Anna KaRNNa

In this notebook, I'll build a character-wise RNN trained on Anna Karenina, one of my all-time favorite books. It'll be able to generate new text based on the text from the book.

This network is based off of Andrej Karpathy's post on RNNs and implementation in Torch. Also, some information here at r2rt and from Sherjil Ozair on GitHub. Below is the general architecture of the character-wise RNN.


In [1]:
import time
from collections import namedtuple

import numpy as np
import tensorflow as tf

First we'll load the text file and convert it into integers for our network to use.


In [2]:
with open('anna.txt', 'r') as f:
    text=f.read()
vocab = set(text)
vocab_to_int = {c: i for i, c in enumerate(vocab)}
int_to_vocab = dict(enumerate(vocab))
chars = np.array([vocab_to_int[c] for c in text], dtype=np.int32)

In [3]:
text[:100]


Out[3]:
'Chapter 1\n\n\nHappy families are all alike; every unhappy family is unhappy in its own\nway.\n\nEverythin'

In [4]:
chars[:100]


Out[4]:
array([49, 25, 63, 11, 54, 29, 37, 69,  1, 67, 67, 67, 38, 63, 11, 11, 64,
       69, 12, 63,  2, 20, 75, 20, 29, 17, 69, 63, 37, 29, 69, 63, 75, 75,
       69, 63, 75, 20, 19, 29, 52, 69, 29, 45, 29, 37, 64, 69, 31, 18, 25,
       63, 11, 11, 64, 69, 12, 63,  2, 20, 75, 64, 69, 20, 17, 69, 31, 18,
       25, 63, 11, 11, 64, 69, 20, 18, 69, 20, 54, 17, 69, 40, 13, 18, 67,
       13, 63, 64, 26, 67, 67, 81, 45, 29, 37, 64, 54, 25, 20, 18], dtype=int32)

Now I need to split up the data into batches, and into training and validation sets. I should be making a test set here, but I'm not going to worry about that. My test will be if the network can generate new text.

Here I'll make both input and target arrays. The targets are the same as the inputs, except shifted one character over. I'll also drop the last bit of data so that I'll only have completely full batches.

The idea here is to make a 2D matrix where the number of rows is equal to the number of batches. Each row will be one long concatenated string from the character data. We'll split this data into a training set and validation set using the split_frac keyword. This will keep 90% of the batches in the training set, the other 10% in the validation set.


In [5]:
def split_data(chars, batch_size, num_steps, split_frac=0.9):
    """ 
    Split character data into training and validation sets, inputs and targets for each set.
    
    Arguments
    ---------
    chars: character array
    batch_size: Size of examples in each of batch
    num_steps: Number of sequence steps to keep in the input and pass to the network
    split_frac: Fraction of batches to keep in the training set
    
    
    Returns train_x, train_y, val_x, val_y
    """
    
    slice_size = batch_size * num_steps
    n_batches = int(len(chars) / slice_size)
    
    # Drop the last few characters to make only full batches
    x = chars[: n_batches*slice_size]
    y = chars[1: n_batches*slice_size + 1]
    
    # Split the data into batch_size slices, then stack them into a 2D matrix 
    x = np.stack(np.split(x, batch_size))
    y = np.stack(np.split(y, batch_size))
    
    # Now x and y are arrays with dimensions batch_size x n_batches*num_steps
    
    # Split into training and validation sets, keep the virst split_frac batches for training
    split_idx = int(n_batches*split_frac)
    train_x, train_y= x[:, :split_idx*num_steps], y[:, :split_idx*num_steps]
    val_x, val_y = x[:, split_idx*num_steps:], y[:, split_idx*num_steps:]
    
    return train_x, train_y, val_x, val_y

In [6]:
train_x, train_y, val_x, val_y = split_data(chars, 10, 200)

In [7]:
train_x.shape


Out[7]:
(10, 178400)

In [8]:
train_x[:,:10]


Out[8]:
array([[49, 25, 63, 11, 54, 29, 37, 69,  1, 67],
       [72, 18,  4, 69, 25, 29, 69,  2, 40, 45],
       [69, 56, 63, 54, 56, 25, 20, 18, 76, 69],
       [40, 54, 25, 29, 37, 69, 13, 40, 31, 75],
       [69, 54, 25, 29, 69, 75, 63, 18,  4, 16],
       [69, 30, 25, 37, 40, 31, 76, 25, 69, 75],
       [54, 69, 54, 40, 67,  4, 40, 26, 67, 67],
       [40, 69, 25, 29, 37, 17, 29, 75, 12, 57],
       [25, 63, 54, 69, 20, 17, 69, 54, 25, 29],
       [29, 37, 17, 29, 75, 12, 69, 63, 18,  4]], dtype=int32)

I'll write another function to grab batches out of the arrays made by split data. Here each batch will be a sliding window on these arrays with size batch_size X num_steps. For example, if we want our network to train on a sequence of 100 characters, num_steps = 100. For the next batch, we'll shift this window the next sequence of num_steps characters. In this way we can feed batches to the network and the cell states will continue through on each batch.


In [9]:
def get_batch(arrs, num_steps):
    batch_size, slice_size = arrs[0].shape
    
    n_batches = int(slice_size/num_steps)
    for b in range(n_batches):
        yield [x[:, b*num_steps: (b+1)*num_steps] for x in arrs]

In [10]:
def build_rnn(num_classes, batch_size=50, num_steps=50, lstm_size=128, num_layers=2,
              learning_rate=0.001, grad_clip=5, sampling=False):
        
    if sampling == True:
        batch_size, num_steps = 1, 1

    tf.reset_default_graph()
    
    # Declare placeholders we'll feed into the graph
    with tf.name_scope('inputs'):
        inputs = tf.placeholder(tf.int32, [batch_size, num_steps], name='inputs')
        x_one_hot = tf.one_hot(inputs, num_classes, name='x_one_hot')
    
    with tf.name_scope('targets'):
        targets = tf.placeholder(tf.int32, [batch_size, num_steps], name='targets')
        y_one_hot = tf.one_hot(targets, num_classes, name='y_one_hot')
        y_reshaped = tf.reshape(y_one_hot, [-1, num_classes])
    
    keep_prob = tf.placeholder(tf.float32, name='keep_prob')
    
    # Build the RNN layers
    with tf.name_scope("RNN_cells"):
        #lstm = tf.contrib.rnn.BasicLSTMCell(lstm_size)
        #drop = tf.contrib.rnn.DropoutWrapper(lstm, output_keep_prob=keep_prob)
        #cell = tf.contrib.rnn.MultiRNNCell([drop] * num_layers)
        
        cell = tf.contrib.rnn.MultiRNNCell([tf.contrib.rnn.DropoutWrapper(tf.contrib.rnn.BasicLSTMCell(lstm_size), \
                                            output_keep_prob=keep_prob) for i in range(num_layers)])
    
    with tf.name_scope("RNN_init_state"):
        initial_state = cell.zero_state(batch_size, tf.float32)

    # Run the data through the RNN layers
    with tf.name_scope("RNN_forward"):
        outputs, state = tf.nn.dynamic_rnn(cell, x_one_hot, initial_state=initial_state)
    
    final_state = state
    
    # Reshape output so it's a bunch of rows, one row for each cell output
    with tf.name_scope('sequence_reshape'):
        seq_output = tf.concat(outputs, axis=1,name='seq_output')
        output = tf.reshape(seq_output, [-1, lstm_size], name='graph_output')
    
    # Now connect the RNN outputs to a softmax layer and calculate the cost
    with tf.name_scope('logits'):
        softmax_w = tf.Variable(tf.truncated_normal((lstm_size, num_classes), stddev=0.1),
                               name='softmax_w')
        softmax_b = tf.Variable(tf.zeros(num_classes), name='softmax_b')
        logits = tf.matmul(output, softmax_w) + softmax_b
        tf.summary.histogram('softmax_w', softmax_w)
        tf.summary.histogram('softmax_b', softmax_b)

    with tf.name_scope('predictions'):
        preds = tf.nn.softmax(logits, name='predictions')
        tf.summary.histogram('predictions', preds)
    
    with tf.name_scope('cost'):
        loss = tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=y_reshaped, name='loss')
        cost = tf.reduce_mean(loss, name='cost')
        tf.summary.scalar('cost', cost)

    # Optimizer for training, using gradient clipping to control exploding gradients
    with tf.name_scope('train'):
        tvars = tf.trainable_variables()
        grads, _ = tf.clip_by_global_norm(tf.gradients(cost, tvars), grad_clip)
        train_op = tf.train.AdamOptimizer(learning_rate)
        optimizer = train_op.apply_gradients(zip(grads, tvars))
    
    merged = tf.summary.merge_all()
    
    # Export the nodes 
    export_nodes = ['inputs', 'targets', 'initial_state', 'final_state',
                    'keep_prob', 'cost', 'preds', 'optimizer', 'merged']
    Graph = namedtuple('Graph', export_nodes)
    local_dict = locals()
    graph = Graph(*[local_dict[each] for each in export_nodes])
    
    return graph

Hyperparameters

Here I'm defining the hyperparameters for the network. The two you probably haven't seen before are lstm_size and num_layers. These set the number of hidden units in the LSTM layers and the number of LSTM layers, respectively. Of course, making these bigger will improve the network's performance but you'll have to watch out for overfitting. If your validation loss is much larger than the training loss, you're probably overfitting. Decrease the size of the network or decrease the dropout keep probability.


In [11]:
batch_size = 100
num_steps = 100
lstm_size = 512
num_layers = 2
learning_rate = 0.001

Training

Time for training which is is pretty straightforward. Here I pass in some data, and get an LSTM state back. Then I pass that state back in to the network so the next batch can continue the state from the previous batch. And every so often (set by save_every_n) I calculate the validation loss and save a checkpoint.


In [12]:
!mkdir -p checkpoints/anna

In [13]:
def train(model, epochs, file_writer):
    
    with tf.Session() as sess:
        sess.run(tf.global_variables_initializer())

        # Use the line below to load a checkpoint and resume training
        #saver.restore(sess, 'checkpoints/anna20.ckpt')

        n_batches = int(train_x.shape[1]/num_steps)
        iterations = n_batches * epochs
        for e in range(epochs):

            # Train network
            new_state = sess.run(model.initial_state)
            loss = 0
            for b, (x, y) in enumerate(get_batch([train_x, train_y], num_steps), 1):
                iteration = e*n_batches + b
                start = time.time()
                feed = {model.inputs: x,
                        model.targets: y,
                        model.keep_prob: 0.5,
                        model.initial_state: new_state}
                summary, batch_loss, new_state, _ = sess.run([model.merged, model.cost, 
                                                              model.final_state, model.optimizer], 
                                                              feed_dict=feed)
                loss += batch_loss
                end = time.time()
                print('Epoch {}/{} '.format(e+1, epochs),
                      'Iteration {}/{}'.format(iteration, iterations),
                      'Training loss: {:.4f}'.format(loss/b),
                      '{:.4f} sec/batch'.format((end-start)))

                file_writer.add_summary(summary, iteration)

In [14]:
epochs = 20
batch_size = 100
num_steps = 100
train_x, train_y, val_x, val_y = split_data(chars, batch_size, num_steps)

for lstm_size in [128,256,512]:
    for num_layers in [1, 2]:
        for learning_rate in [0.002, 0.001]:
            log_string = 'logs/4/lr={},rl={},ru={}'.format(learning_rate, num_layers, lstm_size)
            writer = tf.summary.FileWriter(log_string)
            model = build_rnn(len(vocab), 
                    batch_size=batch_size,
                    num_steps=num_steps,
                    learning_rate=learning_rate,
                    lstm_size=lstm_size,
                    num_layers=num_layers)
            
            train(model, epochs, writer)


Epoch 1/20  Iteration 1/3560 Training loss: 4.4258 0.4246 sec/batch
Epoch 1/20  Iteration 2/3560 Training loss: 4.4138 0.4081 sec/batch
Epoch 1/20  Iteration 3/3560 Training loss: 4.4010 0.3948 sec/batch
Epoch 1/20  Iteration 4/3560 Training loss: 4.3830 0.3979 sec/batch
Epoch 1/20  Iteration 5/3560 Training loss: 4.3478 0.4104 sec/batch
Epoch 1/20  Iteration 6/3560 Training loss: 4.2738 0.3831 sec/batch
Epoch 1/20  Iteration 7/3560 Training loss: 4.1904 0.4090 sec/batch
Epoch 1/20  Iteration 8/3560 Training loss: 4.1138 0.3932 sec/batch
Epoch 1/20  Iteration 9/3560 Training loss: 4.0456 0.3781 sec/batch
Epoch 1/20  Iteration 10/3560 Training loss: 3.9857 0.3813 sec/batch
Epoch 1/20  Iteration 11/3560 Training loss: 3.9304 0.3823 sec/batch
Epoch 1/20  Iteration 12/3560 Training loss: 3.8820 0.3870 sec/batch
Epoch 1/20  Iteration 13/3560 Training loss: 3.8391 0.3860 sec/batch
Epoch 1/20  Iteration 14/3560 Training loss: 3.8031 0.3952 sec/batch
Epoch 1/20  Iteration 15/3560 Training loss: 3.7705 0.4018 sec/batch
Epoch 1/20  Iteration 16/3560 Training loss: 3.7407 0.4093 sec/batch
Epoch 1/20  Iteration 17/3560 Training loss: 3.7132 0.4073 sec/batch
Epoch 1/20  Iteration 18/3560 Training loss: 3.6896 0.3883 sec/batch
Epoch 1/20  Iteration 19/3560 Training loss: 3.6673 0.3822 sec/batch
Epoch 1/20  Iteration 20/3560 Training loss: 3.6450 0.4080 sec/batch
Epoch 1/20  Iteration 21/3560 Training loss: 3.6257 0.4029 sec/batch
Epoch 1/20  Iteration 22/3560 Training loss: 3.6076 0.4160 sec/batch
Epoch 1/20  Iteration 23/3560 Training loss: 3.5904 0.3931 sec/batch
Epoch 1/20  Iteration 24/3560 Training loss: 3.5746 0.3886 sec/batch
Epoch 1/20  Iteration 25/3560 Training loss: 3.5593 0.3941 sec/batch
Epoch 1/20  Iteration 26/3560 Training loss: 3.5457 0.3963 sec/batch
Epoch 1/20  Iteration 27/3560 Training loss: 3.5328 0.3804 sec/batch
Epoch 1/20  Iteration 28/3560 Training loss: 3.5201 0.3836 sec/batch
Epoch 1/20  Iteration 29/3560 Training loss: 3.5082 0.3800 sec/batch
Epoch 1/20  Iteration 30/3560 Training loss: 3.4976 0.3957 sec/batch
Epoch 1/20  Iteration 31/3560 Training loss: 3.4881 0.4026 sec/batch
Epoch 1/20  Iteration 32/3560 Training loss: 3.4781 0.3927 sec/batch
Epoch 1/20  Iteration 33/3560 Training loss: 3.4683 0.3819 sec/batch
Epoch 1/20  Iteration 34/3560 Training loss: 3.4598 0.3798 sec/batch
Epoch 1/20  Iteration 35/3560 Training loss: 3.4510 0.4109 sec/batch
Epoch 1/20  Iteration 36/3560 Training loss: 3.4432 0.4033 sec/batch
Epoch 1/20  Iteration 37/3560 Training loss: 3.4348 0.3983 sec/batch
Epoch 1/20  Iteration 38/3560 Training loss: 3.4271 0.4021 sec/batch
Epoch 1/20  Iteration 39/3560 Training loss: 3.4195 0.3899 sec/batch
Epoch 1/20  Iteration 40/3560 Training loss: 3.4123 0.3965 sec/batch
Epoch 1/20  Iteration 41/3560 Training loss: 3.4054 0.3972 sec/batch
Epoch 1/20  Iteration 42/3560 Training loss: 3.3989 0.4069 sec/batch
Epoch 1/20  Iteration 43/3560 Training loss: 3.3924 0.3899 sec/batch
Epoch 1/20  Iteration 44/3560 Training loss: 3.3861 0.3880 sec/batch
Epoch 1/20  Iteration 45/3560 Training loss: 3.3801 0.3811 sec/batch
Epoch 1/20  Iteration 46/3560 Training loss: 3.3746 0.4016 sec/batch
Epoch 1/20  Iteration 47/3560 Training loss: 3.3694 0.3784 sec/batch
Epoch 1/20  Iteration 48/3560 Training loss: 3.3644 0.3804 sec/batch
Epoch 1/20  Iteration 49/3560 Training loss: 3.3596 0.3790 sec/batch
Epoch 1/20  Iteration 50/3560 Training loss: 3.3548 0.4070 sec/batch
Epoch 1/20  Iteration 51/3560 Training loss: 3.3501 0.4004 sec/batch
Epoch 1/20  Iteration 52/3560 Training loss: 3.3454 0.3861 sec/batch
Epoch 1/20  Iteration 53/3560 Training loss: 3.3411 0.3953 sec/batch
Epoch 1/20  Iteration 54/3560 Training loss: 3.3366 0.3959 sec/batch
Epoch 1/20  Iteration 55/3560 Training loss: 3.3324 0.3845 sec/batch
Epoch 1/20  Iteration 56/3560 Training loss: 3.3281 0.3834 sec/batch
Epoch 1/20  Iteration 57/3560 Training loss: 3.3240 0.3831 sec/batch
Epoch 1/20  Iteration 58/3560 Training loss: 3.3201 0.3898 sec/batch
Epoch 1/20  Iteration 59/3560 Training loss: 3.3161 0.3974 sec/batch
Epoch 1/20  Iteration 60/3560 Training loss: 3.3125 0.3915 sec/batch
Epoch 1/20  Iteration 61/3560 Training loss: 3.3090 0.3823 sec/batch
Epoch 1/20  Iteration 62/3560 Training loss: 3.3058 0.3833 sec/batch
Epoch 1/20  Iteration 63/3560 Training loss: 3.3027 0.4099 sec/batch
Epoch 1/20  Iteration 64/3560 Training loss: 3.2991 0.3857 sec/batch
Epoch 1/20  Iteration 65/3560 Training loss: 3.2956 0.3859 sec/batch
Epoch 1/20  Iteration 66/3560 Training loss: 3.2926 0.3802 sec/batch
Epoch 1/20  Iteration 67/3560 Training loss: 3.2896 0.3841 sec/batch
Epoch 1/20  Iteration 68/3560 Training loss: 3.2859 0.3845 sec/batch
Epoch 1/20  Iteration 69/3560 Training loss: 3.2826 0.3811 sec/batch
Epoch 1/20  Iteration 70/3560 Training loss: 3.2797 0.3826 sec/batch
Epoch 1/20  Iteration 71/3560 Training loss: 3.2767 0.3828 sec/batch
Epoch 1/20  Iteration 72/3560 Training loss: 3.2741 0.3785 sec/batch
Epoch 1/20  Iteration 73/3560 Training loss: 3.2712 0.3810 sec/batch
Epoch 1/20  Iteration 74/3560 Training loss: 3.2685 0.3832 sec/batch
Epoch 1/20  Iteration 75/3560 Training loss: 3.2659 0.3881 sec/batch
Epoch 1/20  Iteration 76/3560 Training loss: 3.2634 0.4097 sec/batch
Epoch 1/20  Iteration 77/3560 Training loss: 3.2608 0.3896 sec/batch
Epoch 1/20  Iteration 78/3560 Training loss: 3.2582 0.3958 sec/batch
Epoch 1/20  Iteration 79/3560 Training loss: 3.2555 0.3860 sec/batch
Epoch 1/20  Iteration 80/3560 Training loss: 3.2527 0.3838 sec/batch
Epoch 1/20  Iteration 81/3560 Training loss: 3.2501 0.3814 sec/batch
Epoch 1/20  Iteration 82/3560 Training loss: 3.2477 0.3785 sec/batch
Epoch 1/20  Iteration 83/3560 Training loss: 3.2455 0.3808 sec/batch
Epoch 1/20  Iteration 84/3560 Training loss: 3.2430 0.3783 sec/batch
Epoch 1/20  Iteration 85/3560 Training loss: 3.2403 0.3796 sec/batch
Epoch 1/20  Iteration 86/3560 Training loss: 3.2377 0.3776 sec/batch
Epoch 1/20  Iteration 87/3560 Training loss: 3.2352 0.3808 sec/batch
Epoch 1/20  Iteration 88/3560 Training loss: 3.2327 0.3809 sec/batch
Epoch 1/20  Iteration 89/3560 Training loss: 3.2304 0.4016 sec/batch
Epoch 1/20  Iteration 90/3560 Training loss: 3.2281 0.3872 sec/batch
Epoch 1/20  Iteration 91/3560 Training loss: 3.2258 0.3846 sec/batch
Epoch 1/20  Iteration 92/3560 Training loss: 3.2234 0.3765 sec/batch
Epoch 1/20  Iteration 93/3560 Training loss: 3.2211 0.3808 sec/batch
Epoch 1/20  Iteration 94/3560 Training loss: 3.2187 0.3903 sec/batch
Epoch 1/20  Iteration 95/3560 Training loss: 3.2164 0.3868 sec/batch
Epoch 1/20  Iteration 96/3560 Training loss: 3.2140 0.3913 sec/batch
Epoch 1/20  Iteration 97/3560 Training loss: 3.2117 0.3781 sec/batch
Epoch 1/20  Iteration 98/3560 Training loss: 3.2093 0.3777 sec/batch
Epoch 1/20  Iteration 99/3560 Training loss: 3.2071 0.3848 sec/batch
Epoch 1/20  Iteration 100/3560 Training loss: 3.2048 0.3788 sec/batch
Epoch 1/20  Iteration 101/3560 Training loss: 3.2025 0.3822 sec/batch
Epoch 1/20  Iteration 102/3560 Training loss: 3.2002 0.3852 sec/batch
Epoch 1/20  Iteration 103/3560 Training loss: 3.1979 0.3903 sec/batch
Epoch 1/20  Iteration 104/3560 Training loss: 3.1955 0.3845 sec/batch
Epoch 1/20  Iteration 105/3560 Training loss: 3.1932 0.3852 sec/batch
Epoch 1/20  Iteration 106/3560 Training loss: 3.1908 0.3885 sec/batch
Epoch 1/20  Iteration 107/3560 Training loss: 3.1883 0.3810 sec/batch
Epoch 1/20  Iteration 108/3560 Training loss: 3.1859 0.3800 sec/batch
Epoch 1/20  Iteration 109/3560 Training loss: 3.1837 0.3889 sec/batch
Epoch 1/20  Iteration 110/3560 Training loss: 3.1810 0.3839 sec/batch
Epoch 1/20  Iteration 111/3560 Training loss: 3.1786 0.3891 sec/batch
Epoch 1/20  Iteration 112/3560 Training loss: 3.1761 0.3847 sec/batch
Epoch 1/20  Iteration 113/3560 Training loss: 3.1736 0.3781 sec/batch
Epoch 1/20  Iteration 114/3560 Training loss: 3.1710 0.3876 sec/batch
Epoch 1/20  Iteration 115/3560 Training loss: 3.1685 0.3869 sec/batch
Epoch 1/20  Iteration 116/3560 Training loss: 3.1659 0.3951 sec/batch
Epoch 1/20  Iteration 117/3560 Training loss: 3.1634 0.3809 sec/batch
Epoch 1/20  Iteration 118/3560 Training loss: 3.1610 0.3807 sec/batch
Epoch 1/20  Iteration 119/3560 Training loss: 3.1587 0.3835 sec/batch
Epoch 1/20  Iteration 120/3560 Training loss: 3.1561 0.3811 sec/batch
Epoch 1/20  Iteration 121/3560 Training loss: 3.1539 0.3818 sec/batch
Epoch 1/20  Iteration 122/3560 Training loss: 3.1515 0.3857 sec/batch
Epoch 1/20  Iteration 123/3560 Training loss: 3.1490 0.3787 sec/batch
Epoch 1/20  Iteration 124/3560 Training loss: 3.1466 0.3788 sec/batch
Epoch 1/20  Iteration 125/3560 Training loss: 3.1440 0.3925 sec/batch
Epoch 1/20  Iteration 126/3560 Training loss: 3.1414 0.3859 sec/batch
Epoch 1/20  Iteration 127/3560 Training loss: 3.1390 0.3802 sec/batch
Epoch 1/20  Iteration 128/3560 Training loss: 3.1366 0.3768 sec/batch
Epoch 1/20  Iteration 129/3560 Training loss: 3.1341 0.3992 sec/batch
Epoch 1/20  Iteration 130/3560 Training loss: 3.1316 0.3801 sec/batch
Epoch 1/20  Iteration 131/3560 Training loss: 3.1292 0.3891 sec/batch
Epoch 1/20  Iteration 132/3560 Training loss: 3.1266 0.3802 sec/batch
Epoch 1/20  Iteration 133/3560 Training loss: 3.1241 0.3781 sec/batch
Epoch 1/20  Iteration 134/3560 Training loss: 3.1216 0.3950 sec/batch
Epoch 1/20  Iteration 135/3560 Training loss: 3.1188 0.3764 sec/batch
Epoch 1/20  Iteration 136/3560 Training loss: 3.1162 0.3838 sec/batch
Epoch 1/20  Iteration 137/3560 Training loss: 3.1136 0.3852 sec/batch
Epoch 1/20  Iteration 138/3560 Training loss: 3.1110 0.3792 sec/batch
Epoch 1/20  Iteration 139/3560 Training loss: 3.1086 0.3924 sec/batch
Epoch 1/20  Iteration 140/3560 Training loss: 3.1060 0.3856 sec/batch
Epoch 1/20  Iteration 141/3560 Training loss: 3.1035 0.3770 sec/batch
Epoch 1/20  Iteration 142/3560 Training loss: 3.1008 0.3994 sec/batch
Epoch 1/20  Iteration 143/3560 Training loss: 3.0983 0.3759 sec/batch
Epoch 1/20  Iteration 144/3560 Training loss: 3.0957 0.3778 sec/batch
Epoch 1/20  Iteration 145/3560 Training loss: 3.0932 0.3838 sec/batch
Epoch 1/20  Iteration 146/3560 Training loss: 3.0907 0.3847 sec/batch
Epoch 1/20  Iteration 147/3560 Training loss: 3.0883 0.3854 sec/batch
Epoch 1/20  Iteration 148/3560 Training loss: 3.0859 0.3805 sec/batch
Epoch 1/20  Iteration 149/3560 Training loss: 3.0833 0.3770 sec/batch
Epoch 1/20  Iteration 150/3560 Training loss: 3.0808 0.4007 sec/batch
Epoch 1/20  Iteration 151/3560 Training loss: 3.0785 0.3810 sec/batch
Epoch 1/20  Iteration 152/3560 Training loss: 3.0763 0.3848 sec/batch
Epoch 1/20  Iteration 153/3560 Training loss: 3.0739 0.3789 sec/batch
Epoch 1/20  Iteration 154/3560 Training loss: 3.0714 0.3886 sec/batch
Epoch 1/20  Iteration 155/3560 Training loss: 3.0689 0.3987 sec/batch
Epoch 1/20  Iteration 156/3560 Training loss: 3.0664 0.3826 sec/batch
Epoch 1/20  Iteration 157/3560 Training loss: 3.0639 0.3834 sec/batch
Epoch 1/20  Iteration 158/3560 Training loss: 3.0613 0.3896 sec/batch
Epoch 1/20  Iteration 159/3560 Training loss: 3.0588 0.3910 sec/batch
Epoch 1/20  Iteration 160/3560 Training loss: 3.0563 0.3850 sec/batch
Epoch 1/20  Iteration 161/3560 Training loss: 3.0538 0.3811 sec/batch
Epoch 1/20  Iteration 162/3560 Training loss: 3.0511 0.3844 sec/batch
Epoch 1/20  Iteration 163/3560 Training loss: 3.0486 0.3836 sec/batch
Epoch 1/20  Iteration 164/3560 Training loss: 3.0461 0.3785 sec/batch
Epoch 1/20  Iteration 165/3560 Training loss: 3.0437 0.3855 sec/batch
Epoch 1/20  Iteration 166/3560 Training loss: 3.0412 0.3807 sec/batch
Epoch 1/20  Iteration 167/3560 Training loss: 3.0388 0.3771 sec/batch
Epoch 1/20  Iteration 168/3560 Training loss: 3.0364 0.4112 sec/batch
Epoch 1/20  Iteration 169/3560 Training loss: 3.0341 0.3846 sec/batch
Epoch 1/20  Iteration 170/3560 Training loss: 3.0315 0.3751 sec/batch
Epoch 1/20  Iteration 171/3560 Training loss: 3.0292 0.3832 sec/batch
Epoch 1/20  Iteration 172/3560 Training loss: 3.0270 0.3825 sec/batch
Epoch 1/20  Iteration 173/3560 Training loss: 3.0248 0.3918 sec/batch
Epoch 1/20  Iteration 174/3560 Training loss: 3.0227 0.3754 sec/batch
Epoch 1/20  Iteration 175/3560 Training loss: 3.0205 0.3799 sec/batch
Epoch 1/20  Iteration 176/3560 Training loss: 3.0182 0.3811 sec/batch
Epoch 1/20  Iteration 177/3560 Training loss: 3.0158 0.4010 sec/batch
Epoch 1/20  Iteration 178/3560 Training loss: 3.0133 0.3853 sec/batch
Epoch 2/20  Iteration 179/3560 Training loss: 2.6354 0.3815 sec/batch
Epoch 2/20  Iteration 180/3560 Training loss: 2.5956 0.3930 sec/batch
Epoch 2/20  Iteration 181/3560 Training loss: 2.5875 0.3960 sec/batch
Epoch 2/20  Iteration 182/3560 Training loss: 2.5827 0.3798 sec/batch
Epoch 2/20  Iteration 183/3560 Training loss: 2.5811 0.3806 sec/batch
Epoch 2/20  Iteration 184/3560 Training loss: 2.5788 0.3832 sec/batch
Epoch 2/20  Iteration 185/3560 Training loss: 2.5787 0.3953 sec/batch
Epoch 2/20  Iteration 186/3560 Training loss: 2.5782 0.3824 sec/batch
Epoch 2/20  Iteration 187/3560 Training loss: 2.5779 0.3798 sec/batch
Epoch 2/20  Iteration 188/3560 Training loss: 2.5763 0.3897 sec/batch
Epoch 2/20  Iteration 189/3560 Training loss: 2.5740 0.3831 sec/batch
Epoch 2/20  Iteration 190/3560 Training loss: 2.5731 0.3854 sec/batch
Epoch 2/20  Iteration 191/3560 Training loss: 2.5719 0.3836 sec/batch
Epoch 2/20  Iteration 192/3560 Training loss: 2.5724 0.3770 sec/batch
Epoch 2/20  Iteration 193/3560 Training loss: 2.5712 0.3849 sec/batch
Epoch 2/20  Iteration 194/3560 Training loss: 2.5705 0.4081 sec/batch
Epoch 2/20  Iteration 195/3560 Training loss: 2.5695 0.3930 sec/batch
Epoch 2/20  Iteration 196/3560 Training loss: 2.5700 0.3838 sec/batch
Epoch 2/20  Iteration 197/3560 Training loss: 2.5692 0.3806 sec/batch
Epoch 2/20  Iteration 198/3560 Training loss: 2.5668 0.3935 sec/batch
Epoch 2/20  Iteration 199/3560 Training loss: 2.5653 0.3793 sec/batch
Epoch 2/20  Iteration 200/3560 Training loss: 2.5654 0.3769 sec/batch
Epoch 2/20  Iteration 201/3560 Training loss: 2.5638 0.3769 sec/batch
Epoch 2/20  Iteration 202/3560 Training loss: 2.5623 0.3793 sec/batch
Epoch 2/20  Iteration 203/3560 Training loss: 2.5607 0.3830 sec/batch
Epoch 2/20  Iteration 204/3560 Training loss: 2.5597 0.3865 sec/batch
Epoch 2/20  Iteration 205/3560 Training loss: 2.5585 0.3786 sec/batch
Epoch 2/20  Iteration 206/3560 Training loss: 2.5569 0.3814 sec/batch
Epoch 2/20  Iteration 207/3560 Training loss: 2.5563 0.4010 sec/batch
Epoch 2/20  Iteration 208/3560 Training loss: 2.5555 0.3930 sec/batch
Epoch 2/20  Iteration 209/3560 Training loss: 2.5552 0.3799 sec/batch
Epoch 2/20  Iteration 210/3560 Training loss: 2.5535 0.3792 sec/batch
Epoch 2/20  Iteration 211/3560 Training loss: 2.5519 0.3850 sec/batch
Epoch 2/20  Iteration 212/3560 Training loss: 2.5509 0.3787 sec/batch
Epoch 2/20  Iteration 213/3560 Training loss: 2.5495 0.3825 sec/batch
Epoch 2/20  Iteration 214/3560 Training loss: 2.5486 0.3949 sec/batch
Epoch 2/20  Iteration 215/3560 Training loss: 2.5472 0.3854 sec/batch
Epoch 2/20  Iteration 216/3560 Training loss: 2.5455 0.3826 sec/batch
Epoch 2/20  Iteration 217/3560 Training loss: 2.5439 0.3840 sec/batch
Epoch 2/20  Iteration 218/3560 Training loss: 2.5423 0.3812 sec/batch
Epoch 2/20  Iteration 219/3560 Training loss: 2.5408 0.3831 sec/batch
Epoch 2/20  Iteration 220/3560 Training loss: 2.5394 0.4011 sec/batch
Epoch 2/20  Iteration 221/3560 Training loss: 2.5377 0.3831 sec/batch
Epoch 2/20  Iteration 222/3560 Training loss: 2.5362 0.3852 sec/batch
Epoch 2/20  Iteration 223/3560 Training loss: 2.5348 0.3838 sec/batch
Epoch 2/20  Iteration 224/3560 Training loss: 2.5330 0.3897 sec/batch
Epoch 2/20  Iteration 225/3560 Training loss: 2.5321 0.3836 sec/batch
Epoch 2/20  Iteration 226/3560 Training loss: 2.5308 0.3873 sec/batch
Epoch 2/20  Iteration 227/3560 Training loss: 2.5297 0.3807 sec/batch
Epoch 2/20  Iteration 228/3560 Training loss: 2.5292 0.3811 sec/batch
Epoch 2/20  Iteration 229/3560 Training loss: 2.5280 0.3869 sec/batch
Epoch 2/20  Iteration 230/3560 Training loss: 2.5270 0.3909 sec/batch
Epoch 2/20  Iteration 231/3560 Training loss: 2.5258 0.3922 sec/batch
Epoch 2/20  Iteration 232/3560 Training loss: 2.5245 0.3856 sec/batch
Epoch 2/20  Iteration 233/3560 Training loss: 2.5233 0.4101 sec/batch
Epoch 2/20  Iteration 234/3560 Training loss: 2.5223 0.3849 sec/batch
Epoch 2/20  Iteration 235/3560 Training loss: 2.5213 0.3883 sec/batch
Epoch 2/20  Iteration 236/3560 Training loss: 2.5202 0.3855 sec/batch
Epoch 2/20  Iteration 237/3560 Training loss: 2.5189 0.3860 sec/batch
Epoch 2/20  Iteration 238/3560 Training loss: 2.5182 0.4031 sec/batch
Epoch 2/20  Iteration 239/3560 Training loss: 2.5170 0.3838 sec/batch
Epoch 2/20  Iteration 240/3560 Training loss: 2.5162 0.3845 sec/batch
Epoch 2/20  Iteration 241/3560 Training loss: 2.5155 0.3788 sec/batch
Epoch 2/20  Iteration 242/3560 Training loss: 2.5145 0.3820 sec/batch
Epoch 2/20  Iteration 243/3560 Training loss: 2.5133 0.3804 sec/batch
Epoch 2/20  Iteration 244/3560 Training loss: 2.5127 0.3794 sec/batch
Epoch 2/20  Iteration 245/3560 Training loss: 2.5117 0.3918 sec/batch
Epoch 2/20  Iteration 246/3560 Training loss: 2.5103 0.4021 sec/batch
Epoch 2/20  Iteration 247/3560 Training loss: 2.5092 0.3854 sec/batch
Epoch 2/20  Iteration 248/3560 Training loss: 2.5085 0.3796 sec/batch
Epoch 2/20  Iteration 249/3560 Training loss: 2.5076 0.3768 sec/batch
Epoch 2/20  Iteration 250/3560 Training loss: 2.5070 0.3856 sec/batch
Epoch 2/20  Iteration 251/3560 Training loss: 2.5060 0.3823 sec/batch
Epoch 2/20  Iteration 252/3560 Training loss: 2.5050 0.3797 sec/batch
Epoch 2/20  Iteration 253/3560 Training loss: 2.5042 0.3793 sec/batch
Epoch 2/20  Iteration 254/3560 Training loss: 2.5037 0.3912 sec/batch
Epoch 2/20  Iteration 255/3560 Training loss: 2.5027 0.3803 sec/batch
Epoch 2/20  Iteration 256/3560 Training loss: 2.5020 0.3913 sec/batch
Epoch 2/20  Iteration 257/3560 Training loss: 2.5009 0.3861 sec/batch
Epoch 2/20  Iteration 258/3560 Training loss: 2.4999 0.3735 sec/batch
Epoch 2/20  Iteration 259/3560 Training loss: 2.4989 0.4035 sec/batch
Epoch 2/20  Iteration 260/3560 Training loss: 2.4983 0.3869 sec/batch
Epoch 2/20  Iteration 261/3560 Training loss: 2.4974 0.3769 sec/batch
Epoch 2/20  Iteration 262/3560 Training loss: 2.4963 0.3792 sec/batch
Epoch 2/20  Iteration 263/3560 Training loss: 2.4950 0.3858 sec/batch
Epoch 2/20  Iteration 264/3560 Training loss: 2.4942 0.3819 sec/batch
Epoch 2/20  Iteration 265/3560 Training loss: 2.4934 0.3881 sec/batch
Epoch 2/20  Iteration 266/3560 Training loss: 2.4925 0.3807 sec/batch
Epoch 2/20  Iteration 267/3560 Training loss: 2.4916 0.3863 sec/batch
Epoch 2/20  Iteration 268/3560 Training loss: 2.4909 0.3854 sec/batch
Epoch 2/20  Iteration 269/3560 Training loss: 2.4900 0.3772 sec/batch
Epoch 2/20  Iteration 270/3560 Training loss: 2.4893 0.3847 sec/batch
Epoch 2/20  Iteration 271/3560 Training loss: 2.4884 0.3817 sec/batch
Epoch 2/20  Iteration 272/3560 Training loss: 2.4874 0.4047 sec/batch
Epoch 2/20  Iteration 273/3560 Training loss: 2.4864 0.3796 sec/batch
Epoch 2/20  Iteration 274/3560 Training loss: 2.4854 0.3810 sec/batch
Epoch 2/20  Iteration 275/3560 Training loss: 2.4845 0.3843 sec/batch
Epoch 2/20  Iteration 276/3560 Training loss: 2.4838 0.3875 sec/batch
Epoch 2/20  Iteration 277/3560 Training loss: 2.4829 0.4100 sec/batch
Epoch 2/20  Iteration 278/3560 Training loss: 2.4820 0.3802 sec/batch
Epoch 2/20  Iteration 279/3560 Training loss: 2.4813 0.3841 sec/batch
Epoch 2/20  Iteration 280/3560 Training loss: 2.4804 0.3868 sec/batch
Epoch 2/20  Iteration 281/3560 Training loss: 2.4794 0.3815 sec/batch
Epoch 2/20  Iteration 282/3560 Training loss: 2.4786 0.3797 sec/batch
Epoch 2/20  Iteration 283/3560 Training loss: 2.4777 0.3872 sec/batch
Epoch 2/20  Iteration 284/3560 Training loss: 2.4769 0.3801 sec/batch
Epoch 2/20  Iteration 285/3560 Training loss: 2.4760 0.4016 sec/batch
Epoch 2/20  Iteration 286/3560 Training loss: 2.4754 0.3847 sec/batch
Epoch 2/20  Iteration 287/3560 Training loss: 2.4748 0.3797 sec/batch
Epoch 2/20  Iteration 288/3560 Training loss: 2.4738 0.3867 sec/batch
Epoch 2/20  Iteration 289/3560 Training loss: 2.4731 0.4050 sec/batch
Epoch 2/20  Iteration 290/3560 Training loss: 2.4725 0.3919 sec/batch
Epoch 2/20  Iteration 291/3560 Training loss: 2.4717 0.3901 sec/batch
Epoch 2/20  Iteration 292/3560 Training loss: 2.4708 0.3857 sec/batch
Epoch 2/20  Iteration 293/3560 Training loss: 2.4700 0.3839 sec/batch
Epoch 2/20  Iteration 294/3560 Training loss: 2.4690 0.3887 sec/batch
Epoch 2/20  Iteration 295/3560 Training loss: 2.4682 0.3960 sec/batch
Epoch 2/20  Iteration 296/3560 Training loss: 2.4676 0.3812 sec/batch
Epoch 2/20  Iteration 297/3560 Training loss: 2.4671 0.3763 sec/batch
Epoch 2/20  Iteration 298/3560 Training loss: 2.4664 0.4059 sec/batch
Epoch 2/20  Iteration 299/3560 Training loss: 2.4659 0.3844 sec/batch
Epoch 2/20  Iteration 300/3560 Training loss: 2.4653 0.3828 sec/batch
Epoch 2/20  Iteration 301/3560 Training loss: 2.4646 0.3909 sec/batch
Epoch 2/20  Iteration 302/3560 Training loss: 2.4640 0.3753 sec/batch
Epoch 2/20  Iteration 303/3560 Training loss: 2.4633 0.3804 sec/batch
Epoch 2/20  Iteration 304/3560 Training loss: 2.4624 0.3841 sec/batch
Epoch 2/20  Iteration 305/3560 Training loss: 2.4619 0.3828 sec/batch
Epoch 2/20  Iteration 306/3560 Training loss: 2.4613 0.3890 sec/batch
Epoch 2/20  Iteration 307/3560 Training loss: 2.4607 0.3786 sec/batch
Epoch 2/20  Iteration 308/3560 Training loss: 2.4601 0.3798 sec/batch
Epoch 2/20  Iteration 309/3560 Training loss: 2.4594 0.3859 sec/batch
Epoch 2/20  Iteration 310/3560 Training loss: 2.4586 0.3792 sec/batch
Epoch 2/20  Iteration 311/3560 Training loss: 2.4580 0.3851 sec/batch
Epoch 2/20  Iteration 312/3560 Training loss: 2.4575 0.3938 sec/batch
Epoch 2/20  Iteration 313/3560 Training loss: 2.4567 0.3852 sec/batch
Epoch 2/20  Iteration 314/3560 Training loss: 2.4561 0.3827 sec/batch
Epoch 2/20  Iteration 315/3560 Training loss: 2.4555 0.3802 sec/batch
Epoch 2/20  Iteration 316/3560 Training loss: 2.4549 0.3800 sec/batch
Epoch 2/20  Iteration 317/3560 Training loss: 2.4545 0.3887 sec/batch
Epoch 2/20  Iteration 318/3560 Training loss: 2.4538 0.3889 sec/batch
Epoch 2/20  Iteration 319/3560 Training loss: 2.4533 0.3802 sec/batch
Epoch 2/20  Iteration 320/3560 Training loss: 2.4527 0.3795 sec/batch
Epoch 2/20  Iteration 321/3560 Training loss: 2.4521 0.3809 sec/batch
Epoch 2/20  Iteration 322/3560 Training loss: 2.4515 0.3916 sec/batch
Epoch 2/20  Iteration 323/3560 Training loss: 2.4509 0.3797 sec/batch
Epoch 2/20  Iteration 324/3560 Training loss: 2.4505 0.3911 sec/batch
Epoch 2/20  Iteration 325/3560 Training loss: 2.4500 0.3988 sec/batch
Epoch 2/20  Iteration 326/3560 Training loss: 2.4496 0.3880 sec/batch
Epoch 2/20  Iteration 327/3560 Training loss: 2.4489 0.3811 sec/batch
Epoch 2/20  Iteration 328/3560 Training loss: 2.4483 0.3805 sec/batch
Epoch 2/20  Iteration 329/3560 Training loss: 2.4479 0.3923 sec/batch
Epoch 2/20  Iteration 330/3560 Training loss: 2.4477 0.3981 sec/batch
Epoch 2/20  Iteration 331/3560 Training loss: 2.4472 0.7645 sec/batch
Epoch 2/20  Iteration 332/3560 Training loss: 2.4467 0.3833 sec/batch
Epoch 2/20  Iteration 333/3560 Training loss: 2.4461 0.3846 sec/batch
Epoch 2/20  Iteration 334/3560 Training loss: 2.4455 0.3871 sec/batch
Epoch 2/20  Iteration 335/3560 Training loss: 2.4449 0.3822 sec/batch
Epoch 2/20  Iteration 336/3560 Training loss: 2.4444 0.3805 sec/batch
Epoch 2/20  Iteration 337/3560 Training loss: 2.4437 0.4021 sec/batch
Epoch 2/20  Iteration 338/3560 Training loss: 2.4433 0.3814 sec/batch
Epoch 2/20  Iteration 339/3560 Training loss: 2.4429 0.3803 sec/batch
Epoch 2/20  Iteration 340/3560 Training loss: 2.4421 0.3807 sec/batch
Epoch 2/20  Iteration 341/3560 Training loss: 2.4415 0.3839 sec/batch
Epoch 2/20  Iteration 342/3560 Training loss: 2.4409 0.3813 sec/batch
Epoch 2/20  Iteration 343/3560 Training loss: 2.4405 0.3786 sec/batch
Epoch 2/20  Iteration 344/3560 Training loss: 2.4400 0.3812 sec/batch
Epoch 2/20  Iteration 345/3560 Training loss: 2.4395 0.3836 sec/batch
Epoch 2/20  Iteration 346/3560 Training loss: 2.4390 0.3841 sec/batch
Epoch 2/20  Iteration 347/3560 Training loss: 2.4385 0.3866 sec/batch
Epoch 2/20  Iteration 348/3560 Training loss: 2.4379 0.3761 sec/batch
Epoch 2/20  Iteration 349/3560 Training loss: 2.4374 0.3856 sec/batch
Epoch 2/20  Iteration 350/3560 Training loss: 2.4370 0.4013 sec/batch
Epoch 2/20  Iteration 351/3560 Training loss: 2.4367 0.3846 sec/batch
Epoch 2/20  Iteration 352/3560 Training loss: 2.4364 0.3881 sec/batch
Epoch 2/20  Iteration 353/3560 Training loss: 2.4361 0.3886 sec/batch
Epoch 2/20  Iteration 354/3560 Training loss: 2.4360 0.3883 sec/batch
Epoch 2/20  Iteration 355/3560 Training loss: 2.4357 0.3801 sec/batch
Epoch 2/20  Iteration 356/3560 Training loss: 2.4354 0.3814 sec/batch
Epoch 3/20  Iteration 357/3560 Training loss: 2.3916 0.3860 sec/batch
Epoch 3/20  Iteration 358/3560 Training loss: 2.3496 0.3856 sec/batch
Epoch 3/20  Iteration 359/3560 Training loss: 2.3375 0.3843 sec/batch
Epoch 3/20  Iteration 360/3560 Training loss: 2.3344 0.3860 sec/batch
Epoch 3/20  Iteration 361/3560 Training loss: 2.3330 0.3878 sec/batch
Epoch 3/20  Iteration 362/3560 Training loss: 2.3320 0.3847 sec/batch
Epoch 3/20  Iteration 363/3560 Training loss: 2.3322 0.4004 sec/batch
Epoch 3/20  Iteration 364/3560 Training loss: 2.3341 0.3912 sec/batch
Epoch 3/20  Iteration 365/3560 Training loss: 2.3355 0.3702 sec/batch
Epoch 3/20  Iteration 366/3560 Training loss: 2.3348 0.3879 sec/batch
Epoch 3/20  Iteration 367/3560 Training loss: 2.3327 0.3842 sec/batch
Epoch 3/20  Iteration 368/3560 Training loss: 2.3318 0.3798 sec/batch
Epoch 3/20  Iteration 369/3560 Training loss: 2.3318 0.3787 sec/batch
Epoch 3/20  Iteration 370/3560 Training loss: 2.3342 0.3783 sec/batch
Epoch 3/20  Iteration 371/3560 Training loss: 2.3338 0.3930 sec/batch
Epoch 3/20  Iteration 372/3560 Training loss: 2.3333 0.3828 sec/batch
Epoch 3/20  Iteration 373/3560 Training loss: 2.3335 0.3824 sec/batch
Epoch 3/20  Iteration 374/3560 Training loss: 2.3349 0.3797 sec/batch
Epoch 3/20  Iteration 375/3560 Training loss: 2.3350 0.3803 sec/batch
Epoch 3/20  Iteration 376/3560 Training loss: 2.3340 0.3991 sec/batch
Epoch 3/20  Iteration 377/3560 Training loss: 2.3335 0.3859 sec/batch
Epoch 3/20  Iteration 378/3560 Training loss: 2.3345 0.3838 sec/batch
Epoch 3/20  Iteration 379/3560 Training loss: 2.3341 0.3849 sec/batch
Epoch 3/20  Iteration 380/3560 Training loss: 2.3331 0.3942 sec/batch
Epoch 3/20  Iteration 381/3560 Training loss: 2.3322 0.3818 sec/batch
Epoch 3/20  Iteration 382/3560 Training loss: 2.3315 0.3884 sec/batch
Epoch 3/20  Iteration 383/3560 Training loss: 2.3305 0.3942 sec/batch
Epoch 3/20  Iteration 384/3560 Training loss: 2.3301 0.3837 sec/batch
Epoch 3/20  Iteration 385/3560 Training loss: 2.3303 0.3833 sec/batch
Epoch 3/20  Iteration 386/3560 Training loss: 2.3303 0.3807 sec/batch
Epoch 3/20  Iteration 387/3560 Training loss: 2.3303 0.3799 sec/batch
Epoch 3/20  Iteration 388/3560 Training loss: 2.3294 0.3908 sec/batch
Epoch 3/20  Iteration 389/3560 Training loss: 2.3286 0.3993 sec/batch
Epoch 3/20  Iteration 390/3560 Training loss: 2.3286 0.3827 sec/batch
Epoch 3/20  Iteration 391/3560 Training loss: 2.3279 0.3852 sec/batch
Epoch 3/20  Iteration 392/3560 Training loss: 2.3274 0.3844 sec/batch
Epoch 3/20  Iteration 393/3560 Training loss: 2.3269 0.3810 sec/batch
Epoch 3/20  Iteration 394/3560 Training loss: 2.3257 0.3881 sec/batch
Epoch 3/20  Iteration 395/3560 Training loss: 2.3249 0.3867 sec/batch
Epoch 3/20  Iteration 396/3560 Training loss: 2.3239 0.3811 sec/batch
Epoch 3/20  Iteration 397/3560 Training loss: 2.3230 0.3850 sec/batch
Epoch 3/20  Iteration 398/3560 Training loss: 2.3224 0.3896 sec/batch
Epoch 3/20  Iteration 399/3560 Training loss: 2.3215 0.3758 sec/batch
Epoch 3/20  Iteration 400/3560 Training loss: 2.3208 0.3810 sec/batch
Epoch 3/20  Iteration 401/3560 Training loss: 2.3201 0.3815 sec/batch
Epoch 3/20  Iteration 402/3560 Training loss: 2.3188 0.3998 sec/batch
Epoch 3/20  Iteration 403/3560 Training loss: 2.3187 0.3846 sec/batch
Epoch 3/20  Iteration 404/3560 Training loss: 2.3181 0.3857 sec/batch
Epoch 3/20  Iteration 405/3560 Training loss: 2.3176 0.3837 sec/batch
Epoch 3/20  Iteration 406/3560 Training loss: 2.3177 0.3941 sec/batch
Epoch 3/20  Iteration 407/3560 Training loss: 2.3168 0.6368 sec/batch
Epoch 3/20  Iteration 408/3560 Training loss: 2.3166 0.3954 sec/batch
Epoch 3/20  Iteration 409/3560 Training loss: 2.3160 0.3865 sec/batch
Epoch 3/20  Iteration 410/3560 Training loss: 2.3154 0.3874 sec/batch
Epoch 3/20  Iteration 411/3560 Training loss: 2.3146 0.3823 sec/batch
Epoch 3/20  Iteration 412/3560 Training loss: 2.3143 0.3813 sec/batch
Epoch 3/20  Iteration 413/3560 Training loss: 2.3139 0.3869 sec/batch
Epoch 3/20  Iteration 414/3560 Training loss: 2.3131 0.3997 sec/batch
Epoch 3/20  Iteration 415/3560 Training loss: 2.3125 0.3808 sec/batch
Epoch 3/20  Iteration 416/3560 Training loss: 2.3125 0.3830 sec/batch
Epoch 3/20  Iteration 417/3560 Training loss: 2.3123 0.3898 sec/batch
Epoch 3/20  Iteration 418/3560 Training loss: 2.3122 0.3852 sec/batch
Epoch 3/20  Iteration 419/3560 Training loss: 2.3121 0.3849 sec/batch
Epoch 3/20  Iteration 420/3560 Training loss: 2.3116 0.3792 sec/batch
Epoch 3/20  Iteration 421/3560 Training loss: 2.3109 0.3803 sec/batch
Epoch 3/20  Iteration 422/3560 Training loss: 2.3108 0.3796 sec/batch
Epoch 3/20  Iteration 423/3560 Training loss: 2.3105 0.3802 sec/batch
Epoch 3/20  Iteration 424/3560 Training loss: 2.3096 0.3929 sec/batch
Epoch 3/20  Iteration 425/3560 Training loss: 2.3091 0.3862 sec/batch
Epoch 3/20  Iteration 426/3560 Training loss: 2.3087 0.3842 sec/batch
Epoch 3/20  Iteration 427/3560 Training loss: 2.3085 0.3984 sec/batch
Epoch 3/20  Iteration 428/3560 Training loss: 2.3083 0.3869 sec/batch
Epoch 3/20  Iteration 429/3560 Training loss: 2.3081 0.3898 sec/batch
Epoch 3/20  Iteration 430/3560 Training loss: 2.3076 0.3812 sec/batch
Epoch 3/20  Iteration 431/3560 Training loss: 2.3071 0.3804 sec/batch
Epoch 3/20  Iteration 432/3560 Training loss: 2.3072 0.3810 sec/batch
Epoch 3/20  Iteration 433/3560 Training loss: 2.3068 0.3841 sec/batch
Epoch 3/20  Iteration 434/3560 Training loss: 2.3068 0.3804 sec/batch
Epoch 3/20  Iteration 435/3560 Training loss: 2.3062 0.3876 sec/batch
Epoch 3/20  Iteration 436/3560 Training loss: 2.3057 0.3853 sec/batch
Epoch 3/20  Iteration 437/3560 Training loss: 2.3052 0.3797 sec/batch
Epoch 3/20  Iteration 438/3560 Training loss: 2.3050 0.3943 sec/batch
Epoch 3/20  Iteration 439/3560 Training loss: 2.3046 0.3846 sec/batch
Epoch 3/20  Iteration 440/3560 Training loss: 2.3041 0.3947 sec/batch
Epoch 3/20  Iteration 441/3560 Training loss: 2.3034 0.4013 sec/batch
Epoch 3/20  Iteration 442/3560 Training loss: 2.3028 0.3887 sec/batch
Epoch 3/20  Iteration 443/3560 Training loss: 2.3024 0.3844 sec/batch
Epoch 3/20  Iteration 444/3560 Training loss: 2.3020 0.3857 sec/batch
Epoch 3/20  Iteration 445/3560 Training loss: 2.3014 0.3853 sec/batch
Epoch 3/20  Iteration 446/3560 Training loss: 2.3012 0.3997 sec/batch
Epoch 3/20  Iteration 447/3560 Training loss: 2.3008 0.3870 sec/batch
Epoch 3/20  Iteration 448/3560 Training loss: 2.3005 0.3875 sec/batch
Epoch 3/20  Iteration 449/3560 Training loss: 2.2999 0.3821 sec/batch
Epoch 3/20  Iteration 450/3560 Training loss: 2.2994 0.3802 sec/batch
Epoch 3/20  Iteration 451/3560 Training loss: 2.2988 0.5031 sec/batch
Epoch 3/20  Iteration 452/3560 Training loss: 2.2984 0.5056 sec/batch
Epoch 3/20  Iteration 453/3560 Training loss: 2.2980 0.4710 sec/batch
Epoch 3/20  Iteration 454/3560 Training loss: 2.2975 0.3808 sec/batch
Epoch 3/20  Iteration 455/3560 Training loss: 2.2971 0.3881 sec/batch
Epoch 3/20  Iteration 456/3560 Training loss: 2.2965 0.3797 sec/batch
Epoch 3/20  Iteration 457/3560 Training loss: 2.2962 0.3825 sec/batch
Epoch 3/20  Iteration 458/3560 Training loss: 2.2959 0.3862 sec/batch
Epoch 3/20  Iteration 459/3560 Training loss: 2.2953 0.3867 sec/batch
Epoch 3/20  Iteration 460/3560 Training loss: 2.2948 0.3832 sec/batch
Epoch 3/20  Iteration 461/3560 Training loss: 2.2943 0.3914 sec/batch
Epoch 3/20  Iteration 462/3560 Training loss: 2.2939 0.3871 sec/batch
Epoch 3/20  Iteration 463/3560 Training loss: 2.2935 0.3864 sec/batch
Epoch 3/20  Iteration 464/3560 Training loss: 2.2932 0.3896 sec/batch
Epoch 3/20  Iteration 465/3560 Training loss: 2.2930 0.3830 sec/batch
Epoch 3/20  Iteration 466/3560 Training loss: 2.2925 0.4104 sec/batch
Epoch 3/20  Iteration 467/3560 Training loss: 2.2922 0.3786 sec/batch
Epoch 3/20  Iteration 468/3560 Training loss: 2.2919 0.3846 sec/batch
Epoch 3/20  Iteration 469/3560 Training loss: 2.2915 0.3814 sec/batch
Epoch 3/20  Iteration 470/3560 Training loss: 2.2911 0.3859 sec/batch
Epoch 3/20  Iteration 471/3560 Training loss: 2.2907 0.3889 sec/batch
Epoch 3/20  Iteration 472/3560 Training loss: 2.2900 0.3876 sec/batch
Epoch 3/20  Iteration 473/3560 Training loss: 2.2896 0.3831 sec/batch
Epoch 3/20  Iteration 474/3560 Training loss: 2.2893 0.3902 sec/batch
Epoch 3/20  Iteration 475/3560 Training loss: 2.2891 0.3806 sec/batch
Epoch 3/20  Iteration 476/3560 Training loss: 2.2888 0.3862 sec/batch
Epoch 3/20  Iteration 477/3560 Training loss: 2.2886 0.3767 sec/batch
Epoch 3/20  Iteration 478/3560 Training loss: 2.2882 0.3837 sec/batch
Epoch 3/20  Iteration 479/3560 Training loss: 2.2878 0.4005 sec/batch
Epoch 3/20  Iteration 480/3560 Training loss: 2.2875 0.3820 sec/batch
Epoch 3/20  Iteration 481/3560 Training loss: 2.2871 0.3901 sec/batch
Epoch 3/20  Iteration 482/3560 Training loss: 2.2866 0.3932 sec/batch
Epoch 3/20  Iteration 483/3560 Training loss: 2.2863 0.3938 sec/batch
Epoch 3/20  Iteration 484/3560 Training loss: 2.2860 0.3868 sec/batch
Epoch 3/20  Iteration 485/3560 Training loss: 2.2857 0.3828 sec/batch
Epoch 3/20  Iteration 486/3560 Training loss: 2.2855 0.3958 sec/batch
Epoch 3/20  Iteration 487/3560 Training loss: 2.2852 0.3863 sec/batch
Epoch 3/20  Iteration 488/3560 Training loss: 2.2847 0.3874 sec/batch
Epoch 3/20  Iteration 489/3560 Training loss: 2.2845 0.3913 sec/batch
Epoch 3/20  Iteration 490/3560 Training loss: 2.2843 0.3799 sec/batch
Epoch 3/20  Iteration 491/3560 Training loss: 2.2839 0.3862 sec/batch
Epoch 3/20  Iteration 492/3560 Training loss: 2.2837 0.4051 sec/batch
Epoch 3/20  Iteration 493/3560 Training loss: 2.2834 0.3887 sec/batch
Epoch 3/20  Iteration 494/3560 Training loss: 2.2831 0.3784 sec/batch
Epoch 3/20  Iteration 495/3560 Training loss: 2.2830 0.3825 sec/batch
Epoch 3/20  Iteration 496/3560 Training loss: 2.2826 0.3811 sec/batch
Epoch 3/20  Iteration 497/3560 Training loss: 2.2825 0.3890 sec/batch
Epoch 3/20  Iteration 498/3560 Training loss: 2.2822 0.3866 sec/batch
Epoch 3/20  Iteration 499/3560 Training loss: 2.2819 0.3842 sec/batch
Epoch 3/20  Iteration 500/3560 Training loss: 2.2816 0.3810 sec/batch
Epoch 3/20  Iteration 501/3560 Training loss: 2.2813 0.3791 sec/batch
Epoch 3/20  Iteration 502/3560 Training loss: 2.2812 0.3839 sec/batch
Epoch 3/20  Iteration 503/3560 Training loss: 2.2809 0.3854 sec/batch
Epoch 3/20  Iteration 504/3560 Training loss: 2.2807 0.3873 sec/batch
Epoch 3/20  Iteration 505/3560 Training loss: 2.2805 0.3973 sec/batch
Epoch 3/20  Iteration 506/3560 Training loss: 2.2801 0.3825 sec/batch
Epoch 3/20  Iteration 507/3560 Training loss: 2.2799 0.3832 sec/batch
Epoch 3/20  Iteration 508/3560 Training loss: 2.2799 0.3850 sec/batch
Epoch 3/20  Iteration 509/3560 Training loss: 2.2797 0.3842 sec/batch
Epoch 3/20  Iteration 510/3560 Training loss: 2.2796 0.3792 sec/batch
Epoch 3/20  Iteration 511/3560 Training loss: 2.2793 0.3880 sec/batch
Epoch 3/20  Iteration 512/3560 Training loss: 2.2789 0.3878 sec/batch
Epoch 3/20  Iteration 513/3560 Training loss: 2.2786 0.3823 sec/batch
Epoch 3/20  Iteration 514/3560 Training loss: 2.2783 0.3807 sec/batch
Epoch 3/20  Iteration 515/3560 Training loss: 2.2778 0.3815 sec/batch
Epoch 3/20  Iteration 516/3560 Training loss: 2.2776 0.3826 sec/batch
Epoch 3/20  Iteration 517/3560 Training loss: 2.2774 0.3880 sec/batch
Epoch 3/20  Iteration 518/3560 Training loss: 2.2771 0.4096 sec/batch
Epoch 3/20  Iteration 519/3560 Training loss: 2.2767 0.3815 sec/batch
Epoch 3/20  Iteration 520/3560 Training loss: 2.2764 0.3801 sec/batch
Epoch 3/20  Iteration 521/3560 Training loss: 2.2762 0.3877 sec/batch
Epoch 3/20  Iteration 522/3560 Training loss: 2.2758 0.3840 sec/batch
Epoch 3/20  Iteration 523/3560 Training loss: 2.2756 0.3815 sec/batch
Epoch 3/20  Iteration 524/3560 Training loss: 2.2755 0.3821 sec/batch
Epoch 3/20  Iteration 525/3560 Training loss: 2.2752 0.3805 sec/batch
Epoch 3/20  Iteration 526/3560 Training loss: 2.2749 0.3816 sec/batch
Epoch 3/20  Iteration 527/3560 Training loss: 2.2745 0.3824 sec/batch
Epoch 3/20  Iteration 528/3560 Training loss: 2.2742 0.3878 sec/batch
Epoch 3/20  Iteration 529/3560 Training loss: 2.2739 0.3827 sec/batch
Epoch 3/20  Iteration 530/3560 Training loss: 2.2738 0.3840 sec/batch
Epoch 3/20  Iteration 531/3560 Training loss: 2.2736 0.3977 sec/batch
Epoch 3/20  Iteration 532/3560 Training loss: 2.2734 0.3801 sec/batch
Epoch 3/20  Iteration 533/3560 Training loss: 2.2731 0.3824 sec/batch
Epoch 3/20  Iteration 534/3560 Training loss: 2.2727 0.3836 sec/batch
Epoch 4/20  Iteration 535/3560 Training loss: 2.2767 0.3823 sec/batch
Epoch 4/20  Iteration 536/3560 Training loss: 2.2308 0.3949 sec/batch
Epoch 4/20  Iteration 537/3560 Training loss: 2.2198 0.3839 sec/batch
Epoch 4/20  Iteration 538/3560 Training loss: 2.2161 0.3891 sec/batch
Epoch 4/20  Iteration 539/3560 Training loss: 2.2163 0.3848 sec/batch
Epoch 4/20  Iteration 540/3560 Training loss: 2.2131 0.3807 sec/batch
Epoch 4/20  Iteration 541/3560 Training loss: 2.2142 0.3845 sec/batch
Epoch 4/20  Iteration 542/3560 Training loss: 2.2148 0.3943 sec/batch
Epoch 4/20  Iteration 543/3560 Training loss: 2.2161 0.3874 sec/batch
Epoch 4/20  Iteration 544/3560 Training loss: 2.2152 0.4084 sec/batch
Epoch 4/20  Iteration 545/3560 Training loss: 2.2146 0.3897 sec/batch
Epoch 4/20  Iteration 546/3560 Training loss: 2.2128 0.3814 sec/batch
Epoch 4/20  Iteration 547/3560 Training loss: 2.2126 0.3802 sec/batch
Epoch 4/20  Iteration 548/3560 Training loss: 2.2155 0.3845 sec/batch
Epoch 4/20  Iteration 549/3560 Training loss: 2.2154 0.3719 sec/batch
Epoch 4/20  Iteration 550/3560 Training loss: 2.2148 0.3853 sec/batch
Epoch 4/20  Iteration 551/3560 Training loss: 2.2148 0.3886 sec/batch
Epoch 4/20  Iteration 552/3560 Training loss: 2.2165 0.3805 sec/batch
Epoch 4/20  Iteration 553/3560 Training loss: 2.2166 0.3805 sec/batch
Epoch 4/20  Iteration 554/3560 Training loss: 2.2162 0.3826 sec/batch
Epoch 4/20  Iteration 555/3560 Training loss: 2.2154 0.3823 sec/batch
Epoch 4/20  Iteration 556/3560 Training loss: 2.2161 0.3815 sec/batch
Epoch 4/20  Iteration 557/3560 Training loss: 2.2158 0.4010 sec/batch
Epoch 4/20  Iteration 558/3560 Training loss: 2.2149 0.3868 sec/batch
Epoch 4/20  Iteration 559/3560 Training loss: 2.2141 0.4448 sec/batch
Epoch 4/20  Iteration 560/3560 Training loss: 2.2131 0.3881 sec/batch
Epoch 4/20  Iteration 561/3560 Training loss: 2.2122 0.3847 sec/batch
Epoch 4/20  Iteration 562/3560 Training loss: 2.2122 0.3799 sec/batch
Epoch 4/20  Iteration 563/3560 Training loss: 2.2125 0.3808 sec/batch
Epoch 4/20  Iteration 564/3560 Training loss: 2.2127 0.3861 sec/batch
Epoch 4/20  Iteration 565/3560 Training loss: 2.2130 0.3780 sec/batch
Epoch 4/20  Iteration 566/3560 Training loss: 2.2121 0.3846 sec/batch
Epoch 4/20  Iteration 567/3560 Training loss: 2.2116 0.3807 sec/batch
Epoch 4/20  Iteration 568/3560 Training loss: 2.2119 0.3854 sec/batch
Epoch 4/20  Iteration 569/3560 Training loss: 2.2115 0.3856 sec/batch
Epoch 4/20  Iteration 570/3560 Training loss: 2.2110 0.4187 sec/batch
Epoch 4/20  Iteration 571/3560 Training loss: 2.2105 0.3880 sec/batch
Epoch 4/20  Iteration 572/3560 Training loss: 2.2090 0.3882 sec/batch
Epoch 4/20  Iteration 573/3560 Training loss: 2.2080 0.3898 sec/batch
Epoch 4/20  Iteration 574/3560 Training loss: 2.2071 0.3823 sec/batch
Epoch 4/20  Iteration 575/3560 Training loss: 2.2063 0.3860 sec/batch
Epoch 4/20  Iteration 576/3560 Training loss: 2.2058 0.3811 sec/batch
Epoch 4/20  Iteration 577/3560 Training loss: 2.2051 0.3795 sec/batch
Epoch 4/20  Iteration 578/3560 Training loss: 2.2043 0.3897 sec/batch
Epoch 4/20  Iteration 579/3560 Training loss: 2.2036 0.3845 sec/batch
Epoch 4/20  Iteration 580/3560 Training loss: 2.2022 0.3840 sec/batch
Epoch 4/20  Iteration 581/3560 Training loss: 2.2022 0.3818 sec/batch
Epoch 4/20  Iteration 582/3560 Training loss: 2.2017 0.3831 sec/batch
Epoch 4/20  Iteration 583/3560 Training loss: 2.2013 0.4040 sec/batch
Epoch 4/20  Iteration 584/3560 Training loss: 2.2017 0.3794 sec/batch
Epoch 4/20  Iteration 585/3560 Training loss: 2.2010 0.3894 sec/batch
Epoch 4/20  Iteration 586/3560 Training loss: 2.2012 0.3952 sec/batch
Epoch 4/20  Iteration 587/3560 Training loss: 2.2006 0.3833 sec/batch
Epoch 4/20  Iteration 588/3560 Training loss: 2.2001 0.3882 sec/batch
Epoch 4/20  Iteration 589/3560 Training loss: 2.1994 0.3885 sec/batch
Epoch 4/20  Iteration 590/3560 Training loss: 2.1993 0.3843 sec/batch
Epoch 4/20  Iteration 591/3560 Training loss: 2.1990 0.3923 sec/batch
Epoch 4/20  Iteration 592/3560 Training loss: 2.1987 0.3889 sec/batch
Epoch 4/20  Iteration 593/3560 Training loss: 2.1982 0.4214 sec/batch
Epoch 4/20  Iteration 594/3560 Training loss: 2.1983 0.3843 sec/batch
Epoch 4/20  Iteration 595/3560 Training loss: 2.1979 0.3929 sec/batch
Epoch 4/20  Iteration 596/3560 Training loss: 2.1979 0.4157 sec/batch
Epoch 4/20  Iteration 597/3560 Training loss: 2.1979 0.3843 sec/batch
Epoch 4/20  Iteration 598/3560 Training loss: 2.1979 0.4048 sec/batch
Epoch 4/20  Iteration 599/3560 Training loss: 2.1974 0.3862 sec/batch
Epoch 4/20  Iteration 600/3560 Training loss: 2.1973 0.3942 sec/batch
Epoch 4/20  Iteration 601/3560 Training loss: 2.1972 0.3875 sec/batch
Epoch 4/20  Iteration 602/3560 Training loss: 2.1965 0.3826 sec/batch
Epoch 4/20  Iteration 603/3560 Training loss: 2.1960 0.3978 sec/batch
Epoch 4/20  Iteration 604/3560 Training loss: 2.1959 0.3812 sec/batch
Epoch 4/20  Iteration 605/3560 Training loss: 2.1959 0.3856 sec/batch
Epoch 4/20  Iteration 606/3560 Training loss: 2.1956 0.3788 sec/batch
Epoch 4/20  Iteration 607/3560 Training loss: 2.1955 0.3882 sec/batch
Epoch 4/20  Iteration 608/3560 Training loss: 2.1950 0.3877 sec/batch
Epoch 4/20  Iteration 609/3560 Training loss: 2.1947 0.4118 sec/batch
Epoch 4/20  Iteration 610/3560 Training loss: 2.1950 0.3777 sec/batch
Epoch 4/20  Iteration 611/3560 Training loss: 2.1947 0.3828 sec/batch
Epoch 4/20  Iteration 612/3560 Training loss: 2.1947 0.3928 sec/batch
Epoch 4/20  Iteration 613/3560 Training loss: 2.1941 0.3833 sec/batch
Epoch 4/20  Iteration 614/3560 Training loss: 2.1938 0.3800 sec/batch
Epoch 4/20  Iteration 615/3560 Training loss: 2.1932 0.3846 sec/batch
Epoch 4/20  Iteration 616/3560 Training loss: 2.1930 0.3834 sec/batch
Epoch 4/20  Iteration 617/3560 Training loss: 2.1925 0.3875 sec/batch
Epoch 4/20  Iteration 618/3560 Training loss: 2.1921 0.3790 sec/batch
Epoch 4/20  Iteration 619/3560 Training loss: 2.1912 0.3819 sec/batch
Epoch 4/20  Iteration 620/3560 Training loss: 2.1906 0.3950 sec/batch
Epoch 4/20  Iteration 621/3560 Training loss: 2.1904 0.3851 sec/batch
Epoch 4/20  Iteration 622/3560 Training loss: 2.1900 0.4099 sec/batch
Epoch 4/20  Iteration 623/3560 Training loss: 2.1895 0.3858 sec/batch
Epoch 4/20  Iteration 624/3560 Training loss: 2.1894 0.3790 sec/batch
Epoch 4/20  Iteration 625/3560 Training loss: 2.1889 0.3878 sec/batch
Epoch 4/20  Iteration 626/3560 Training loss: 2.1887 0.3941 sec/batch
Epoch 4/20  Iteration 627/3560 Training loss: 2.1881 0.3912 sec/batch
Epoch 4/20  Iteration 628/3560 Training loss: 2.1876 0.3831 sec/batch
Epoch 4/20  Iteration 629/3560 Training loss: 2.1871 0.3858 sec/batch
Epoch 4/20  Iteration 630/3560 Training loss: 2.1868 0.3872 sec/batch
Epoch 4/20  Iteration 631/3560 Training loss: 2.1866 0.3979 sec/batch
Epoch 4/20  Iteration 632/3560 Training loss: 2.1861 0.3882 sec/batch
Epoch 4/20  Iteration 633/3560 Training loss: 2.1857 0.3791 sec/batch
Epoch 4/20  Iteration 634/3560 Training loss: 2.1851 0.3926 sec/batch
Epoch 4/20  Iteration 635/3560 Training loss: 2.1849 0.4298 sec/batch
Epoch 4/20  Iteration 636/3560 Training loss: 2.1846 0.3815 sec/batch
Epoch 4/20  Iteration 637/3560 Training loss: 2.1841 0.3918 sec/batch
Epoch 4/20  Iteration 638/3560 Training loss: 2.1837 0.3914 sec/batch
Epoch 4/20  Iteration 639/3560 Training loss: 2.1832 0.3866 sec/batch
Epoch 4/20  Iteration 640/3560 Training loss: 2.1829 0.3811 sec/batch
Epoch 4/20  Iteration 641/3560 Training loss: 2.1826 0.3803 sec/batch
Epoch 4/20  Iteration 642/3560 Training loss: 2.1823 0.3825 sec/batch
Epoch 4/20  Iteration 643/3560 Training loss: 2.1822 0.3924 sec/batch
Epoch 4/20  Iteration 644/3560 Training loss: 2.1818 0.3862 sec/batch
Epoch 4/20  Iteration 645/3560 Training loss: 2.1816 0.3804 sec/batch
Epoch 4/20  Iteration 646/3560 Training loss: 2.1813 0.3849 sec/batch
Epoch 4/20  Iteration 647/3560 Training loss: 2.1810 0.3933 sec/batch
Epoch 4/20  Iteration 648/3560 Training loss: 2.1807 0.4177 sec/batch
Epoch 4/20  Iteration 649/3560 Training loss: 2.1803 0.3821 sec/batch
Epoch 4/20  Iteration 650/3560 Training loss: 2.1798 0.3887 sec/batch
Epoch 4/20  Iteration 651/3560 Training loss: 2.1795 0.3852 sec/batch
Epoch 4/20  Iteration 652/3560 Training loss: 2.1792 0.3903 sec/batch
Epoch 4/20  Iteration 653/3560 Training loss: 2.1792 0.3756 sec/batch
Epoch 4/20  Iteration 654/3560 Training loss: 2.1790 0.3874 sec/batch
Epoch 4/20  Iteration 655/3560 Training loss: 2.1789 0.3860 sec/batch
Epoch 4/20  Iteration 656/3560 Training loss: 2.1786 0.3835 sec/batch
Epoch 4/20  Iteration 657/3560 Training loss: 2.1783 0.3815 sec/batch
Epoch 4/20  Iteration 658/3560 Training loss: 2.1781 0.3787 sec/batch
Epoch 4/20  Iteration 659/3560 Training loss: 2.1779 0.3819 sec/batch
Epoch 4/20  Iteration 660/3560 Training loss: 2.1774 0.3832 sec/batch
Epoch 4/20  Iteration 661/3560 Training loss: 2.1772 0.4107 sec/batch
Epoch 4/20  Iteration 662/3560 Training loss: 2.1770 0.3959 sec/batch
Epoch 4/20  Iteration 663/3560 Training loss: 2.1769 0.3857 sec/batch
Epoch 4/20  Iteration 664/3560 Training loss: 2.1768 0.3871 sec/batch
Epoch 4/20  Iteration 665/3560 Training loss: 2.1765 0.3857 sec/batch
Epoch 4/20  Iteration 666/3560 Training loss: 2.1761 0.3822 sec/batch
Epoch 4/20  Iteration 667/3560 Training loss: 2.1759 0.3875 sec/batch
Epoch 4/20  Iteration 668/3560 Training loss: 2.1758 0.3874 sec/batch
Epoch 4/20  Iteration 669/3560 Training loss: 2.1755 0.3888 sec/batch
Epoch 4/20  Iteration 670/3560 Training loss: 2.1754 0.3851 sec/batch
Epoch 4/20  Iteration 671/3560 Training loss: 2.1752 0.3868 sec/batch
Epoch 4/20  Iteration 672/3560 Training loss: 2.1751 0.3860 sec/batch
Epoch 4/20  Iteration 673/3560 Training loss: 2.1750 0.3792 sec/batch
Epoch 4/20  Iteration 674/3560 Training loss: 2.1748 0.4081 sec/batch
Epoch 4/20  Iteration 675/3560 Training loss: 2.1747 0.3864 sec/batch
Epoch 4/20  Iteration 676/3560 Training loss: 2.1744 0.3852 sec/batch
Epoch 4/20  Iteration 677/3560 Training loss: 2.1742 0.3871 sec/batch
Epoch 4/20  Iteration 678/3560 Training loss: 2.1740 0.3920 sec/batch
Epoch 4/20  Iteration 679/3560 Training loss: 2.1736 0.3795 sec/batch
Epoch 4/20  Iteration 680/3560 Training loss: 2.1736 0.3876 sec/batch
Epoch 4/20  Iteration 681/3560 Training loss: 2.1734 0.3824 sec/batch
Epoch 4/20  Iteration 682/3560 Training loss: 2.1733 0.3902 sec/batch
Epoch 4/20  Iteration 683/3560 Training loss: 2.1731 0.3817 sec/batch
Epoch 4/20  Iteration 684/3560 Training loss: 2.1727 0.3895 sec/batch
Epoch 4/20  Iteration 685/3560 Training loss: 2.1725 0.3813 sec/batch
Epoch 4/20  Iteration 686/3560 Training loss: 2.1725 0.3869 sec/batch
Epoch 4/20  Iteration 687/3560 Training loss: 2.1723 0.4125 sec/batch
Epoch 4/20  Iteration 688/3560 Training loss: 2.1722 0.3966 sec/batch
Epoch 4/20  Iteration 689/3560 Training loss: 2.1719 0.3866 sec/batch
Epoch 4/20  Iteration 690/3560 Training loss: 2.1716 0.3835 sec/batch
Epoch 4/20  Iteration 691/3560 Training loss: 2.1714 0.3795 sec/batch
Epoch 4/20  Iteration 692/3560 Training loss: 2.1712 0.3986 sec/batch
Epoch 4/20  Iteration 693/3560 Training loss: 2.1708 0.3864 sec/batch
Epoch 4/20  Iteration 694/3560 Training loss: 2.1708 0.3849 sec/batch
Epoch 4/20  Iteration 695/3560 Training loss: 2.1706 0.3852 sec/batch
Epoch 4/20  Iteration 696/3560 Training loss: 2.1703 0.3860 sec/batch
Epoch 4/20  Iteration 697/3560 Training loss: 2.1700 0.3910 sec/batch
Epoch 4/20  Iteration 698/3560 Training loss: 2.1698 0.3828 sec/batch
Epoch 4/20  Iteration 699/3560 Training loss: 2.1697 0.3816 sec/batch
Epoch 4/20  Iteration 700/3560 Training loss: 2.1695 0.4103 sec/batch
Epoch 4/20  Iteration 701/3560 Training loss: 2.1693 0.3811 sec/batch
Epoch 4/20  Iteration 702/3560 Training loss: 2.1693 0.3834 sec/batch
Epoch 4/20  Iteration 703/3560 Training loss: 2.1691 0.3846 sec/batch
Epoch 4/20  Iteration 704/3560 Training loss: 2.1689 0.3838 sec/batch
Epoch 4/20  Iteration 705/3560 Training loss: 2.1686 0.3852 sec/batch
Epoch 4/20  Iteration 706/3560 Training loss: 2.1683 0.3959 sec/batch
Epoch 4/20  Iteration 707/3560 Training loss: 2.1682 0.3861 sec/batch
Epoch 4/20  Iteration 708/3560 Training loss: 2.1681 0.3827 sec/batch
Epoch 4/20  Iteration 709/3560 Training loss: 2.1680 0.3915 sec/batch
Epoch 4/20  Iteration 710/3560 Training loss: 2.1679 0.3804 sec/batch
Epoch 4/20  Iteration 711/3560 Training loss: 2.1676 0.4062 sec/batch
Epoch 4/20  Iteration 712/3560 Training loss: 2.1674 0.3850 sec/batch
Epoch 5/20  Iteration 713/3560 Training loss: 2.1837 0.4048 sec/batch
Epoch 5/20  Iteration 714/3560 Training loss: 2.1412 0.3806 sec/batch
Epoch 5/20  Iteration 715/3560 Training loss: 2.1337 0.3817 sec/batch
Epoch 5/20  Iteration 716/3560 Training loss: 2.1285 0.3809 sec/batch
Epoch 5/20  Iteration 717/3560 Training loss: 2.1267 0.3822 sec/batch
Epoch 5/20  Iteration 718/3560 Training loss: 2.1222 0.3915 sec/batch
Epoch 5/20  Iteration 719/3560 Training loss: 2.1230 0.3862 sec/batch
Epoch 5/20  Iteration 720/3560 Training loss: 2.1239 0.3830 sec/batch
Epoch 5/20  Iteration 721/3560 Training loss: 2.1275 0.3925 sec/batch
Epoch 5/20  Iteration 722/3560 Training loss: 2.1268 0.3779 sec/batch
Epoch 5/20  Iteration 723/3560 Training loss: 2.1243 0.3848 sec/batch
Epoch 5/20  Iteration 724/3560 Training loss: 2.1223 0.3849 sec/batch
Epoch 5/20  Iteration 725/3560 Training loss: 2.1225 0.3832 sec/batch
Epoch 5/20  Iteration 726/3560 Training loss: 2.1240 0.4214 sec/batch
Epoch 5/20  Iteration 727/3560 Training loss: 2.1232 0.3905 sec/batch
Epoch 5/20  Iteration 728/3560 Training loss: 2.1222 0.3800 sec/batch
Epoch 5/20  Iteration 729/3560 Training loss: 2.1223 0.3779 sec/batch
Epoch 5/20  Iteration 730/3560 Training loss: 2.1239 0.3838 sec/batch
Epoch 5/20  Iteration 731/3560 Training loss: 2.1242 0.3866 sec/batch
Epoch 5/20  Iteration 732/3560 Training loss: 2.1235 0.3828 sec/batch
Epoch 5/20  Iteration 733/3560 Training loss: 2.1228 0.3854 sec/batch
Epoch 5/20  Iteration 734/3560 Training loss: 2.1236 0.3830 sec/batch
Epoch 5/20  Iteration 735/3560 Training loss: 2.1229 0.3738 sec/batch
Epoch 5/20  Iteration 736/3560 Training loss: 2.1219 0.3785 sec/batch
Epoch 5/20  Iteration 737/3560 Training loss: 2.1210 0.3808 sec/batch
Epoch 5/20  Iteration 738/3560 Training loss: 2.1203 0.3861 sec/batch
Epoch 5/20  Iteration 739/3560 Training loss: 2.1194 0.4092 sec/batch
Epoch 5/20  Iteration 740/3560 Training loss: 2.1193 0.3872 sec/batch
Epoch 5/20  Iteration 741/3560 Training loss: 2.1197 0.3846 sec/batch
Epoch 5/20  Iteration 742/3560 Training loss: 2.1202 0.3855 sec/batch
Epoch 5/20  Iteration 743/3560 Training loss: 2.1204 0.3825 sec/batch
Epoch 5/20  Iteration 744/3560 Training loss: 2.1200 0.3792 sec/batch
Epoch 5/20  Iteration 745/3560 Training loss: 2.1199 0.3767 sec/batch
Epoch 5/20  Iteration 746/3560 Training loss: 2.1202 0.3790 sec/batch
Epoch 5/20  Iteration 747/3560 Training loss: 2.1193 0.3834 sec/batch
Epoch 5/20  Iteration 748/3560 Training loss: 2.1188 0.3978 sec/batch
Epoch 5/20  Iteration 749/3560 Training loss: 2.1183 0.3769 sec/batch
Epoch 5/20  Iteration 750/3560 Training loss: 2.1169 0.3874 sec/batch
Epoch 5/20  Iteration 751/3560 Training loss: 2.1161 0.3862 sec/batch
Epoch 5/20  Iteration 752/3560 Training loss: 2.1152 0.3973 sec/batch
Epoch 5/20  Iteration 753/3560 Training loss: 2.1146 0.3891 sec/batch
Epoch 5/20  Iteration 754/3560 Training loss: 2.1140 0.3930 sec/batch
Epoch 5/20  Iteration 755/3560 Training loss: 2.1133 0.3804 sec/batch
Epoch 5/20  Iteration 756/3560 Training loss: 2.1125 0.3817 sec/batch
Epoch 5/20  Iteration 757/3560 Training loss: 2.1121 0.3877 sec/batch
Epoch 5/20  Iteration 758/3560 Training loss: 2.1105 0.3777 sec/batch
Epoch 5/20  Iteration 759/3560 Training loss: 2.1105 0.3824 sec/batch
Epoch 5/20  Iteration 760/3560 Training loss: 2.1101 0.3787 sec/batch
Epoch 5/20  Iteration 761/3560 Training loss: 2.1098 0.3823 sec/batch
Epoch 5/20  Iteration 762/3560 Training loss: 2.1102 0.3822 sec/batch
Epoch 5/20  Iteration 763/3560 Training loss: 2.1094 0.3811 sec/batch
Epoch 5/20  Iteration 764/3560 Training loss: 2.1097 0.3812 sec/batch
Epoch 5/20  Iteration 765/3560 Training loss: 2.1093 0.3846 sec/batch
Epoch 5/20  Iteration 766/3560 Training loss: 2.1090 0.3977 sec/batch
Epoch 5/20  Iteration 767/3560 Training loss: 2.1084 0.3822 sec/batch
Epoch 5/20  Iteration 768/3560 Training loss: 2.1085 0.3858 sec/batch
Epoch 5/20  Iteration 769/3560 Training loss: 2.1084 0.3782 sec/batch
Epoch 5/20  Iteration 770/3560 Training loss: 2.1082 0.3860 sec/batch
Epoch 5/20  Iteration 771/3560 Training loss: 2.1077 0.3811 sec/batch
Epoch 5/20  Iteration 772/3560 Training loss: 2.1079 0.3951 sec/batch
Epoch 5/20  Iteration 773/3560 Training loss: 2.1075 0.3800 sec/batch
Epoch 5/20  Iteration 774/3560 Training loss: 2.1078 0.3831 sec/batch
Epoch 5/20  Iteration 775/3560 Training loss: 2.1081 0.3947 sec/batch
Epoch 5/20  Iteration 776/3560 Training loss: 2.1080 0.3839 sec/batch
Epoch 5/20  Iteration 777/3560 Training loss: 2.1077 0.3812 sec/batch
Epoch 5/20  Iteration 778/3560 Training loss: 2.1079 0.3862 sec/batch
Epoch 5/20  Iteration 779/3560 Training loss: 2.1078 0.4032 sec/batch
Epoch 5/20  Iteration 780/3560 Training loss: 2.1071 0.3905 sec/batch
Epoch 5/20  Iteration 781/3560 Training loss: 2.1067 0.3774 sec/batch
Epoch 5/20  Iteration 782/3560 Training loss: 2.1067 0.3817 sec/batch
Epoch 5/20  Iteration 783/3560 Training loss: 2.1069 0.3871 sec/batch
Epoch 5/20  Iteration 784/3560 Training loss: 2.1068 0.3832 sec/batch
Epoch 5/20  Iteration 785/3560 Training loss: 2.1069 0.3845 sec/batch
Epoch 5/20  Iteration 786/3560 Training loss: 2.1065 0.3851 sec/batch
Epoch 5/20  Iteration 787/3560 Training loss: 2.1063 0.3813 sec/batch
Epoch 5/20  Iteration 788/3560 Training loss: 2.1065 0.4019 sec/batch
Epoch 5/20  Iteration 789/3560 Training loss: 2.1063 0.3963 sec/batch
Epoch 5/20  Iteration 790/3560 Training loss: 2.1062 0.3828 sec/batch
Epoch 5/20  Iteration 791/3560 Training loss: 2.1057 0.3832 sec/batch
Epoch 5/20  Iteration 792/3560 Training loss: 2.1053 0.4008 sec/batch
Epoch 5/20  Iteration 793/3560 Training loss: 2.1047 0.3770 sec/batch
Epoch 5/20  Iteration 794/3560 Training loss: 2.1048 0.3886 sec/batch
Epoch 5/20  Iteration 795/3560 Training loss: 2.1042 0.3777 sec/batch
Epoch 5/20  Iteration 796/3560 Training loss: 2.1039 0.3778 sec/batch
Epoch 5/20  Iteration 797/3560 Training loss: 2.1031 0.3797 sec/batch
Epoch 5/20  Iteration 798/3560 Training loss: 2.1027 0.3806 sec/batch
Epoch 5/20  Iteration 799/3560 Training loss: 2.1024 0.3800 sec/batch
Epoch 5/20  Iteration 800/3560 Training loss: 2.1021 0.3890 sec/batch
Epoch 5/20  Iteration 801/3560 Training loss: 2.1015 0.3847 sec/batch
Epoch 5/20  Iteration 802/3560 Training loss: 2.1014 0.3850 sec/batch
Epoch 5/20  Iteration 803/3560 Training loss: 2.1012 0.3808 sec/batch
Epoch 5/20  Iteration 804/3560 Training loss: 2.1009 0.3930 sec/batch
Epoch 5/20  Iteration 805/3560 Training loss: 2.1004 0.4016 sec/batch
Epoch 5/20  Iteration 806/3560 Training loss: 2.1001 0.3936 sec/batch
Epoch 5/20  Iteration 807/3560 Training loss: 2.0997 0.3819 sec/batch
Epoch 5/20  Iteration 808/3560 Training loss: 2.0994 0.3817 sec/batch
Epoch 5/20  Iteration 809/3560 Training loss: 2.0991 0.3829 sec/batch
Epoch 5/20  Iteration 810/3560 Training loss: 2.0987 0.3799 sec/batch
Epoch 5/20  Iteration 811/3560 Training loss: 2.0983 0.3794 sec/batch
Epoch 5/20  Iteration 812/3560 Training loss: 2.0978 0.3849 sec/batch
Epoch 5/20  Iteration 813/3560 Training loss: 2.0977 0.3815 sec/batch
Epoch 5/20  Iteration 814/3560 Training loss: 2.0976 0.3820 sec/batch
Epoch 5/20  Iteration 815/3560 Training loss: 2.0971 0.3872 sec/batch
Epoch 5/20  Iteration 816/3560 Training loss: 2.0967 0.3811 sec/batch
Epoch 5/20  Iteration 817/3560 Training loss: 2.0964 0.3846 sec/batch
Epoch 5/20  Iteration 818/3560 Training loss: 2.0962 0.4162 sec/batch
Epoch 5/20  Iteration 819/3560 Training loss: 2.0960 0.3821 sec/batch
Epoch 5/20  Iteration 820/3560 Training loss: 2.0959 0.3839 sec/batch
Epoch 5/20  Iteration 821/3560 Training loss: 2.0958 0.3855 sec/batch
Epoch 5/20  Iteration 822/3560 Training loss: 2.0956 0.3882 sec/batch
Epoch 5/20  Iteration 823/3560 Training loss: 2.0955 0.3795 sec/batch
Epoch 5/20  Iteration 824/3560 Training loss: 2.0952 0.3788 sec/batch
Epoch 5/20  Iteration 825/3560 Training loss: 2.0950 0.3858 sec/batch
Epoch 5/20  Iteration 826/3560 Training loss: 2.0948 0.3879 sec/batch
Epoch 5/20  Iteration 827/3560 Training loss: 2.0945 0.3836 sec/batch
Epoch 5/20  Iteration 828/3560 Training loss: 2.0940 0.3852 sec/batch
Epoch 5/20  Iteration 829/3560 Training loss: 2.0938 0.3797 sec/batch
Epoch 5/20  Iteration 830/3560 Training loss: 2.0937 0.3900 sec/batch
Epoch 5/20  Iteration 831/3560 Training loss: 2.0936 0.4034 sec/batch
Epoch 5/20  Iteration 832/3560 Training loss: 2.0933 0.3823 sec/batch
Epoch 5/20  Iteration 833/3560 Training loss: 2.0933 0.3795 sec/batch
Epoch 5/20  Iteration 834/3560 Training loss: 2.0929 0.3899 sec/batch
Epoch 5/20  Iteration 835/3560 Training loss: 2.0926 0.3834 sec/batch
Epoch 5/20  Iteration 836/3560 Training loss: 2.0924 0.3873 sec/batch
Epoch 5/20  Iteration 837/3560 Training loss: 2.0923 0.3827 sec/batch
Epoch 5/20  Iteration 838/3560 Training loss: 2.0918 0.3803 sec/batch
Epoch 5/20  Iteration 839/3560 Training loss: 2.0917 0.3818 sec/batch
Epoch 5/20  Iteration 840/3560 Training loss: 2.0916 0.3825 sec/batch
Epoch 5/20  Iteration 841/3560 Training loss: 2.0915 0.3836 sec/batch
Epoch 5/20  Iteration 842/3560 Training loss: 2.0914 0.3832 sec/batch
Epoch 5/20  Iteration 843/3560 Training loss: 2.0911 0.3791 sec/batch
Epoch 5/20  Iteration 844/3560 Training loss: 2.0907 0.4145 sec/batch
Epoch 5/20  Iteration 845/3560 Training loss: 2.0906 0.3838 sec/batch
Epoch 5/20  Iteration 846/3560 Training loss: 2.0904 0.3826 sec/batch
Epoch 5/20  Iteration 847/3560 Training loss: 2.0902 0.3791 sec/batch
Epoch 5/20  Iteration 848/3560 Training loss: 2.0901 0.3860 sec/batch
Epoch 5/20  Iteration 849/3560 Training loss: 2.0899 0.3787 sec/batch
Epoch 5/20  Iteration 850/3560 Training loss: 2.0898 0.3819 sec/batch
Epoch 5/20  Iteration 851/3560 Training loss: 2.0899 0.3821 sec/batch
Epoch 5/20  Iteration 852/3560 Training loss: 2.0896 0.3851 sec/batch
Epoch 5/20  Iteration 853/3560 Training loss: 2.0896 0.3837 sec/batch
Epoch 5/20  Iteration 854/3560 Training loss: 2.0894 0.3826 sec/batch
Epoch 5/20  Iteration 855/3560 Training loss: 2.0892 0.3822 sec/batch
Epoch 5/20  Iteration 856/3560 Training loss: 2.0890 0.3825 sec/batch
Epoch 5/20  Iteration 857/3560 Training loss: 2.0887 0.4015 sec/batch
Epoch 5/20  Iteration 858/3560 Training loss: 2.0888 0.3849 sec/batch
Epoch 5/20  Iteration 859/3560 Training loss: 2.0886 0.3875 sec/batch
Epoch 5/20  Iteration 860/3560 Training loss: 2.0886 0.3795 sec/batch
Epoch 5/20  Iteration 861/3560 Training loss: 2.0884 0.3835 sec/batch
Epoch 5/20  Iteration 862/3560 Training loss: 2.0881 0.3981 sec/batch
Epoch 5/20  Iteration 863/3560 Training loss: 2.0879 0.3793 sec/batch
Epoch 5/20  Iteration 864/3560 Training loss: 2.0879 0.3864 sec/batch
Epoch 5/20  Iteration 865/3560 Training loss: 2.0877 0.4026 sec/batch
Epoch 5/20  Iteration 866/3560 Training loss: 2.0877 0.3835 sec/batch
Epoch 5/20  Iteration 867/3560 Training loss: 2.0875 0.3841 sec/batch
Epoch 5/20  Iteration 868/3560 Training loss: 2.0874 0.3803 sec/batch
Epoch 5/20  Iteration 869/3560 Training loss: 2.0872 0.3810 sec/batch
Epoch 5/20  Iteration 870/3560 Training loss: 2.0870 0.3928 sec/batch
Epoch 5/20  Iteration 871/3560 Training loss: 2.0867 0.3887 sec/batch
Epoch 5/20  Iteration 872/3560 Training loss: 2.0868 0.3839 sec/batch
Epoch 5/20  Iteration 873/3560 Training loss: 2.0868 0.3920 sec/batch
Epoch 5/20  Iteration 874/3560 Training loss: 2.0865 0.3781 sec/batch
Epoch 5/20  Iteration 875/3560 Training loss: 2.0864 0.3802 sec/batch
Epoch 5/20  Iteration 876/3560 Training loss: 2.0863 0.3835 sec/batch
Epoch 5/20  Iteration 877/3560 Training loss: 2.0861 0.3886 sec/batch
Epoch 5/20  Iteration 878/3560 Training loss: 2.0860 0.3876 sec/batch
Epoch 5/20  Iteration 879/3560 Training loss: 2.0859 0.3839 sec/batch
Epoch 5/20  Iteration 880/3560 Training loss: 2.0861 0.3785 sec/batch
Epoch 5/20  Iteration 881/3560 Training loss: 2.0859 0.3789 sec/batch
Epoch 5/20  Iteration 882/3560 Training loss: 2.0857 0.3791 sec/batch
Epoch 5/20  Iteration 883/3560 Training loss: 2.0855 0.3953 sec/batch
Epoch 5/20  Iteration 884/3560 Training loss: 2.0852 0.4004 sec/batch
Epoch 5/20  Iteration 885/3560 Training loss: 2.0852 0.3798 sec/batch
Epoch 5/20  Iteration 886/3560 Training loss: 2.0851 0.3812 sec/batch
Epoch 5/20  Iteration 887/3560 Training loss: 2.0850 0.4099 sec/batch
Epoch 5/20  Iteration 888/3560 Training loss: 2.0849 0.3816 sec/batch
Epoch 5/20  Iteration 889/3560 Training loss: 2.0848 0.3856 sec/batch
Epoch 5/20  Iteration 890/3560 Training loss: 2.0847 0.3824 sec/batch
Epoch 6/20  Iteration 891/3560 Training loss: 2.1140 0.3825 sec/batch
Epoch 6/20  Iteration 892/3560 Training loss: 2.0756 0.3848 sec/batch
Epoch 6/20  Iteration 893/3560 Training loss: 2.0621 0.3850 sec/batch
Epoch 6/20  Iteration 894/3560 Training loss: 2.0571 0.3847 sec/batch
Epoch 6/20  Iteration 895/3560 Training loss: 2.0558 0.3851 sec/batch
Epoch 6/20  Iteration 896/3560 Training loss: 2.0504 0.3884 sec/batch
Epoch 6/20  Iteration 897/3560 Training loss: 2.0507 0.3971 sec/batch
Epoch 6/20  Iteration 898/3560 Training loss: 2.0525 0.3874 sec/batch
Epoch 6/20  Iteration 899/3560 Training loss: 2.0550 0.3793 sec/batch
Epoch 6/20  Iteration 900/3560 Training loss: 2.0545 0.4309 sec/batch
Epoch 6/20  Iteration 901/3560 Training loss: 2.0520 0.3818 sec/batch
Epoch 6/20  Iteration 902/3560 Training loss: 2.0498 0.3937 sec/batch
Epoch 6/20  Iteration 903/3560 Training loss: 2.0503 0.3897 sec/batch
Epoch 6/20  Iteration 904/3560 Training loss: 2.0523 0.3798 sec/batch
Epoch 6/20  Iteration 905/3560 Training loss: 2.0512 0.4007 sec/batch
Epoch 6/20  Iteration 906/3560 Training loss: 2.0500 0.3866 sec/batch
Epoch 6/20  Iteration 907/3560 Training loss: 2.0499 0.3920 sec/batch
Epoch 6/20  Iteration 908/3560 Training loss: 2.0518 0.3863 sec/batch
Epoch 6/20  Iteration 909/3560 Training loss: 2.0517 0.4119 sec/batch
Epoch 6/20  Iteration 910/3560 Training loss: 2.0514 0.3916 sec/batch
Epoch 6/20  Iteration 911/3560 Training loss: 2.0509 0.3884 sec/batch
Epoch 6/20  Iteration 912/3560 Training loss: 2.0521 0.3845 sec/batch
Epoch 6/20  Iteration 913/3560 Training loss: 2.0515 0.3869 sec/batch
Epoch 6/20  Iteration 914/3560 Training loss: 2.0509 0.3817 sec/batch
Epoch 6/20  Iteration 915/3560 Training loss: 2.0504 0.3894 sec/batch
Epoch 6/20  Iteration 916/3560 Training loss: 2.0495 0.3858 sec/batch
Epoch 6/20  Iteration 917/3560 Training loss: 2.0483 0.3821 sec/batch
Epoch 6/20  Iteration 918/3560 Training loss: 2.0486 0.3802 sec/batch
Epoch 6/20  Iteration 919/3560 Training loss: 2.0490 0.3822 sec/batch
Epoch 6/20  Iteration 920/3560 Training loss: 2.0494 0.3924 sec/batch
Epoch 6/20  Iteration 921/3560 Training loss: 2.0493 0.3840 sec/batch
Epoch 6/20  Iteration 922/3560 Training loss: 2.0487 0.3981 sec/batch
Epoch 6/20  Iteration 923/3560 Training loss: 2.0486 0.3879 sec/batch
Epoch 6/20  Iteration 924/3560 Training loss: 2.0490 0.3871 sec/batch
Epoch 6/20  Iteration 925/3560 Training loss: 2.0483 0.3967 sec/batch
Epoch 6/20  Iteration 926/3560 Training loss: 2.0479 0.3795 sec/batch
Epoch 6/20  Iteration 927/3560 Training loss: 2.0479 0.3802 sec/batch
Epoch 6/20  Iteration 928/3560 Training loss: 2.0467 0.3866 sec/batch
Epoch 6/20  Iteration 929/3560 Training loss: 2.0455 0.3817 sec/batch
Epoch 6/20  Iteration 930/3560 Training loss: 2.0445 0.3790 sec/batch
Epoch 6/20  Iteration 931/3560 Training loss: 2.0440 0.3790 sec/batch
Epoch 6/20  Iteration 932/3560 Training loss: 2.0436 0.3824 sec/batch
Epoch 6/20  Iteration 933/3560 Training loss: 2.0431 0.4018 sec/batch
Epoch 6/20  Iteration 934/3560 Training loss: 2.0422 0.3878 sec/batch
Epoch 6/20  Iteration 935/3560 Training loss: 2.0421 0.4012 sec/batch
Epoch 6/20  Iteration 936/3560 Training loss: 2.0408 0.3858 sec/batch
Epoch 6/20  Iteration 937/3560 Training loss: 2.0409 0.3824 sec/batch
Epoch 6/20  Iteration 938/3560 Training loss: 2.0403 0.3821 sec/batch
Epoch 6/20  Iteration 939/3560 Training loss: 2.0401 0.3836 sec/batch
Epoch 6/20  Iteration 940/3560 Training loss: 2.0407 0.3867 sec/batch
Epoch 6/20  Iteration 941/3560 Training loss: 2.0401 0.4048 sec/batch
Epoch 6/20  Iteration 942/3560 Training loss: 2.0406 0.3869 sec/batch
Epoch 6/20  Iteration 943/3560 Training loss: 2.0402 0.3836 sec/batch
Epoch 6/20  Iteration 944/3560 Training loss: 2.0399 0.3836 sec/batch
Epoch 6/20  Iteration 945/3560 Training loss: 2.0395 0.3926 sec/batch
Epoch 6/20  Iteration 946/3560 Training loss: 2.0396 0.3867 sec/batch
Epoch 6/20  Iteration 947/3560 Training loss: 2.0397 0.3841 sec/batch
Epoch 6/20  Iteration 948/3560 Training loss: 2.0395 0.4102 sec/batch
Epoch 6/20  Iteration 949/3560 Training loss: 2.0392 0.3842 sec/batch
Epoch 6/20  Iteration 950/3560 Training loss: 2.0395 0.3791 sec/batch
Epoch 6/20  Iteration 951/3560 Training loss: 2.0392 0.3800 sec/batch
Epoch 6/20  Iteration 952/3560 Training loss: 2.0397 0.3778 sec/batch
Epoch 6/20  Iteration 953/3560 Training loss: 2.0400 0.3782 sec/batch
Epoch 6/20  Iteration 954/3560 Training loss: 2.0401 0.3802 sec/batch
Epoch 6/20  Iteration 955/3560 Training loss: 2.0398 0.3817 sec/batch
Epoch 6/20  Iteration 956/3560 Training loss: 2.0400 0.3788 sec/batch
Epoch 6/20  Iteration 957/3560 Training loss: 2.0401 0.3828 sec/batch
Epoch 6/20  Iteration 958/3560 Training loss: 2.0395 0.3796 sec/batch
Epoch 6/20  Iteration 959/3560 Training loss: 2.0392 0.3832 sec/batch
Epoch 6/20  Iteration 960/3560 Training loss: 2.0392 0.3824 sec/batch
Epoch 6/20  Iteration 961/3560 Training loss: 2.0395 0.3902 sec/batch
Epoch 6/20  Iteration 962/3560 Training loss: 2.0394 0.4029 sec/batch
Epoch 6/20  Iteration 963/3560 Training loss: 2.0396 0.3818 sec/batch
Epoch 6/20  Iteration 964/3560 Training loss: 2.0392 0.3850 sec/batch
Epoch 6/20  Iteration 965/3560 Training loss: 2.0390 0.3787 sec/batch
Epoch 6/20  Iteration 966/3560 Training loss: 2.0392 0.3790 sec/batch
Epoch 6/20  Iteration 967/3560 Training loss: 2.0390 0.3823 sec/batch
Epoch 6/20  Iteration 968/3560 Training loss: 2.0390 0.3813 sec/batch
Epoch 6/20  Iteration 969/3560 Training loss: 2.0385 0.3800 sec/batch
Epoch 6/20  Iteration 970/3560 Training loss: 2.0383 0.3941 sec/batch
Epoch 6/20  Iteration 971/3560 Training loss: 2.0377 0.3854 sec/batch
Epoch 6/20  Iteration 972/3560 Training loss: 2.0377 0.3860 sec/batch
Epoch 6/20  Iteration 973/3560 Training loss: 2.0372 0.3881 sec/batch
Epoch 6/20  Iteration 974/3560 Training loss: 2.0369 0.3870 sec/batch
Epoch 6/20  Iteration 975/3560 Training loss: 2.0362 0.4059 sec/batch
Epoch 6/20  Iteration 976/3560 Training loss: 2.0358 0.3797 sec/batch
Epoch 6/20  Iteration 977/3560 Training loss: 2.0355 0.3888 sec/batch
Epoch 6/20  Iteration 978/3560 Training loss: 2.0353 0.3806 sec/batch
Epoch 6/20  Iteration 979/3560 Training loss: 2.0347 0.3867 sec/batch
Epoch 6/20  Iteration 980/3560 Training loss: 2.0348 0.3818 sec/batch
Epoch 6/20  Iteration 981/3560 Training loss: 2.0345 0.3792 sec/batch
Epoch 6/20  Iteration 982/3560 Training loss: 2.0343 0.3846 sec/batch
Epoch 6/20  Iteration 983/3560 Training loss: 2.0338 0.3794 sec/batch
Epoch 6/20  Iteration 984/3560 Training loss: 2.0335 0.3809 sec/batch
Epoch 6/20  Iteration 985/3560 Training loss: 2.0331 0.3808 sec/batch
Epoch 6/20  Iteration 986/3560 Training loss: 2.0327 0.3808 sec/batch
Epoch 6/20  Iteration 987/3560 Training loss: 2.0325 0.3837 sec/batch
Epoch 6/20  Iteration 988/3560 Training loss: 2.0320 0.4054 sec/batch
Epoch 6/20  Iteration 989/3560 Training loss: 2.0316 0.3882 sec/batch
Epoch 6/20  Iteration 990/3560 Training loss: 2.0311 0.3834 sec/batch
Epoch 6/20  Iteration 991/3560 Training loss: 2.0310 0.3935 sec/batch
Epoch 6/20  Iteration 992/3560 Training loss: 2.0308 0.3828 sec/batch
Epoch 6/20  Iteration 993/3560 Training loss: 2.0305 0.3856 sec/batch
Epoch 6/20  Iteration 994/3560 Training loss: 2.0302 0.3828 sec/batch
Epoch 6/20  Iteration 995/3560 Training loss: 2.0298 0.3859 sec/batch
Epoch 6/20  Iteration 996/3560 Training loss: 2.0296 0.3782 sec/batch
Epoch 6/20  Iteration 997/3560 Training loss: 2.0295 0.3848 sec/batch
Epoch 6/20  Iteration 998/3560 Training loss: 2.0294 0.3817 sec/batch
Epoch 6/20  Iteration 999/3560 Training loss: 2.0293 0.3874 sec/batch
Epoch 6/20  Iteration 1000/3560 Training loss: 2.0291 0.3864 sec/batch
Epoch 6/20  Iteration 1001/3560 Training loss: 2.0290 0.4152 sec/batch
Epoch 6/20  Iteration 1002/3560 Training loss: 2.0288 0.3881 sec/batch
Epoch 6/20  Iteration 1003/3560 Training loss: 2.0287 0.3772 sec/batch
Epoch 6/20  Iteration 1004/3560 Training loss: 2.0285 0.3840 sec/batch
Epoch 6/20  Iteration 1005/3560 Training loss: 2.0282 0.3851 sec/batch
Epoch 6/20  Iteration 1006/3560 Training loss: 2.0278 0.3904 sec/batch
Epoch 6/20  Iteration 1007/3560 Training loss: 2.0277 0.3867 sec/batch
Epoch 6/20  Iteration 1008/3560 Training loss: 2.0275 0.3885 sec/batch
Epoch 6/20  Iteration 1009/3560 Training loss: 2.0275 0.3906 sec/batch
Epoch 6/20  Iteration 1010/3560 Training loss: 2.0274 0.3887 sec/batch
Epoch 6/20  Iteration 1011/3560 Training loss: 2.0274 0.3839 sec/batch
Epoch 6/20  Iteration 1012/3560 Training loss: 2.0270 0.3780 sec/batch
Epoch 6/20  Iteration 1013/3560 Training loss: 2.0267 0.3793 sec/batch
Epoch 6/20  Iteration 1014/3560 Training loss: 2.0267 0.3990 sec/batch
Epoch 6/20  Iteration 1015/3560 Training loss: 2.0265 0.3964 sec/batch
Epoch 6/20  Iteration 1016/3560 Training loss: 2.0261 0.3836 sec/batch
Epoch 6/20  Iteration 1017/3560 Training loss: 2.0261 0.3846 sec/batch
Epoch 6/20  Iteration 1018/3560 Training loss: 2.0261 0.3971 sec/batch
Epoch 6/20  Iteration 1019/3560 Training loss: 2.0260 0.3913 sec/batch
Epoch 6/20  Iteration 1020/3560 Training loss: 2.0259 0.3978 sec/batch
Epoch 6/20  Iteration 1021/3560 Training loss: 2.0256 0.3846 sec/batch
Epoch 6/20  Iteration 1022/3560 Training loss: 2.0253 0.3814 sec/batch
Epoch 6/20  Iteration 1023/3560 Training loss: 2.0251 0.3896 sec/batch
Epoch 6/20  Iteration 1024/3560 Training loss: 2.0249 0.3859 sec/batch
Epoch 6/20  Iteration 1025/3560 Training loss: 2.0248 0.3862 sec/batch
Epoch 6/20  Iteration 1026/3560 Training loss: 2.0247 0.3809 sec/batch
Epoch 6/20  Iteration 1027/3560 Training loss: 2.0248 0.4074 sec/batch
Epoch 6/20  Iteration 1028/3560 Training loss: 2.0246 0.3782 sec/batch
Epoch 6/20  Iteration 1029/3560 Training loss: 2.0247 0.3795 sec/batch
Epoch 6/20  Iteration 1030/3560 Training loss: 2.0245 0.3806 sec/batch
Epoch 6/20  Iteration 1031/3560 Training loss: 2.0246 0.3814 sec/batch
Epoch 6/20  Iteration 1032/3560 Training loss: 2.0244 0.3892 sec/batch
Epoch 6/20  Iteration 1033/3560 Training loss: 2.0242 0.3825 sec/batch
Epoch 6/20  Iteration 1034/3560 Training loss: 2.0242 0.3864 sec/batch
Epoch 6/20  Iteration 1035/3560 Training loss: 2.0239 0.3863 sec/batch
Epoch 6/20  Iteration 1036/3560 Training loss: 2.0239 0.3861 sec/batch
Epoch 6/20  Iteration 1037/3560 Training loss: 2.0238 0.3933 sec/batch
Epoch 6/20  Iteration 1038/3560 Training loss: 2.0238 0.3893 sec/batch
Epoch 6/20  Iteration 1039/3560 Training loss: 2.0237 0.3823 sec/batch
Epoch 6/20  Iteration 1040/3560 Training loss: 2.0234 0.4019 sec/batch
Epoch 6/20  Iteration 1041/3560 Training loss: 2.0232 0.3802 sec/batch
Epoch 6/20  Iteration 1042/3560 Training loss: 2.0233 0.3866 sec/batch
Epoch 6/20  Iteration 1043/3560 Training loss: 2.0232 0.3859 sec/batch
Epoch 6/20  Iteration 1044/3560 Training loss: 2.0231 0.3872 sec/batch
Epoch 6/20  Iteration 1045/3560 Training loss: 2.0230 0.3887 sec/batch
Epoch 6/20  Iteration 1046/3560 Training loss: 2.0229 0.3886 sec/batch
Epoch 6/20  Iteration 1047/3560 Training loss: 2.0228 0.3856 sec/batch
Epoch 6/20  Iteration 1048/3560 Training loss: 2.0226 0.3959 sec/batch
Epoch 6/20  Iteration 1049/3560 Training loss: 2.0224 0.3837 sec/batch
Epoch 6/20  Iteration 1050/3560 Training loss: 2.0224 0.3807 sec/batch
Epoch 6/20  Iteration 1051/3560 Training loss: 2.0224 0.3869 sec/batch
Epoch 6/20  Iteration 1052/3560 Training loss: 2.0223 0.3858 sec/batch
Epoch 6/20  Iteration 1053/3560 Training loss: 2.0221 0.4037 sec/batch
Epoch 6/20  Iteration 1054/3560 Training loss: 2.0220 0.3809 sec/batch
Epoch 6/20  Iteration 1055/3560 Training loss: 2.0219 0.3829 sec/batch
Epoch 6/20  Iteration 1056/3560 Training loss: 2.0217 0.3868 sec/batch
Epoch 6/20  Iteration 1057/3560 Training loss: 2.0217 0.3838 sec/batch
Epoch 6/20  Iteration 1058/3560 Training loss: 2.0219 0.3853 sec/batch
Epoch 6/20  Iteration 1059/3560 Training loss: 2.0218 0.3789 sec/batch
Epoch 6/20  Iteration 1060/3560 Training loss: 2.0216 0.3831 sec/batch
Epoch 6/20  Iteration 1061/3560 Training loss: 2.0214 0.3895 sec/batch
Epoch 6/20  Iteration 1062/3560 Training loss: 2.0212 0.3832 sec/batch
Epoch 6/20  Iteration 1063/3560 Training loss: 2.0212 0.3908 sec/batch
Epoch 6/20  Iteration 1064/3560 Training loss: 2.0211 0.3914 sec/batch
Epoch 6/20  Iteration 1065/3560 Training loss: 2.0210 0.3826 sec/batch
Epoch 6/20  Iteration 1066/3560 Training loss: 2.0210 0.4027 sec/batch
Epoch 6/20  Iteration 1067/3560 Training loss: 2.0208 0.3807 sec/batch
Epoch 6/20  Iteration 1068/3560 Training loss: 2.0207 0.3835 sec/batch
Epoch 7/20  Iteration 1069/3560 Training loss: 2.0604 0.3786 sec/batch
Epoch 7/20  Iteration 1070/3560 Training loss: 2.0165 0.3799 sec/batch
Epoch 7/20  Iteration 1071/3560 Training loss: 2.0075 0.3854 sec/batch
Epoch 7/20  Iteration 1072/3560 Training loss: 2.0012 0.3823 sec/batch
Epoch 7/20  Iteration 1073/3560 Training loss: 1.9977 0.3820 sec/batch
Epoch 7/20  Iteration 1074/3560 Training loss: 1.9927 0.3935 sec/batch
Epoch 7/20  Iteration 1075/3560 Training loss: 1.9940 0.3833 sec/batch
Epoch 7/20  Iteration 1076/3560 Training loss: 1.9953 0.3830 sec/batch
Epoch 7/20  Iteration 1077/3560 Training loss: 1.9984 0.3834 sec/batch
Epoch 7/20  Iteration 1078/3560 Training loss: 1.9977 0.3785 sec/batch
Epoch 7/20  Iteration 1079/3560 Training loss: 1.9952 0.4100 sec/batch
Epoch 7/20  Iteration 1080/3560 Training loss: 1.9934 0.3854 sec/batch
Epoch 7/20  Iteration 1081/3560 Training loss: 1.9938 0.3926 sec/batch
Epoch 7/20  Iteration 1082/3560 Training loss: 1.9954 0.3849 sec/batch
Epoch 7/20  Iteration 1083/3560 Training loss: 1.9950 0.3816 sec/batch
Epoch 7/20  Iteration 1084/3560 Training loss: 1.9936 0.3795 sec/batch
Epoch 7/20  Iteration 1085/3560 Training loss: 1.9934 0.3829 sec/batch
Epoch 7/20  Iteration 1086/3560 Training loss: 1.9957 0.3796 sec/batch
Epoch 7/20  Iteration 1087/3560 Training loss: 1.9956 0.3790 sec/batch
Epoch 7/20  Iteration 1088/3560 Training loss: 1.9954 0.3855 sec/batch
Epoch 7/20  Iteration 1089/3560 Training loss: 1.9946 0.3848 sec/batch
Epoch 7/20  Iteration 1090/3560 Training loss: 1.9954 0.3878 sec/batch
Epoch 7/20  Iteration 1091/3560 Training loss: 1.9954 0.3803 sec/batch
Epoch 7/20  Iteration 1092/3560 Training loss: 1.9946 0.4040 sec/batch
Epoch 7/20  Iteration 1093/3560 Training loss: 1.9941 0.3833 sec/batch
Epoch 7/20  Iteration 1094/3560 Training loss: 1.9934 0.3912 sec/batch
Epoch 7/20  Iteration 1095/3560 Training loss: 1.9925 0.3976 sec/batch
Epoch 7/20  Iteration 1096/3560 Training loss: 1.9927 0.3832 sec/batch
Epoch 7/20  Iteration 1097/3560 Training loss: 1.9933 0.3847 sec/batch
Epoch 7/20  Iteration 1098/3560 Training loss: 1.9938 0.3828 sec/batch
Epoch 7/20  Iteration 1099/3560 Training loss: 1.9932 0.3832 sec/batch
Epoch 7/20  Iteration 1100/3560 Training loss: 1.9925 0.3930 sec/batch
Epoch 7/20  Iteration 1101/3560 Training loss: 1.9924 0.3908 sec/batch
Epoch 7/20  Iteration 1102/3560 Training loss: 1.9932 0.3816 sec/batch
Epoch 7/20  Iteration 1103/3560 Training loss: 1.9925 0.3814 sec/batch
Epoch 7/20  Iteration 1104/3560 Training loss: 1.9924 0.3814 sec/batch
Epoch 7/20  Iteration 1105/3560 Training loss: 1.9922 0.4047 sec/batch
Epoch 7/20  Iteration 1106/3560 Training loss: 1.9910 0.3860 sec/batch
Epoch 7/20  Iteration 1107/3560 Training loss: 1.9900 0.3886 sec/batch
Epoch 7/20  Iteration 1108/3560 Training loss: 1.9892 0.3852 sec/batch
Epoch 7/20  Iteration 1109/3560 Training loss: 1.9886 0.3963 sec/batch
Epoch 7/20  Iteration 1110/3560 Training loss: 1.9882 0.3863 sec/batch
Epoch 7/20  Iteration 1111/3560 Training loss: 1.9880 0.3755 sec/batch
Epoch 7/20  Iteration 1112/3560 Training loss: 1.9872 0.3868 sec/batch
Epoch 7/20  Iteration 1113/3560 Training loss: 1.9870 0.3859 sec/batch
Epoch 7/20  Iteration 1114/3560 Training loss: 1.9857 0.3829 sec/batch
Epoch 7/20  Iteration 1115/3560 Training loss: 1.9856 0.3832 sec/batch
Epoch 7/20  Iteration 1116/3560 Training loss: 1.9850 0.3823 sec/batch
Epoch 7/20  Iteration 1117/3560 Training loss: 1.9849 0.3818 sec/batch
Epoch 7/20  Iteration 1118/3560 Training loss: 1.9855 0.4069 sec/batch
Epoch 7/20  Iteration 1119/3560 Training loss: 1.9849 0.3884 sec/batch
Epoch 7/20  Iteration 1120/3560 Training loss: 1.9856 0.3798 sec/batch
Epoch 7/20  Iteration 1121/3560 Training loss: 1.9854 0.3829 sec/batch
Epoch 7/20  Iteration 1122/3560 Training loss: 1.9854 0.3881 sec/batch
Epoch 7/20  Iteration 1123/3560 Training loss: 1.9851 0.3832 sec/batch
Epoch 7/20  Iteration 1124/3560 Training loss: 1.9852 0.3854 sec/batch
Epoch 7/20  Iteration 1125/3560 Training loss: 1.9854 0.3920 sec/batch
Epoch 7/20  Iteration 1126/3560 Training loss: 1.9852 0.3851 sec/batch
Epoch 7/20  Iteration 1127/3560 Training loss: 1.9847 0.3894 sec/batch
Epoch 7/20  Iteration 1128/3560 Training loss: 1.9850 0.3861 sec/batch
Epoch 7/20  Iteration 1129/3560 Training loss: 1.9847 0.3814 sec/batch
Epoch 7/20  Iteration 1130/3560 Training loss: 1.9851 0.3811 sec/batch
Epoch 7/20  Iteration 1131/3560 Training loss: 1.9854 0.4083 sec/batch
Epoch 7/20  Iteration 1132/3560 Training loss: 1.9856 0.3903 sec/batch
Epoch 7/20  Iteration 1133/3560 Training loss: 1.9853 0.3893 sec/batch
Epoch 7/20  Iteration 1134/3560 Training loss: 1.9856 0.3760 sec/batch
Epoch 7/20  Iteration 1135/3560 Training loss: 1.9856 0.3820 sec/batch
Epoch 7/20  Iteration 1136/3560 Training loss: 1.9851 0.3835 sec/batch
Epoch 7/20  Iteration 1137/3560 Training loss: 1.9849 0.3868 sec/batch
Epoch 7/20  Iteration 1138/3560 Training loss: 1.9847 0.3858 sec/batch
Epoch 7/20  Iteration 1139/3560 Training loss: 1.9851 0.3867 sec/batch
Epoch 7/20  Iteration 1140/3560 Training loss: 1.9851 0.3942 sec/batch
Epoch 7/20  Iteration 1141/3560 Training loss: 1.9852 0.3917 sec/batch
Epoch 7/20  Iteration 1142/3560 Training loss: 1.9849 0.3862 sec/batch
Epoch 7/20  Iteration 1143/3560 Training loss: 1.9847 0.3905 sec/batch
Epoch 7/20  Iteration 1144/3560 Training loss: 1.9849 0.4200 sec/batch
Epoch 7/20  Iteration 1145/3560 Training loss: 1.9847 0.3887 sec/batch
Epoch 7/20  Iteration 1146/3560 Training loss: 1.9846 0.3859 sec/batch
Epoch 7/20  Iteration 1147/3560 Training loss: 1.9842 0.3816 sec/batch
Epoch 7/20  Iteration 1148/3560 Training loss: 1.9840 0.3963 sec/batch
Epoch 7/20  Iteration 1149/3560 Training loss: 1.9834 0.3878 sec/batch
Epoch 7/20  Iteration 1150/3560 Training loss: 1.9836 0.3844 sec/batch
Epoch 7/20  Iteration 1151/3560 Training loss: 1.9832 0.3810 sec/batch
Epoch 7/20  Iteration 1152/3560 Training loss: 1.9829 0.3785 sec/batch
Epoch 7/20  Iteration 1153/3560 Training loss: 1.9822 0.3831 sec/batch
Epoch 7/20  Iteration 1154/3560 Training loss: 1.9819 0.3926 sec/batch
Epoch 7/20  Iteration 1155/3560 Training loss: 1.9817 0.3869 sec/batch
Epoch 7/20  Iteration 1156/3560 Training loss: 1.9814 0.3846 sec/batch
Epoch 7/20  Iteration 1157/3560 Training loss: 1.9808 0.4092 sec/batch
Epoch 7/20  Iteration 1158/3560 Training loss: 1.9808 0.3857 sec/batch
Epoch 7/20  Iteration 1159/3560 Training loss: 1.9805 0.3868 sec/batch
Epoch 7/20  Iteration 1160/3560 Training loss: 1.9804 0.3876 sec/batch
Epoch 7/20  Iteration 1161/3560 Training loss: 1.9799 0.3871 sec/batch
Epoch 7/20  Iteration 1162/3560 Training loss: 1.9795 0.3840 sec/batch
Epoch 7/20  Iteration 1163/3560 Training loss: 1.9791 0.3866 sec/batch
Epoch 7/20  Iteration 1164/3560 Training loss: 1.9789 0.3817 sec/batch
Epoch 7/20  Iteration 1165/3560 Training loss: 1.9787 0.3873 sec/batch
Epoch 7/20  Iteration 1166/3560 Training loss: 1.9783 0.3858 sec/batch
Epoch 7/20  Iteration 1167/3560 Training loss: 1.9779 0.3872 sec/batch
Epoch 7/20  Iteration 1168/3560 Training loss: 1.9774 0.3924 sec/batch
Epoch 7/20  Iteration 1169/3560 Training loss: 1.9774 0.3799 sec/batch
Epoch 7/20  Iteration 1170/3560 Training loss: 1.9773 0.4075 sec/batch
Epoch 7/20  Iteration 1171/3560 Training loss: 1.9769 0.3995 sec/batch
Epoch 7/20  Iteration 1172/3560 Training loss: 1.9766 0.3880 sec/batch
Epoch 7/20  Iteration 1173/3560 Training loss: 1.9762 0.3872 sec/batch
Epoch 7/20  Iteration 1174/3560 Training loss: 1.9760 0.3840 sec/batch
Epoch 7/20  Iteration 1175/3560 Training loss: 1.9758 0.3826 sec/batch
Epoch 7/20  Iteration 1176/3560 Training loss: 1.9759 0.3838 sec/batch
Epoch 7/20  Iteration 1177/3560 Training loss: 1.9758 0.3845 sec/batch
Epoch 7/20  Iteration 1178/3560 Training loss: 1.9757 0.3838 sec/batch
Epoch 7/20  Iteration 1179/3560 Training loss: 1.9755 0.4000 sec/batch
Epoch 7/20  Iteration 1180/3560 Training loss: 1.9754 0.4076 sec/batch
Epoch 7/20  Iteration 1181/3560 Training loss: 1.9752 0.3863 sec/batch
Epoch 7/20  Iteration 1182/3560 Training loss: 1.9751 0.3828 sec/batch
Epoch 7/20  Iteration 1183/3560 Training loss: 1.9748 0.4163 sec/batch
Epoch 7/20  Iteration 1184/3560 Training loss: 1.9744 0.3875 sec/batch
Epoch 7/20  Iteration 1185/3560 Training loss: 1.9743 0.3806 sec/batch
Epoch 7/20  Iteration 1186/3560 Training loss: 1.9742 0.3803 sec/batch
Epoch 7/20  Iteration 1187/3560 Training loss: 1.9742 0.3920 sec/batch
Epoch 7/20  Iteration 1188/3560 Training loss: 1.9739 0.3901 sec/batch
Epoch 7/20  Iteration 1189/3560 Training loss: 1.9739 0.3802 sec/batch
Epoch 7/20  Iteration 1190/3560 Training loss: 1.9736 0.3863 sec/batch
Epoch 7/20  Iteration 1191/3560 Training loss: 1.9734 0.3798 sec/batch
Epoch 7/20  Iteration 1192/3560 Training loss: 1.9735 0.3826 sec/batch
Epoch 7/20  Iteration 1193/3560 Training loss: 1.9735 0.3811 sec/batch
Epoch 7/20  Iteration 1194/3560 Training loss: 1.9731 0.3816 sec/batch
Epoch 7/20  Iteration 1195/3560 Training loss: 1.9731 0.3848 sec/batch
Epoch 7/20  Iteration 1196/3560 Training loss: 1.9732 0.4152 sec/batch
Epoch 7/20  Iteration 1197/3560 Training loss: 1.9731 0.3887 sec/batch
Epoch 7/20  Iteration 1198/3560 Training loss: 1.9730 0.3930 sec/batch
Epoch 7/20  Iteration 1199/3560 Training loss: 1.9727 0.3942 sec/batch
Epoch 7/20  Iteration 1200/3560 Training loss: 1.9723 0.3836 sec/batch
Epoch 7/20  Iteration 1201/3560 Training loss: 1.9722 0.3933 sec/batch
Epoch 7/20  Iteration 1202/3560 Training loss: 1.9722 0.3855 sec/batch
Epoch 7/20  Iteration 1203/3560 Training loss: 1.9722 0.3774 sec/batch
Epoch 7/20  Iteration 1204/3560 Training loss: 1.9722 0.3917 sec/batch
Epoch 7/20  Iteration 1205/3560 Training loss: 1.9721 0.3820 sec/batch
Epoch 7/20  Iteration 1206/3560 Training loss: 1.9721 0.4929 sec/batch
Epoch 7/20  Iteration 1207/3560 Training loss: 1.9723 0.4094 sec/batch
Epoch 7/20  Iteration 1208/3560 Training loss: 1.9720 0.4215 sec/batch
Epoch 7/20  Iteration 1209/3560 Training loss: 1.9722 0.4341 sec/batch
Epoch 7/20  Iteration 1210/3560 Training loss: 1.9720 0.3917 sec/batch
Epoch 7/20  Iteration 1211/3560 Training loss: 1.9719 0.4058 sec/batch
Epoch 7/20  Iteration 1212/3560 Training loss: 1.9719 0.3881 sec/batch
Epoch 7/20  Iteration 1213/3560 Training loss: 1.9717 0.4023 sec/batch
Epoch 7/20  Iteration 1214/3560 Training loss: 1.9716 0.3817 sec/batch
Epoch 7/20  Iteration 1215/3560 Training loss: 1.9716 0.4258 sec/batch
Epoch 7/20  Iteration 1216/3560 Training loss: 1.9717 0.5277 sec/batch
Epoch 7/20  Iteration 1217/3560 Training loss: 1.9715 0.4590 sec/batch
Epoch 7/20  Iteration 1218/3560 Training loss: 1.9713 0.3860 sec/batch
Epoch 7/20  Iteration 1219/3560 Training loss: 1.9710 0.3871 sec/batch
Epoch 7/20  Iteration 1220/3560 Training loss: 1.9710 0.3901 sec/batch
Epoch 7/20  Iteration 1221/3560 Training loss: 1.9710 0.3984 sec/batch
Epoch 7/20  Iteration 1222/3560 Training loss: 1.9709 0.3841 sec/batch
Epoch 7/20  Iteration 1223/3560 Training loss: 1.9708 0.3901 sec/batch
Epoch 7/20  Iteration 1224/3560 Training loss: 1.9707 0.3853 sec/batch
Epoch 7/20  Iteration 1225/3560 Training loss: 1.9707 0.3871 sec/batch
Epoch 7/20  Iteration 1226/3560 Training loss: 1.9706 0.3861 sec/batch
Epoch 7/20  Iteration 1227/3560 Training loss: 1.9703 0.3865 sec/batch
Epoch 7/20  Iteration 1228/3560 Training loss: 1.9704 0.3972 sec/batch
Epoch 7/20  Iteration 1229/3560 Training loss: 1.9705 0.3924 sec/batch
Epoch 7/20  Iteration 1230/3560 Training loss: 1.9703 0.3859 sec/batch
Epoch 7/20  Iteration 1231/3560 Training loss: 1.9703 0.3867 sec/batch
Epoch 7/20  Iteration 1232/3560 Training loss: 1.9703 0.3837 sec/batch
Epoch 7/20  Iteration 1233/3560 Training loss: 1.9702 0.3875 sec/batch
Epoch 7/20  Iteration 1234/3560 Training loss: 1.9700 0.4104 sec/batch
Epoch 7/20  Iteration 1235/3560 Training loss: 1.9700 0.3827 sec/batch
Epoch 7/20  Iteration 1236/3560 Training loss: 1.9702 0.3794 sec/batch
Epoch 7/20  Iteration 1237/3560 Training loss: 1.9701 0.3870 sec/batch
Epoch 7/20  Iteration 1238/3560 Training loss: 1.9700 0.3865 sec/batch
Epoch 7/20  Iteration 1239/3560 Training loss: 1.9699 0.3848 sec/batch
Epoch 7/20  Iteration 1240/3560 Training loss: 1.9697 0.3877 sec/batch
Epoch 7/20  Iteration 1241/3560 Training loss: 1.9697 0.3919 sec/batch
Epoch 7/20  Iteration 1242/3560 Training loss: 1.9697 0.4308 sec/batch
Epoch 7/20  Iteration 1243/3560 Training loss: 1.9697 0.4295 sec/batch
Epoch 7/20  Iteration 1244/3560 Training loss: 1.9696 0.3868 sec/batch
Epoch 7/20  Iteration 1245/3560 Training loss: 1.9694 0.3815 sec/batch
Epoch 7/20  Iteration 1246/3560 Training loss: 1.9694 0.4112 sec/batch
Epoch 8/20  Iteration 1247/3560 Training loss: 2.0159 0.3898 sec/batch
Epoch 8/20  Iteration 1248/3560 Training loss: 1.9779 0.3842 sec/batch
Epoch 8/20  Iteration 1249/3560 Training loss: 1.9654 0.3800 sec/batch
Epoch 8/20  Iteration 1250/3560 Training loss: 1.9594 0.3852 sec/batch
Epoch 8/20  Iteration 1251/3560 Training loss: 1.9553 0.3843 sec/batch
Epoch 8/20  Iteration 1252/3560 Training loss: 1.9469 0.3884 sec/batch
Epoch 8/20  Iteration 1253/3560 Training loss: 1.9475 0.3820 sec/batch
Epoch 8/20  Iteration 1254/3560 Training loss: 1.9470 0.3858 sec/batch
Epoch 8/20  Iteration 1255/3560 Training loss: 1.9495 0.4000 sec/batch
Epoch 8/20  Iteration 1256/3560 Training loss: 1.9497 0.3878 sec/batch
Epoch 8/20  Iteration 1257/3560 Training loss: 1.9471 0.3963 sec/batch
Epoch 8/20  Iteration 1258/3560 Training loss: 1.9452 0.3831 sec/batch
Epoch 8/20  Iteration 1259/3560 Training loss: 1.9456 0.3966 sec/batch
Epoch 8/20  Iteration 1260/3560 Training loss: 1.9477 0.4019 sec/batch
Epoch 8/20  Iteration 1261/3560 Training loss: 1.9471 0.3883 sec/batch
Epoch 8/20  Iteration 1262/3560 Training loss: 1.9446 0.3863 sec/batch
Epoch 8/20  Iteration 1263/3560 Training loss: 1.9442 0.3942 sec/batch
Epoch 8/20  Iteration 1264/3560 Training loss: 1.9465 0.3829 sec/batch
Epoch 8/20  Iteration 1265/3560 Training loss: 1.9461 0.3882 sec/batch
Epoch 8/20  Iteration 1266/3560 Training loss: 1.9462 0.3840 sec/batch
Epoch 8/20  Iteration 1267/3560 Training loss: 1.9457 0.3800 sec/batch
Epoch 8/20  Iteration 1268/3560 Training loss: 1.9467 0.3866 sec/batch
Epoch 8/20  Iteration 1269/3560 Training loss: 1.9468 0.3866 sec/batch
Epoch 8/20  Iteration 1270/3560 Training loss: 1.9462 0.3898 sec/batch
Epoch 8/20  Iteration 1271/3560 Training loss: 1.9461 0.3870 sec/batch
Epoch 8/20  Iteration 1272/3560 Training loss: 1.9454 0.4109 sec/batch
Epoch 8/20  Iteration 1273/3560 Training loss: 1.9444 0.3873 sec/batch
Epoch 8/20  Iteration 1274/3560 Training loss: 1.9447 0.3848 sec/batch
Epoch 8/20  Iteration 1275/3560 Training loss: 1.9454 0.3867 sec/batch
Epoch 8/20  Iteration 1276/3560 Training loss: 1.9456 0.3954 sec/batch
Epoch 8/20  Iteration 1277/3560 Training loss: 1.9453 0.3820 sec/batch
Epoch 8/20  Iteration 1278/3560 Training loss: 1.9444 0.3850 sec/batch
Epoch 8/20  Iteration 1279/3560 Training loss: 1.9443 0.3804 sec/batch
Epoch 8/20  Iteration 1280/3560 Training loss: 1.9450 0.3853 sec/batch
Epoch 8/20  Iteration 1281/3560 Training loss: 1.9447 0.3818 sec/batch
Epoch 8/20  Iteration 1282/3560 Training loss: 1.9444 0.3841 sec/batch
Epoch 8/20  Iteration 1283/3560 Training loss: 1.9439 0.3865 sec/batch
Epoch 8/20  Iteration 1284/3560 Training loss: 1.9428 0.3852 sec/batch
Epoch 8/20  Iteration 1285/3560 Training loss: 1.9418 0.3991 sec/batch
Epoch 8/20  Iteration 1286/3560 Training loss: 1.9411 0.3927 sec/batch
Epoch 8/20  Iteration 1287/3560 Training loss: 1.9404 0.3934 sec/batch
Epoch 8/20  Iteration 1288/3560 Training loss: 1.9404 0.3849 sec/batch
Epoch 8/20  Iteration 1289/3560 Training loss: 1.9401 0.3823 sec/batch
Epoch 8/20  Iteration 1290/3560 Training loss: 1.9394 0.3804 sec/batch
Epoch 8/20  Iteration 1291/3560 Training loss: 1.9393 0.3974 sec/batch
Epoch 8/20  Iteration 1292/3560 Training loss: 1.9381 0.3948 sec/batch
Epoch 8/20  Iteration 1293/3560 Training loss: 1.9381 0.3825 sec/batch
Epoch 8/20  Iteration 1294/3560 Training loss: 1.9376 0.3824 sec/batch
Epoch 8/20  Iteration 1295/3560 Training loss: 1.9374 0.3808 sec/batch
Epoch 8/20  Iteration 1296/3560 Training loss: 1.9382 0.3837 sec/batch
Epoch 8/20  Iteration 1297/3560 Training loss: 1.9375 0.3817 sec/batch
Epoch 8/20  Iteration 1298/3560 Training loss: 1.9381 0.3927 sec/batch
Epoch 8/20  Iteration 1299/3560 Training loss: 1.9380 0.4072 sec/batch
Epoch 8/20  Iteration 1300/3560 Training loss: 1.9378 0.3859 sec/batch
Epoch 8/20  Iteration 1301/3560 Training loss: 1.9376 0.3974 sec/batch
Epoch 8/20  Iteration 1302/3560 Training loss: 1.9379 0.3834 sec/batch
Epoch 8/20  Iteration 1303/3560 Training loss: 1.9382 0.3850 sec/batch
Epoch 8/20  Iteration 1304/3560 Training loss: 1.9379 0.3837 sec/batch
Epoch 8/20  Iteration 1305/3560 Training loss: 1.9374 0.3803 sec/batch
Epoch 8/20  Iteration 1306/3560 Training loss: 1.9378 0.3846 sec/batch
Epoch 8/20  Iteration 1307/3560 Training loss: 1.9376 0.3847 sec/batch
Epoch 8/20  Iteration 1308/3560 Training loss: 1.9381 0.3870 sec/batch
Epoch 8/20  Iteration 1309/3560 Training loss: 1.9384 0.3863 sec/batch
Epoch 8/20  Iteration 1310/3560 Training loss: 1.9386 0.3907 sec/batch
Epoch 8/20  Iteration 1311/3560 Training loss: 1.9383 0.4348 sec/batch
Epoch 8/20  Iteration 1312/3560 Training loss: 1.9385 0.3821 sec/batch
Epoch 8/20  Iteration 1313/3560 Training loss: 1.9387 0.3867 sec/batch
Epoch 8/20  Iteration 1314/3560 Training loss: 1.9382 0.3806 sec/batch
Epoch 8/20  Iteration 1315/3560 Training loss: 1.9381 0.3847 sec/batch
Epoch 8/20  Iteration 1316/3560 Training loss: 1.9381 0.3889 sec/batch
Epoch 8/20  Iteration 1317/3560 Training loss: 1.9384 0.3844 sec/batch
Epoch 8/20  Iteration 1318/3560 Training loss: 1.9383 0.3827 sec/batch
Epoch 8/20  Iteration 1319/3560 Training loss: 1.9386 0.3844 sec/batch
Epoch 8/20  Iteration 1320/3560 Training loss: 1.9382 0.3882 sec/batch
Epoch 8/20  Iteration 1321/3560 Training loss: 1.9380 0.3866 sec/batch
Epoch 8/20  Iteration 1322/3560 Training loss: 1.9383 0.4079 sec/batch
Epoch 8/20  Iteration 1323/3560 Training loss: 1.9382 0.3767 sec/batch
Epoch 8/20  Iteration 1324/3560 Training loss: 1.9383 0.4199 sec/batch
Epoch 8/20  Iteration 1325/3560 Training loss: 1.9379 0.3848 sec/batch
Epoch 8/20  Iteration 1326/3560 Training loss: 1.9376 0.3818 sec/batch
Epoch 8/20  Iteration 1327/3560 Training loss: 1.9369 0.3862 sec/batch
Epoch 8/20  Iteration 1328/3560 Training loss: 1.9370 0.3859 sec/batch
Epoch 8/20  Iteration 1329/3560 Training loss: 1.9365 0.3838 sec/batch
Epoch 8/20  Iteration 1330/3560 Training loss: 1.9363 0.3951 sec/batch
Epoch 8/20  Iteration 1331/3560 Training loss: 1.9359 0.3823 sec/batch
Epoch 8/20  Iteration 1332/3560 Training loss: 1.9355 0.3826 sec/batch
Epoch 8/20  Iteration 1333/3560 Training loss: 1.9355 0.3907 sec/batch
Epoch 8/20  Iteration 1334/3560 Training loss: 1.9353 0.3862 sec/batch
Epoch 8/20  Iteration 1335/3560 Training loss: 1.9348 0.3866 sec/batch
Epoch 8/20  Iteration 1336/3560 Training loss: 1.9348 0.3817 sec/batch
Epoch 8/20  Iteration 1337/3560 Training loss: 1.9344 0.4112 sec/batch
Epoch 8/20  Iteration 1338/3560 Training loss: 1.9344 0.3909 sec/batch
Epoch 8/20  Iteration 1339/3560 Training loss: 1.9338 0.3821 sec/batch
Epoch 8/20  Iteration 1340/3560 Training loss: 1.9334 0.3828 sec/batch
Epoch 8/20  Iteration 1341/3560 Training loss: 1.9331 0.3832 sec/batch
Epoch 8/20  Iteration 1342/3560 Training loss: 1.9330 0.3827 sec/batch
Epoch 8/20  Iteration 1343/3560 Training loss: 1.9329 0.3870 sec/batch
Epoch 8/20  Iteration 1344/3560 Training loss: 1.9325 0.3805 sec/batch
Epoch 8/20  Iteration 1345/3560 Training loss: 1.9321 0.3851 sec/batch
Epoch 8/20  Iteration 1346/3560 Training loss: 1.9316 0.3810 sec/batch
Epoch 8/20  Iteration 1347/3560 Training loss: 1.9316 0.3922 sec/batch
Epoch 8/20  Iteration 1348/3560 Training loss: 1.9315 0.3861 sec/batch
Epoch 8/20  Iteration 1349/3560 Training loss: 1.9312 0.3831 sec/batch
Epoch 8/20  Iteration 1350/3560 Training loss: 1.9310 0.3988 sec/batch
Epoch 8/20  Iteration 1351/3560 Training loss: 1.9307 0.3909 sec/batch
Epoch 8/20  Iteration 1352/3560 Training loss: 1.9305 0.3893 sec/batch
Epoch 8/20  Iteration 1353/3560 Training loss: 1.9304 0.3862 sec/batch
Epoch 8/20  Iteration 1354/3560 Training loss: 1.9305 0.3977 sec/batch
Epoch 8/20  Iteration 1355/3560 Training loss: 1.9305 0.3895 sec/batch
Epoch 8/20  Iteration 1356/3560 Training loss: 1.9304 0.3804 sec/batch
Epoch 8/20  Iteration 1357/3560 Training loss: 1.9302 0.3844 sec/batch
Epoch 8/20  Iteration 1358/3560 Training loss: 1.9300 0.3807 sec/batch
Epoch 8/20  Iteration 1359/3560 Training loss: 1.9297 0.3841 sec/batch
Epoch 8/20  Iteration 1360/3560 Training loss: 1.9296 0.3942 sec/batch
Epoch 8/20  Iteration 1361/3560 Training loss: 1.9293 0.3823 sec/batch
Epoch 8/20  Iteration 1362/3560 Training loss: 1.9289 0.3885 sec/batch
Epoch 8/20  Iteration 1363/3560 Training loss: 1.9289 0.3949 sec/batch
Epoch 8/20  Iteration 1364/3560 Training loss: 1.9289 0.3935 sec/batch
Epoch 8/20  Iteration 1365/3560 Training loss: 1.9288 0.3795 sec/batch
Epoch 8/20  Iteration 1366/3560 Training loss: 1.9287 0.3938 sec/batch
Epoch 8/20  Iteration 1367/3560 Training loss: 1.9287 0.3960 sec/batch
Epoch 8/20  Iteration 1368/3560 Training loss: 1.9283 0.3853 sec/batch
Epoch 8/20  Iteration 1369/3560 Training loss: 1.9281 0.3844 sec/batch
Epoch 8/20  Iteration 1370/3560 Training loss: 1.9282 0.3878 sec/batch
Epoch 8/20  Iteration 1371/3560 Training loss: 1.9280 0.3880 sec/batch
Epoch 8/20  Iteration 1372/3560 Training loss: 1.9276 0.3841 sec/batch
Epoch 8/20  Iteration 1373/3560 Training loss: 1.9276 0.3906 sec/batch
Epoch 8/20  Iteration 1374/3560 Training loss: 1.9277 0.3831 sec/batch
Epoch 8/20  Iteration 1375/3560 Training loss: 1.9276 0.3885 sec/batch
Epoch 8/20  Iteration 1376/3560 Training loss: 1.9275 0.3911 sec/batch
Epoch 8/20  Iteration 1377/3560 Training loss: 1.9272 0.4055 sec/batch
Epoch 8/20  Iteration 1378/3560 Training loss: 1.9270 0.3878 sec/batch
Epoch 8/20  Iteration 1379/3560 Training loss: 1.9269 0.3844 sec/batch
Epoch 8/20  Iteration 1380/3560 Training loss: 1.9269 0.3976 sec/batch
Epoch 8/20  Iteration 1381/3560 Training loss: 1.9268 0.3822 sec/batch
Epoch 8/20  Iteration 1382/3560 Training loss: 1.9268 0.3877 sec/batch
Epoch 8/20  Iteration 1383/3560 Training loss: 1.9268 0.3851 sec/batch
Epoch 8/20  Iteration 1384/3560 Training loss: 1.9268 0.3857 sec/batch
Epoch 8/20  Iteration 1385/3560 Training loss: 1.9269 0.3867 sec/batch
Epoch 8/20  Iteration 1386/3560 Training loss: 1.9267 0.3808 sec/batch
Epoch 8/20  Iteration 1387/3560 Training loss: 1.9269 0.3914 sec/batch
Epoch 8/20  Iteration 1388/3560 Training loss: 1.9267 0.3820 sec/batch
Epoch 8/20  Iteration 1389/3560 Training loss: 1.9266 0.3946 sec/batch
Epoch 8/20  Iteration 1390/3560 Training loss: 1.9266 0.3986 sec/batch
Epoch 8/20  Iteration 1391/3560 Training loss: 1.9264 0.3801 sec/batch
Epoch 8/20  Iteration 1392/3560 Training loss: 1.9266 0.3872 sec/batch
Epoch 8/20  Iteration 1393/3560 Training loss: 1.9266 0.3888 sec/batch
Epoch 8/20  Iteration 1394/3560 Training loss: 1.9267 0.3953 sec/batch
Epoch 8/20  Iteration 1395/3560 Training loss: 1.9265 0.3795 sec/batch
Epoch 8/20  Iteration 1396/3560 Training loss: 1.9263 0.3828 sec/batch
Epoch 8/20  Iteration 1397/3560 Training loss: 1.9261 0.3860 sec/batch
Epoch 8/20  Iteration 1398/3560 Training loss: 1.9261 0.3901 sec/batch
Epoch 8/20  Iteration 1399/3560 Training loss: 1.9261 0.3903 sec/batch
Epoch 8/20  Iteration 1400/3560 Training loss: 1.9261 0.3817 sec/batch
Epoch 8/20  Iteration 1401/3560 Training loss: 1.9260 0.3930 sec/batch
Epoch 8/20  Iteration 1402/3560 Training loss: 1.9259 0.3916 sec/batch
Epoch 8/20  Iteration 1403/3560 Training loss: 1.9258 0.3915 sec/batch
Epoch 8/20  Iteration 1404/3560 Training loss: 1.9257 0.3860 sec/batch
Epoch 8/20  Iteration 1405/3560 Training loss: 1.9255 0.3863 sec/batch
Epoch 8/20  Iteration 1406/3560 Training loss: 1.9256 0.3894 sec/batch
Epoch 8/20  Iteration 1407/3560 Training loss: 1.9257 0.3905 sec/batch
Epoch 8/20  Iteration 1408/3560 Training loss: 1.9256 0.3943 sec/batch
Epoch 8/20  Iteration 1409/3560 Training loss: 1.9256 0.3832 sec/batch
Epoch 8/20  Iteration 1410/3560 Training loss: 1.9255 0.3793 sec/batch
Epoch 8/20  Iteration 1411/3560 Training loss: 1.9254 0.3872 sec/batch
Epoch 8/20  Iteration 1412/3560 Training loss: 1.9253 0.3903 sec/batch
Epoch 8/20  Iteration 1413/3560 Training loss: 1.9253 0.3920 sec/batch
Epoch 8/20  Iteration 1414/3560 Training loss: 1.9256 0.3868 sec/batch
Epoch 8/20  Iteration 1415/3560 Training loss: 1.9255 0.3951 sec/batch
Epoch 8/20  Iteration 1416/3560 Training loss: 1.9253 0.3927 sec/batch
Epoch 8/20  Iteration 1417/3560 Training loss: 1.9252 0.3851 sec/batch
Epoch 8/20  Iteration 1418/3560 Training loss: 1.9250 0.3944 sec/batch
Epoch 8/20  Iteration 1419/3560 Training loss: 1.9250 0.3904 sec/batch
Epoch 8/20  Iteration 1420/3560 Training loss: 1.9250 0.3828 sec/batch
Epoch 8/20  Iteration 1421/3560 Training loss: 1.9250 0.3813 sec/batch
Epoch 8/20  Iteration 1422/3560 Training loss: 1.9250 0.3834 sec/batch
Epoch 8/20  Iteration 1423/3560 Training loss: 1.9248 0.3900 sec/batch
Epoch 8/20  Iteration 1424/3560 Training loss: 1.9248 0.3808 sec/batch
Epoch 9/20  Iteration 1425/3560 Training loss: 1.9783 0.3816 sec/batch
Epoch 9/20  Iteration 1426/3560 Training loss: 1.9355 0.3840 sec/batch
Epoch 9/20  Iteration 1427/3560 Training loss: 1.9256 0.3753 sec/batch
Epoch 9/20  Iteration 1428/3560 Training loss: 1.9191 0.3947 sec/batch
Epoch 9/20  Iteration 1429/3560 Training loss: 1.9157 0.3991 sec/batch
Epoch 9/20  Iteration 1430/3560 Training loss: 1.9089 0.3889 sec/batch
Epoch 9/20  Iteration 1431/3560 Training loss: 1.9091 0.3908 sec/batch
Epoch 9/20  Iteration 1432/3560 Training loss: 1.9092 0.3857 sec/batch
Epoch 9/20  Iteration 1433/3560 Training loss: 1.9122 0.3860 sec/batch
Epoch 9/20  Iteration 1434/3560 Training loss: 1.9119 0.3813 sec/batch
Epoch 9/20  Iteration 1435/3560 Training loss: 1.9096 0.3868 sec/batch
Epoch 9/20  Iteration 1436/3560 Training loss: 1.9075 0.3961 sec/batch
Epoch 9/20  Iteration 1437/3560 Training loss: 1.9073 0.3817 sec/batch
Epoch 9/20  Iteration 1438/3560 Training loss: 1.9097 0.3846 sec/batch
Epoch 9/20  Iteration 1439/3560 Training loss: 1.9090 0.3826 sec/batch
Epoch 9/20  Iteration 1440/3560 Training loss: 1.9069 0.3822 sec/batch
Epoch 9/20  Iteration 1441/3560 Training loss: 1.9067 0.3947 sec/batch
Epoch 9/20  Iteration 1442/3560 Training loss: 1.9084 0.3902 sec/batch
Epoch 9/20  Iteration 1443/3560 Training loss: 1.9081 0.3951 sec/batch
Epoch 9/20  Iteration 1444/3560 Training loss: 1.9078 0.3848 sec/batch
Epoch 9/20  Iteration 1445/3560 Training loss: 1.9068 0.3812 sec/batch
Epoch 9/20  Iteration 1446/3560 Training loss: 1.9073 0.3859 sec/batch
Epoch 9/20  Iteration 1447/3560 Training loss: 1.9067 0.3804 sec/batch
Epoch 9/20  Iteration 1448/3560 Training loss: 1.9060 0.3808 sec/batch
Epoch 9/20  Iteration 1449/3560 Training loss: 1.9058 0.3903 sec/batch
Epoch 9/20  Iteration 1450/3560 Training loss: 1.9046 0.3877 sec/batch
Epoch 9/20  Iteration 1451/3560 Training loss: 1.9036 0.3972 sec/batch
Epoch 9/20  Iteration 1452/3560 Training loss: 1.9042 0.4054 sec/batch
Epoch 9/20  Iteration 1453/3560 Training loss: 1.9048 0.3865 sec/batch
Epoch 9/20  Iteration 1454/3560 Training loss: 1.9052 0.3922 sec/batch
Epoch 9/20  Iteration 1455/3560 Training loss: 1.9052 0.3918 sec/batch
Epoch 9/20  Iteration 1456/3560 Training loss: 1.9040 0.3854 sec/batch
Epoch 9/20  Iteration 1457/3560 Training loss: 1.9038 0.3847 sec/batch
Epoch 9/20  Iteration 1458/3560 Training loss: 1.9042 0.3968 sec/batch
Epoch 9/20  Iteration 1459/3560 Training loss: 1.9039 0.3842 sec/batch
Epoch 9/20  Iteration 1460/3560 Training loss: 1.9041 0.3831 sec/batch
Epoch 9/20  Iteration 1461/3560 Training loss: 1.9034 0.3832 sec/batch
Epoch 9/20  Iteration 1462/3560 Training loss: 1.9025 0.3834 sec/batch
Epoch 9/20  Iteration 1463/3560 Training loss: 1.9015 0.3844 sec/batch
Epoch 9/20  Iteration 1464/3560 Training loss: 1.9006 0.3913 sec/batch
Epoch 9/20  Iteration 1465/3560 Training loss: 1.9001 0.3832 sec/batch
Epoch 9/20  Iteration 1466/3560 Training loss: 1.9001 0.3854 sec/batch
Epoch 9/20  Iteration 1467/3560 Training loss: 1.8998 0.4050 sec/batch
Epoch 9/20  Iteration 1468/3560 Training loss: 1.8992 0.3926 sec/batch
Epoch 9/20  Iteration 1469/3560 Training loss: 1.8992 0.3860 sec/batch
Epoch 9/20  Iteration 1470/3560 Training loss: 1.8981 0.3792 sec/batch
Epoch 9/20  Iteration 1471/3560 Training loss: 1.8980 0.3847 sec/batch
Epoch 9/20  Iteration 1472/3560 Training loss: 1.8973 0.3790 sec/batch
Epoch 9/20  Iteration 1473/3560 Training loss: 1.8970 0.3845 sec/batch
Epoch 9/20  Iteration 1474/3560 Training loss: 1.8978 0.3892 sec/batch
Epoch 9/20  Iteration 1475/3560 Training loss: 1.8972 0.3966 sec/batch
Epoch 9/20  Iteration 1476/3560 Training loss: 1.8980 0.3879 sec/batch
Epoch 9/20  Iteration 1477/3560 Training loss: 1.8979 0.3843 sec/batch
Epoch 9/20  Iteration 1478/3560 Training loss: 1.8979 0.3856 sec/batch
Epoch 9/20  Iteration 1479/3560 Training loss: 1.8976 0.3876 sec/batch
Epoch 9/20  Iteration 1480/3560 Training loss: 1.8978 0.3846 sec/batch
Epoch 9/20  Iteration 1481/3560 Training loss: 1.8979 0.3994 sec/batch
Epoch 9/20  Iteration 1482/3560 Training loss: 1.8976 0.3957 sec/batch
Epoch 9/20  Iteration 1483/3560 Training loss: 1.8972 0.3793 sec/batch
Epoch 9/20  Iteration 1484/3560 Training loss: 1.8976 0.3872 sec/batch
Epoch 9/20  Iteration 1485/3560 Training loss: 1.8975 0.3819 sec/batch
Epoch 9/20  Iteration 1486/3560 Training loss: 1.8980 0.3821 sec/batch
Epoch 9/20  Iteration 1487/3560 Training loss: 1.8984 0.3852 sec/batch
Epoch 9/20  Iteration 1488/3560 Training loss: 1.8986 0.3818 sec/batch
Epoch 9/20  Iteration 1489/3560 Training loss: 1.8984 0.3863 sec/batch
Epoch 9/20  Iteration 1490/3560 Training loss: 1.8987 0.3828 sec/batch
Epoch 9/20  Iteration 1491/3560 Training loss: 1.8987 0.3938 sec/batch
Epoch 9/20  Iteration 1492/3560 Training loss: 1.8982 0.3868 sec/batch
Epoch 9/20  Iteration 1493/3560 Training loss: 1.8980 0.3929 sec/batch
Epoch 9/20  Iteration 1494/3560 Training loss: 1.8979 0.4017 sec/batch
Epoch 9/20  Iteration 1495/3560 Training loss: 1.8983 0.3931 sec/batch
Epoch 9/20  Iteration 1496/3560 Training loss: 1.8983 0.3839 sec/batch
Epoch 9/20  Iteration 1497/3560 Training loss: 1.8985 0.3826 sec/batch
Epoch 9/20  Iteration 1498/3560 Training loss: 1.8982 0.3845 sec/batch
Epoch 9/20  Iteration 1499/3560 Training loss: 1.8982 0.3842 sec/batch
Epoch 9/20  Iteration 1500/3560 Training loss: 1.8983 0.3865 sec/batch
Epoch 9/20  Iteration 1501/3560 Training loss: 1.8982 0.3823 sec/batch
Epoch 9/20  Iteration 1502/3560 Training loss: 1.8982 0.3809 sec/batch
Epoch 9/20  Iteration 1503/3560 Training loss: 1.8979 0.3845 sec/batch
Epoch 9/20  Iteration 1504/3560 Training loss: 1.8976 0.3830 sec/batch
Epoch 9/20  Iteration 1505/3560 Training loss: 1.8972 0.3877 sec/batch
Epoch 9/20  Iteration 1506/3560 Training loss: 1.8974 0.3877 sec/batch
Epoch 9/20  Iteration 1507/3560 Training loss: 1.8970 0.4028 sec/batch
Epoch 9/20  Iteration 1508/3560 Training loss: 1.8968 0.3882 sec/batch
Epoch 9/20  Iteration 1509/3560 Training loss: 1.8963 0.3886 sec/batch
Epoch 9/20  Iteration 1510/3560 Training loss: 1.8960 0.4198 sec/batch
Epoch 9/20  Iteration 1511/3560 Training loss: 1.8959 0.4062 sec/batch
Epoch 9/20  Iteration 1512/3560 Training loss: 1.8957 0.3873 sec/batch
Epoch 9/20  Iteration 1513/3560 Training loss: 1.8952 0.3949 sec/batch
Epoch 9/20  Iteration 1514/3560 Training loss: 1.8953 0.3879 sec/batch
Epoch 9/20  Iteration 1515/3560 Training loss: 1.8950 0.4043 sec/batch
Epoch 9/20  Iteration 1516/3560 Training loss: 1.8950 0.3866 sec/batch
Epoch 9/20  Iteration 1517/3560 Training loss: 1.8945 0.3911 sec/batch
Epoch 9/20  Iteration 1518/3560 Training loss: 1.8941 0.3940 sec/batch
Epoch 9/20  Iteration 1519/3560 Training loss: 1.8938 0.4127 sec/batch
Epoch 9/20  Iteration 1520/3560 Training loss: 1.8937 0.3836 sec/batch
Epoch 9/20  Iteration 1521/3560 Training loss: 1.8936 0.3843 sec/batch
Epoch 9/20  Iteration 1522/3560 Training loss: 1.8933 0.3864 sec/batch
Epoch 9/20  Iteration 1523/3560 Training loss: 1.8930 0.3845 sec/batch
Epoch 9/20  Iteration 1524/3560 Training loss: 1.8925 0.3978 sec/batch
Epoch 9/20  Iteration 1525/3560 Training loss: 1.8926 0.3949 sec/batch
Epoch 9/20  Iteration 1526/3560 Training loss: 1.8925 0.3927 sec/batch
Epoch 9/20  Iteration 1527/3560 Training loss: 1.8921 0.3867 sec/batch
Epoch 9/20  Iteration 1528/3560 Training loss: 1.8920 0.3855 sec/batch
Epoch 9/20  Iteration 1529/3560 Training loss: 1.8918 0.3869 sec/batch
Epoch 9/20  Iteration 1530/3560 Training loss: 1.8916 0.3837 sec/batch
Epoch 9/20  Iteration 1531/3560 Training loss: 1.8916 0.3785 sec/batch
Epoch 9/20  Iteration 1532/3560 Training loss: 1.8915 0.4021 sec/batch
Epoch 9/20  Iteration 1533/3560 Training loss: 1.8916 0.3847 sec/batch
Epoch 9/20  Iteration 1534/3560 Training loss: 1.8915 0.3878 sec/batch
Epoch 9/20  Iteration 1535/3560 Training loss: 1.8913 0.3834 sec/batch
Epoch 9/20  Iteration 1536/3560 Training loss: 1.8912 0.3795 sec/batch
Epoch 9/20  Iteration 1537/3560 Training loss: 1.8910 0.3823 sec/batch
Epoch 9/20  Iteration 1538/3560 Training loss: 1.8908 0.3900 sec/batch
Epoch 9/20  Iteration 1539/3560 Training loss: 1.8906 0.3802 sec/batch
Epoch 9/20  Iteration 1540/3560 Training loss: 1.8902 0.3873 sec/batch
Epoch 9/20  Iteration 1541/3560 Training loss: 1.8901 0.3891 sec/batch
Epoch 9/20  Iteration 1542/3560 Training loss: 1.8900 0.3804 sec/batch
Epoch 9/20  Iteration 1543/3560 Training loss: 1.8900 0.3803 sec/batch
Epoch 9/20  Iteration 1544/3560 Training loss: 1.8899 0.3859 sec/batch
Epoch 9/20  Iteration 1545/3560 Training loss: 1.8899 0.3927 sec/batch
Epoch 9/20  Iteration 1546/3560 Training loss: 1.8895 0.3936 sec/batch
Epoch 9/20  Iteration 1547/3560 Training loss: 1.8892 0.3876 sec/batch
Epoch 9/20  Iteration 1548/3560 Training loss: 1.8893 0.3856 sec/batch
Epoch 9/20  Iteration 1549/3560 Training loss: 1.8892 0.3902 sec/batch
Epoch 9/20  Iteration 1550/3560 Training loss: 1.8888 0.3793 sec/batch
Epoch 9/20  Iteration 1551/3560 Training loss: 1.8889 0.3926 sec/batch
Epoch 9/20  Iteration 1552/3560 Training loss: 1.8890 0.3897 sec/batch
Epoch 9/20  Iteration 1553/3560 Training loss: 1.8890 0.3933 sec/batch
Epoch 9/20  Iteration 1554/3560 Training loss: 1.8889 0.3863 sec/batch
Epoch 9/20  Iteration 1555/3560 Training loss: 1.8886 0.3854 sec/batch
Epoch 9/20  Iteration 1556/3560 Training loss: 1.8883 0.3821 sec/batch
Epoch 9/20  Iteration 1557/3560 Training loss: 1.8883 0.3789 sec/batch
Epoch 9/20  Iteration 1558/3560 Training loss: 1.8884 0.3840 sec/batch
Epoch 9/20  Iteration 1559/3560 Training loss: 1.8883 0.4232 sec/batch
Epoch 9/20  Iteration 1560/3560 Training loss: 1.8884 0.3826 sec/batch
Epoch 9/20  Iteration 1561/3560 Training loss: 1.8884 0.3935 sec/batch
Epoch 9/20  Iteration 1562/3560 Training loss: 1.8884 0.3792 sec/batch
Epoch 9/20  Iteration 1563/3560 Training loss: 1.8884 0.3851 sec/batch
Epoch 9/20  Iteration 1564/3560 Training loss: 1.8883 0.3864 sec/batch
Epoch 9/20  Iteration 1565/3560 Training loss: 1.8884 0.3791 sec/batch
Epoch 9/20  Iteration 1566/3560 Training loss: 1.8883 0.3807 sec/batch
Epoch 9/20  Iteration 1567/3560 Training loss: 1.8883 0.3895 sec/batch
Epoch 9/20  Iteration 1568/3560 Training loss: 1.8882 0.3854 sec/batch
Epoch 9/20  Iteration 1569/3560 Training loss: 1.8881 0.3906 sec/batch
Epoch 9/20  Iteration 1570/3560 Training loss: 1.8883 0.3871 sec/batch
Epoch 9/20  Iteration 1571/3560 Training loss: 1.8882 0.3920 sec/batch
Epoch 9/20  Iteration 1572/3560 Training loss: 1.8884 0.4022 sec/batch
Epoch 9/20  Iteration 1573/3560 Training loss: 1.8883 0.3842 sec/batch
Epoch 9/20  Iteration 1574/3560 Training loss: 1.8881 0.3868 sec/batch
Epoch 9/20  Iteration 1575/3560 Training loss: 1.8879 0.3834 sec/batch
Epoch 9/20  Iteration 1576/3560 Training loss: 1.8880 0.3931 sec/batch
Epoch 9/20  Iteration 1577/3560 Training loss: 1.8880 0.3799 sec/batch
Epoch 9/20  Iteration 1578/3560 Training loss: 1.8881 0.3856 sec/batch
Epoch 9/20  Iteration 1579/3560 Training loss: 1.8880 0.3840 sec/batch
Epoch 9/20  Iteration 1580/3560 Training loss: 1.8880 0.3893 sec/batch
Epoch 9/20  Iteration 1581/3560 Training loss: 1.8879 0.3842 sec/batch
Epoch 9/20  Iteration 1582/3560 Training loss: 1.8879 0.3815 sec/batch
Epoch 9/20  Iteration 1583/3560 Training loss: 1.8877 0.3925 sec/batch
Epoch 9/20  Iteration 1584/3560 Training loss: 1.8877 0.3861 sec/batch
Epoch 9/20  Iteration 1585/3560 Training loss: 1.8878 0.4036 sec/batch
Epoch 9/20  Iteration 1586/3560 Training loss: 1.8877 0.3884 sec/batch
Epoch 9/20  Iteration 1587/3560 Training loss: 1.8877 0.3873 sec/batch
Epoch 9/20  Iteration 1588/3560 Training loss: 1.8877 0.4897 sec/batch
Epoch 9/20  Iteration 1589/3560 Training loss: 1.8876 0.3841 sec/batch
Epoch 9/20  Iteration 1590/3560 Training loss: 1.8875 0.3772 sec/batch
Epoch 9/20  Iteration 1591/3560 Training loss: 1.8874 0.3874 sec/batch
Epoch 9/20  Iteration 1592/3560 Training loss: 1.8877 0.3755 sec/batch
Epoch 9/20  Iteration 1593/3560 Training loss: 1.8876 0.3739 sec/batch
Epoch 9/20  Iteration 1594/3560 Training loss: 1.8875 0.3847 sec/batch
Epoch 9/20  Iteration 1595/3560 Training loss: 1.8873 0.3729 sec/batch
Epoch 9/20  Iteration 1596/3560 Training loss: 1.8872 0.4068 sec/batch
Epoch 9/20  Iteration 1597/3560 Training loss: 1.8872 0.4169 sec/batch
Epoch 9/20  Iteration 1598/3560 Training loss: 1.8872 0.3801 sec/batch
Epoch 9/20  Iteration 1599/3560 Training loss: 1.8872 0.3833 sec/batch
Epoch 9/20  Iteration 1600/3560 Training loss: 1.8871 0.3761 sec/batch
Epoch 9/20  Iteration 1601/3560 Training loss: 1.8869 0.3765 sec/batch
Epoch 9/20  Iteration 1602/3560 Training loss: 1.8869 0.3733 sec/batch
Epoch 10/20  Iteration 1603/3560 Training loss: 1.9502 0.3762 sec/batch
Epoch 10/20  Iteration 1604/3560 Training loss: 1.9105 0.3806 sec/batch
Epoch 10/20  Iteration 1605/3560 Training loss: 1.8962 0.3813 sec/batch
Epoch 10/20  Iteration 1606/3560 Training loss: 1.8886 0.3793 sec/batch
Epoch 10/20  Iteration 1607/3560 Training loss: 1.8829 0.3817 sec/batch
Epoch 10/20  Iteration 1608/3560 Training loss: 1.8748 0.3753 sec/batch
Epoch 10/20  Iteration 1609/3560 Training loss: 1.8746 0.3774 sec/batch
Epoch 10/20  Iteration 1610/3560 Training loss: 1.8744 0.3857 sec/batch
Epoch 10/20  Iteration 1611/3560 Training loss: 1.8759 0.4035 sec/batch
Epoch 10/20  Iteration 1612/3560 Training loss: 1.8756 0.3719 sec/batch
Epoch 10/20  Iteration 1613/3560 Training loss: 1.8718 0.3798 sec/batch
Epoch 10/20  Iteration 1614/3560 Training loss: 1.8700 0.3782 sec/batch
Epoch 10/20  Iteration 1615/3560 Training loss: 1.8693 0.3754 sec/batch
Epoch 10/20  Iteration 1616/3560 Training loss: 1.8718 0.3742 sec/batch
Epoch 10/20  Iteration 1617/3560 Training loss: 1.8707 0.3727 sec/batch
Epoch 10/20  Iteration 1618/3560 Training loss: 1.8691 0.3728 sec/batch
Epoch 10/20  Iteration 1619/3560 Training loss: 1.8689 0.3816 sec/batch
Epoch 10/20  Iteration 1620/3560 Training loss: 1.8708 0.3807 sec/batch
Epoch 10/20  Iteration 1621/3560 Training loss: 1.8707 0.3766 sec/batch
Epoch 10/20  Iteration 1622/3560 Training loss: 1.8708 0.3814 sec/batch
Epoch 10/20  Iteration 1623/3560 Training loss: 1.8708 0.3822 sec/batch
Epoch 10/20  Iteration 1624/3560 Training loss: 1.8710 0.4092 sec/batch
Epoch 10/20  Iteration 1625/3560 Training loss: 1.8707 0.3773 sec/batch
Epoch 10/20  Iteration 1626/3560 Training loss: 1.8697 0.3849 sec/batch
Epoch 10/20  Iteration 1627/3560 Training loss: 1.8697 0.3742 sec/batch
Epoch 10/20  Iteration 1628/3560 Training loss: 1.8686 0.3912 sec/batch
Epoch 10/20  Iteration 1629/3560 Training loss: 1.8679 0.3759 sec/batch
Epoch 10/20  Iteration 1630/3560 Training loss: 1.8683 0.3807 sec/batch
Epoch 10/20  Iteration 1631/3560 Training loss: 1.8695 0.3788 sec/batch
Epoch 10/20  Iteration 1632/3560 Training loss: 1.8699 0.3760 sec/batch
Epoch 10/20  Iteration 1633/3560 Training loss: 1.8698 0.3751 sec/batch
Epoch 10/20  Iteration 1634/3560 Training loss: 1.8689 0.3809 sec/batch
Epoch 10/20  Iteration 1635/3560 Training loss: 1.8690 0.4270 sec/batch
Epoch 10/20  Iteration 1636/3560 Training loss: 1.8696 0.3793 sec/batch
Epoch 10/20  Iteration 1637/3560 Training loss: 1.8691 0.4290 sec/batch
Epoch 10/20  Iteration 1638/3560 Training loss: 1.8692 0.3803 sec/batch
Epoch 10/20  Iteration 1639/3560 Training loss: 1.8685 0.3816 sec/batch
Epoch 10/20  Iteration 1640/3560 Training loss: 1.8673 0.3780 sec/batch
Epoch 10/20  Iteration 1641/3560 Training loss: 1.8663 0.3804 sec/batch
Epoch 10/20  Iteration 1642/3560 Training loss: 1.8658 0.4093 sec/batch
Epoch 10/20  Iteration 1643/3560 Training loss: 1.8655 0.3745 sec/batch
Epoch 10/20  Iteration 1644/3560 Training loss: 1.8655 0.3833 sec/batch
Epoch 10/20  Iteration 1645/3560 Training loss: 1.8649 0.3796 sec/batch
Epoch 10/20  Iteration 1646/3560 Training loss: 1.8642 0.3829 sec/batch
Epoch 10/20  Iteration 1647/3560 Training loss: 1.8641 0.3776 sec/batch
Epoch 10/20  Iteration 1648/3560 Training loss: 1.8629 0.3802 sec/batch
Epoch 10/20  Iteration 1649/3560 Training loss: 1.8629 1764.6100 sec/batch
Epoch 10/20  Iteration 1650/3560 Training loss: 1.8623 2.7257 sec/batch
Epoch 10/20  Iteration 1651/3560 Training loss: 1.8621 0.5650 sec/batch
Epoch 10/20  Iteration 1652/3560 Training loss: 1.8630 0.6761 sec/batch
Epoch 10/20  Iteration 1653/3560 Training loss: 1.8625 0.4364 sec/batch
Epoch 10/20  Iteration 1654/3560 Training loss: 1.8633 0.4289 sec/batch
Epoch 10/20  Iteration 1655/3560 Training loss: 1.8632 0.4215 sec/batch
Epoch 10/20  Iteration 1656/3560 Training loss: 1.8632 0.4265 sec/batch
Epoch 10/20  Iteration 1657/3560 Training loss: 1.8630 0.4333 sec/batch
Epoch 10/20  Iteration 1658/3560 Training loss: 1.8632 0.4351 sec/batch
Epoch 10/20  Iteration 1659/3560 Training loss: 1.8635 0.4287 sec/batch
Epoch 10/20  Iteration 1660/3560 Training loss: 1.8633 0.5035 sec/batch
Epoch 10/20  Iteration 1661/3560 Training loss: 1.8629 0.4219 sec/batch
Epoch 10/20  Iteration 1662/3560 Training loss: 1.8634 0.3983 sec/batch
Epoch 10/20  Iteration 1663/3560 Training loss: 1.8632 0.3997 sec/batch
Epoch 10/20  Iteration 1664/3560 Training loss: 1.8638 0.4032 sec/batch
Epoch 10/20  Iteration 1665/3560 Training loss: 1.8642 0.4001 sec/batch
Epoch 10/20  Iteration 1666/3560 Training loss: 1.8646 0.4542 sec/batch
Epoch 10/20  Iteration 1667/3560 Training loss: 1.8644 0.5129 sec/batch
Epoch 10/20  Iteration 1668/3560 Training loss: 1.8648 0.4719 sec/batch
Epoch 10/20  Iteration 1669/3560 Training loss: 1.8649 0.5724 sec/batch
Epoch 10/20  Iteration 1670/3560 Training loss: 1.8644 0.5393 sec/batch
Epoch 10/20  Iteration 1671/3560 Training loss: 1.8644 0.5184 sec/batch
Epoch 10/20  Iteration 1672/3560 Training loss: 1.8644 0.5340 sec/batch
Epoch 10/20  Iteration 1673/3560 Training loss: 1.8648 0.4064 sec/batch
Epoch 10/20  Iteration 1674/3560 Training loss: 1.8650 0.4152 sec/batch
Epoch 10/20  Iteration 1675/3560 Training loss: 1.8654 0.4003 sec/batch
Epoch 10/20  Iteration 1676/3560 Training loss: 1.8651 0.3826 sec/batch
Epoch 10/20  Iteration 1677/3560 Training loss: 1.8651 0.4062 sec/batch
Epoch 10/20  Iteration 1678/3560 Training loss: 1.8654 0.3876 sec/batch
Epoch 10/20  Iteration 1679/3560 Training loss: 1.8653 0.3867 sec/batch
Epoch 10/20  Iteration 1680/3560 Training loss: 1.8654 0.3787 sec/batch
Epoch 10/20  Iteration 1681/3560 Training loss: 1.8649 0.5001 sec/batch
Epoch 10/20  Iteration 1682/3560 Training loss: 1.8647 0.4564 sec/batch
Epoch 10/20  Iteration 1683/3560 Training loss: 1.8642 0.4616 sec/batch
Epoch 10/20  Iteration 1684/3560 Training loss: 1.8644 0.4030 sec/batch
Epoch 10/20  Iteration 1685/3560 Training loss: 1.8639 0.4092 sec/batch
Epoch 10/20  Iteration 1686/3560 Training loss: 1.8638 0.3903 sec/batch
Epoch 10/20  Iteration 1687/3560 Training loss: 1.8632 0.3996 sec/batch
Epoch 10/20  Iteration 1688/3560 Training loss: 1.8631 0.4289 sec/batch
Epoch 10/20  Iteration 1689/3560 Training loss: 1.8630 0.4152 sec/batch
Epoch 10/20  Iteration 1690/3560 Training loss: 1.8627 0.4170 sec/batch
Epoch 10/20  Iteration 1691/3560 Training loss: 1.8621 0.4293 sec/batch
Epoch 10/20  Iteration 1692/3560 Training loss: 1.8622 0.3958 sec/batch
Epoch 10/20  Iteration 1693/3560 Training loss: 1.8619 0.3925 sec/batch
Epoch 10/20  Iteration 1694/3560 Training loss: 1.8618 0.3961 sec/batch
Epoch 10/20  Iteration 1695/3560 Training loss: 1.8614 0.3938 sec/batch
Epoch 10/20  Iteration 1696/3560 Training loss: 1.8611 0.4790 sec/batch
Epoch 10/20  Iteration 1697/3560 Training loss: 1.8607 0.4649 sec/batch
Epoch 10/20  Iteration 1698/3560 Training loss: 1.8607 0.4486 sec/batch
Epoch 10/20  Iteration 1699/3560 Training loss: 1.8605 0.4436 sec/batch
Epoch 10/20  Iteration 1700/3560 Training loss: 1.8602 0.4332 sec/batch
Epoch 10/20  Iteration 1701/3560 Training loss: 1.8598 0.4480 sec/batch
Epoch 10/20  Iteration 1702/3560 Training loss: 1.8593 0.4277 sec/batch
Epoch 10/20  Iteration 1703/3560 Training loss: 1.8593 0.4416 sec/batch
Epoch 10/20  Iteration 1704/3560 Training loss: 1.8591 0.4488 sec/batch
Epoch 10/20  Iteration 1705/3560 Training loss: 1.8588 0.4309 sec/batch
Epoch 10/20  Iteration 1706/3560 Training loss: 1.8586 0.4224 sec/batch
Epoch 10/20  Iteration 1707/3560 Training loss: 1.8583 0.3884 sec/batch
Epoch 10/20  Iteration 1708/3560 Training loss: 1.8582 0.3972 sec/batch
Epoch 10/20  Iteration 1709/3560 Training loss: 1.8581 0.4192 sec/batch
Epoch 10/20  Iteration 1710/3560 Training loss: 1.8580 0.3852 sec/batch
Epoch 10/20  Iteration 1711/3560 Training loss: 1.8580 0.3945 sec/batch
Epoch 10/20  Iteration 1712/3560 Training loss: 1.8579 0.4003 sec/batch
Epoch 10/20  Iteration 1713/3560 Training loss: 1.8579 0.4238 sec/batch
Epoch 10/20  Iteration 1714/3560 Training loss: 1.8578 0.4084 sec/batch
Epoch 10/20  Iteration 1715/3560 Training loss: 1.8577 0.4061 sec/batch
Epoch 10/20  Iteration 1716/3560 Training loss: 1.8576 0.3820 sec/batch
Epoch 10/20  Iteration 1717/3560 Training loss: 1.8574 0.3879 sec/batch
Epoch 10/20  Iteration 1718/3560 Training loss: 1.8570 0.3910 sec/batch
Epoch 10/20  Iteration 1719/3560 Training loss: 1.8570 0.5142 sec/batch
Epoch 10/20  Iteration 1720/3560 Training loss: 1.8570 0.5699 sec/batch
Epoch 10/20  Iteration 1721/3560 Training loss: 1.8569 0.5904 sec/batch
Epoch 10/20  Iteration 1722/3560 Training loss: 1.8568 0.4018 sec/batch
Epoch 10/20  Iteration 1723/3560 Training loss: 1.8569 0.4041 sec/batch
Epoch 10/20  Iteration 1724/3560 Training loss: 1.8565 0.3950 sec/batch
Epoch 10/20  Iteration 1725/3560 Training loss: 1.8563 0.4041 sec/batch
Epoch 10/20  Iteration 1726/3560 Training loss: 1.8563 0.3818 sec/batch
Epoch 10/20  Iteration 1727/3560 Training loss: 1.8563 0.4377 sec/batch
Epoch 10/20  Iteration 1728/3560 Training loss: 1.8559 0.4054 sec/batch
Epoch 10/20  Iteration 1729/3560 Training loss: 1.8559 0.3996 sec/batch
Epoch 10/20  Iteration 1730/3560 Training loss: 1.8559 0.4159 sec/batch
Epoch 10/20  Iteration 1731/3560 Training loss: 1.8558 0.4061 sec/batch
Epoch 10/20  Iteration 1732/3560 Training loss: 1.8558 0.4127 sec/batch
Epoch 10/20  Iteration 1733/3560 Training loss: 1.8554 0.3906 sec/batch
Epoch 10/20  Iteration 1734/3560 Training loss: 1.8552 0.4905 sec/batch
Epoch 10/20  Iteration 1735/3560 Training loss: 1.8552 0.3926 sec/batch
Epoch 10/20  Iteration 1736/3560 Training loss: 1.8552 0.3884 sec/batch
Epoch 10/20  Iteration 1737/3560 Training loss: 1.8551 0.4016 sec/batch
Epoch 10/20  Iteration 1738/3560 Training loss: 1.8552 0.4180 sec/batch
Epoch 10/20  Iteration 1739/3560 Training loss: 1.8553 0.3937 sec/batch
Epoch 10/20  Iteration 1740/3560 Training loss: 1.8552 0.4171 sec/batch
Epoch 10/20  Iteration 1741/3560 Training loss: 1.8554 0.3848 sec/batch
Epoch 10/20  Iteration 1742/3560 Training loss: 1.8552 0.3839 sec/batch
Epoch 10/20  Iteration 1743/3560 Training loss: 1.8554 0.3852 sec/batch
Epoch 10/20  Iteration 1744/3560 Training loss: 1.8553 0.4072 sec/batch
Epoch 10/20  Iteration 1745/3560 Training loss: 1.8553 0.4242 sec/batch
Epoch 10/20  Iteration 1746/3560 Training loss: 1.8553 0.5562 sec/batch
Epoch 10/20  Iteration 1747/3560 Training loss: 1.8552 0.5657 sec/batch
Epoch 10/20  Iteration 1748/3560 Training loss: 1.8553 0.6351 sec/batch
Epoch 10/20  Iteration 1749/3560 Training loss: 1.8553 0.5124 sec/batch
Epoch 10/20  Iteration 1750/3560 Training loss: 1.8555 0.4455 sec/batch
Epoch 10/20  Iteration 1751/3560 Training loss: 1.8555 0.4045 sec/batch
Epoch 10/20  Iteration 1752/3560 Training loss: 1.8553 0.4150 sec/batch
Epoch 10/20  Iteration 1753/3560 Training loss: 1.8551 0.4492 sec/batch
Epoch 10/20  Iteration 1754/3560 Training loss: 1.8552 0.4092 sec/batch
Epoch 10/20  Iteration 1755/3560 Training loss: 1.8552 0.4161 sec/batch
Epoch 10/20  Iteration 1756/3560 Training loss: 1.8552 0.3924 sec/batch
Epoch 10/20  Iteration 1757/3560 Training loss: 1.8551 0.3979 sec/batch
Epoch 10/20  Iteration 1758/3560 Training loss: 1.8552 0.3892 sec/batch
Epoch 10/20  Iteration 1759/3560 Training loss: 1.8551 0.3827 sec/batch
Epoch 10/20  Iteration 1760/3560 Training loss: 1.8551 0.4955 sec/batch
Epoch 10/20  Iteration 1761/3560 Training loss: 1.8549 0.5304 sec/batch
Epoch 10/20  Iteration 1762/3560 Training loss: 1.8550 0.3803 sec/batch
Epoch 10/20  Iteration 1763/3560 Training loss: 1.8550 0.3789 sec/batch
Epoch 10/20  Iteration 1764/3560 Training loss: 1.8549 0.4291 sec/batch
Epoch 10/20  Iteration 1765/3560 Training loss: 1.8550 0.4621 sec/batch
Epoch 10/20  Iteration 1766/3560 Training loss: 1.8549 0.4551 sec/batch
Epoch 10/20  Iteration 1767/3560 Training loss: 1.8549 0.4394 sec/batch
Epoch 10/20  Iteration 1768/3560 Training loss: 1.8549 0.3785 sec/batch
Epoch 10/20  Iteration 1769/3560 Training loss: 1.8549 0.3781 sec/batch
Epoch 10/20  Iteration 1770/3560 Training loss: 1.8552 0.5620 sec/batch
Epoch 10/20  Iteration 1771/3560 Training loss: 1.8552 0.3985 sec/batch
Epoch 10/20  Iteration 1772/3560 Training loss: 1.8551 0.4563 sec/batch
Epoch 10/20  Iteration 1773/3560 Training loss: 1.8550 0.3911 sec/batch
Epoch 10/20  Iteration 1774/3560 Training loss: 1.8549 0.3811 sec/batch
Epoch 10/20  Iteration 1775/3560 Training loss: 1.8549 0.4004 sec/batch
Epoch 10/20  Iteration 1776/3560 Training loss: 1.8550 0.3916 sec/batch
Epoch 10/20  Iteration 1777/3560 Training loss: 1.8549 0.4013 sec/batch
Epoch 10/20  Iteration 1778/3560 Training loss: 1.8548 0.3877 sec/batch
Epoch 10/20  Iteration 1779/3560 Training loss: 1.8547 0.3759 sec/batch
Epoch 10/20  Iteration 1780/3560 Training loss: 1.8547 0.3853 sec/batch
Epoch 11/20  Iteration 1781/3560 Training loss: 1.9136 0.3781 sec/batch
Epoch 11/20  Iteration 1782/3560 Training loss: 1.8755 0.3937 sec/batch
Epoch 11/20  Iteration 1783/3560 Training loss: 1.8663 0.4001 sec/batch
Epoch 11/20  Iteration 1784/3560 Training loss: 1.8605 0.4030 sec/batch
Epoch 11/20  Iteration 1785/3560 Training loss: 1.8555 0.4246 sec/batch
Epoch 11/20  Iteration 1786/3560 Training loss: 1.8466 0.3845 sec/batch
Epoch 11/20  Iteration 1787/3560 Training loss: 1.8467 0.4603 sec/batch
Epoch 11/20  Iteration 1788/3560 Training loss: 1.8443 0.5406 sec/batch
Epoch 11/20  Iteration 1789/3560 Training loss: 1.8471 0.5413 sec/batch
Epoch 11/20  Iteration 1790/3560 Training loss: 1.8468 0.4984 sec/batch
Epoch 11/20  Iteration 1791/3560 Training loss: 1.8436 0.4005 sec/batch
Epoch 11/20  Iteration 1792/3560 Training loss: 1.8418 0.4173 sec/batch
Epoch 11/20  Iteration 1793/3560 Training loss: 1.8413 0.4093 sec/batch
Epoch 11/20  Iteration 1794/3560 Training loss: 1.8437 0.3858 sec/batch
Epoch 11/20  Iteration 1795/3560 Training loss: 1.8425 0.3753 sec/batch
Epoch 11/20  Iteration 1796/3560 Training loss: 1.8407 0.4049 sec/batch
Epoch 11/20  Iteration 1797/3560 Training loss: 1.8398 0.3817 sec/batch
Epoch 11/20  Iteration 1798/3560 Training loss: 1.8413 0.3935 sec/batch
Epoch 11/20  Iteration 1799/3560 Training loss: 1.8409 0.4901 sec/batch
Epoch 11/20  Iteration 1800/3560 Training loss: 1.8413 0.4983 sec/batch
Epoch 11/20  Iteration 1801/3560 Training loss: 1.8409 0.4822 sec/batch
Epoch 11/20  Iteration 1802/3560 Training loss: 1.8412 0.4694 sec/batch
Epoch 11/20  Iteration 1803/3560 Training loss: 1.8409 0.3929 sec/batch
Epoch 11/20  Iteration 1804/3560 Training loss: 1.8404 0.4502 sec/batch
Epoch 11/20  Iteration 1805/3560 Training loss: 1.8404 0.3793 sec/batch
Epoch 11/20  Iteration 1806/3560 Training loss: 1.8394 0.3760 sec/batch
Epoch 11/20  Iteration 1807/3560 Training loss: 1.8388 0.3797 sec/batch
Epoch 11/20  Iteration 1808/3560 Training loss: 1.8391 0.4040 sec/batch
Epoch 11/20  Iteration 1809/3560 Training loss: 1.8401 0.3742 sec/batch
Epoch 11/20  Iteration 1810/3560 Training loss: 1.8406 0.3825 sec/batch
Epoch 11/20  Iteration 1811/3560 Training loss: 1.8407 0.3727 sec/batch
Epoch 11/20  Iteration 1812/3560 Training loss: 1.8398 0.3859 sec/batch
Epoch 11/20  Iteration 1813/3560 Training loss: 1.8400 0.3831 sec/batch
Epoch 11/20  Iteration 1814/3560 Training loss: 1.8408 0.3788 sec/batch
Epoch 11/20  Iteration 1815/3560 Training loss: 1.8404 0.3781 sec/batch
Epoch 11/20  Iteration 1816/3560 Training loss: 1.8405 0.3826 sec/batch
Epoch 11/20  Iteration 1817/3560 Training loss: 1.8400 0.3800 sec/batch
Epoch 11/20  Iteration 1818/3560 Training loss: 1.8390 0.3842 sec/batch
Epoch 11/20  Iteration 1819/3560 Training loss: 1.8380 0.3770 sec/batch
Epoch 11/20  Iteration 1820/3560 Training loss: 1.8373 0.3778 sec/batch
Epoch 11/20  Iteration 1821/3560 Training loss: 1.8371 0.3986 sec/batch
Epoch 11/20  Iteration 1822/3560 Training loss: 1.8373 0.3819 sec/batch
Epoch 11/20  Iteration 1823/3560 Training loss: 1.8369 0.3833 sec/batch
Epoch 11/20  Iteration 1824/3560 Training loss: 1.8363 0.3782 sec/batch
Epoch 11/20  Iteration 1825/3560 Training loss: 1.8366 0.3785 sec/batch
Epoch 11/20  Iteration 1826/3560 Training loss: 1.8355 0.3798 sec/batch
Epoch 11/20  Iteration 1827/3560 Training loss: 1.8353 0.3782 sec/batch
Epoch 11/20  Iteration 1828/3560 Training loss: 1.8348 0.3825 sec/batch
Epoch 11/20  Iteration 1829/3560 Training loss: 1.8347 0.4091 sec/batch
Epoch 11/20  Iteration 1830/3560 Training loss: 1.8354 0.3741 sec/batch
Epoch 11/20  Iteration 1831/3560 Training loss: 1.8348 0.3933 sec/batch
Epoch 11/20  Iteration 1832/3560 Training loss: 1.8357 0.3903 sec/batch
Epoch 11/20  Iteration 1833/3560 Training loss: 1.8355 0.3792 sec/batch
Epoch 11/20  Iteration 1834/3560 Training loss: 1.8355 0.4111 sec/batch
Epoch 11/20  Iteration 1835/3560 Training loss: 1.8355 0.3772 sec/batch
Epoch 11/20  Iteration 1836/3560 Training loss: 1.8358 0.3794 sec/batch
Epoch 11/20  Iteration 1837/3560 Training loss: 1.8362 0.3752 sec/batch
Epoch 11/20  Iteration 1838/3560 Training loss: 1.8359 0.3735 sec/batch
Epoch 11/20  Iteration 1839/3560 Training loss: 1.8355 0.3776 sec/batch
Epoch 11/20  Iteration 1840/3560 Training loss: 1.8361 0.3938 sec/batch
Epoch 11/20  Iteration 1841/3560 Training loss: 1.8359 0.3819 sec/batch
Epoch 11/20  Iteration 1842/3560 Training loss: 1.8365 0.3884 sec/batch
Epoch 11/20  Iteration 1843/3560 Training loss: 1.8368 0.3851 sec/batch
Epoch 11/20  Iteration 1844/3560 Training loss: 1.8371 0.3854 sec/batch
Epoch 11/20  Iteration 1845/3560 Training loss: 1.8370 0.3791 sec/batch
Epoch 11/20  Iteration 1846/3560 Training loss: 1.8373 0.3821 sec/batch
Epoch 11/20  Iteration 1847/3560 Training loss: 1.8374 0.4038 sec/batch
Epoch 11/20  Iteration 1848/3560 Training loss: 1.8370 0.3978 sec/batch
Epoch 11/20  Iteration 1849/3560 Training loss: 1.8369 0.3918 sec/batch
Epoch 11/20  Iteration 1850/3560 Training loss: 1.8368 0.4113 sec/batch
Epoch 11/20  Iteration 1851/3560 Training loss: 1.8372 0.3940 sec/batch
Epoch 11/20  Iteration 1852/3560 Training loss: 1.8373 0.3938 sec/batch
Epoch 11/20  Iteration 1853/3560 Training loss: 1.8376 0.3919 sec/batch
Epoch 11/20  Iteration 1854/3560 Training loss: 1.8374 0.3764 sec/batch
Epoch 11/20  Iteration 1855/3560 Training loss: 1.8373 0.3771 sec/batch
Epoch 11/20  Iteration 1856/3560 Training loss: 1.8376 0.3841 sec/batch
Epoch 11/20  Iteration 1857/3560 Training loss: 1.8376 0.3763 sec/batch
Epoch 11/20  Iteration 1858/3560 Training loss: 1.8378 0.3750 sec/batch
Epoch 11/20  Iteration 1859/3560 Training loss: 1.8373 0.3786 sec/batch
Epoch 11/20  Iteration 1860/3560 Training loss: 1.8372 0.3980 sec/batch
Epoch 11/20  Iteration 1861/3560 Training loss: 1.8366 0.3813 sec/batch
Epoch 11/20  Iteration 1862/3560 Training loss: 1.8369 0.3805 sec/batch
Epoch 11/20  Iteration 1863/3560 Training loss: 1.8364 0.3885 sec/batch
Epoch 11/20  Iteration 1864/3560 Training loss: 1.8363 0.3799 sec/batch
Epoch 11/20  Iteration 1865/3560 Training loss: 1.8358 0.3737 sec/batch
Epoch 11/20  Iteration 1866/3560 Training loss: 1.8355 0.3899 sec/batch
Epoch 11/20  Iteration 1867/3560 Training loss: 1.8354 0.3770 sec/batch
Epoch 11/20  Iteration 1868/3560 Training loss: 1.8352 0.3802 sec/batch
Epoch 11/20  Iteration 1869/3560 Training loss: 1.8347 0.3768 sec/batch
Epoch 11/20  Iteration 1870/3560 Training loss: 1.8348 0.3732 sec/batch
Epoch 11/20  Iteration 1871/3560 Training loss: 1.8345 0.3808 sec/batch
Epoch 11/20  Iteration 1872/3560 Training loss: 1.8344 0.3774 sec/batch
Epoch 11/20  Iteration 1873/3560 Training loss: 1.8340 0.4008 sec/batch
Epoch 11/20  Iteration 1874/3560 Training loss: 1.8335 0.4252 sec/batch
Epoch 11/20  Iteration 1875/3560 Training loss: 1.8331 0.3761 sec/batch
Epoch 11/20  Iteration 1876/3560 Training loss: 1.8331 0.3764 sec/batch
Epoch 11/20  Iteration 1877/3560 Training loss: 1.8331 0.4061 sec/batch
Epoch 11/20  Iteration 1878/3560 Training loss: 1.8328 0.3864 sec/batch
Epoch 11/20  Iteration 1879/3560 Training loss: 1.8324 0.3886 sec/batch
Epoch 11/20  Iteration 1880/3560 Training loss: 1.8319 0.4027 sec/batch
Epoch 11/20  Iteration 1881/3560 Training loss: 1.8318 0.4089 sec/batch
Epoch 11/20  Iteration 1882/3560 Training loss: 1.8317 0.3993 sec/batch
Epoch 11/20  Iteration 1883/3560 Training loss: 1.8315 0.3888 sec/batch
Epoch 11/20  Iteration 1884/3560 Training loss: 1.8313 0.3840 sec/batch
Epoch 11/20  Iteration 1885/3560 Training loss: 1.8311 0.3911 sec/batch
Epoch 11/20  Iteration 1886/3560 Training loss: 1.8309 0.3972 sec/batch
Epoch 11/20  Iteration 1887/3560 Training loss: 1.8309 0.3864 sec/batch
Epoch 11/20  Iteration 1888/3560 Training loss: 1.8308 0.3885 sec/batch
Epoch 11/20  Iteration 1889/3560 Training loss: 1.8309 0.3773 sec/batch
Epoch 11/20  Iteration 1890/3560 Training loss: 1.8309 0.3795 sec/batch
Epoch 11/20  Iteration 1891/3560 Training loss: 1.8309 0.3923 sec/batch
Epoch 11/20  Iteration 1892/3560 Training loss: 1.8308 0.4137 sec/batch
Epoch 11/20  Iteration 1893/3560 Training loss: 1.8306 0.3820 sec/batch
Epoch 11/20  Iteration 1894/3560 Training loss: 1.8305 0.3840 sec/batch
Epoch 11/20  Iteration 1895/3560 Training loss: 1.8303 0.3794 sec/batch
Epoch 11/20  Iteration 1896/3560 Training loss: 1.8298 0.3789 sec/batch
Epoch 11/20  Iteration 1897/3560 Training loss: 1.8298 0.3740 sec/batch
Epoch 11/20  Iteration 1898/3560 Training loss: 1.8297 0.3764 sec/batch
Epoch 11/20  Iteration 1899/3560 Training loss: 1.8297 0.3817 sec/batch
Epoch 11/20  Iteration 1900/3560 Training loss: 1.8297 0.3959 sec/batch
Epoch 11/20  Iteration 1901/3560 Training loss: 1.8297 0.3820 sec/batch
Epoch 11/20  Iteration 1902/3560 Training loss: 1.8294 0.3807 sec/batch
Epoch 11/20  Iteration 1903/3560 Training loss: 1.8291 0.3748 sec/batch
Epoch 11/20  Iteration 1904/3560 Training loss: 1.8292 0.3753 sec/batch
Epoch 11/20  Iteration 1905/3560 Training loss: 1.8291 0.3871 sec/batch
Epoch 11/20  Iteration 1906/3560 Training loss: 1.8287 0.3903 sec/batch
Epoch 11/20  Iteration 1907/3560 Training loss: 1.8288 0.3773 sec/batch
Epoch 11/20  Iteration 1908/3560 Training loss: 1.8288 0.3766 sec/batch
Epoch 11/20  Iteration 1909/3560 Training loss: 1.8287 0.3757 sec/batch
Epoch 11/20  Iteration 1910/3560 Training loss: 1.8286 0.3894 sec/batch
Epoch 11/20  Iteration 1911/3560 Training loss: 1.8283 0.3735 sec/batch
Epoch 11/20  Iteration 1912/3560 Training loss: 1.8281 0.3769 sec/batch
Epoch 11/20  Iteration 1913/3560 Training loss: 1.8281 0.4017 sec/batch
Epoch 11/20  Iteration 1914/3560 Training loss: 1.8281 0.3768 sec/batch
Epoch 11/20  Iteration 1915/3560 Training loss: 1.8281 0.3731 sec/batch
Epoch 11/20  Iteration 1916/3560 Training loss: 1.8282 0.3751 sec/batch
Epoch 11/20  Iteration 1917/3560 Training loss: 1.8282 0.3766 sec/batch
Epoch 11/20  Iteration 1918/3560 Training loss: 1.8282 0.3743 sec/batch
Epoch 11/20  Iteration 1919/3560 Training loss: 1.8284 0.3753 sec/batch
Epoch 11/20  Iteration 1920/3560 Training loss: 1.8282 0.3737 sec/batch
Epoch 11/20  Iteration 1921/3560 Training loss: 1.8284 0.3741 sec/batch
Epoch 11/20  Iteration 1922/3560 Training loss: 1.8284 0.3855 sec/batch
Epoch 11/20  Iteration 1923/3560 Training loss: 1.8283 0.3838 sec/batch
Epoch 11/20  Iteration 1924/3560 Training loss: 1.8283 0.3868 sec/batch
Epoch 11/20  Iteration 1925/3560 Training loss: 1.8283 0.3819 sec/batch
Epoch 11/20  Iteration 1926/3560 Training loss: 1.8284 0.4126 sec/batch
Epoch 11/20  Iteration 1927/3560 Training loss: 1.8284 0.3877 sec/batch
Epoch 11/20  Iteration 1928/3560 Training loss: 1.8285 0.3749 sec/batch
Epoch 11/20  Iteration 1929/3560 Training loss: 1.8285 0.3771 sec/batch
Epoch 11/20  Iteration 1930/3560 Training loss: 1.8283 0.3745 sec/batch
Epoch 11/20  Iteration 1931/3560 Training loss: 1.8281 0.3764 sec/batch
Epoch 11/20  Iteration 1932/3560 Training loss: 1.8282 0.3994 sec/batch
Epoch 11/20  Iteration 1933/3560 Training loss: 1.8282 0.3795 sec/batch
Epoch 11/20  Iteration 1934/3560 Training loss: 1.8282 0.3987 sec/batch
Epoch 11/20  Iteration 1935/3560 Training loss: 1.8281 0.3749 sec/batch
Epoch 11/20  Iteration 1936/3560 Training loss: 1.8281 0.3976 sec/batch
Epoch 11/20  Iteration 1937/3560 Training loss: 1.8281 0.3829 sec/batch
Epoch 11/20  Iteration 1938/3560 Training loss: 1.8281 0.4045 sec/batch
Epoch 11/20  Iteration 1939/3560 Training loss: 1.8278 0.4146 sec/batch
Epoch 11/20  Iteration 1940/3560 Training loss: 1.8279 0.3735 sec/batch
Epoch 11/20  Iteration 1941/3560 Training loss: 1.8280 0.3795 sec/batch
Epoch 11/20  Iteration 1942/3560 Training loss: 1.8280 0.4010 sec/batch
Epoch 11/20  Iteration 1943/3560 Training loss: 1.8280 0.3871 sec/batch
Epoch 11/20  Iteration 1944/3560 Training loss: 1.8279 0.4144 sec/batch
Epoch 11/20  Iteration 1945/3560 Training loss: 1.8279 0.3817 sec/batch
Epoch 11/20  Iteration 1946/3560 Training loss: 1.8278 0.3760 sec/batch
Epoch 11/20  Iteration 1947/3560 Training loss: 1.8279 0.4151 sec/batch
Epoch 11/20  Iteration 1948/3560 Training loss: 1.8282 0.4035 sec/batch
Epoch 11/20  Iteration 1949/3560 Training loss: 1.8281 0.3951 sec/batch
Epoch 11/20  Iteration 1950/3560 Training loss: 1.8280 0.3787 sec/batch
Epoch 11/20  Iteration 1951/3560 Training loss: 1.8279 0.3778 sec/batch
Epoch 11/20  Iteration 1952/3560 Training loss: 1.8278 0.4130 sec/batch
Epoch 11/20  Iteration 1953/3560 Training loss: 1.8279 0.3753 sec/batch
Epoch 11/20  Iteration 1954/3560 Training loss: 1.8278 0.4082 sec/batch
Epoch 11/20  Iteration 1955/3560 Training loss: 1.8278 0.3872 sec/batch
Epoch 11/20  Iteration 1956/3560 Training loss: 1.8278 0.3807 sec/batch
Epoch 11/20  Iteration 1957/3560 Training loss: 1.8276 0.4001 sec/batch
Epoch 11/20  Iteration 1958/3560 Training loss: 1.8276 0.3855 sec/batch
Epoch 12/20  Iteration 1959/3560 Training loss: 1.8836 0.3829 sec/batch
Epoch 12/20  Iteration 1960/3560 Training loss: 1.8465 0.3963 sec/batch
Epoch 12/20  Iteration 1961/3560 Training loss: 1.8366 0.3753 sec/batch
Epoch 12/20  Iteration 1962/3560 Training loss: 1.8337 0.3809 sec/batch
Epoch 12/20  Iteration 1963/3560 Training loss: 1.8278 0.3793 sec/batch
Epoch 12/20  Iteration 1964/3560 Training loss: 1.8207 0.3744 sec/batch
Epoch 12/20  Iteration 1965/3560 Training loss: 1.8208 0.4097 sec/batch
Epoch 12/20  Iteration 1966/3560 Training loss: 1.8201 0.3880 sec/batch
Epoch 12/20  Iteration 1967/3560 Training loss: 1.8222 0.3736 sec/batch
Epoch 12/20  Iteration 1968/3560 Training loss: 1.8216 0.3787 sec/batch
Epoch 12/20  Iteration 1969/3560 Training loss: 1.8184 0.3801 sec/batch
Epoch 12/20  Iteration 1970/3560 Training loss: 1.8163 0.4032 sec/batch
Epoch 12/20  Iteration 1971/3560 Training loss: 1.8165 0.3876 sec/batch
Epoch 12/20  Iteration 1972/3560 Training loss: 1.8185 0.3950 sec/batch
Epoch 12/20  Iteration 1973/3560 Training loss: 1.8176 0.3787 sec/batch
Epoch 12/20  Iteration 1974/3560 Training loss: 1.8151 0.3831 sec/batch
Epoch 12/20  Iteration 1975/3560 Training loss: 1.8146 0.3744 sec/batch
Epoch 12/20  Iteration 1976/3560 Training loss: 1.8162 0.3758 sec/batch
Epoch 12/20  Iteration 1977/3560 Training loss: 1.8162 0.3746 sec/batch
Epoch 12/20  Iteration 1978/3560 Training loss: 1.8171 0.4016 sec/batch
Epoch 12/20  Iteration 1979/3560 Training loss: 1.8164 0.3972 sec/batch
Epoch 12/20  Iteration 1980/3560 Training loss: 1.8168 0.3865 sec/batch
Epoch 12/20  Iteration 1981/3560 Training loss: 1.8168 0.3748 sec/batch
Epoch 12/20  Iteration 1982/3560 Training loss: 1.8163 0.3826 sec/batch
Epoch 12/20  Iteration 1983/3560 Training loss: 1.8161 0.4086 sec/batch
Epoch 12/20  Iteration 1984/3560 Training loss: 1.8154 0.3789 sec/batch
Epoch 12/20  Iteration 1985/3560 Training loss: 1.8144 0.3735 sec/batch
Epoch 12/20  Iteration 1986/3560 Training loss: 1.8147 0.3742 sec/batch
Epoch 12/20  Iteration 1987/3560 Training loss: 1.8156 0.3785 sec/batch
Epoch 12/20  Iteration 1988/3560 Training loss: 1.8160 0.3846 sec/batch
Epoch 12/20  Iteration 1989/3560 Training loss: 1.8157 0.3790 sec/batch
Epoch 12/20  Iteration 1990/3560 Training loss: 1.8148 0.3763 sec/batch
Epoch 12/20  Iteration 1991/3560 Training loss: 1.8152 0.3982 sec/batch
Epoch 12/20  Iteration 1992/3560 Training loss: 1.8156 0.3828 sec/batch
Epoch 12/20  Iteration 1993/3560 Training loss: 1.8152 0.3773 sec/batch
Epoch 12/20  Iteration 1994/3560 Training loss: 1.8153 0.4024 sec/batch
Epoch 12/20  Iteration 1995/3560 Training loss: 1.8149 0.3818 sec/batch
Epoch 12/20  Iteration 1996/3560 Training loss: 1.8138 0.3852 sec/batch
Epoch 12/20  Iteration 1997/3560 Training loss: 1.8128 0.3789 sec/batch
Epoch 12/20  Iteration 1998/3560 Training loss: 1.8120 0.3914 sec/batch
Epoch 12/20  Iteration 1999/3560 Training loss: 1.8117 0.3798 sec/batch
Epoch 12/20  Iteration 2000/3560 Training loss: 1.8119 0.3918 sec/batch
Epoch 12/20  Iteration 2001/3560 Training loss: 1.8113 0.3749 sec/batch
Epoch 12/20  Iteration 2002/3560 Training loss: 1.8104 0.3795 sec/batch
Epoch 12/20  Iteration 2003/3560 Training loss: 1.8104 0.3861 sec/batch
Epoch 12/20  Iteration 2004/3560 Training loss: 1.8093 0.4158 sec/batch
Epoch 12/20  Iteration 2005/3560 Training loss: 1.8091 0.3788 sec/batch
Epoch 12/20  Iteration 2006/3560 Training loss: 1.8084 0.3849 sec/batch
Epoch 12/20  Iteration 2007/3560 Training loss: 1.8083 0.3839 sec/batch
Epoch 12/20  Iteration 2008/3560 Training loss: 1.8090 0.3804 sec/batch
Epoch 12/20  Iteration 2009/3560 Training loss: 1.8084 0.3766 sec/batch
Epoch 12/20  Iteration 2010/3560 Training loss: 1.8094 0.3832 sec/batch
Epoch 12/20  Iteration 2011/3560 Training loss: 1.8093 0.3763 sec/batch
Epoch 12/20  Iteration 2012/3560 Training loss: 1.8095 0.3730 sec/batch
Epoch 12/20  Iteration 2013/3560 Training loss: 1.8092 0.3785 sec/batch
Epoch 12/20  Iteration 2014/3560 Training loss: 1.8093 0.3739 sec/batch
Epoch 12/20  Iteration 2015/3560 Training loss: 1.8095 0.3741 sec/batch
Epoch 12/20  Iteration 2016/3560 Training loss: 1.8091 0.3794 sec/batch
Epoch 12/20  Iteration 2017/3560 Training loss: 1.8088 0.3810 sec/batch
Epoch 12/20  Iteration 2018/3560 Training loss: 1.8094 0.4068 sec/batch
Epoch 12/20  Iteration 2019/3560 Training loss: 1.8092 0.3816 sec/batch
Epoch 12/20  Iteration 2020/3560 Training loss: 1.8100 0.3743 sec/batch
Epoch 12/20  Iteration 2021/3560 Training loss: 1.8103 0.3822 sec/batch
Epoch 12/20  Iteration 2022/3560 Training loss: 1.8106 0.3763 sec/batch
Epoch 12/20  Iteration 2023/3560 Training loss: 1.8105 0.3720 sec/batch
Epoch 12/20  Iteration 2024/3560 Training loss: 1.8107 0.3762 sec/batch
Epoch 12/20  Iteration 2025/3560 Training loss: 1.8109 0.3940 sec/batch
Epoch 12/20  Iteration 2026/3560 Training loss: 1.8104 0.3775 sec/batch
Epoch 12/20  Iteration 2027/3560 Training loss: 1.8104 0.3814 sec/batch
Epoch 12/20  Iteration 2028/3560 Training loss: 1.8104 0.3822 sec/batch
Epoch 12/20  Iteration 2029/3560 Training loss: 1.8108 0.3753 sec/batch
Epoch 12/20  Iteration 2030/3560 Training loss: 1.8111 0.3806 sec/batch
Epoch 12/20  Iteration 2031/3560 Training loss: 1.8114 0.3978 sec/batch
Epoch 12/20  Iteration 2032/3560 Training loss: 1.8112 0.3806 sec/batch
Epoch 12/20  Iteration 2033/3560 Training loss: 1.8113 0.3748 sec/batch
Epoch 12/20  Iteration 2034/3560 Training loss: 1.8115 0.3769 sec/batch
Epoch 12/20  Iteration 2035/3560 Training loss: 1.8115 0.3763 sec/batch
Epoch 12/20  Iteration 2036/3560 Training loss: 1.8117 0.3778 sec/batch
Epoch 12/20  Iteration 2037/3560 Training loss: 1.8112 0.3720 sec/batch
Epoch 12/20  Iteration 2038/3560 Training loss: 1.8111 0.3721 sec/batch
Epoch 12/20  Iteration 2039/3560 Training loss: 1.8106 0.4003 sec/batch
Epoch 12/20  Iteration 2040/3560 Training loss: 1.8108 0.3860 sec/batch
Epoch 12/20  Iteration 2041/3560 Training loss: 1.8105 0.3826 sec/batch
Epoch 12/20  Iteration 2042/3560 Training loss: 1.8104 0.3858 sec/batch
Epoch 12/20  Iteration 2043/3560 Training loss: 1.8099 0.3896 sec/batch
Epoch 12/20  Iteration 2044/3560 Training loss: 1.8096 0.4123 sec/batch
Epoch 12/20  Iteration 2045/3560 Training loss: 1.8095 0.3816 sec/batch
Epoch 12/20  Iteration 2046/3560 Training loss: 1.8091 0.3833 sec/batch
Epoch 12/20  Iteration 2047/3560 Training loss: 1.8086 0.4585 sec/batch
Epoch 12/20  Iteration 2048/3560 Training loss: 1.8087 0.5250 sec/batch
Epoch 12/20  Iteration 2049/3560 Training loss: 1.8084 0.3906 sec/batch
Epoch 12/20  Iteration 2050/3560 Training loss: 1.8084 0.3796 sec/batch
Epoch 12/20  Iteration 2051/3560 Training loss: 1.8079 0.3767 sec/batch
Epoch 12/20  Iteration 2052/3560 Training loss: 1.8074 0.3780 sec/batch
Epoch 12/20  Iteration 2053/3560 Training loss: 1.8070 0.3917 sec/batch
Epoch 12/20  Iteration 2054/3560 Training loss: 1.8070 0.3764 sec/batch
Epoch 12/20  Iteration 2055/3560 Training loss: 1.8069 0.3818 sec/batch
Epoch 12/20  Iteration 2056/3560 Training loss: 1.8066 0.3969 sec/batch
Epoch 12/20  Iteration 2057/3560 Training loss: 1.8063 0.3882 sec/batch
Epoch 12/20  Iteration 2058/3560 Training loss: 1.8060 0.3775 sec/batch
Epoch 12/20  Iteration 2059/3560 Training loss: 1.8060 0.3893 sec/batch
Epoch 12/20  Iteration 2060/3560 Training loss: 1.8057 0.3820 sec/batch
Epoch 12/20  Iteration 2061/3560 Training loss: 1.8056 0.3854 sec/batch
Epoch 12/20  Iteration 2062/3560 Training loss: 1.8054 0.3725 sec/batch
Epoch 12/20  Iteration 2063/3560 Training loss: 1.8052 0.3818 sec/batch
Epoch 12/20  Iteration 2064/3560 Training loss: 1.8051 0.3847 sec/batch
Epoch 12/20  Iteration 2065/3560 Training loss: 1.8051 0.3768 sec/batch
Epoch 12/20  Iteration 2066/3560 Training loss: 1.8050 0.3853 sec/batch
Epoch 12/20  Iteration 2067/3560 Training loss: 1.8051 0.3755 sec/batch
Epoch 12/20  Iteration 2068/3560 Training loss: 1.8051 0.3783 sec/batch
Epoch 12/20  Iteration 2069/3560 Training loss: 1.8051 0.3740 sec/batch
Epoch 12/20  Iteration 2070/3560 Training loss: 1.8050 0.4010 sec/batch
Epoch 12/20  Iteration 2071/3560 Training loss: 1.8049 0.3754 sec/batch
Epoch 12/20  Iteration 2072/3560 Training loss: 1.8048 0.3835 sec/batch
Epoch 12/20  Iteration 2073/3560 Training loss: 1.8046 0.3771 sec/batch
Epoch 12/20  Iteration 2074/3560 Training loss: 1.8043 0.3776 sec/batch
Epoch 12/20  Iteration 2075/3560 Training loss: 1.8042 0.3780 sec/batch
Epoch 12/20  Iteration 2076/3560 Training loss: 1.8041 0.3804 sec/batch
Epoch 12/20  Iteration 2077/3560 Training loss: 1.8040 0.3760 sec/batch
Epoch 12/20  Iteration 2078/3560 Training loss: 1.8040 0.3756 sec/batch
Epoch 12/20  Iteration 2079/3560 Training loss: 1.8039 0.3786 sec/batch
Epoch 12/20  Iteration 2080/3560 Training loss: 1.8036 0.3773 sec/batch
Epoch 12/20  Iteration 2081/3560 Training loss: 1.8033 0.3786 sec/batch
Epoch 12/20  Iteration 2082/3560 Training loss: 1.8034 0.3865 sec/batch
Epoch 12/20  Iteration 2083/3560 Training loss: 1.8034 0.3980 sec/batch
Epoch 12/20  Iteration 2084/3560 Training loss: 1.8030 0.3829 sec/batch
Epoch 12/20  Iteration 2085/3560 Training loss: 1.8031 0.3790 sec/batch
Epoch 12/20  Iteration 2086/3560 Training loss: 1.8032 0.3780 sec/batch
Epoch 12/20  Iteration 2087/3560 Training loss: 1.8031 0.4042 sec/batch
Epoch 12/20  Iteration 2088/3560 Training loss: 1.8030 0.3775 sec/batch
Epoch 12/20  Iteration 2089/3560 Training loss: 1.8027 0.3764 sec/batch
Epoch 12/20  Iteration 2090/3560 Training loss: 1.8025 0.3810 sec/batch
Epoch 12/20  Iteration 2091/3560 Training loss: 1.8025 0.3865 sec/batch
Epoch 12/20  Iteration 2092/3560 Training loss: 1.8026 0.3819 sec/batch
Epoch 12/20  Iteration 2093/3560 Training loss: 1.8026 0.3765 sec/batch
Epoch 12/20  Iteration 2094/3560 Training loss: 1.8026 0.3805 sec/batch
Epoch 12/20  Iteration 2095/3560 Training loss: 1.8027 0.3802 sec/batch
Epoch 12/20  Iteration 2096/3560 Training loss: 1.8027 0.4193 sec/batch
Epoch 12/20  Iteration 2097/3560 Training loss: 1.8028 0.3883 sec/batch
Epoch 12/20  Iteration 2098/3560 Training loss: 1.8026 0.3802 sec/batch
Epoch 12/20  Iteration 2099/3560 Training loss: 1.8029 0.3799 sec/batch
Epoch 12/20  Iteration 2100/3560 Training loss: 1.8028 0.3764 sec/batch
Epoch 12/20  Iteration 2101/3560 Training loss: 1.8027 0.3777 sec/batch
Epoch 12/20  Iteration 2102/3560 Training loss: 1.8028 0.3748 sec/batch
Epoch 12/20  Iteration 2103/3560 Training loss: 1.8027 0.3739 sec/batch
Epoch 12/20  Iteration 2104/3560 Training loss: 1.8027 0.3750 sec/batch
Epoch 12/20  Iteration 2105/3560 Training loss: 1.8028 0.3728 sec/batch
Epoch 12/20  Iteration 2106/3560 Training loss: 1.8029 0.3812 sec/batch
Epoch 12/20  Iteration 2107/3560 Training loss: 1.8029 0.3768 sec/batch
Epoch 12/20  Iteration 2108/3560 Training loss: 1.8028 0.3766 sec/batch
Epoch 12/20  Iteration 2109/3560 Training loss: 1.8026 0.4011 sec/batch
Epoch 12/20  Iteration 2110/3560 Training loss: 1.8027 0.3757 sec/batch
Epoch 12/20  Iteration 2111/3560 Training loss: 1.8027 0.3811 sec/batch
Epoch 12/20  Iteration 2112/3560 Training loss: 1.8027 0.3864 sec/batch
Epoch 12/20  Iteration 2113/3560 Training loss: 1.8027 0.3845 sec/batch
Epoch 12/20  Iteration 2114/3560 Training loss: 1.8027 0.3732 sec/batch
Epoch 12/20  Iteration 2115/3560 Training loss: 1.8027 0.3721 sec/batch
Epoch 12/20  Iteration 2116/3560 Training loss: 1.8026 0.3813 sec/batch
Epoch 12/20  Iteration 2117/3560 Training loss: 1.8024 0.3834 sec/batch
Epoch 12/20  Iteration 2118/3560 Training loss: 1.8026 0.3828 sec/batch
Epoch 12/20  Iteration 2119/3560 Training loss: 1.8027 0.3793 sec/batch
Epoch 12/20  Iteration 2120/3560 Training loss: 1.8027 0.3783 sec/batch
Epoch 12/20  Iteration 2121/3560 Training loss: 1.8027 0.3751 sec/batch
Epoch 12/20  Iteration 2122/3560 Training loss: 1.8027 0.3759 sec/batch
Epoch 12/20  Iteration 2123/3560 Training loss: 1.8027 0.4011 sec/batch
Epoch 12/20  Iteration 2124/3560 Training loss: 1.8026 0.3735 sec/batch
Epoch 12/20  Iteration 2125/3560 Training loss: 1.8027 0.3835 sec/batch
Epoch 12/20  Iteration 2126/3560 Training loss: 1.8031 0.3829 sec/batch
Epoch 12/20  Iteration 2127/3560 Training loss: 1.8031 0.3881 sec/batch
Epoch 12/20  Iteration 2128/3560 Training loss: 1.8030 0.3753 sec/batch
Epoch 12/20  Iteration 2129/3560 Training loss: 1.8029 0.3770 sec/batch
Epoch 12/20  Iteration 2130/3560 Training loss: 1.8028 0.3791 sec/batch
Epoch 12/20  Iteration 2131/3560 Training loss: 1.8028 0.3765 sec/batch
Epoch 12/20  Iteration 2132/3560 Training loss: 1.8028 0.3765 sec/batch
Epoch 12/20  Iteration 2133/3560 Training loss: 1.8029 0.3903 sec/batch
Epoch 12/20  Iteration 2134/3560 Training loss: 1.8028 0.3836 sec/batch
Epoch 12/20  Iteration 2135/3560 Training loss: 1.8026 0.3763 sec/batch
Epoch 12/20  Iteration 2136/3560 Training loss: 1.8027 0.3951 sec/batch
Epoch 13/20  Iteration 2137/3560 Training loss: 1.8618 0.3911 sec/batch
Epoch 13/20  Iteration 2138/3560 Training loss: 1.8267 0.3825 sec/batch
Epoch 13/20  Iteration 2139/3560 Training loss: 1.8153 0.3760 sec/batch
Epoch 13/20  Iteration 2140/3560 Training loss: 1.8100 0.3787 sec/batch
Epoch 13/20  Iteration 2141/3560 Training loss: 1.8057 0.3861 sec/batch
Epoch 13/20  Iteration 2142/3560 Training loss: 1.7964 0.3813 sec/batch
Epoch 13/20  Iteration 2143/3560 Training loss: 1.7973 0.3872 sec/batch
Epoch 13/20  Iteration 2144/3560 Training loss: 1.7960 0.3762 sec/batch
Epoch 13/20  Iteration 2145/3560 Training loss: 1.7984 0.3772 sec/batch
Epoch 13/20  Iteration 2146/3560 Training loss: 1.7974 0.3794 sec/batch
Epoch 13/20  Iteration 2147/3560 Training loss: 1.7938 0.3795 sec/batch
Epoch 13/20  Iteration 2148/3560 Training loss: 1.7924 0.3806 sec/batch
Epoch 13/20  Iteration 2149/3560 Training loss: 1.7923 0.4153 sec/batch
Epoch 13/20  Iteration 2150/3560 Training loss: 1.7947 0.3899 sec/batch
Epoch 13/20  Iteration 2151/3560 Training loss: 1.7937 0.3836 sec/batch
Epoch 13/20  Iteration 2152/3560 Training loss: 1.7909 0.3876 sec/batch
Epoch 13/20  Iteration 2153/3560 Training loss: 1.7908 0.3810 sec/batch
Epoch 13/20  Iteration 2154/3560 Training loss: 1.7924 0.3772 sec/batch
Epoch 13/20  Iteration 2155/3560 Training loss: 1.7922 0.3745 sec/batch
Epoch 13/20  Iteration 2156/3560 Training loss: 1.7930 0.3801 sec/batch
Epoch 13/20  Iteration 2157/3560 Training loss: 1.7927 0.3883 sec/batch
Epoch 13/20  Iteration 2158/3560 Training loss: 1.7931 0.3859 sec/batch
Epoch 13/20  Iteration 2159/3560 Training loss: 1.7925 0.3742 sec/batch
Epoch 13/20  Iteration 2160/3560 Training loss: 1.7922 0.3773 sec/batch
Epoch 13/20  Iteration 2161/3560 Training loss: 1.7923 0.3836 sec/batch
Epoch 13/20  Iteration 2162/3560 Training loss: 1.7915 0.3968 sec/batch
Epoch 13/20  Iteration 2163/3560 Training loss: 1.7909 0.3744 sec/batch
Epoch 13/20  Iteration 2164/3560 Training loss: 1.7913 0.3759 sec/batch
Epoch 13/20  Iteration 2165/3560 Training loss: 1.7925 0.3778 sec/batch
Epoch 13/20  Iteration 2166/3560 Training loss: 1.7925 0.3750 sec/batch
Epoch 13/20  Iteration 2167/3560 Training loss: 1.7924 0.3802 sec/batch
Epoch 13/20  Iteration 2168/3560 Training loss: 1.7916 0.3800 sec/batch
Epoch 13/20  Iteration 2169/3560 Training loss: 1.7919 0.3801 sec/batch
Epoch 13/20  Iteration 2170/3560 Training loss: 1.7926 0.3765 sec/batch
Epoch 13/20  Iteration 2171/3560 Training loss: 1.7921 0.3755 sec/batch
Epoch 13/20  Iteration 2172/3560 Training loss: 1.7921 0.3836 sec/batch
Epoch 13/20  Iteration 2173/3560 Training loss: 1.7914 0.3760 sec/batch
Epoch 13/20  Iteration 2174/3560 Training loss: 1.7904 0.3826 sec/batch
Epoch 13/20  Iteration 2175/3560 Training loss: 1.7893 0.3891 sec/batch
Epoch 13/20  Iteration 2176/3560 Training loss: 1.7888 0.3891 sec/batch
Epoch 13/20  Iteration 2177/3560 Training loss: 1.7884 0.3780 sec/batch
Epoch 13/20  Iteration 2178/3560 Training loss: 1.7885 0.3782 sec/batch
Epoch 13/20  Iteration 2179/3560 Training loss: 1.7881 0.3744 sec/batch
Epoch 13/20  Iteration 2180/3560 Training loss: 1.7872 0.3816 sec/batch
Epoch 13/20  Iteration 2181/3560 Training loss: 1.7873 0.3790 sec/batch
Epoch 13/20  Iteration 2182/3560 Training loss: 1.7864 0.3821 sec/batch
Epoch 13/20  Iteration 2183/3560 Training loss: 1.7864 0.3771 sec/batch
Epoch 13/20  Iteration 2184/3560 Training loss: 1.7860 0.3768 sec/batch
Epoch 13/20  Iteration 2185/3560 Training loss: 1.7860 0.3843 sec/batch
Epoch 13/20  Iteration 2186/3560 Training loss: 1.7867 0.3766 sec/batch
Epoch 13/20  Iteration 2187/3560 Training loss: 1.7861 0.3814 sec/batch
Epoch 13/20  Iteration 2188/3560 Training loss: 1.7870 0.3761 sec/batch
Epoch 13/20  Iteration 2189/3560 Training loss: 1.7869 0.3977 sec/batch
Epoch 13/20  Iteration 2190/3560 Training loss: 1.7870 0.3784 sec/batch
Epoch 13/20  Iteration 2191/3560 Training loss: 1.7867 0.3730 sec/batch
Epoch 13/20  Iteration 2192/3560 Training loss: 1.7868 0.3774 sec/batch
Epoch 13/20  Iteration 2193/3560 Training loss: 1.7871 0.3765 sec/batch
Epoch 13/20  Iteration 2194/3560 Training loss: 1.7868 0.3754 sec/batch
Epoch 13/20  Iteration 2195/3560 Training loss: 1.7862 0.3738 sec/batch
Epoch 13/20  Iteration 2196/3560 Training loss: 1.7869 0.3755 sec/batch
Epoch 13/20  Iteration 2197/3560 Training loss: 1.7868 0.3800 sec/batch
Epoch 13/20  Iteration 2198/3560 Training loss: 1.7874 0.3810 sec/batch
Epoch 13/20  Iteration 2199/3560 Training loss: 1.7879 0.3798 sec/batch
Epoch 13/20  Iteration 2200/3560 Training loss: 1.7881 0.3785 sec/batch
Epoch 13/20  Iteration 2201/3560 Training loss: 1.7879 0.3911 sec/batch
Epoch 13/20  Iteration 2202/3560 Training loss: 1.7881 0.4062 sec/batch
Epoch 13/20  Iteration 2203/3560 Training loss: 1.7884 0.3792 sec/batch
Epoch 13/20  Iteration 2204/3560 Training loss: 1.7880 0.3736 sec/batch
Epoch 13/20  Iteration 2205/3560 Training loss: 1.7880 0.3740 sec/batch
Epoch 13/20  Iteration 2206/3560 Training loss: 1.7879 0.3748 sec/batch
Epoch 13/20  Iteration 2207/3560 Training loss: 1.7882 0.3784 sec/batch
Epoch 13/20  Iteration 2208/3560 Training loss: 1.7884 0.3737 sec/batch
Epoch 13/20  Iteration 2209/3560 Training loss: 1.7888 0.3741 sec/batch
Epoch 13/20  Iteration 2210/3560 Training loss: 1.7884 0.3758 sec/batch
Epoch 13/20  Iteration 2211/3560 Training loss: 1.7883 0.3746 sec/batch
Epoch 13/20  Iteration 2212/3560 Training loss: 1.7885 0.3758 sec/batch
Epoch 13/20  Iteration 2213/3560 Training loss: 1.7884 0.3755 sec/batch
Epoch 13/20  Iteration 2214/3560 Training loss: 1.7886 0.3897 sec/batch
Epoch 13/20  Iteration 2215/3560 Training loss: 1.7882 0.4165 sec/batch
Epoch 13/20  Iteration 2216/3560 Training loss: 1.7880 0.3708 sec/batch
Epoch 13/20  Iteration 2217/3560 Training loss: 1.7874 0.3794 sec/batch
Epoch 13/20  Iteration 2218/3560 Training loss: 1.7877 0.3772 sec/batch
Epoch 13/20  Iteration 2219/3560 Training loss: 1.7873 0.3743 sec/batch
Epoch 13/20  Iteration 2220/3560 Training loss: 1.7873 0.3744 sec/batch
Epoch 13/20  Iteration 2221/3560 Training loss: 1.7869 0.3757 sec/batch
Epoch 13/20  Iteration 2222/3560 Training loss: 1.7866 0.3832 sec/batch
Epoch 13/20  Iteration 2223/3560 Training loss: 1.7864 0.3743 sec/batch
Epoch 13/20  Iteration 2224/3560 Training loss: 1.7861 0.3791 sec/batch
Epoch 13/20  Iteration 2225/3560 Training loss: 1.7856 0.3763 sec/batch
Epoch 13/20  Iteration 2226/3560 Training loss: 1.7857 0.3739 sec/batch
Epoch 13/20  Iteration 2227/3560 Training loss: 1.7854 0.3726 sec/batch
Epoch 13/20  Iteration 2228/3560 Training loss: 1.7853 0.3735 sec/batch
Epoch 13/20  Iteration 2229/3560 Training loss: 1.7849 0.3994 sec/batch
Epoch 13/20  Iteration 2230/3560 Training loss: 1.7846 0.3812 sec/batch
Epoch 13/20  Iteration 2231/3560 Training loss: 1.7843 0.3827 sec/batch
Epoch 13/20  Iteration 2232/3560 Training loss: 1.7843 0.3859 sec/batch
Epoch 13/20  Iteration 2233/3560 Training loss: 1.7843 0.3795 sec/batch
Epoch 13/20  Iteration 2234/3560 Training loss: 1.7839 0.3775 sec/batch
Epoch 13/20  Iteration 2235/3560 Training loss: 1.7836 0.3743 sec/batch
Epoch 13/20  Iteration 2236/3560 Training loss: 1.7831 0.3814 sec/batch
Epoch 13/20  Iteration 2237/3560 Training loss: 1.7831 0.4028 sec/batch
Epoch 13/20  Iteration 2238/3560 Training loss: 1.7829 0.3804 sec/batch
Epoch 13/20  Iteration 2239/3560 Training loss: 1.7827 0.3803 sec/batch
Epoch 13/20  Iteration 2240/3560 Training loss: 1.7826 0.3754 sec/batch
Epoch 13/20  Iteration 2241/3560 Training loss: 1.7824 0.3767 sec/batch
Epoch 13/20  Iteration 2242/3560 Training loss: 1.7823 0.4015 sec/batch
Epoch 13/20  Iteration 2243/3560 Training loss: 1.7821 0.3786 sec/batch
Epoch 13/20  Iteration 2244/3560 Training loss: 1.7821 0.3833 sec/batch
Epoch 13/20  Iteration 2245/3560 Training loss: 1.7821 0.3741 sec/batch
Epoch 13/20  Iteration 2246/3560 Training loss: 1.7822 0.3785 sec/batch
Epoch 13/20  Iteration 2247/3560 Training loss: 1.7821 0.3823 sec/batch
Epoch 13/20  Iteration 2248/3560 Training loss: 1.7820 0.3914 sec/batch
Epoch 13/20  Iteration 2249/3560 Training loss: 1.7819 0.3766 sec/batch
Epoch 13/20  Iteration 2250/3560 Training loss: 1.7818 0.3774 sec/batch
Epoch 13/20  Iteration 2251/3560 Training loss: 1.7815 0.3799 sec/batch
Epoch 13/20  Iteration 2252/3560 Training loss: 1.7812 0.4059 sec/batch
Epoch 13/20  Iteration 2253/3560 Training loss: 1.7811 0.3764 sec/batch
Epoch 13/20  Iteration 2254/3560 Training loss: 1.7811 0.3834 sec/batch
Epoch 13/20  Iteration 2255/3560 Training loss: 1.7811 0.4143 sec/batch
Epoch 13/20  Iteration 2256/3560 Training loss: 1.7811 0.3779 sec/batch
Epoch 13/20  Iteration 2257/3560 Training loss: 1.7811 0.3757 sec/batch
Epoch 13/20  Iteration 2258/3560 Training loss: 1.7807 0.3772 sec/batch
Epoch 13/20  Iteration 2259/3560 Training loss: 1.7804 0.3786 sec/batch
Epoch 13/20  Iteration 2260/3560 Training loss: 1.7805 0.3819 sec/batch
Epoch 13/20  Iteration 2261/3560 Training loss: 1.7804 0.3798 sec/batch
Epoch 13/20  Iteration 2262/3560 Training loss: 1.7799 0.3752 sec/batch
Epoch 13/20  Iteration 2263/3560 Training loss: 1.7801 0.3732 sec/batch
Epoch 13/20  Iteration 2264/3560 Training loss: 1.7802 0.3757 sec/batch
Epoch 13/20  Iteration 2265/3560 Training loss: 1.7801 0.3774 sec/batch
Epoch 13/20  Iteration 2266/3560 Training loss: 1.7800 0.3803 sec/batch
Epoch 13/20  Iteration 2267/3560 Training loss: 1.7797 0.3775 sec/batch
Epoch 13/20  Iteration 2268/3560 Training loss: 1.7795 0.4076 sec/batch
Epoch 13/20  Iteration 2269/3560 Training loss: 1.7795 0.3826 sec/batch
Epoch 13/20  Iteration 2270/3560 Training loss: 1.7794 0.3832 sec/batch
Epoch 13/20  Iteration 2271/3560 Training loss: 1.7794 0.3857 sec/batch
Epoch 13/20  Iteration 2272/3560 Training loss: 1.7794 0.3871 sec/batch
Epoch 13/20  Iteration 2273/3560 Training loss: 1.7795 0.3859 sec/batch
Epoch 13/20  Iteration 2274/3560 Training loss: 1.7796 0.3758 sec/batch
Epoch 13/20  Iteration 2275/3560 Training loss: 1.7797 0.3756 sec/batch
Epoch 13/20  Iteration 2276/3560 Training loss: 1.7796 0.3743 sec/batch
Epoch 13/20  Iteration 2277/3560 Training loss: 1.7798 0.3888 sec/batch
Epoch 13/20  Iteration 2278/3560 Training loss: 1.7797 0.3804 sec/batch
Epoch 13/20  Iteration 2279/3560 Training loss: 1.7797 0.3829 sec/batch
Epoch 13/20  Iteration 2280/3560 Training loss: 1.7797 0.3805 sec/batch
Epoch 13/20  Iteration 2281/3560 Training loss: 1.7797 0.3851 sec/batch
Epoch 13/20  Iteration 2282/3560 Training loss: 1.7798 0.3874 sec/batch
Epoch 13/20  Iteration 2283/3560 Training loss: 1.7798 0.3717 sec/batch
Epoch 13/20  Iteration 2284/3560 Training loss: 1.7800 0.3757 sec/batch
Epoch 13/20  Iteration 2285/3560 Training loss: 1.7800 0.3791 sec/batch
Epoch 13/20  Iteration 2286/3560 Training loss: 1.7798 0.3755 sec/batch
Epoch 13/20  Iteration 2287/3560 Training loss: 1.7795 0.3920 sec/batch
Epoch 13/20  Iteration 2288/3560 Training loss: 1.7795 0.3954 sec/batch
Epoch 13/20  Iteration 2289/3560 Training loss: 1.7795 0.3791 sec/batch
Epoch 13/20  Iteration 2290/3560 Training loss: 1.7796 0.3756 sec/batch
Epoch 13/20  Iteration 2291/3560 Training loss: 1.7795 0.3842 sec/batch
Epoch 13/20  Iteration 2292/3560 Training loss: 1.7796 0.3940 sec/batch
Epoch 13/20  Iteration 2293/3560 Training loss: 1.7796 0.3868 sec/batch
Epoch 13/20  Iteration 2294/3560 Training loss: 1.7796 0.3763 sec/batch
Epoch 13/20  Iteration 2295/3560 Training loss: 1.7793 0.4019 sec/batch
Epoch 13/20  Iteration 2296/3560 Training loss: 1.7795 0.3789 sec/batch
Epoch 13/20  Iteration 2297/3560 Training loss: 1.7796 0.3770 sec/batch
Epoch 13/20  Iteration 2298/3560 Training loss: 1.7796 0.3757 sec/batch
Epoch 13/20  Iteration 2299/3560 Training loss: 1.7796 0.3805 sec/batch
Epoch 13/20  Iteration 2300/3560 Training loss: 1.7796 0.3777 sec/batch
Epoch 13/20  Iteration 2301/3560 Training loss: 1.7796 0.3752 sec/batch
Epoch 13/20  Iteration 2302/3560 Training loss: 1.7796 0.3775 sec/batch
Epoch 13/20  Iteration 2303/3560 Training loss: 1.7797 0.3780 sec/batch
Epoch 13/20  Iteration 2304/3560 Training loss: 1.7801 0.3790 sec/batch
Epoch 13/20  Iteration 2305/3560 Training loss: 1.7800 0.3895 sec/batch
Epoch 13/20  Iteration 2306/3560 Training loss: 1.7800 0.3842 sec/batch
Epoch 13/20  Iteration 2307/3560 Training loss: 1.7799 0.3778 sec/batch
Epoch 13/20  Iteration 2308/3560 Training loss: 1.7797 0.4056 sec/batch
Epoch 13/20  Iteration 2309/3560 Training loss: 1.7798 0.3834 sec/batch
Epoch 13/20  Iteration 2310/3560 Training loss: 1.7798 0.3748 sec/batch
Epoch 13/20  Iteration 2311/3560 Training loss: 1.7798 0.3792 sec/batch
Epoch 13/20  Iteration 2312/3560 Training loss: 1.7798 0.3750 sec/batch
Epoch 13/20  Iteration 2313/3560 Training loss: 1.7797 0.3728 sec/batch
Epoch 13/20  Iteration 2314/3560 Training loss: 1.7797 0.3772 sec/batch
Epoch 14/20  Iteration 2315/3560 Training loss: 1.8477 0.3715 sec/batch
Epoch 14/20  Iteration 2316/3560 Training loss: 1.8080 0.3766 sec/batch
Epoch 14/20  Iteration 2317/3560 Training loss: 1.7961 0.3841 sec/batch
Epoch 14/20  Iteration 2318/3560 Training loss: 1.7902 0.3740 sec/batch
Epoch 14/20  Iteration 2319/3560 Training loss: 1.7842 0.3734 sec/batch
Epoch 14/20  Iteration 2320/3560 Training loss: 1.7765 0.3889 sec/batch
Epoch 14/20  Iteration 2321/3560 Training loss: 1.7761 0.4032 sec/batch
Epoch 14/20  Iteration 2322/3560 Training loss: 1.7731 0.3757 sec/batch
Epoch 14/20  Iteration 2323/3560 Training loss: 1.7758 0.3792 sec/batch
Epoch 14/20  Iteration 2324/3560 Training loss: 1.7760 0.3826 sec/batch
Epoch 14/20  Iteration 2325/3560 Training loss: 1.7725 0.3903 sec/batch
Epoch 14/20  Iteration 2326/3560 Training loss: 1.7712 0.3813 sec/batch
Epoch 14/20  Iteration 2327/3560 Training loss: 1.7702 0.3923 sec/batch
Epoch 14/20  Iteration 2328/3560 Training loss: 1.7718 0.3768 sec/batch
Epoch 14/20  Iteration 2329/3560 Training loss: 1.7715 0.3768 sec/batch
Epoch 14/20  Iteration 2330/3560 Training loss: 1.7692 0.3774 sec/batch
Epoch 14/20  Iteration 2331/3560 Training loss: 1.7696 0.3772 sec/batch
Epoch 14/20  Iteration 2332/3560 Training loss: 1.7709 0.3759 sec/batch
Epoch 14/20  Iteration 2333/3560 Training loss: 1.7707 0.3768 sec/batch
Epoch 14/20  Iteration 2334/3560 Training loss: 1.7713 0.4074 sec/batch
Epoch 14/20  Iteration 2335/3560 Training loss: 1.7703 0.3804 sec/batch
Epoch 14/20  Iteration 2336/3560 Training loss: 1.7709 0.3845 sec/batch
Epoch 14/20  Iteration 2337/3560 Training loss: 1.7705 0.3844 sec/batch
Epoch 14/20  Iteration 2338/3560 Training loss: 1.7699 0.3797 sec/batch
Epoch 14/20  Iteration 2339/3560 Training loss: 1.7700 0.3811 sec/batch
Epoch 14/20  Iteration 2340/3560 Training loss: 1.7693 0.3783 sec/batch
Epoch 14/20  Iteration 2341/3560 Training loss: 1.7686 0.3738 sec/batch
Epoch 14/20  Iteration 2342/3560 Training loss: 1.7693 0.3776 sec/batch
Epoch 14/20  Iteration 2343/3560 Training loss: 1.7703 0.3805 sec/batch
Epoch 14/20  Iteration 2344/3560 Training loss: 1.7709 0.3792 sec/batch
Epoch 14/20  Iteration 2345/3560 Training loss: 1.7710 0.3824 sec/batch
Epoch 14/20  Iteration 2346/3560 Training loss: 1.7698 0.3786 sec/batch
Epoch 14/20  Iteration 2347/3560 Training loss: 1.7701 0.3861 sec/batch
Epoch 14/20  Iteration 2348/3560 Training loss: 1.7707 0.3951 sec/batch
Epoch 14/20  Iteration 2349/3560 Training loss: 1.7704 0.3738 sec/batch
Epoch 14/20  Iteration 2350/3560 Training loss: 1.7702 0.3867 sec/batch
Epoch 14/20  Iteration 2351/3560 Training loss: 1.7698 0.3872 sec/batch
Epoch 14/20  Iteration 2352/3560 Training loss: 1.7687 0.3754 sec/batch
Epoch 14/20  Iteration 2353/3560 Training loss: 1.7674 0.3749 sec/batch
Epoch 14/20  Iteration 2354/3560 Training loss: 1.7667 0.3796 sec/batch
Epoch 14/20  Iteration 2355/3560 Training loss: 1.7664 0.3804 sec/batch
Epoch 14/20  Iteration 2356/3560 Training loss: 1.7667 0.3726 sec/batch
Epoch 14/20  Iteration 2357/3560 Training loss: 1.7661 0.3762 sec/batch
Epoch 14/20  Iteration 2358/3560 Training loss: 1.7655 0.3718 sec/batch
Epoch 14/20  Iteration 2359/3560 Training loss: 1.7658 0.3764 sec/batch
Epoch 14/20  Iteration 2360/3560 Training loss: 1.7648 0.3791 sec/batch
Epoch 14/20  Iteration 2361/3560 Training loss: 1.7643 0.3984 sec/batch
Epoch 14/20  Iteration 2362/3560 Training loss: 1.7638 0.3742 sec/batch
Epoch 14/20  Iteration 2363/3560 Training loss: 1.7636 0.3765 sec/batch
Epoch 14/20  Iteration 2364/3560 Training loss: 1.7643 0.3758 sec/batch
Epoch 14/20  Iteration 2365/3560 Training loss: 1.7638 0.3837 sec/batch
Epoch 14/20  Iteration 2366/3560 Training loss: 1.7648 0.3845 sec/batch
Epoch 14/20  Iteration 2367/3560 Training loss: 1.7647 0.3839 sec/batch
Epoch 14/20  Iteration 2368/3560 Training loss: 1.7647 0.3833 sec/batch
Epoch 14/20  Iteration 2369/3560 Training loss: 1.7646 0.3890 sec/batch
Epoch 14/20  Iteration 2370/3560 Training loss: 1.7648 0.4845 sec/batch
Epoch 14/20  Iteration 2371/3560 Training loss: 1.7653 0.3784 sec/batch
Epoch 14/20  Iteration 2372/3560 Training loss: 1.7650 0.3751 sec/batch
Epoch 14/20  Iteration 2373/3560 Training loss: 1.7645 0.3750 sec/batch
Epoch 14/20  Iteration 2374/3560 Training loss: 1.7649 0.3995 sec/batch
Epoch 14/20  Iteration 2375/3560 Training loss: 1.7646 0.3797 sec/batch
Epoch 14/20  Iteration 2376/3560 Training loss: 1.7653 0.3775 sec/batch
Epoch 14/20  Iteration 2377/3560 Training loss: 1.7658 0.3805 sec/batch
Epoch 14/20  Iteration 2378/3560 Training loss: 1.7661 0.3779 sec/batch
Epoch 14/20  Iteration 2379/3560 Training loss: 1.7660 0.3854 sec/batch
Epoch 14/20  Iteration 2380/3560 Training loss: 1.7662 0.3842 sec/batch
Epoch 14/20  Iteration 2381/3560 Training loss: 1.7663 0.4010 sec/batch
Epoch 14/20  Iteration 2382/3560 Training loss: 1.7660 0.3749 sec/batch
Epoch 14/20  Iteration 2383/3560 Training loss: 1.7659 0.3778 sec/batch
Epoch 14/20  Iteration 2384/3560 Training loss: 1.7659 0.3737 sec/batch
Epoch 14/20  Iteration 2385/3560 Training loss: 1.7664 0.3759 sec/batch
Epoch 14/20  Iteration 2386/3560 Training loss: 1.7666 0.3770 sec/batch
Epoch 14/20  Iteration 2387/3560 Training loss: 1.7671 0.4029 sec/batch
Epoch 14/20  Iteration 2388/3560 Training loss: 1.7669 0.3782 sec/batch
Epoch 14/20  Iteration 2389/3560 Training loss: 1.7668 0.3846 sec/batch
Epoch 14/20  Iteration 2390/3560 Training loss: 1.7670 0.3814 sec/batch
Epoch 14/20  Iteration 2391/3560 Training loss: 1.7671 0.3802 sec/batch
Epoch 14/20  Iteration 2392/3560 Training loss: 1.7672 0.3743 sec/batch
Epoch 14/20  Iteration 2393/3560 Training loss: 1.7668 0.3794 sec/batch
Epoch 14/20  Iteration 2394/3560 Training loss: 1.7666 0.3815 sec/batch
Epoch 14/20  Iteration 2395/3560 Training loss: 1.7663 0.3757 sec/batch
Epoch 14/20  Iteration 2396/3560 Training loss: 1.7664 0.3756 sec/batch
Epoch 14/20  Iteration 2397/3560 Training loss: 1.7661 0.3867 sec/batch
Epoch 14/20  Iteration 2398/3560 Training loss: 1.7661 0.3777 sec/batch
Epoch 14/20  Iteration 2399/3560 Training loss: 1.7657 0.3812 sec/batch
Epoch 14/20  Iteration 2400/3560 Training loss: 1.7654 0.4138 sec/batch
Epoch 14/20  Iteration 2401/3560 Training loss: 1.7652 0.3780 sec/batch
Epoch 14/20  Iteration 2402/3560 Training loss: 1.7650 0.3927 sec/batch
Epoch 14/20  Iteration 2403/3560 Training loss: 1.7646 0.3946 sec/batch
Epoch 14/20  Iteration 2404/3560 Training loss: 1.7647 0.3957 sec/batch
Epoch 14/20  Iteration 2405/3560 Training loss: 1.7644 0.3771 sec/batch
Epoch 14/20  Iteration 2406/3560 Training loss: 1.7643 0.3890 sec/batch
Epoch 14/20  Iteration 2407/3560 Training loss: 1.7638 0.3821 sec/batch
Epoch 14/20  Iteration 2408/3560 Training loss: 1.7636 0.3815 sec/batch
Epoch 14/20  Iteration 2409/3560 Training loss: 1.7632 0.3795 sec/batch
Epoch 14/20  Iteration 2410/3560 Training loss: 1.7631 0.3756 sec/batch
Epoch 14/20  Iteration 2411/3560 Training loss: 1.7630 0.3772 sec/batch
Epoch 14/20  Iteration 2412/3560 Training loss: 1.7627 0.3887 sec/batch
Epoch 14/20  Iteration 2413/3560 Training loss: 1.7624 0.3900 sec/batch
Epoch 14/20  Iteration 2414/3560 Training loss: 1.7619 0.3940 sec/batch
Epoch 14/20  Iteration 2415/3560 Training loss: 1.7618 0.3828 sec/batch
Epoch 14/20  Iteration 2416/3560 Training loss: 1.7617 0.3778 sec/batch
Epoch 14/20  Iteration 2417/3560 Training loss: 1.7616 0.3768 sec/batch
Epoch 14/20  Iteration 2418/3560 Training loss: 1.7615 0.3771 sec/batch
Epoch 14/20  Iteration 2419/3560 Training loss: 1.7613 0.3775 sec/batch
Epoch 14/20  Iteration 2420/3560 Training loss: 1.7612 0.3893 sec/batch
Epoch 14/20  Iteration 2421/3560 Training loss: 1.7611 0.3902 sec/batch
Epoch 14/20  Iteration 2422/3560 Training loss: 1.7611 0.4169 sec/batch
Epoch 14/20  Iteration 2423/3560 Training loss: 1.7611 0.3909 sec/batch
Epoch 14/20  Iteration 2424/3560 Training loss: 1.7611 0.3950 sec/batch
Epoch 14/20  Iteration 2425/3560 Training loss: 1.7611 0.3858 sec/batch
Epoch 14/20  Iteration 2426/3560 Training loss: 1.7611 0.3835 sec/batch
Epoch 14/20  Iteration 2427/3560 Training loss: 1.7609 0.4072 sec/batch
Epoch 14/20  Iteration 2428/3560 Training loss: 1.7608 0.3870 sec/batch
Epoch 14/20  Iteration 2429/3560 Training loss: 1.7605 0.3868 sec/batch
Epoch 14/20  Iteration 2430/3560 Training loss: 1.7600 0.3825 sec/batch
Epoch 14/20  Iteration 2431/3560 Training loss: 1.7601 0.3805 sec/batch
Epoch 14/20  Iteration 2432/3560 Training loss: 1.7600 0.3798 sec/batch
Epoch 14/20  Iteration 2433/3560 Training loss: 1.7600 0.3777 sec/batch
Epoch 14/20  Iteration 2434/3560 Training loss: 1.7600 0.3774 sec/batch
Epoch 14/20  Iteration 2435/3560 Training loss: 1.7599 0.3812 sec/batch
Epoch 14/20  Iteration 2436/3560 Training loss: 1.7596 0.3778 sec/batch
Epoch 14/20  Iteration 2437/3560 Training loss: 1.7593 0.3765 sec/batch
Epoch 14/20  Iteration 2438/3560 Training loss: 1.7594 0.3776 sec/batch
Epoch 14/20  Iteration 2439/3560 Training loss: 1.7594 0.3959 sec/batch
Epoch 14/20  Iteration 2440/3560 Training loss: 1.7590 0.4203 sec/batch
Epoch 14/20  Iteration 2441/3560 Training loss: 1.7591 0.3881 sec/batch
Epoch 14/20  Iteration 2442/3560 Training loss: 1.7592 0.3783 sec/batch
Epoch 14/20  Iteration 2443/3560 Training loss: 1.7591 0.3746 sec/batch
Epoch 14/20  Iteration 2444/3560 Training loss: 1.7591 0.3762 sec/batch
Epoch 14/20  Iteration 2445/3560 Training loss: 1.7588 0.3740 sec/batch
Epoch 14/20  Iteration 2446/3560 Training loss: 1.7587 0.4037 sec/batch
Epoch 14/20  Iteration 2447/3560 Training loss: 1.7588 0.3760 sec/batch
Epoch 14/20  Iteration 2448/3560 Training loss: 1.7588 0.3712 sec/batch
Epoch 14/20  Iteration 2449/3560 Training loss: 1.7588 0.3798 sec/batch
Epoch 14/20  Iteration 2450/3560 Training loss: 1.7589 0.3817 sec/batch
Epoch 14/20  Iteration 2451/3560 Training loss: 1.7590 0.3758 sec/batch
Epoch 14/20  Iteration 2452/3560 Training loss: 1.7590 0.3764 sec/batch
Epoch 14/20  Iteration 2453/3560 Training loss: 1.7592 0.3840 sec/batch
Epoch 14/20  Iteration 2454/3560 Training loss: 1.7590 0.3943 sec/batch
Epoch 14/20  Iteration 2455/3560 Training loss: 1.7593 0.3857 sec/batch
Epoch 14/20  Iteration 2456/3560 Training loss: 1.7591 0.3892 sec/batch
Epoch 14/20  Iteration 2457/3560 Training loss: 1.7591 0.3754 sec/batch
Epoch 14/20  Iteration 2458/3560 Training loss: 1.7592 0.3813 sec/batch
Epoch 14/20  Iteration 2459/3560 Training loss: 1.7591 0.3758 sec/batch
Epoch 14/20  Iteration 2460/3560 Training loss: 1.7592 0.3811 sec/batch
Epoch 14/20  Iteration 2461/3560 Training loss: 1.7593 0.3836 sec/batch
Epoch 14/20  Iteration 2462/3560 Training loss: 1.7595 0.3804 sec/batch
Epoch 14/20  Iteration 2463/3560 Training loss: 1.7596 0.3820 sec/batch
Epoch 14/20  Iteration 2464/3560 Training loss: 1.7595 0.3763 sec/batch
Epoch 14/20  Iteration 2465/3560 Training loss: 1.7592 0.3878 sec/batch
Epoch 14/20  Iteration 2466/3560 Training loss: 1.7593 0.3778 sec/batch
Epoch 14/20  Iteration 2467/3560 Training loss: 1.7594 0.4038 sec/batch
Epoch 14/20  Iteration 2468/3560 Training loss: 1.7595 0.3756 sec/batch
Epoch 14/20  Iteration 2469/3560 Training loss: 1.7594 0.3769 sec/batch
Epoch 14/20  Iteration 2470/3560 Training loss: 1.7595 0.3774 sec/batch
Epoch 14/20  Iteration 2471/3560 Training loss: 1.7595 0.3884 sec/batch
Epoch 14/20  Iteration 2472/3560 Training loss: 1.7594 0.3752 sec/batch
Epoch 14/20  Iteration 2473/3560 Training loss: 1.7591 0.3773 sec/batch
Epoch 14/20  Iteration 2474/3560 Training loss: 1.7593 0.3756 sec/batch
Epoch 14/20  Iteration 2475/3560 Training loss: 1.7594 0.3919 sec/batch
Epoch 14/20  Iteration 2476/3560 Training loss: 1.7594 0.4007 sec/batch
Epoch 14/20  Iteration 2477/3560 Training loss: 1.7595 0.3806 sec/batch
Epoch 14/20  Iteration 2478/3560 Training loss: 1.7594 0.3880 sec/batch
Epoch 14/20  Iteration 2479/3560 Training loss: 1.7594 0.3784 sec/batch
Epoch 14/20  Iteration 2480/3560 Training loss: 1.7594 0.4089 sec/batch
Epoch 14/20  Iteration 2481/3560 Training loss: 1.7594 0.3868 sec/batch
Epoch 14/20  Iteration 2482/3560 Training loss: 1.7598 0.3832 sec/batch
Epoch 14/20  Iteration 2483/3560 Training loss: 1.7598 0.3785 sec/batch
Epoch 14/20  Iteration 2484/3560 Training loss: 1.7597 0.3836 sec/batch
Epoch 14/20  Iteration 2485/3560 Training loss: 1.7596 0.4003 sec/batch
Epoch 14/20  Iteration 2486/3560 Training loss: 1.7594 0.3959 sec/batch
Epoch 14/20  Iteration 2487/3560 Training loss: 1.7596 0.3763 sec/batch
Epoch 14/20  Iteration 2488/3560 Training loss: 1.7596 0.3756 sec/batch
Epoch 14/20  Iteration 2489/3560 Training loss: 1.7596 0.3831 sec/batch
Epoch 14/20  Iteration 2490/3560 Training loss: 1.7595 0.3798 sec/batch
Epoch 14/20  Iteration 2491/3560 Training loss: 1.7594 0.3897 sec/batch
Epoch 14/20  Iteration 2492/3560 Training loss: 1.7594 0.3875 sec/batch
Epoch 15/20  Iteration 2493/3560 Training loss: 1.8232 0.3983 sec/batch
Epoch 15/20  Iteration 2494/3560 Training loss: 1.7900 0.3770 sec/batch
Epoch 15/20  Iteration 2495/3560 Training loss: 1.7771 0.3755 sec/batch
Epoch 15/20  Iteration 2496/3560 Training loss: 1.7696 0.3819 sec/batch
Epoch 15/20  Iteration 2497/3560 Training loss: 1.7650 0.3773 sec/batch
Epoch 15/20  Iteration 2498/3560 Training loss: 1.7547 0.3950 sec/batch
Epoch 15/20  Iteration 2499/3560 Training loss: 1.7567 0.3759 sec/batch
Epoch 15/20  Iteration 2500/3560 Training loss: 1.7547 0.3767 sec/batch
Epoch 15/20  Iteration 2501/3560 Training loss: 1.7572 0.3868 sec/batch
Epoch 15/20  Iteration 2502/3560 Training loss: 1.7572 0.3813 sec/batch
Epoch 15/20  Iteration 2503/3560 Training loss: 1.7528 0.3795 sec/batch
Epoch 15/20  Iteration 2504/3560 Training loss: 1.7519 0.3835 sec/batch
Epoch 15/20  Iteration 2505/3560 Training loss: 1.7514 0.3760 sec/batch
Epoch 15/20  Iteration 2506/3560 Training loss: 1.7551 0.4007 sec/batch
Epoch 15/20  Iteration 2507/3560 Training loss: 1.7540 0.3774 sec/batch
Epoch 15/20  Iteration 2508/3560 Training loss: 1.7514 0.4077 sec/batch
Epoch 15/20  Iteration 2509/3560 Training loss: 1.7515 0.3903 sec/batch
Epoch 15/20  Iteration 2510/3560 Training loss: 1.7536 0.3817 sec/batch
Epoch 15/20  Iteration 2511/3560 Training loss: 1.7534 0.3923 sec/batch
Epoch 15/20  Iteration 2512/3560 Training loss: 1.7541 0.3923 sec/batch
Epoch 15/20  Iteration 2513/3560 Training loss: 1.7531 0.3811 sec/batch
Epoch 15/20  Iteration 2514/3560 Training loss: 1.7538 0.3806 sec/batch
Epoch 15/20  Iteration 2515/3560 Training loss: 1.7531 0.3822 sec/batch
Epoch 15/20  Iteration 2516/3560 Training loss: 1.7527 0.3744 sec/batch
Epoch 15/20  Iteration 2517/3560 Training loss: 1.7524 0.4006 sec/batch
Epoch 15/20  Iteration 2518/3560 Training loss: 1.7514 0.4095 sec/batch
Epoch 15/20  Iteration 2519/3560 Training loss: 1.7506 0.4662 sec/batch
Epoch 15/20  Iteration 2520/3560 Training loss: 1.7512 0.3902 sec/batch
Epoch 15/20  Iteration 2521/3560 Training loss: 1.7523 0.3991 sec/batch
Epoch 15/20  Iteration 2522/3560 Training loss: 1.7528 0.5395 sec/batch
Epoch 15/20  Iteration 2523/3560 Training loss: 1.7524 0.5113 sec/batch
Epoch 15/20  Iteration 2524/3560 Training loss: 1.7515 0.5648 sec/batch
Epoch 15/20  Iteration 2525/3560 Training loss: 1.7517 0.4844 sec/batch
Epoch 15/20  Iteration 2526/3560 Training loss: 1.7519 0.3784 sec/batch
Epoch 15/20  Iteration 2527/3560 Training loss: 1.7515 0.3798 sec/batch
Epoch 15/20  Iteration 2528/3560 Training loss: 1.7515 0.3796 sec/batch
Epoch 15/20  Iteration 2529/3560 Training loss: 1.7508 0.3762 sec/batch
Epoch 15/20  Iteration 2530/3560 Training loss: 1.7499 0.3900 sec/batch
Epoch 15/20  Iteration 2531/3560 Training loss: 1.7488 0.3874 sec/batch
Epoch 15/20  Iteration 2532/3560 Training loss: 1.7480 0.3805 sec/batch
Epoch 15/20  Iteration 2533/3560 Training loss: 1.7477 0.3791 sec/batch
Epoch 15/20  Iteration 2534/3560 Training loss: 1.7479 0.3770 sec/batch
Epoch 15/20  Iteration 2535/3560 Training loss: 1.7474 0.3740 sec/batch
Epoch 15/20  Iteration 2536/3560 Training loss: 1.7468 0.3847 sec/batch
Epoch 15/20  Iteration 2537/3560 Training loss: 1.7470 0.3800 sec/batch
Epoch 15/20  Iteration 2538/3560 Training loss: 1.7459 0.3828 sec/batch
Epoch 15/20  Iteration 2539/3560 Training loss: 1.7456 0.3798 sec/batch
Epoch 15/20  Iteration 2540/3560 Training loss: 1.7450 0.3768 sec/batch
Epoch 15/20  Iteration 2541/3560 Training loss: 1.7448 0.3808 sec/batch
Epoch 15/20  Iteration 2542/3560 Training loss: 1.7457 0.3793 sec/batch
Epoch 15/20  Iteration 2543/3560 Training loss: 1.7453 0.3801 sec/batch
Epoch 15/20  Iteration 2544/3560 Training loss: 1.7461 0.4007 sec/batch
Epoch 15/20  Iteration 2545/3560 Training loss: 1.7460 0.3915 sec/batch
Epoch 15/20  Iteration 2546/3560 Training loss: 1.7460 0.3730 sec/batch
Epoch 15/20  Iteration 2547/3560 Training loss: 1.7459 0.3748 sec/batch
Epoch 15/20  Iteration 2548/3560 Training loss: 1.7462 0.3761 sec/batch
Epoch 15/20  Iteration 2549/3560 Training loss: 1.7464 0.3738 sec/batch
Epoch 15/20  Iteration 2550/3560 Training loss: 1.7461 0.3815 sec/batch
Epoch 15/20  Iteration 2551/3560 Training loss: 1.7457 0.3766 sec/batch
Epoch 15/20  Iteration 2552/3560 Training loss: 1.7463 0.3763 sec/batch
Epoch 15/20  Iteration 2553/3560 Training loss: 1.7462 0.3922 sec/batch
Epoch 15/20  Iteration 2554/3560 Training loss: 1.7470 0.3857 sec/batch
Epoch 15/20  Iteration 2555/3560 Training loss: 1.7474 0.3800 sec/batch
Epoch 15/20  Iteration 2556/3560 Training loss: 1.7476 0.3786 sec/batch
Epoch 15/20  Iteration 2557/3560 Training loss: 1.7474 0.4090 sec/batch
Epoch 15/20  Iteration 2558/3560 Training loss: 1.7476 0.3929 sec/batch
Epoch 15/20  Iteration 2559/3560 Training loss: 1.7478 0.4170 sec/batch
Epoch 15/20  Iteration 2560/3560 Training loss: 1.7475 0.3918 sec/batch
Epoch 15/20  Iteration 2561/3560 Training loss: 1.7475 0.4027 sec/batch
Epoch 15/20  Iteration 2562/3560 Training loss: 1.7474 0.3958 sec/batch
Epoch 15/20  Iteration 2563/3560 Training loss: 1.7478 0.3980 sec/batch
Epoch 15/20  Iteration 2564/3560 Training loss: 1.7478 0.3806 sec/batch
Epoch 15/20  Iteration 2565/3560 Training loss: 1.7484 0.3784 sec/batch
Epoch 15/20  Iteration 2566/3560 Training loss: 1.7482 0.3787 sec/batch
Epoch 15/20  Iteration 2567/3560 Training loss: 1.7483 0.3800 sec/batch
Epoch 15/20  Iteration 2568/3560 Training loss: 1.7484 0.3866 sec/batch
Epoch 15/20  Iteration 2569/3560 Training loss: 1.7482 0.3767 sec/batch
Epoch 15/20  Iteration 2570/3560 Training loss: 1.7484 0.3999 sec/batch
Epoch 15/20  Iteration 2571/3560 Training loss: 1.7479 0.3788 sec/batch
Epoch 15/20  Iteration 2572/3560 Training loss: 1.7479 0.3822 sec/batch
Epoch 15/20  Iteration 2573/3560 Training loss: 1.7474 0.3816 sec/batch
Epoch 15/20  Iteration 2574/3560 Training loss: 1.7477 0.3855 sec/batch
Epoch 15/20  Iteration 2575/3560 Training loss: 1.7473 0.3791 sec/batch
Epoch 15/20  Iteration 2576/3560 Training loss: 1.7471 0.3840 sec/batch
Epoch 15/20  Iteration 2577/3560 Training loss: 1.7468 0.3772 sec/batch
Epoch 15/20  Iteration 2578/3560 Training loss: 1.7464 0.3824 sec/batch
Epoch 15/20  Iteration 2579/3560 Training loss: 1.7463 0.3808 sec/batch
Epoch 15/20  Iteration 2580/3560 Training loss: 1.7461 0.3825 sec/batch
Epoch 15/20  Iteration 2581/3560 Training loss: 1.7456 0.3823 sec/batch
Epoch 15/20  Iteration 2582/3560 Training loss: 1.7457 0.3783 sec/batch
Epoch 15/20  Iteration 2583/3560 Training loss: 1.7454 0.4071 sec/batch
Epoch 15/20  Iteration 2584/3560 Training loss: 1.7452 0.3799 sec/batch
Epoch 15/20  Iteration 2585/3560 Training loss: 1.7448 0.3782 sec/batch
Epoch 15/20  Iteration 2586/3560 Training loss: 1.7445 0.3771 sec/batch
Epoch 15/20  Iteration 2587/3560 Training loss: 1.7443 0.3759 sec/batch
Epoch 15/20  Iteration 2588/3560 Training loss: 1.7443 0.3728 sec/batch
Epoch 15/20  Iteration 2589/3560 Training loss: 1.7443 0.3772 sec/batch
Epoch 15/20  Iteration 2590/3560 Training loss: 1.7439 0.3850 sec/batch
Epoch 15/20  Iteration 2591/3560 Training loss: 1.7438 0.3735 sec/batch
Epoch 15/20  Iteration 2592/3560 Training loss: 1.7433 0.4149 sec/batch
Epoch 15/20  Iteration 2593/3560 Training loss: 1.7433 0.3808 sec/batch
Epoch 15/20  Iteration 2594/3560 Training loss: 1.7433 0.3814 sec/batch
Epoch 15/20  Iteration 2595/3560 Training loss: 1.7431 0.3903 sec/batch
Epoch 15/20  Iteration 2596/3560 Training loss: 1.7429 0.4001 sec/batch
Epoch 15/20  Iteration 2597/3560 Training loss: 1.7427 0.3808 sec/batch
Epoch 15/20  Iteration 2598/3560 Training loss: 1.7426 0.3760 sec/batch
Epoch 15/20  Iteration 2599/3560 Training loss: 1.7426 0.3915 sec/batch
Epoch 15/20  Iteration 2600/3560 Training loss: 1.7427 0.3780 sec/batch
Epoch 15/20  Iteration 2601/3560 Training loss: 1.7427 0.3911 sec/batch
Epoch 15/20  Iteration 2602/3560 Training loss: 1.7427 0.3917 sec/batch
Epoch 15/20  Iteration 2603/3560 Training loss: 1.7427 0.3869 sec/batch
Epoch 15/20  Iteration 2604/3560 Training loss: 1.7426 0.3877 sec/batch
Epoch 15/20  Iteration 2605/3560 Training loss: 1.7424 0.3764 sec/batch
Epoch 15/20  Iteration 2606/3560 Training loss: 1.7423 0.3764 sec/batch
Epoch 15/20  Iteration 2607/3560 Training loss: 1.7420 0.3821 sec/batch
Epoch 15/20  Iteration 2608/3560 Training loss: 1.7416 0.3796 sec/batch
Epoch 15/20  Iteration 2609/3560 Training loss: 1.7416 0.4009 sec/batch
Epoch 15/20  Iteration 2610/3560 Training loss: 1.7416 0.3810 sec/batch
Epoch 15/20  Iteration 2611/3560 Training loss: 1.7415 0.3838 sec/batch
Epoch 15/20  Iteration 2612/3560 Training loss: 1.7414 0.3776 sec/batch
Epoch 15/20  Iteration 2613/3560 Training loss: 1.7414 0.3860 sec/batch
Epoch 15/20  Iteration 2614/3560 Training loss: 1.7410 0.3849 sec/batch
Epoch 15/20  Iteration 2615/3560 Training loss: 1.7407 0.3821 sec/batch
Epoch 15/20  Iteration 2616/3560 Training loss: 1.7408 0.3784 sec/batch
Epoch 15/20  Iteration 2617/3560 Training loss: 1.7406 0.3794 sec/batch
Epoch 15/20  Iteration 2618/3560 Training loss: 1.7403 0.3787 sec/batch
Epoch 15/20  Iteration 2619/3560 Training loss: 1.7404 0.3776 sec/batch
Epoch 15/20  Iteration 2620/3560 Training loss: 1.7405 0.3854 sec/batch
Epoch 15/20  Iteration 2621/3560 Training loss: 1.7403 0.3834 sec/batch
Epoch 15/20  Iteration 2622/3560 Training loss: 1.7402 0.3892 sec/batch
Epoch 15/20  Iteration 2623/3560 Training loss: 1.7399 0.3888 sec/batch
Epoch 15/20  Iteration 2624/3560 Training loss: 1.7397 0.3783 sec/batch
Epoch 15/20  Iteration 2625/3560 Training loss: 1.7398 0.4061 sec/batch
Epoch 15/20  Iteration 2626/3560 Training loss: 1.7398 0.3950 sec/batch
Epoch 15/20  Iteration 2627/3560 Training loss: 1.7399 0.3977 sec/batch
Epoch 15/20  Iteration 2628/3560 Training loss: 1.7400 0.3795 sec/batch
Epoch 15/20  Iteration 2629/3560 Training loss: 1.7401 0.4377 sec/batch
Epoch 15/20  Iteration 2630/3560 Training loss: 1.7401 0.3894 sec/batch
Epoch 15/20  Iteration 2631/3560 Training loss: 1.7402 0.3790 sec/batch
Epoch 15/20  Iteration 2632/3560 Training loss: 1.7400 0.3758 sec/batch
Epoch 15/20  Iteration 2633/3560 Training loss: 1.7403 0.3776 sec/batch
Epoch 15/20  Iteration 2634/3560 Training loss: 1.7403 0.3780 sec/batch
Epoch 15/20  Iteration 2635/3560 Training loss: 1.7401 0.3828 sec/batch
Epoch 15/20  Iteration 2636/3560 Training loss: 1.7403 0.4043 sec/batch
Epoch 15/20  Iteration 2637/3560 Training loss: 1.7402 0.3966 sec/batch
Epoch 15/20  Iteration 2638/3560 Training loss: 1.7404 0.3803 sec/batch
Epoch 15/20  Iteration 2639/3560 Training loss: 1.7404 0.3832 sec/batch
Epoch 15/20  Iteration 2640/3560 Training loss: 1.7406 0.3770 sec/batch
Epoch 15/20  Iteration 2641/3560 Training loss: 1.7406 0.3810 sec/batch
Epoch 15/20  Iteration 2642/3560 Training loss: 1.7405 0.3746 sec/batch
Epoch 15/20  Iteration 2643/3560 Training loss: 1.7403 0.3801 sec/batch
Epoch 15/20  Iteration 2644/3560 Training loss: 1.7404 0.3802 sec/batch
Epoch 15/20  Iteration 2645/3560 Training loss: 1.7404 0.3829 sec/batch
Epoch 15/20  Iteration 2646/3560 Training loss: 1.7404 0.3843 sec/batch
Epoch 15/20  Iteration 2647/3560 Training loss: 1.7404 0.3769 sec/batch
Epoch 15/20  Iteration 2648/3560 Training loss: 1.7404 0.3775 sec/batch
Epoch 15/20  Iteration 2649/3560 Training loss: 1.7405 0.4020 sec/batch
Epoch 15/20  Iteration 2650/3560 Training loss: 1.7405 0.3756 sec/batch
Epoch 15/20  Iteration 2651/3560 Training loss: 1.7404 0.4006 sec/batch
Epoch 15/20  Iteration 2652/3560 Training loss: 1.7405 0.3724 sec/batch
Epoch 15/20  Iteration 2653/3560 Training loss: 1.7407 0.3746 sec/batch
Epoch 15/20  Iteration 2654/3560 Training loss: 1.7407 0.3784 sec/batch
Epoch 15/20  Iteration 2655/3560 Training loss: 1.7407 0.3771 sec/batch
Epoch 15/20  Iteration 2656/3560 Training loss: 1.7407 0.3746 sec/batch
Epoch 15/20  Iteration 2657/3560 Training loss: 1.7408 0.3790 sec/batch
Epoch 15/20  Iteration 2658/3560 Training loss: 1.7407 0.3746 sec/batch
Epoch 15/20  Iteration 2659/3560 Training loss: 1.7408 0.3796 sec/batch
Epoch 15/20  Iteration 2660/3560 Training loss: 1.7412 0.3844 sec/batch
Epoch 15/20  Iteration 2661/3560 Training loss: 1.7412 0.3770 sec/batch
Epoch 15/20  Iteration 2662/3560 Training loss: 1.7411 0.3998 sec/batch
Epoch 15/20  Iteration 2663/3560 Training loss: 1.7410 0.3781 sec/batch
Epoch 15/20  Iteration 2664/3560 Training loss: 1.7409 0.3808 sec/batch
Epoch 15/20  Iteration 2665/3560 Training loss: 1.7411 0.4021 sec/batch
Epoch 15/20  Iteration 2666/3560 Training loss: 1.7412 0.4106 sec/batch
Epoch 15/20  Iteration 2667/3560 Training loss: 1.7412 0.3944 sec/batch
Epoch 15/20  Iteration 2668/3560 Training loss: 1.7412 0.3909 sec/batch
Epoch 15/20  Iteration 2669/3560 Training loss: 1.7410 0.3865 sec/batch
Epoch 15/20  Iteration 2670/3560 Training loss: 1.7411 0.3915 sec/batch
Epoch 16/20  Iteration 2671/3560 Training loss: 1.8113 0.3783 sec/batch
Epoch 16/20  Iteration 2672/3560 Training loss: 1.7716 0.3854 sec/batch
Epoch 16/20  Iteration 2673/3560 Training loss: 1.7548 0.3773 sec/batch
Epoch 16/20  Iteration 2674/3560 Training loss: 1.7477 0.3825 sec/batch
Epoch 16/20  Iteration 2675/3560 Training loss: 1.7428 0.4135 sec/batch
Epoch 16/20  Iteration 2676/3560 Training loss: 1.7314 0.3787 sec/batch
Epoch 16/20  Iteration 2677/3560 Training loss: 1.7328 0.3861 sec/batch
Epoch 16/20  Iteration 2678/3560 Training loss: 1.7318 0.3824 sec/batch
Epoch 16/20  Iteration 2679/3560 Training loss: 1.7345 0.3808 sec/batch
Epoch 16/20  Iteration 2680/3560 Training loss: 1.7336 0.3902 sec/batch
Epoch 16/20  Iteration 2681/3560 Training loss: 1.7306 0.3899 sec/batch
Epoch 16/20  Iteration 2682/3560 Training loss: 1.7296 0.3916 sec/batch
Epoch 16/20  Iteration 2683/3560 Training loss: 1.7293 0.3826 sec/batch
Epoch 16/20  Iteration 2684/3560 Training loss: 1.7328 0.3893 sec/batch
Epoch 16/20  Iteration 2685/3560 Training loss: 1.7323 0.3804 sec/batch
Epoch 16/20  Iteration 2686/3560 Training loss: 1.7308 0.3857 sec/batch
Epoch 16/20  Iteration 2687/3560 Training loss: 1.7309 0.3980 sec/batch
Epoch 16/20  Iteration 2688/3560 Training loss: 1.7328 0.4043 sec/batch
Epoch 16/20  Iteration 2689/3560 Training loss: 1.7334 0.3825 sec/batch
Epoch 16/20  Iteration 2690/3560 Training loss: 1.7341 0.3822 sec/batch
Epoch 16/20  Iteration 2691/3560 Training loss: 1.7336 0.3795 sec/batch
Epoch 16/20  Iteration 2692/3560 Training loss: 1.7342 0.3776 sec/batch
Epoch 16/20  Iteration 2693/3560 Training loss: 1.7335 0.3772 sec/batch
Epoch 16/20  Iteration 2694/3560 Training loss: 1.7329 0.3823 sec/batch
Epoch 16/20  Iteration 2695/3560 Training loss: 1.7330 0.3837 sec/batch
Epoch 16/20  Iteration 2696/3560 Training loss: 1.7323 0.3819 sec/batch
Epoch 16/20  Iteration 2697/3560 Training loss: 1.7313 0.3815 sec/batch
Epoch 16/20  Iteration 2698/3560 Training loss: 1.7320 0.3794 sec/batch
Epoch 16/20  Iteration 2699/3560 Training loss: 1.7329 0.3793 sec/batch
Epoch 16/20  Iteration 2700/3560 Training loss: 1.7333 0.3814 sec/batch
Epoch 16/20  Iteration 2701/3560 Training loss: 1.7333 0.4032 sec/batch
Epoch 16/20  Iteration 2702/3560 Training loss: 1.7324 0.3773 sec/batch
Epoch 16/20  Iteration 2703/3560 Training loss: 1.7328 0.3776 sec/batch
Epoch 16/20  Iteration 2704/3560 Training loss: 1.7333 0.3879 sec/batch
Epoch 16/20  Iteration 2705/3560 Training loss: 1.7330 0.3843 sec/batch
Epoch 16/20  Iteration 2706/3560 Training loss: 1.7332 0.3749 sec/batch
Epoch 16/20  Iteration 2707/3560 Training loss: 1.7327 0.3767 sec/batch
Epoch 16/20  Iteration 2708/3560 Training loss: 1.7320 0.3818 sec/batch
Epoch 16/20  Iteration 2709/3560 Training loss: 1.7310 0.3934 sec/batch
Epoch 16/20  Iteration 2710/3560 Training loss: 1.7304 0.3851 sec/batch
Epoch 16/20  Iteration 2711/3560 Training loss: 1.7303 0.3785 sec/batch
Epoch 16/20  Iteration 2712/3560 Training loss: 1.7308 0.3891 sec/batch
Epoch 16/20  Iteration 2713/3560 Training loss: 1.7304 0.3763 sec/batch
Epoch 16/20  Iteration 2714/3560 Training loss: 1.7294 0.4078 sec/batch
Epoch 16/20  Iteration 2715/3560 Training loss: 1.7298 0.3794 sec/batch
Epoch 16/20  Iteration 2716/3560 Training loss: 1.7287 0.3810 sec/batch
Epoch 16/20  Iteration 2717/3560 Training loss: 1.7286 0.3788 sec/batch
Epoch 16/20  Iteration 2718/3560 Training loss: 1.7281 0.3801 sec/batch
Epoch 16/20  Iteration 2719/3560 Training loss: 1.7279 0.3800 sec/batch
Epoch 16/20  Iteration 2720/3560 Training loss: 1.7287 0.3762 sec/batch
Epoch 16/20  Iteration 2721/3560 Training loss: 1.7283 0.3751 sec/batch
Epoch 16/20  Iteration 2722/3560 Training loss: 1.7291 0.3732 sec/batch
Epoch 16/20  Iteration 2723/3560 Training loss: 1.7290 0.3779 sec/batch
Epoch 16/20  Iteration 2724/3560 Training loss: 1.7291 0.3790 sec/batch
Epoch 16/20  Iteration 2725/3560 Training loss: 1.7288 0.3803 sec/batch
Epoch 16/20  Iteration 2726/3560 Training loss: 1.7290 0.3866 sec/batch
Epoch 16/20  Iteration 2727/3560 Training loss: 1.7293 0.3884 sec/batch
Epoch 16/20  Iteration 2728/3560 Training loss: 1.7290 0.3929 sec/batch
Epoch 16/20  Iteration 2729/3560 Training loss: 1.7286 0.3732 sec/batch
Epoch 16/20  Iteration 2730/3560 Training loss: 1.7291 0.3766 sec/batch
Epoch 16/20  Iteration 2731/3560 Training loss: 1.7293 0.3771 sec/batch
Epoch 16/20  Iteration 2732/3560 Training loss: 1.7301 0.3772 sec/batch
Epoch 16/20  Iteration 2733/3560 Training loss: 1.7306 0.3844 sec/batch
Epoch 16/20  Iteration 2734/3560 Training loss: 1.7308 0.3792 sec/batch
Epoch 16/20  Iteration 2735/3560 Training loss: 1.7307 0.3772 sec/batch
Epoch 16/20  Iteration 2736/3560 Training loss: 1.7310 0.3734 sec/batch
Epoch 16/20  Iteration 2737/3560 Training loss: 1.7313 0.3735 sec/batch
Epoch 16/20  Iteration 2738/3560 Training loss: 1.7309 0.3846 sec/batch
Epoch 16/20  Iteration 2739/3560 Training loss: 1.7308 0.3772 sec/batch
Epoch 16/20  Iteration 2740/3560 Training loss: 1.7307 0.3753 sec/batch
Epoch 16/20  Iteration 2741/3560 Training loss: 1.7312 0.4012 sec/batch
Epoch 16/20  Iteration 2742/3560 Training loss: 1.7315 0.3926 sec/batch
Epoch 16/20  Iteration 2743/3560 Training loss: 1.7321 0.3787 sec/batch
Epoch 16/20  Iteration 2744/3560 Training loss: 1.7318 0.3764 sec/batch
Epoch 16/20  Iteration 2745/3560 Training loss: 1.7316 0.3776 sec/batch
Epoch 16/20  Iteration 2746/3560 Training loss: 1.7318 0.3784 sec/batch
Epoch 16/20  Iteration 2747/3560 Training loss: 1.7318 0.3803 sec/batch
Epoch 16/20  Iteration 2748/3560 Training loss: 1.7320 0.3764 sec/batch
Epoch 16/20  Iteration 2749/3560 Training loss: 1.7316 0.3880 sec/batch
Epoch 16/20  Iteration 2750/3560 Training loss: 1.7314 0.3844 sec/batch
Epoch 16/20  Iteration 2751/3560 Training loss: 1.7310 0.3821 sec/batch
Epoch 16/20  Iteration 2752/3560 Training loss: 1.7312 0.3771 sec/batch
Epoch 16/20  Iteration 2753/3560 Training loss: 1.7307 0.3935 sec/batch
Epoch 16/20  Iteration 2754/3560 Training loss: 1.7307 0.4106 sec/batch
Epoch 16/20  Iteration 2755/3560 Training loss: 1.7303 0.3863 sec/batch
Epoch 16/20  Iteration 2756/3560 Training loss: 1.7301 0.3754 sec/batch
Epoch 16/20  Iteration 2757/3560 Training loss: 1.7299 0.3812 sec/batch
Epoch 16/20  Iteration 2758/3560 Training loss: 1.7296 0.3815 sec/batch
Epoch 16/20  Iteration 2759/3560 Training loss: 1.7290 0.3784 sec/batch
Epoch 16/20  Iteration 2760/3560 Training loss: 1.7293 0.3855 sec/batch
Epoch 16/20  Iteration 2761/3560 Training loss: 1.7291 0.3818 sec/batch
Epoch 16/20  Iteration 2762/3560 Training loss: 1.7289 0.3838 sec/batch
Epoch 16/20  Iteration 2763/3560 Training loss: 1.7286 0.3802 sec/batch
Epoch 16/20  Iteration 2764/3560 Training loss: 1.7282 0.3751 sec/batch
Epoch 16/20  Iteration 2765/3560 Training loss: 1.7279 0.3789 sec/batch
Epoch 16/20  Iteration 2766/3560 Training loss: 1.7278 0.3813 sec/batch
Epoch 16/20  Iteration 2767/3560 Training loss: 1.7277 0.4162 sec/batch
Epoch 16/20  Iteration 2768/3560 Training loss: 1.7274 0.3738 sec/batch
Epoch 16/20  Iteration 2769/3560 Training loss: 1.7271 0.3756 sec/batch
Epoch 16/20  Iteration 2770/3560 Training loss: 1.7268 0.3793 sec/batch
Epoch 16/20  Iteration 2771/3560 Training loss: 1.7268 0.3765 sec/batch
Epoch 16/20  Iteration 2772/3560 Training loss: 1.7266 0.3840 sec/batch
Epoch 16/20  Iteration 2773/3560 Training loss: 1.7263 0.3776 sec/batch
Epoch 16/20  Iteration 2774/3560 Training loss: 1.7263 0.3837 sec/batch
Epoch 16/20  Iteration 2775/3560 Training loss: 1.7261 0.3778 sec/batch
Epoch 16/20  Iteration 2776/3560 Training loss: 1.7260 0.3781 sec/batch
Epoch 16/20  Iteration 2777/3560 Training loss: 1.7259 0.3747 sec/batch
Epoch 16/20  Iteration 2778/3560 Training loss: 1.7258 0.3823 sec/batch
Epoch 16/20  Iteration 2779/3560 Training loss: 1.7259 0.3760 sec/batch
Epoch 16/20  Iteration 2780/3560 Training loss: 1.7259 0.3854 sec/batch
Epoch 16/20  Iteration 2781/3560 Training loss: 1.7258 0.3892 sec/batch
Epoch 16/20  Iteration 2782/3560 Training loss: 1.7257 0.3893 sec/batch
Epoch 16/20  Iteration 2783/3560 Training loss: 1.7255 0.3764 sec/batch
Epoch 16/20  Iteration 2784/3560 Training loss: 1.7254 0.3767 sec/batch
Epoch 16/20  Iteration 2785/3560 Training loss: 1.7252 0.3790 sec/batch
Epoch 16/20  Iteration 2786/3560 Training loss: 1.7248 0.3713 sec/batch
Epoch 16/20  Iteration 2787/3560 Training loss: 1.7248 0.3782 sec/batch
Epoch 16/20  Iteration 2788/3560 Training loss: 1.7247 0.3889 sec/batch
Epoch 16/20  Iteration 2789/3560 Training loss: 1.7248 0.3823 sec/batch
Epoch 16/20  Iteration 2790/3560 Training loss: 1.7246 0.3863 sec/batch
Epoch 16/20  Iteration 2791/3560 Training loss: 1.7245 0.3776 sec/batch
Epoch 16/20  Iteration 2792/3560 Training loss: 1.7242 0.3815 sec/batch
Epoch 16/20  Iteration 2793/3560 Training loss: 1.7239 0.3843 sec/batch
Epoch 16/20  Iteration 2794/3560 Training loss: 1.7239 0.3997 sec/batch
Epoch 16/20  Iteration 2795/3560 Training loss: 1.7238 0.3777 sec/batch
Epoch 16/20  Iteration 2796/3560 Training loss: 1.7234 0.3821 sec/batch
Epoch 16/20  Iteration 2797/3560 Training loss: 1.7236 0.3810 sec/batch
Epoch 16/20  Iteration 2798/3560 Training loss: 1.7237 0.3788 sec/batch
Epoch 16/20  Iteration 2799/3560 Training loss: 1.7236 0.3857 sec/batch
Epoch 16/20  Iteration 2800/3560 Training loss: 1.7234 0.3713 sec/batch
Epoch 16/20  Iteration 2801/3560 Training loss: 1.7231 0.3790 sec/batch
Epoch 16/20  Iteration 2802/3560 Training loss: 1.7230 0.3809 sec/batch
Epoch 16/20  Iteration 2803/3560 Training loss: 1.7230 0.3797 sec/batch
Epoch 16/20  Iteration 2804/3560 Training loss: 1.7230 0.3840 sec/batch
Epoch 16/20  Iteration 2805/3560 Training loss: 1.7231 0.3790 sec/batch
Epoch 16/20  Iteration 2806/3560 Training loss: 1.7231 0.3772 sec/batch
Epoch 16/20  Iteration 2807/3560 Training loss: 1.7232 0.4063 sec/batch
Epoch 16/20  Iteration 2808/3560 Training loss: 1.7232 0.3782 sec/batch
Epoch 16/20  Iteration 2809/3560 Training loss: 1.7233 0.3752 sec/batch
Epoch 16/20  Iteration 2810/3560 Training loss: 1.7232 0.3735 sec/batch
Epoch 16/20  Iteration 2811/3560 Training loss: 1.7234 0.3795 sec/batch
Epoch 16/20  Iteration 2812/3560 Training loss: 1.7233 0.3809 sec/batch
Epoch 16/20  Iteration 2813/3560 Training loss: 1.7233 0.3757 sec/batch
Epoch 16/20  Iteration 2814/3560 Training loss: 1.7234 0.3777 sec/batch
Epoch 16/20  Iteration 2815/3560 Training loss: 1.7233 0.3792 sec/batch
Epoch 16/20  Iteration 2816/3560 Training loss: 1.7234 0.4091 sec/batch
Epoch 16/20  Iteration 2817/3560 Training loss: 1.7235 0.4298 sec/batch
Epoch 16/20  Iteration 2818/3560 Training loss: 1.7237 0.5200 sec/batch
Epoch 16/20  Iteration 2819/3560 Training loss: 1.7237 0.5879 sec/batch
Epoch 16/20  Iteration 2820/3560 Training loss: 1.7236 0.5165 sec/batch
Epoch 16/20  Iteration 2821/3560 Training loss: 1.7233 0.4407 sec/batch
Epoch 16/20  Iteration 2822/3560 Training loss: 1.7234 0.4236 sec/batch
Epoch 16/20  Iteration 2823/3560 Training loss: 1.7235 0.4057 sec/batch
Epoch 16/20  Iteration 2824/3560 Training loss: 1.7236 0.4051 sec/batch
Epoch 16/20  Iteration 2825/3560 Training loss: 1.7237 0.4041 sec/batch
Epoch 16/20  Iteration 2826/3560 Training loss: 1.7237 0.4072 sec/batch
Epoch 16/20  Iteration 2827/3560 Training loss: 1.7237 0.4049 sec/batch
Epoch 16/20  Iteration 2828/3560 Training loss: 1.7237 0.4295 sec/batch
Epoch 16/20  Iteration 2829/3560 Training loss: 1.7235 0.4156 sec/batch
Epoch 16/20  Iteration 2830/3560 Training loss: 1.7236 0.4425 sec/batch
Epoch 16/20  Iteration 2831/3560 Training loss: 1.7237 0.4319 sec/batch
Epoch 16/20  Iteration 2832/3560 Training loss: 1.7237 0.4056 sec/batch
Epoch 16/20  Iteration 2833/3560 Training loss: 1.7238 0.4076 sec/batch
Epoch 16/20  Iteration 2834/3560 Training loss: 1.7237 0.4098 sec/batch
Epoch 16/20  Iteration 2835/3560 Training loss: 1.7238 0.4052 sec/batch
Epoch 16/20  Iteration 2836/3560 Training loss: 1.7237 0.4051 sec/batch
Epoch 16/20  Iteration 2837/3560 Training loss: 1.7237 0.4093 sec/batch
Epoch 16/20  Iteration 2838/3560 Training loss: 1.7242 0.4105 sec/batch
Epoch 16/20  Iteration 2839/3560 Training loss: 1.7241 0.4128 sec/batch
Epoch 16/20  Iteration 2840/3560 Training loss: 1.7241 0.3990 sec/batch
Epoch 16/20  Iteration 2841/3560 Training loss: 1.7240 0.4104 sec/batch
Epoch 16/20  Iteration 2842/3560 Training loss: 1.7238 0.4100 sec/batch
Epoch 16/20  Iteration 2843/3560 Training loss: 1.7240 0.4449 sec/batch
Epoch 16/20  Iteration 2844/3560 Training loss: 1.7240 0.4237 sec/batch
Epoch 16/20  Iteration 2845/3560 Training loss: 1.7241 0.4062 sec/batch
Epoch 16/20  Iteration 2846/3560 Training loss: 1.7239 0.4094 sec/batch
Epoch 16/20  Iteration 2847/3560 Training loss: 1.7239 0.4154 sec/batch
Epoch 16/20  Iteration 2848/3560 Training loss: 1.7239 0.4057 sec/batch
Epoch 17/20  Iteration 2849/3560 Training loss: 1.7959 0.4087 sec/batch
Epoch 17/20  Iteration 2850/3560 Training loss: 1.7593 0.4086 sec/batch
Epoch 17/20  Iteration 2851/3560 Training loss: 1.7447 0.4075 sec/batch
Epoch 17/20  Iteration 2852/3560 Training loss: 1.7373 0.4072 sec/batch
Epoch 17/20  Iteration 2853/3560 Training loss: 1.7320 0.4089 sec/batch
Epoch 17/20  Iteration 2854/3560 Training loss: 1.7228 0.4099 sec/batch
Epoch 17/20  Iteration 2855/3560 Training loss: 1.7239 0.4080 sec/batch
Epoch 17/20  Iteration 2856/3560 Training loss: 1.7220 0.4434 sec/batch
Epoch 17/20  Iteration 2857/3560 Training loss: 1.7236 0.4014 sec/batch
Epoch 17/20  Iteration 2858/3560 Training loss: 1.7222 0.4155 sec/batch
Epoch 17/20  Iteration 2859/3560 Training loss: 1.7191 0.4019 sec/batch
Epoch 17/20  Iteration 2860/3560 Training loss: 1.7176 0.4036 sec/batch
Epoch 17/20  Iteration 2861/3560 Training loss: 1.7168 0.4082 sec/batch
Epoch 17/20  Iteration 2862/3560 Training loss: 1.7189 0.4102 sec/batch
Epoch 17/20  Iteration 2863/3560 Training loss: 1.7187 0.4446 sec/batch
Epoch 17/20  Iteration 2864/3560 Training loss: 1.7166 0.4050 sec/batch
Epoch 17/20  Iteration 2865/3560 Training loss: 1.7166 0.4047 sec/batch
Epoch 17/20  Iteration 2866/3560 Training loss: 1.7187 0.4105 sec/batch
Epoch 17/20  Iteration 2867/3560 Training loss: 1.7188 0.4099 sec/batch
Epoch 17/20  Iteration 2868/3560 Training loss: 1.7196 0.4333 sec/batch
Epoch 17/20  Iteration 2869/3560 Training loss: 1.7192 0.4105 sec/batch
Epoch 17/20  Iteration 2870/3560 Training loss: 1.7196 0.4039 sec/batch
Epoch 17/20  Iteration 2871/3560 Training loss: 1.7191 0.4093 sec/batch
Epoch 17/20  Iteration 2872/3560 Training loss: 1.7189 0.4032 sec/batch
Epoch 17/20  Iteration 2873/3560 Training loss: 1.7184 0.4059 sec/batch
Epoch 17/20  Iteration 2874/3560 Training loss: 1.7175 0.4085 sec/batch
Epoch 17/20  Iteration 2875/3560 Training loss: 1.7165 0.4110 sec/batch
Epoch 17/20  Iteration 2876/3560 Training loss: 1.7172 0.4243 sec/batch
Epoch 17/20  Iteration 2877/3560 Training loss: 1.7184 0.4091 sec/batch
Epoch 17/20  Iteration 2878/3560 Training loss: 1.7191 0.4041 sec/batch
Epoch 17/20  Iteration 2879/3560 Training loss: 1.7189 0.4048 sec/batch
Epoch 17/20  Iteration 2880/3560 Training loss: 1.7179 0.4354 sec/batch
Epoch 17/20  Iteration 2881/3560 Training loss: 1.7178 0.4049 sec/batch
Epoch 17/20  Iteration 2882/3560 Training loss: 1.7184 0.4098 sec/batch
Epoch 17/20  Iteration 2883/3560 Training loss: 1.7178 0.4009 sec/batch
Epoch 17/20  Iteration 2884/3560 Training loss: 1.7175 0.4070 sec/batch
Epoch 17/20  Iteration 2885/3560 Training loss: 1.7167 0.4163 sec/batch
Epoch 17/20  Iteration 2886/3560 Training loss: 1.7158 0.4069 sec/batch
Epoch 17/20  Iteration 2887/3560 Training loss: 1.7145 0.4042 sec/batch
Epoch 17/20  Iteration 2888/3560 Training loss: 1.7139 0.4077 sec/batch
Epoch 17/20  Iteration 2889/3560 Training loss: 1.7134 0.4080 sec/batch
Epoch 17/20  Iteration 2890/3560 Training loss: 1.7137 0.4089 sec/batch
Epoch 17/20  Iteration 2891/3560 Training loss: 1.7133 0.4085 sec/batch
Epoch 17/20  Iteration 2892/3560 Training loss: 1.7128 0.4052 sec/batch
Epoch 17/20  Iteration 2893/3560 Training loss: 1.7131 0.4302 sec/batch
Epoch 17/20  Iteration 2894/3560 Training loss: 1.7122 0.4058 sec/batch
Epoch 17/20  Iteration 2895/3560 Training loss: 1.7120 0.4058 sec/batch
Epoch 17/20  Iteration 2896/3560 Training loss: 1.7115 0.4236 sec/batch
Epoch 17/20  Iteration 2897/3560 Training loss: 1.7114 0.4094 sec/batch
Epoch 17/20  Iteration 2898/3560 Training loss: 1.7122 0.4073 sec/batch
Epoch 17/20  Iteration 2899/3560 Training loss: 1.7117 0.4203 sec/batch
Epoch 17/20  Iteration 2900/3560 Training loss: 1.7127 0.4237 sec/batch
Epoch 17/20  Iteration 2901/3560 Training loss: 1.7126 0.4093 sec/batch
Epoch 17/20  Iteration 2902/3560 Training loss: 1.7127 0.4057 sec/batch
Epoch 17/20  Iteration 2903/3560 Training loss: 1.7124 0.4080 sec/batch
Epoch 17/20  Iteration 2904/3560 Training loss: 1.7127 0.4018 sec/batch
Epoch 17/20  Iteration 2905/3560 Training loss: 1.7129 0.4291 sec/batch
Epoch 17/20  Iteration 2906/3560 Training loss: 1.7126 0.4090 sec/batch
Epoch 17/20  Iteration 2907/3560 Training loss: 1.7123 0.4062 sec/batch
Epoch 17/20  Iteration 2908/3560 Training loss: 1.7128 0.4020 sec/batch
Epoch 17/20  Iteration 2909/3560 Training loss: 1.7126 0.4013 sec/batch
Epoch 17/20  Iteration 2910/3560 Training loss: 1.7135 0.4051 sec/batch
Epoch 17/20  Iteration 2911/3560 Training loss: 1.7140 0.4057 sec/batch
Epoch 17/20  Iteration 2912/3560 Training loss: 1.7144 0.4050 sec/batch
Epoch 17/20  Iteration 2913/3560 Training loss: 1.7143 0.4042 sec/batch
Epoch 17/20  Iteration 2914/3560 Training loss: 1.7146 0.4034 sec/batch
Epoch 17/20  Iteration 2915/3560 Training loss: 1.7149 0.4144 sec/batch
Epoch 17/20  Iteration 2916/3560 Training loss: 1.7145 0.4099 sec/batch
Epoch 17/20  Iteration 2917/3560 Training loss: 1.7146 0.4326 sec/batch
Epoch 17/20  Iteration 2918/3560 Training loss: 1.7146 0.4097 sec/batch
Epoch 17/20  Iteration 2919/3560 Training loss: 1.7151 0.4089 sec/batch
Epoch 17/20  Iteration 2920/3560 Training loss: 1.7153 0.4059 sec/batch
Epoch 17/20  Iteration 2921/3560 Training loss: 1.7159 0.4134 sec/batch
Epoch 17/20  Iteration 2922/3560 Training loss: 1.7157 0.3932 sec/batch
Epoch 17/20  Iteration 2923/3560 Training loss: 1.7157 0.4076 sec/batch
Epoch 17/20  Iteration 2924/3560 Training loss: 1.7158 0.4061 sec/batch
Epoch 17/20  Iteration 2925/3560 Training loss: 1.7157 0.4022 sec/batch
Epoch 17/20  Iteration 2926/3560 Training loss: 1.7157 0.4037 sec/batch
Epoch 17/20  Iteration 2927/3560 Training loss: 1.7152 0.4047 sec/batch
Epoch 17/20  Iteration 2928/3560 Training loss: 1.7152 0.4157 sec/batch
Epoch 17/20  Iteration 2929/3560 Training loss: 1.7147 0.4042 sec/batch
Epoch 17/20  Iteration 2930/3560 Training loss: 1.7148 0.4296 sec/batch
Epoch 17/20  Iteration 2931/3560 Training loss: 1.7145 0.4029 sec/batch
Epoch 17/20  Iteration 2932/3560 Training loss: 1.7144 0.4068 sec/batch
Epoch 17/20  Iteration 2933/3560 Training loss: 1.7141 0.4095 sec/batch
Epoch 17/20  Iteration 2934/3560 Training loss: 1.7138 0.4073 sec/batch
Epoch 17/20  Iteration 2935/3560 Training loss: 1.7136 0.4062 sec/batch
Epoch 17/20  Iteration 2936/3560 Training loss: 1.7135 0.4066 sec/batch
Epoch 17/20  Iteration 2937/3560 Training loss: 1.7130 0.4124 sec/batch
Epoch 17/20  Iteration 2938/3560 Training loss: 1.7131 0.4531 sec/batch
Epoch 17/20  Iteration 2939/3560 Training loss: 1.7128 0.5205 sec/batch
Epoch 17/20  Iteration 2940/3560 Training loss: 1.7127 0.4518 sec/batch
Epoch 17/20  Iteration 2941/3560 Training loss: 1.7123 0.4258 sec/batch
Epoch 17/20  Iteration 2942/3560 Training loss: 1.7119 0.4058 sec/batch
Epoch 17/20  Iteration 2943/3560 Training loss: 1.7117 0.4212 sec/batch
Epoch 17/20  Iteration 2944/3560 Training loss: 1.7117 0.4025 sec/batch
Epoch 17/20  Iteration 2945/3560 Training loss: 1.7118 0.4092 sec/batch
Epoch 17/20  Iteration 2946/3560 Training loss: 1.7114 0.4059 sec/batch
Epoch 17/20  Iteration 2947/3560 Training loss: 1.7112 0.4022 sec/batch
Epoch 17/20  Iteration 2948/3560 Training loss: 1.7108 0.4048 sec/batch
Epoch 17/20  Iteration 2949/3560 Training loss: 1.7108 0.4079 sec/batch
Epoch 17/20  Iteration 2950/3560 Training loss: 1.7107 0.4088 sec/batch
Epoch 17/20  Iteration 2951/3560 Training loss: 1.7105 0.4137 sec/batch
Epoch 17/20  Iteration 2952/3560 Training loss: 1.7103 0.4068 sec/batch
Epoch 17/20  Iteration 2953/3560 Training loss: 1.7102 0.4021 sec/batch
Epoch 17/20  Iteration 2954/3560 Training loss: 1.7101 0.4290 sec/batch
Epoch 17/20  Iteration 2955/3560 Training loss: 1.7102 0.4051 sec/batch
Epoch 17/20  Iteration 2956/3560 Training loss: 1.7102 0.4198 sec/batch
Epoch 17/20  Iteration 2957/3560 Training loss: 1.7103 0.4102 sec/batch
Epoch 17/20  Iteration 2958/3560 Training loss: 1.7103 0.4048 sec/batch
Epoch 17/20  Iteration 2959/3560 Training loss: 1.7103 0.4118 sec/batch
Epoch 17/20  Iteration 2960/3560 Training loss: 1.7102 0.4090 sec/batch
Epoch 17/20  Iteration 2961/3560 Training loss: 1.7100 0.4099 sec/batch
Epoch 17/20  Iteration 2962/3560 Training loss: 1.7099 0.4085 sec/batch
Epoch 17/20  Iteration 2963/3560 Training loss: 1.7097 0.4042 sec/batch
Epoch 17/20  Iteration 2964/3560 Training loss: 1.7093 0.4085 sec/batch
Epoch 17/20  Iteration 2965/3560 Training loss: 1.7093 0.4016 sec/batch
Epoch 17/20  Iteration 2966/3560 Training loss: 1.7092 0.4361 sec/batch
Epoch 17/20  Iteration 2967/3560 Training loss: 1.7092 0.4112 sec/batch
Epoch 17/20  Iteration 2968/3560 Training loss: 1.7091 0.4184 sec/batch
Epoch 17/20  Iteration 2969/3560 Training loss: 1.7092 0.4067 sec/batch
Epoch 17/20  Iteration 2970/3560 Training loss: 1.7088 0.4065 sec/batch
Epoch 17/20  Iteration 2971/3560 Training loss: 1.7085 0.4058 sec/batch
Epoch 17/20  Iteration 2972/3560 Training loss: 1.7086 0.4150 sec/batch
Epoch 17/20  Iteration 2973/3560 Training loss: 1.7086 0.4161 sec/batch
Epoch 17/20  Iteration 2974/3560 Training loss: 1.7082 0.4046 sec/batch
Epoch 17/20  Iteration 2975/3560 Training loss: 1.7084 0.4075 sec/batch
Epoch 17/20  Iteration 2976/3560 Training loss: 1.7085 0.4056 sec/batch
Epoch 17/20  Iteration 2977/3560 Training loss: 1.7084 0.4081 sec/batch
Epoch 17/20  Iteration 2978/3560 Training loss: 1.7082 0.4159 sec/batch
Epoch 17/20  Iteration 2979/3560 Training loss: 1.7079 0.4215 sec/batch
Epoch 17/20  Iteration 2980/3560 Training loss: 1.7077 0.4105 sec/batch
Epoch 17/20  Iteration 2981/3560 Training loss: 1.7079 0.4150 sec/batch
Epoch 17/20  Iteration 2982/3560 Training loss: 1.7079 0.4016 sec/batch
Epoch 17/20  Iteration 2983/3560 Training loss: 1.7079 0.4116 sec/batch
Epoch 17/20  Iteration 2984/3560 Training loss: 1.7080 0.4026 sec/batch
Epoch 17/20  Iteration 2985/3560 Training loss: 1.7081 0.4149 sec/batch
Epoch 17/20  Iteration 2986/3560 Training loss: 1.7081 0.4045 sec/batch
Epoch 17/20  Iteration 2987/3560 Training loss: 1.7082 0.4028 sec/batch
Epoch 17/20  Iteration 2988/3560 Training loss: 1.7081 0.4046 sec/batch
Epoch 17/20  Iteration 2989/3560 Training loss: 1.7083 0.4058 sec/batch
Epoch 17/20  Iteration 2990/3560 Training loss: 1.7082 0.4095 sec/batch
Epoch 17/20  Iteration 2991/3560 Training loss: 1.7082 0.4270 sec/batch
Epoch 17/20  Iteration 2992/3560 Training loss: 1.7083 0.4038 sec/batch
Epoch 17/20  Iteration 2993/3560 Training loss: 1.7082 0.4077 sec/batch
Epoch 17/20  Iteration 2994/3560 Training loss: 1.7083 0.4049 sec/batch
Epoch 17/20  Iteration 2995/3560 Training loss: 1.7083 0.4059 sec/batch
Epoch 17/20  Iteration 2996/3560 Training loss: 1.7086 0.4045 sec/batch
Epoch 17/20  Iteration 2997/3560 Training loss: 1.7086 0.4048 sec/batch
Epoch 17/20  Iteration 2998/3560 Training loss: 1.7084 0.4037 sec/batch
Epoch 17/20  Iteration 2999/3560 Training loss: 1.7082 0.4149 sec/batch
Epoch 17/20  Iteration 3000/3560 Training loss: 1.7082 0.4023 sec/batch
Epoch 17/20  Iteration 3001/3560 Training loss: 1.7083 0.4008 sec/batch
Epoch 17/20  Iteration 3002/3560 Training loss: 1.7083 0.4093 sec/batch
Epoch 17/20  Iteration 3003/3560 Training loss: 1.7083 0.4327 sec/batch
Epoch 17/20  Iteration 3004/3560 Training loss: 1.7084 0.4056 sec/batch
Epoch 17/20  Iteration 3005/3560 Training loss: 1.7084 0.4089 sec/batch
Epoch 17/20  Iteration 3006/3560 Training loss: 1.7084 0.4271 sec/batch
Epoch 17/20  Iteration 3007/3560 Training loss: 1.7082 0.4201 sec/batch
Epoch 17/20  Iteration 3008/3560 Training loss: 1.7084 0.4029 sec/batch
Epoch 17/20  Iteration 3009/3560 Training loss: 1.7085 0.4059 sec/batch
Epoch 17/20  Iteration 3010/3560 Training loss: 1.7086 0.3994 sec/batch
Epoch 17/20  Iteration 3011/3560 Training loss: 1.7086 0.4101 sec/batch
Epoch 17/20  Iteration 3012/3560 Training loss: 1.7086 0.4078 sec/batch
Epoch 17/20  Iteration 3013/3560 Training loss: 1.7086 0.4144 sec/batch
Epoch 17/20  Iteration 3014/3560 Training loss: 1.7086 0.4053 sec/batch
Epoch 17/20  Iteration 3015/3560 Training loss: 1.7087 0.4137 sec/batch
Epoch 17/20  Iteration 3016/3560 Training loss: 1.7091 0.4210 sec/batch
Epoch 17/20  Iteration 3017/3560 Training loss: 1.7091 0.4087 sec/batch
Epoch 17/20  Iteration 3018/3560 Training loss: 1.7091 0.4113 sec/batch
Epoch 17/20  Iteration 3019/3560 Training loss: 1.7090 0.4094 sec/batch
Epoch 17/20  Iteration 3020/3560 Training loss: 1.7088 0.4006 sec/batch
Epoch 17/20  Iteration 3021/3560 Training loss: 1.7089 0.4113 sec/batch
Epoch 17/20  Iteration 3022/3560 Training loss: 1.7089 0.4094 sec/batch
Epoch 17/20  Iteration 3023/3560 Training loss: 1.7090 0.4097 sec/batch
Epoch 17/20  Iteration 3024/3560 Training loss: 1.7089 0.4054 sec/batch
Epoch 17/20  Iteration 3025/3560 Training loss: 1.7088 0.4038 sec/batch
Epoch 17/20  Iteration 3026/3560 Training loss: 1.7089 0.4170 sec/batch
Epoch 18/20  Iteration 3027/3560 Training loss: 1.7777 0.4030 sec/batch
Epoch 18/20  Iteration 3028/3560 Training loss: 1.7437 0.4275 sec/batch
Epoch 18/20  Iteration 3029/3560 Training loss: 1.7303 0.4074 sec/batch
Epoch 18/20  Iteration 3030/3560 Training loss: 1.7217 0.4055 sec/batch
Epoch 18/20  Iteration 3031/3560 Training loss: 1.7164 0.4154 sec/batch
Epoch 18/20  Iteration 3032/3560 Training loss: 1.7079 0.4082 sec/batch
Epoch 18/20  Iteration 3033/3560 Training loss: 1.7083 0.4102 sec/batch
Epoch 18/20  Iteration 3034/3560 Training loss: 1.7063 0.4076 sec/batch
Epoch 18/20  Iteration 3035/3560 Training loss: 1.7093 0.4086 sec/batch
Epoch 18/20  Iteration 3036/3560 Training loss: 1.7079 0.4025 sec/batch
Epoch 18/20  Iteration 3037/3560 Training loss: 1.7046 0.3993 sec/batch
Epoch 18/20  Iteration 3038/3560 Training loss: 1.7037 0.4109 sec/batch
Epoch 18/20  Iteration 3039/3560 Training loss: 1.7036 0.3997 sec/batch
Epoch 18/20  Iteration 3040/3560 Training loss: 1.7059 0.4333 sec/batch
Epoch 18/20  Iteration 3041/3560 Training loss: 1.7051 0.4117 sec/batch
Epoch 18/20  Iteration 3042/3560 Training loss: 1.7033 0.4099 sec/batch
Epoch 18/20  Iteration 3043/3560 Training loss: 1.7032 0.4069 sec/batch
Epoch 18/20  Iteration 3044/3560 Training loss: 1.7052 0.4233 sec/batch
Epoch 18/20  Iteration 3045/3560 Training loss: 1.7048 0.4161 sec/batch
Epoch 18/20  Iteration 3046/3560 Training loss: 1.7057 0.4031 sec/batch
Epoch 18/20  Iteration 3047/3560 Training loss: 1.7050 0.4039 sec/batch
Epoch 18/20  Iteration 3048/3560 Training loss: 1.7060 0.4032 sec/batch
Epoch 18/20  Iteration 3049/3560 Training loss: 1.7050 0.4079 sec/batch
Epoch 18/20  Iteration 3050/3560 Training loss: 1.7043 0.4021 sec/batch
Epoch 18/20  Iteration 3051/3560 Training loss: 1.7043 0.4073 sec/batch
Epoch 18/20  Iteration 3052/3560 Training loss: 1.7030 0.4116 sec/batch
Epoch 18/20  Iteration 3053/3560 Training loss: 1.7020 0.4185 sec/batch
Epoch 18/20  Iteration 3054/3560 Training loss: 1.7026 0.4083 sec/batch
Epoch 18/20  Iteration 3055/3560 Training loss: 1.7036 0.4077 sec/batch
Epoch 18/20  Iteration 3056/3560 Training loss: 1.7039 0.4134 sec/batch
Epoch 18/20  Iteration 3057/3560 Training loss: 1.7037 0.4055 sec/batch
Epoch 18/20  Iteration 3058/3560 Training loss: 1.7025 0.4107 sec/batch
Epoch 18/20  Iteration 3059/3560 Training loss: 1.7031 0.4025 sec/batch
Epoch 18/20  Iteration 3060/3560 Training loss: 1.7036 0.4080 sec/batch
Epoch 18/20  Iteration 3061/3560 Training loss: 1.7035 0.4067 sec/batch
Epoch 18/20  Iteration 3062/3560 Training loss: 1.7036 0.4099 sec/batch
Epoch 18/20  Iteration 3063/3560 Training loss: 1.7030 0.4040 sec/batch
Epoch 18/20  Iteration 3064/3560 Training loss: 1.7018 0.4129 sec/batch
Epoch 18/20  Iteration 3065/3560 Training loss: 1.7006 0.4280 sec/batch
Epoch 18/20  Iteration 3066/3560 Training loss: 1.7001 0.4090 sec/batch
Epoch 18/20  Iteration 3067/3560 Training loss: 1.6998 0.4095 sec/batch
Epoch 18/20  Iteration 3068/3560 Training loss: 1.7003 0.4039 sec/batch
Epoch 18/20  Iteration 3069/3560 Training loss: 1.6999 0.4115 sec/batch
Epoch 18/20  Iteration 3070/3560 Training loss: 1.6992 0.4143 sec/batch
Epoch 18/20  Iteration 3071/3560 Training loss: 1.6996 0.4030 sec/batch
Epoch 18/20  Iteration 3072/3560 Training loss: 1.6987 0.4079 sec/batch
Epoch 18/20  Iteration 3073/3560 Training loss: 1.6984 0.4040 sec/batch
Epoch 18/20  Iteration 3074/3560 Training loss: 1.6981 0.4062 sec/batch
Epoch 18/20  Iteration 3075/3560 Training loss: 1.6982 0.4081 sec/batch
Epoch 18/20  Iteration 3076/3560 Training loss: 1.6988 0.4043 sec/batch
Epoch 18/20  Iteration 3077/3560 Training loss: 1.6983 0.4320 sec/batch
Epoch 18/20  Iteration 3078/3560 Training loss: 1.6992 0.4087 sec/batch
Epoch 18/20  Iteration 3079/3560 Training loss: 1.6991 0.4049 sec/batch
Epoch 18/20  Iteration 3080/3560 Training loss: 1.6992 0.4080 sec/batch
Epoch 18/20  Iteration 3081/3560 Training loss: 1.6989 0.3996 sec/batch
Epoch 18/20  Iteration 3082/3560 Training loss: 1.6991 0.4099 sec/batch
Epoch 18/20  Iteration 3083/3560 Training loss: 1.6995 0.4138 sec/batch
Epoch 18/20  Iteration 3084/3560 Training loss: 1.6992 0.4051 sec/batch
Epoch 18/20  Iteration 3085/3560 Training loss: 1.6988 0.4187 sec/batch
Epoch 18/20  Iteration 3086/3560 Training loss: 1.6996 0.4078 sec/batch
Epoch 18/20  Iteration 3087/3560 Training loss: 1.6996 0.4046 sec/batch
Epoch 18/20  Iteration 3088/3560 Training loss: 1.7005 0.4017 sec/batch
Epoch 18/20  Iteration 3089/3560 Training loss: 1.7010 0.4193 sec/batch
Epoch 18/20  Iteration 3090/3560 Training loss: 1.7011 0.4163 sec/batch
Epoch 18/20  Iteration 3091/3560 Training loss: 1.7007 0.4049 sec/batch
Epoch 18/20  Iteration 3092/3560 Training loss: 1.7010 0.4011 sec/batch
Epoch 18/20  Iteration 3093/3560 Training loss: 1.7013 0.4041 sec/batch
Epoch 18/20  Iteration 3094/3560 Training loss: 1.7008 0.4041 sec/batch
Epoch 18/20  Iteration 3095/3560 Training loss: 1.7009 0.4056 sec/batch
Epoch 18/20  Iteration 3096/3560 Training loss: 1.7007 0.4067 sec/batch
Epoch 18/20  Iteration 3097/3560 Training loss: 1.7013 0.4020 sec/batch
Epoch 18/20  Iteration 3098/3560 Training loss: 1.7015 0.4050 sec/batch
Epoch 18/20  Iteration 3099/3560 Training loss: 1.7020 0.4121 sec/batch
Epoch 18/20  Iteration 3100/3560 Training loss: 1.7019 0.4012 sec/batch
Epoch 18/20  Iteration 3101/3560 Training loss: 1.7018 0.4065 sec/batch
Epoch 18/20  Iteration 3102/3560 Training loss: 1.7020 0.4269 sec/batch
Epoch 18/20  Iteration 3103/3560 Training loss: 1.7020 0.4066 sec/batch
Epoch 18/20  Iteration 3104/3560 Training loss: 1.7021 0.4038 sec/batch
Epoch 18/20  Iteration 3105/3560 Training loss: 1.7017 0.4051 sec/batch
Epoch 18/20  Iteration 3106/3560 Training loss: 1.7016 0.4081 sec/batch
Epoch 18/20  Iteration 3107/3560 Training loss: 1.7010 0.4062 sec/batch
Epoch 18/20  Iteration 3108/3560 Training loss: 1.7012 0.4068 sec/batch
Epoch 18/20  Iteration 3109/3560 Training loss: 1.7006 0.4049 sec/batch
Epoch 18/20  Iteration 3110/3560 Training loss: 1.7006 0.4066 sec/batch
Epoch 18/20  Iteration 3111/3560 Training loss: 1.7002 0.4042 sec/batch
Epoch 18/20  Iteration 3112/3560 Training loss: 1.7000 0.4049 sec/batch
Epoch 18/20  Iteration 3113/3560 Training loss: 1.6997 0.4065 sec/batch
Epoch 18/20  Iteration 3114/3560 Training loss: 1.6994 0.4519 sec/batch
Epoch 18/20  Iteration 3115/3560 Training loss: 1.6991 0.4038 sec/batch
Epoch 18/20  Iteration 3116/3560 Training loss: 1.6993 0.4216 sec/batch
Epoch 18/20  Iteration 3117/3560 Training loss: 1.6990 0.4056 sec/batch
Epoch 18/20  Iteration 3118/3560 Training loss: 1.6987 0.4054 sec/batch
Epoch 18/20  Iteration 3119/3560 Training loss: 1.6983 0.4059 sec/batch
Epoch 18/20  Iteration 3120/3560 Training loss: 1.6980 0.4050 sec/batch
Epoch 18/20  Iteration 3121/3560 Training loss: 1.6977 0.4054 sec/batch
Epoch 18/20  Iteration 3122/3560 Training loss: 1.6978 0.4140 sec/batch
Epoch 18/20  Iteration 3123/3560 Training loss: 1.6978 0.4036 sec/batch
Epoch 18/20  Iteration 3124/3560 Training loss: 1.6975 0.4114 sec/batch
Epoch 18/20  Iteration 3125/3560 Training loss: 1.6972 0.4085 sec/batch
Epoch 18/20  Iteration 3126/3560 Training loss: 1.6969 0.4078 sec/batch
Epoch 18/20  Iteration 3127/3560 Training loss: 1.6969 0.4354 sec/batch
Epoch 18/20  Iteration 3128/3560 Training loss: 1.6968 0.4042 sec/batch
Epoch 18/20  Iteration 3129/3560 Training loss: 1.6966 0.4033 sec/batch
Epoch 18/20  Iteration 3130/3560 Training loss: 1.6966 0.4053 sec/batch
Epoch 18/20  Iteration 3131/3560 Training loss: 1.6965 0.4079 sec/batch
Epoch 18/20  Iteration 3132/3560 Training loss: 1.6964 0.4020 sec/batch
Epoch 18/20  Iteration 3133/3560 Training loss: 1.6963 0.4082 sec/batch
Epoch 18/20  Iteration 3134/3560 Training loss: 1.6964 0.4029 sec/batch
Epoch 18/20  Iteration 3135/3560 Training loss: 1.6964 0.4054 sec/batch
Epoch 18/20  Iteration 3136/3560 Training loss: 1.6965 0.4299 sec/batch
Epoch 18/20  Iteration 3137/3560 Training loss: 1.6964 0.5278 sec/batch
Epoch 18/20  Iteration 3138/3560 Training loss: 1.6964 0.4607 sec/batch
Epoch 18/20  Iteration 3139/3560 Training loss: 1.6962 0.4150 sec/batch
Epoch 18/20  Iteration 3140/3560 Training loss: 1.6961 0.4202 sec/batch
Epoch 18/20  Iteration 3141/3560 Training loss: 1.6959 0.4163 sec/batch
Epoch 18/20  Iteration 3142/3560 Training loss: 1.6956 0.4354 sec/batch
Epoch 18/20  Iteration 3143/3560 Training loss: 1.6955 0.4020 sec/batch
Epoch 18/20  Iteration 3144/3560 Training loss: 1.6954 0.4040 sec/batch
Epoch 18/20  Iteration 3145/3560 Training loss: 1.6954 0.4031 sec/batch
Epoch 18/20  Iteration 3146/3560 Training loss: 1.6953 0.4106 sec/batch
Epoch 18/20  Iteration 3147/3560 Training loss: 1.6952 0.4090 sec/batch
Epoch 18/20  Iteration 3148/3560 Training loss: 1.6949 0.4047 sec/batch
Epoch 18/20  Iteration 3149/3560 Training loss: 1.6946 0.4054 sec/batch
Epoch 18/20  Iteration 3150/3560 Training loss: 1.6947 0.4051 sec/batch
Epoch 18/20  Iteration 3151/3560 Training loss: 1.6946 0.4716 sec/batch
Epoch 18/20  Iteration 3152/3560 Training loss: 1.6943 0.4056 sec/batch
Epoch 18/20  Iteration 3153/3560 Training loss: 1.6945 0.4193 sec/batch
Epoch 18/20  Iteration 3154/3560 Training loss: 1.6946 0.4300 sec/batch
Epoch 18/20  Iteration 3155/3560 Training loss: 1.6945 0.4106 sec/batch
Epoch 18/20  Iteration 3156/3560 Training loss: 1.6943 0.4090 sec/batch
Epoch 18/20  Iteration 3157/3560 Training loss: 1.6940 0.4070 sec/batch
Epoch 18/20  Iteration 3158/3560 Training loss: 1.6939 0.4044 sec/batch
Epoch 18/20  Iteration 3159/3560 Training loss: 1.6940 0.4033 sec/batch
Epoch 18/20  Iteration 3160/3560 Training loss: 1.6940 0.4098 sec/batch
Epoch 18/20  Iteration 3161/3560 Training loss: 1.6940 0.4054 sec/batch
Epoch 18/20  Iteration 3162/3560 Training loss: 1.6941 0.4137 sec/batch
Epoch 18/20  Iteration 3163/3560 Training loss: 1.6942 0.4372 sec/batch
Epoch 18/20  Iteration 3164/3560 Training loss: 1.6942 0.4110 sec/batch
Epoch 18/20  Iteration 3165/3560 Training loss: 1.6943 0.4196 sec/batch
Epoch 18/20  Iteration 3166/3560 Training loss: 1.6943 0.4058 sec/batch
Epoch 18/20  Iteration 3167/3560 Training loss: 1.6946 0.4091 sec/batch
Epoch 18/20  Iteration 3168/3560 Training loss: 1.6946 0.4183 sec/batch
Epoch 18/20  Iteration 3169/3560 Training loss: 1.6945 0.4051 sec/batch
Epoch 18/20  Iteration 3170/3560 Training loss: 1.6946 0.4077 sec/batch
Epoch 18/20  Iteration 3171/3560 Training loss: 1.6946 0.4089 sec/batch
Epoch 18/20  Iteration 3172/3560 Training loss: 1.6947 0.4055 sec/batch
Epoch 18/20  Iteration 3173/3560 Training loss: 1.6947 0.4094 sec/batch
Epoch 18/20  Iteration 3174/3560 Training loss: 1.6949 0.4073 sec/batch
Epoch 18/20  Iteration 3175/3560 Training loss: 1.6950 0.4321 sec/batch
Epoch 18/20  Iteration 3176/3560 Training loss: 1.6948 0.4059 sec/batch
Epoch 18/20  Iteration 3177/3560 Training loss: 1.6945 0.4081 sec/batch
Epoch 18/20  Iteration 3178/3560 Training loss: 1.6946 0.4054 sec/batch
Epoch 18/20  Iteration 3179/3560 Training loss: 1.6946 0.4045 sec/batch
Epoch 18/20  Iteration 3180/3560 Training loss: 1.6947 0.4096 sec/batch
Epoch 18/20  Iteration 3181/3560 Training loss: 1.6946 0.4080 sec/batch
Epoch 18/20  Iteration 3182/3560 Training loss: 1.6947 0.4057 sec/batch
Epoch 18/20  Iteration 3183/3560 Training loss: 1.6947 0.4168 sec/batch
Epoch 18/20  Iteration 3184/3560 Training loss: 1.6947 0.4199 sec/batch
Epoch 18/20  Iteration 3185/3560 Training loss: 1.6944 0.4039 sec/batch
Epoch 18/20  Iteration 3186/3560 Training loss: 1.6946 0.4318 sec/batch
Epoch 18/20  Iteration 3187/3560 Training loss: 1.6947 0.4356 sec/batch
Epoch 18/20  Iteration 3188/3560 Training loss: 1.6947 0.4708 sec/batch
Epoch 18/20  Iteration 3189/3560 Training loss: 1.6948 0.4075 sec/batch
Epoch 18/20  Iteration 3190/3560 Training loss: 1.6948 0.3966 sec/batch
Epoch 18/20  Iteration 3191/3560 Training loss: 1.6948 0.4062 sec/batch
Epoch 18/20  Iteration 3192/3560 Training loss: 1.6948 0.4021 sec/batch
Epoch 18/20  Iteration 3193/3560 Training loss: 1.6948 0.3992 sec/batch
Epoch 18/20  Iteration 3194/3560 Training loss: 1.6952 0.4028 sec/batch
Epoch 18/20  Iteration 3195/3560 Training loss: 1.6952 0.4070 sec/batch
Epoch 18/20  Iteration 3196/3560 Training loss: 1.6952 0.4122 sec/batch
Epoch 18/20  Iteration 3197/3560 Training loss: 1.6951 0.4073 sec/batch
Epoch 18/20  Iteration 3198/3560 Training loss: 1.6950 0.3995 sec/batch
Epoch 18/20  Iteration 3199/3560 Training loss: 1.6951 0.4084 sec/batch
Epoch 18/20  Iteration 3200/3560 Training loss: 1.6951 0.4406 sec/batch
Epoch 18/20  Iteration 3201/3560 Training loss: 1.6952 0.4006 sec/batch
Epoch 18/20  Iteration 3202/3560 Training loss: 1.6951 0.4055 sec/batch
Epoch 18/20  Iteration 3203/3560 Training loss: 1.6949 0.4032 sec/batch
Epoch 18/20  Iteration 3204/3560 Training loss: 1.6950 0.4040 sec/batch
Epoch 19/20  Iteration 3205/3560 Training loss: 1.7585 0.3996 sec/batch
Epoch 19/20  Iteration 3206/3560 Training loss: 1.7277 0.4069 sec/batch
Epoch 19/20  Iteration 3207/3560 Training loss: 1.7179 0.4070 sec/batch
Epoch 19/20  Iteration 3208/3560 Training loss: 1.7087 0.4059 sec/batch
Epoch 19/20  Iteration 3209/3560 Training loss: 1.7027 0.4055 sec/batch
Epoch 19/20  Iteration 3210/3560 Training loss: 1.6933 0.4083 sec/batch
Epoch 19/20  Iteration 3211/3560 Training loss: 1.6937 0.4018 sec/batch
Epoch 19/20  Iteration 3212/3560 Training loss: 1.6921 0.4430 sec/batch
Epoch 19/20  Iteration 3213/3560 Training loss: 1.6950 0.4035 sec/batch
Epoch 19/20  Iteration 3214/3560 Training loss: 1.6938 0.4087 sec/batch
Epoch 19/20  Iteration 3215/3560 Training loss: 1.6912 0.4076 sec/batch
Epoch 19/20  Iteration 3216/3560 Training loss: 1.6897 0.4081 sec/batch
Epoch 19/20  Iteration 3217/3560 Training loss: 1.6881 0.4062 sec/batch
Epoch 19/20  Iteration 3218/3560 Training loss: 1.6912 0.4078 sec/batch
Epoch 19/20  Iteration 3219/3560 Training loss: 1.6901 0.4021 sec/batch
Epoch 19/20  Iteration 3220/3560 Training loss: 1.6885 0.4013 sec/batch
Epoch 19/20  Iteration 3221/3560 Training loss: 1.6881 0.4036 sec/batch
Epoch 19/20  Iteration 3222/3560 Training loss: 1.6894 0.4099 sec/batch
Epoch 19/20  Iteration 3223/3560 Training loss: 1.6898 0.4051 sec/batch
Epoch 19/20  Iteration 3224/3560 Training loss: 1.6909 0.4307 sec/batch
Epoch 19/20  Iteration 3225/3560 Training loss: 1.6907 0.4065 sec/batch
Epoch 19/20  Iteration 3226/3560 Training loss: 1.6917 0.4156 sec/batch
Epoch 19/20  Iteration 3227/3560 Training loss: 1.6907 0.4369 sec/batch
Epoch 19/20  Iteration 3228/3560 Training loss: 1.6905 0.4182 sec/batch
Epoch 19/20  Iteration 3229/3560 Training loss: 1.6904 0.4203 sec/batch
Epoch 19/20  Iteration 3230/3560 Training loss: 1.6891 0.4210 sec/batch
Epoch 19/20  Iteration 3231/3560 Training loss: 1.6878 0.4001 sec/batch
Epoch 19/20  Iteration 3232/3560 Training loss: 1.6882 0.4298 sec/batch
Epoch 19/20  Iteration 3233/3560 Training loss: 1.6890 0.4060 sec/batch
Epoch 19/20  Iteration 3234/3560 Training loss: 1.6893 0.4047 sec/batch
Epoch 19/20  Iteration 3235/3560 Training loss: 1.6892 0.4057 sec/batch
Epoch 19/20  Iteration 3236/3560 Training loss: 1.6886 0.4134 sec/batch
Epoch 19/20  Iteration 3237/3560 Training loss: 1.6889 0.4231 sec/batch
Epoch 19/20  Iteration 3238/3560 Training loss: 1.6896 0.4014 sec/batch
Epoch 19/20  Iteration 3239/3560 Training loss: 1.6893 0.4058 sec/batch
Epoch 19/20  Iteration 3240/3560 Training loss: 1.6895 0.4113 sec/batch
Epoch 19/20  Iteration 3241/3560 Training loss: 1.6889 0.4132 sec/batch
Epoch 19/20  Iteration 3242/3560 Training loss: 1.6880 0.4199 sec/batch
Epoch 19/20  Iteration 3243/3560 Training loss: 1.6867 0.4162 sec/batch
Epoch 19/20  Iteration 3244/3560 Training loss: 1.6860 0.4210 sec/batch
Epoch 19/20  Iteration 3245/3560 Training loss: 1.6857 0.4076 sec/batch
Epoch 19/20  Iteration 3246/3560 Training loss: 1.6862 0.4165 sec/batch
Epoch 19/20  Iteration 3247/3560 Training loss: 1.6857 0.4101 sec/batch
Epoch 19/20  Iteration 3248/3560 Training loss: 1.6851 0.4038 sec/batch
Epoch 19/20  Iteration 3249/3560 Training loss: 1.6853 0.4337 sec/batch
Epoch 19/20  Iteration 3250/3560 Training loss: 1.6843 0.4120 sec/batch
Epoch 19/20  Iteration 3251/3560 Training loss: 1.6839 0.4041 sec/batch
Epoch 19/20  Iteration 3252/3560 Training loss: 1.6835 0.4048 sec/batch
Epoch 19/20  Iteration 3253/3560 Training loss: 1.6832 0.4060 sec/batch
Epoch 19/20  Iteration 3254/3560 Training loss: 1.6840 0.4128 sec/batch
Epoch 19/20  Iteration 3255/3560 Training loss: 1.6835 0.4072 sec/batch
Epoch 19/20  Iteration 3256/3560 Training loss: 1.6845 0.4047 sec/batch
Epoch 19/20  Iteration 3257/3560 Training loss: 1.6844 0.4083 sec/batch
Epoch 19/20  Iteration 3258/3560 Training loss: 1.6847 0.4095 sec/batch
Epoch 19/20  Iteration 3259/3560 Training loss: 1.6844 0.4041 sec/batch
Epoch 19/20  Iteration 3260/3560 Training loss: 1.6845 0.4229 sec/batch
Epoch 19/20  Iteration 3261/3560 Training loss: 1.6848 0.4330 sec/batch
Epoch 19/20  Iteration 3262/3560 Training loss: 1.6845 0.3999 sec/batch
Epoch 19/20  Iteration 3263/3560 Training loss: 1.6841 0.4091 sec/batch
Epoch 19/20  Iteration 3264/3560 Training loss: 1.6848 0.4003 sec/batch
Epoch 19/20  Iteration 3265/3560 Training loss: 1.6847 0.4064 sec/batch
Epoch 19/20  Iteration 3266/3560 Training loss: 1.6857 0.4042 sec/batch
Epoch 19/20  Iteration 3267/3560 Training loss: 1.6861 0.4077 sec/batch
Epoch 19/20  Iteration 3268/3560 Training loss: 1.6865 0.4002 sec/batch
Epoch 19/20  Iteration 3269/3560 Training loss: 1.6864 0.4132 sec/batch
Epoch 19/20  Iteration 3270/3560 Training loss: 1.6865 0.4060 sec/batch
Epoch 19/20  Iteration 3271/3560 Training loss: 1.6868 0.4032 sec/batch
Epoch 19/20  Iteration 3272/3560 Training loss: 1.6865 0.4013 sec/batch
Epoch 19/20  Iteration 3273/3560 Training loss: 1.6866 0.4226 sec/batch
Epoch 19/20  Iteration 3274/3560 Training loss: 1.6866 0.4188 sec/batch
Epoch 19/20  Iteration 3275/3560 Training loss: 1.6871 0.4094 sec/batch
Epoch 19/20  Iteration 3276/3560 Training loss: 1.6873 0.4056 sec/batch
Epoch 19/20  Iteration 3277/3560 Training loss: 1.6879 0.4019 sec/batch
Epoch 19/20  Iteration 3278/3560 Training loss: 1.6877 0.4090 sec/batch
Epoch 19/20  Iteration 3279/3560 Training loss: 1.6877 0.4047 sec/batch
Epoch 19/20  Iteration 3280/3560 Training loss: 1.6879 0.4023 sec/batch
Epoch 19/20  Iteration 3281/3560 Training loss: 1.6877 0.4052 sec/batch
Epoch 19/20  Iteration 3282/3560 Training loss: 1.6879 0.4131 sec/batch
Epoch 19/20  Iteration 3283/3560 Training loss: 1.6874 0.4173 sec/batch
Epoch 19/20  Iteration 3284/3560 Training loss: 1.6873 0.4148 sec/batch
Epoch 19/20  Iteration 3285/3560 Training loss: 1.6868 0.4078 sec/batch
Epoch 19/20  Iteration 3286/3560 Training loss: 1.6869 0.4356 sec/batch
Epoch 19/20  Iteration 3287/3560 Training loss: 1.6865 0.4067 sec/batch
Epoch 19/20  Iteration 3288/3560 Training loss: 1.6864 0.4054 sec/batch
Epoch 19/20  Iteration 3289/3560 Training loss: 1.6860 0.4044 sec/batch
Epoch 19/20  Iteration 3290/3560 Training loss: 1.6857 0.4033 sec/batch
Epoch 19/20  Iteration 3291/3560 Training loss: 1.6855 0.4013 sec/batch
Epoch 19/20  Iteration 3292/3560 Training loss: 1.6854 0.4046 sec/batch
Epoch 19/20  Iteration 3293/3560 Training loss: 1.6849 0.4058 sec/batch
Epoch 19/20  Iteration 3294/3560 Training loss: 1.6850 0.4129 sec/batch
Epoch 19/20  Iteration 3295/3560 Training loss: 1.6847 0.4086 sec/batch
Epoch 19/20  Iteration 3296/3560 Training loss: 1.6846 0.4069 sec/batch
Epoch 19/20  Iteration 3297/3560 Training loss: 1.6842 0.4162 sec/batch
Epoch 19/20  Iteration 3298/3560 Training loss: 1.6838 0.4329 sec/batch
Epoch 19/20  Iteration 3299/3560 Training loss: 1.6836 0.4170 sec/batch
Epoch 19/20  Iteration 3300/3560 Training loss: 1.6836 0.4067 sec/batch
Epoch 19/20  Iteration 3301/3560 Training loss: 1.6836 0.4082 sec/batch
Epoch 19/20  Iteration 3302/3560 Training loss: 1.6834 0.4152 sec/batch
Epoch 19/20  Iteration 3303/3560 Training loss: 1.6832 0.3996 sec/batch
Epoch 19/20  Iteration 3304/3560 Training loss: 1.6828 0.4053 sec/batch
Epoch 19/20  Iteration 3305/3560 Training loss: 1.6828 0.4062 sec/batch
Epoch 19/20  Iteration 3306/3560 Training loss: 1.6827 0.4079 sec/batch
Epoch 19/20  Iteration 3307/3560 Training loss: 1.6825 0.4058 sec/batch
Epoch 19/20  Iteration 3308/3560 Training loss: 1.6825 0.4058 sec/batch
Epoch 19/20  Iteration 3309/3560 Training loss: 1.6823 0.4034 sec/batch
Epoch 19/20  Iteration 3310/3560 Training loss: 1.6822 0.4309 sec/batch
Epoch 19/20  Iteration 3311/3560 Training loss: 1.6822 0.4028 sec/batch
Epoch 19/20  Iteration 3312/3560 Training loss: 1.6822 0.4169 sec/batch
Epoch 19/20  Iteration 3313/3560 Training loss: 1.6822 0.4049 sec/batch
Epoch 19/20  Iteration 3314/3560 Training loss: 1.6822 0.4143 sec/batch
Epoch 19/20  Iteration 3315/3560 Training loss: 1.6821 0.4047 sec/batch
Epoch 19/20  Iteration 3316/3560 Training loss: 1.6820 0.4032 sec/batch
Epoch 19/20  Iteration 3317/3560 Training loss: 1.6819 0.4033 sec/batch
Epoch 19/20  Iteration 3318/3560 Training loss: 1.6818 0.4021 sec/batch
Epoch 19/20  Iteration 3319/3560 Training loss: 1.6816 0.4087 sec/batch
Epoch 19/20  Iteration 3320/3560 Training loss: 1.6812 0.4079 sec/batch
Epoch 19/20  Iteration 3321/3560 Training loss: 1.6813 0.4028 sec/batch
Epoch 19/20  Iteration 3322/3560 Training loss: 1.6813 0.4065 sec/batch
Epoch 19/20  Iteration 3323/3560 Training loss: 1.6812 0.4347 sec/batch
Epoch 19/20  Iteration 3324/3560 Training loss: 1.6812 0.4033 sec/batch
Epoch 19/20  Iteration 3325/3560 Training loss: 1.6812 0.4025 sec/batch
Epoch 19/20  Iteration 3326/3560 Training loss: 1.6809 0.4054 sec/batch
Epoch 19/20  Iteration 3327/3560 Training loss: 1.6806 0.4161 sec/batch
Epoch 19/20  Iteration 3328/3560 Training loss: 1.6807 0.4030 sec/batch
Epoch 19/20  Iteration 3329/3560 Training loss: 1.6806 0.4165 sec/batch
Epoch 19/20  Iteration 3330/3560 Training loss: 1.6803 0.4047 sec/batch
Epoch 19/20  Iteration 3331/3560 Training loss: 1.6805 0.4070 sec/batch
Epoch 19/20  Iteration 3332/3560 Training loss: 1.6806 0.4462 sec/batch
Epoch 19/20  Iteration 3333/3560 Training loss: 1.6806 0.4127 sec/batch
Epoch 19/20  Iteration 3334/3560 Training loss: 1.6805 0.4056 sec/batch
Epoch 19/20  Iteration 3335/3560 Training loss: 1.6802 0.4334 sec/batch
Epoch 19/20  Iteration 3336/3560 Training loss: 1.6799 0.4106 sec/batch
Epoch 19/20  Iteration 3337/3560 Training loss: 1.6800 0.4071 sec/batch
Epoch 19/20  Iteration 3338/3560 Training loss: 1.6800 0.4100 sec/batch
Epoch 19/20  Iteration 3339/3560 Training loss: 1.6802 0.4062 sec/batch
Epoch 19/20  Iteration 3340/3560 Training loss: 1.6802 0.4044 sec/batch
Epoch 19/20  Iteration 3341/3560 Training loss: 1.6802 0.4032 sec/batch
Epoch 19/20  Iteration 3342/3560 Training loss: 1.6803 0.4189 sec/batch
Epoch 19/20  Iteration 3343/3560 Training loss: 1.6805 0.4083 sec/batch
Epoch 19/20  Iteration 3344/3560 Training loss: 1.6804 0.4040 sec/batch
Epoch 19/20  Iteration 3345/3560 Training loss: 1.6807 0.4086 sec/batch
Epoch 19/20  Iteration 3346/3560 Training loss: 1.6806 0.4057 sec/batch
Epoch 19/20  Iteration 3347/3560 Training loss: 1.6806 0.4375 sec/batch
Epoch 19/20  Iteration 3348/3560 Training loss: 1.6808 0.4136 sec/batch
Epoch 19/20  Iteration 3349/3560 Training loss: 1.6807 0.4123 sec/batch
Epoch 19/20  Iteration 3350/3560 Training loss: 1.6808 0.4067 sec/batch
Epoch 19/20  Iteration 3351/3560 Training loss: 1.6809 0.4012 sec/batch
Epoch 19/20  Iteration 3352/3560 Training loss: 1.6811 0.4050 sec/batch
Epoch 19/20  Iteration 3353/3560 Training loss: 1.6812 0.4073 sec/batch
Epoch 19/20  Iteration 3354/3560 Training loss: 1.6811 0.4038 sec/batch
Epoch 19/20  Iteration 3355/3560 Training loss: 1.6808 0.4193 sec/batch
Epoch 19/20  Iteration 3356/3560 Training loss: 1.6809 0.4048 sec/batch
Epoch 19/20  Iteration 3357/3560 Training loss: 1.6809 0.4037 sec/batch
Epoch 19/20  Iteration 3358/3560 Training loss: 1.6810 0.4078 sec/batch
Epoch 19/20  Iteration 3359/3560 Training loss: 1.6810 0.4069 sec/batch
Epoch 19/20  Iteration 3360/3560 Training loss: 1.6811 0.4346 sec/batch
Epoch 19/20  Iteration 3361/3560 Training loss: 1.6811 0.4090 sec/batch
Epoch 19/20  Iteration 3362/3560 Training loss: 1.6810 0.4046 sec/batch
Epoch 19/20  Iteration 3363/3560 Training loss: 1.6808 0.4023 sec/batch
Epoch 19/20  Iteration 3364/3560 Training loss: 1.6810 0.4044 sec/batch
Epoch 19/20  Iteration 3365/3560 Training loss: 1.6812 0.4062 sec/batch
Epoch 19/20  Iteration 3366/3560 Training loss: 1.6812 0.4081 sec/batch
Epoch 19/20  Iteration 3367/3560 Training loss: 1.6813 0.4060 sec/batch
Epoch 19/20  Iteration 3368/3560 Training loss: 1.6813 0.4044 sec/batch
Epoch 19/20  Iteration 3369/3560 Training loss: 1.6813 0.4050 sec/batch
Epoch 19/20  Iteration 3370/3560 Training loss: 1.6812 0.4050 sec/batch
Epoch 19/20  Iteration 3371/3560 Training loss: 1.6814 0.4188 sec/batch
Epoch 19/20  Iteration 3372/3560 Training loss: 1.6818 0.4312 sec/batch
Epoch 19/20  Iteration 3373/3560 Training loss: 1.6818 0.4050 sec/batch
Epoch 19/20  Iteration 3374/3560 Training loss: 1.6818 0.4054 sec/batch
Epoch 19/20  Iteration 3375/3560 Training loss: 1.6818 0.4063 sec/batch
Epoch 19/20  Iteration 3376/3560 Training loss: 1.6816 0.4025 sec/batch
Epoch 19/20  Iteration 3377/3560 Training loss: 1.6817 0.4063 sec/batch
Epoch 19/20  Iteration 3378/3560 Training loss: 1.6817 0.4059 sec/batch
Epoch 19/20  Iteration 3379/3560 Training loss: 1.6817 0.4110 sec/batch
Epoch 19/20  Iteration 3380/3560 Training loss: 1.6816 0.4028 sec/batch
Epoch 19/20  Iteration 3381/3560 Training loss: 1.6815 0.4037 sec/batch
Epoch 19/20  Iteration 3382/3560 Training loss: 1.6816 0.4008 sec/batch
Epoch 20/20  Iteration 3383/3560 Training loss: 1.7563 0.4071 sec/batch
Epoch 20/20  Iteration 3384/3560 Training loss: 1.7230 0.4322 sec/batch
Epoch 20/20  Iteration 3385/3560 Training loss: 1.7073 0.4148 sec/batch
Epoch 20/20  Iteration 3386/3560 Training loss: 1.6986 0.4076 sec/batch
Epoch 20/20  Iteration 3387/3560 Training loss: 1.6923 0.4072 sec/batch
Epoch 20/20  Iteration 3388/3560 Training loss: 1.6829 0.4062 sec/batch
Epoch 20/20  Iteration 3389/3560 Training loss: 1.6841 0.4080 sec/batch
Epoch 20/20  Iteration 3390/3560 Training loss: 1.6826 0.4063 sec/batch
Epoch 20/20  Iteration 3391/3560 Training loss: 1.6848 0.4129 sec/batch
Epoch 20/20  Iteration 3392/3560 Training loss: 1.6831 0.4102 sec/batch
Epoch 20/20  Iteration 3393/3560 Training loss: 1.6789 0.4093 sec/batch
Epoch 20/20  Iteration 3394/3560 Training loss: 1.6778 0.4113 sec/batch
Epoch 20/20  Iteration 3395/3560 Training loss: 1.6770 0.4022 sec/batch
Epoch 20/20  Iteration 3396/3560 Training loss: 1.6794 0.4054 sec/batch
Epoch 20/20  Iteration 3397/3560 Training loss: 1.6782 0.4304 sec/batch
Epoch 20/20  Iteration 3398/3560 Training loss: 1.6766 0.4065 sec/batch
Epoch 20/20  Iteration 3399/3560 Training loss: 1.6766 0.4073 sec/batch
Epoch 20/20  Iteration 3400/3560 Training loss: 1.6780 0.4147 sec/batch
Epoch 20/20  Iteration 3401/3560 Training loss: 1.6783 0.4055 sec/batch
Epoch 20/20  Iteration 3402/3560 Training loss: 1.6792 0.4076 sec/batch
Epoch 20/20  Iteration 3403/3560 Training loss: 1.6787 0.4085 sec/batch
Epoch 20/20  Iteration 3404/3560 Training loss: 1.6792 0.4242 sec/batch
Epoch 20/20  Iteration 3405/3560 Training loss: 1.6784 0.4164 sec/batch
Epoch 20/20  Iteration 3406/3560 Training loss: 1.6778 0.4142 sec/batch
Epoch 20/20  Iteration 3407/3560 Training loss: 1.6774 0.4079 sec/batch
Epoch 20/20  Iteration 3408/3560 Training loss: 1.6763 0.4104 sec/batch
Epoch 20/20  Iteration 3409/3560 Training loss: 1.6753 0.4318 sec/batch
Epoch 20/20  Iteration 3410/3560 Training loss: 1.6756 0.4139 sec/batch
Epoch 20/20  Iteration 3411/3560 Training loss: 1.6764 0.4043 sec/batch
Epoch 20/20  Iteration 3412/3560 Training loss: 1.6767 0.4097 sec/batch
Epoch 20/20  Iteration 3413/3560 Training loss: 1.6763 0.4087 sec/batch
Epoch 20/20  Iteration 3414/3560 Training loss: 1.6753 0.4088 sec/batch
Epoch 20/20  Iteration 3415/3560 Training loss: 1.6757 0.4099 sec/batch
Epoch 20/20  Iteration 3416/3560 Training loss: 1.6765 0.4108 sec/batch
Epoch 20/20  Iteration 3417/3560 Training loss: 1.6763 0.4109 sec/batch
Epoch 20/20  Iteration 3418/3560 Training loss: 1.6760 0.4053 sec/batch
Epoch 20/20  Iteration 3419/3560 Training loss: 1.6754 0.4056 sec/batch
Epoch 20/20  Iteration 3420/3560 Training loss: 1.6747 0.4046 sec/batch
Epoch 20/20  Iteration 3421/3560 Training loss: 1.6735 0.4318 sec/batch
Epoch 20/20  Iteration 3422/3560 Training loss: 1.6730 0.4044 sec/batch
Epoch 20/20  Iteration 3423/3560 Training loss: 1.6728 0.4117 sec/batch
Epoch 20/20  Iteration 3424/3560 Training loss: 1.6734 0.4094 sec/batch
Epoch 20/20  Iteration 3425/3560 Training loss: 1.6728 0.4106 sec/batch
Epoch 20/20  Iteration 3426/3560 Training loss: 1.6722 0.4072 sec/batch
Epoch 20/20  Iteration 3427/3560 Training loss: 1.6727 0.4072 sec/batch
Epoch 20/20  Iteration 3428/3560 Training loss: 1.6717 0.4022 sec/batch
Epoch 20/20  Iteration 3429/3560 Training loss: 1.6714 0.4230 sec/batch
Epoch 20/20  Iteration 3430/3560 Training loss: 1.6711 0.4119 sec/batch
Epoch 20/20  Iteration 3431/3560 Training loss: 1.6710 0.4104 sec/batch
Epoch 20/20  Iteration 3432/3560 Training loss: 1.6719 0.4051 sec/batch
Epoch 20/20  Iteration 3433/3560 Training loss: 1.6717 0.4064 sec/batch
Epoch 20/20  Iteration 3434/3560 Training loss: 1.6726 0.4328 sec/batch
Epoch 20/20  Iteration 3435/3560 Training loss: 1.6723 0.4081 sec/batch
Epoch 20/20  Iteration 3436/3560 Training loss: 1.6724 0.4039 sec/batch
Epoch 20/20  Iteration 3437/3560 Training loss: 1.6722 0.4046 sec/batch
Epoch 20/20  Iteration 3438/3560 Training loss: 1.6724 0.4016 sec/batch
Epoch 20/20  Iteration 3439/3560 Training loss: 1.6728 0.4506 sec/batch
Epoch 20/20  Iteration 3440/3560 Training loss: 1.6723 0.4053 sec/batch
Epoch 20/20  Iteration 3441/3560 Training loss: 1.6719 0.4042 sec/batch
Epoch 20/20  Iteration 3442/3560 Training loss: 1.6725 0.4252 sec/batch
Epoch 20/20  Iteration 3443/3560 Training loss: 1.6724 0.4105 sec/batch
Epoch 20/20  Iteration 3444/3560 Training loss: 1.6732 0.4074 sec/batch
Epoch 20/20  Iteration 3445/3560 Training loss: 1.6735 0.4055 sec/batch
Epoch 20/20  Iteration 3446/3560 Training loss: 1.6738 0.4337 sec/batch
Epoch 20/20  Iteration 3447/3560 Training loss: 1.6735 0.4082 sec/batch
Epoch 20/20  Iteration 3448/3560 Training loss: 1.6739 0.4065 sec/batch
Epoch 20/20  Iteration 3449/3560 Training loss: 1.6742 0.4115 sec/batch
Epoch 20/20  Iteration 3450/3560 Training loss: 1.6739 0.4106 sec/batch
Epoch 20/20  Iteration 3451/3560 Training loss: 1.6739 0.4078 sec/batch
Epoch 20/20  Iteration 3452/3560 Training loss: 1.6738 0.4006 sec/batch
Epoch 20/20  Iteration 3453/3560 Training loss: 1.6744 0.4097 sec/batch
Epoch 20/20  Iteration 3454/3560 Training loss: 1.6747 0.4145 sec/batch
Epoch 20/20  Iteration 3455/3560 Training loss: 1.6752 0.4092 sec/batch
Epoch 20/20  Iteration 3456/3560 Training loss: 1.6749 0.4147 sec/batch
Epoch 20/20  Iteration 3457/3560 Training loss: 1.6750 0.4035 sec/batch
Epoch 20/20  Iteration 3458/3560 Training loss: 1.6751 0.4345 sec/batch
Epoch 20/20  Iteration 3459/3560 Training loss: 1.6751 0.4032 sec/batch
Epoch 20/20  Iteration 3460/3560 Training loss: 1.6751 0.4051 sec/batch
Epoch 20/20  Iteration 3461/3560 Training loss: 1.6746 0.4059 sec/batch
Epoch 20/20  Iteration 3462/3560 Training loss: 1.6747 0.4052 sec/batch
Epoch 20/20  Iteration 3463/3560 Training loss: 1.6742 0.4088 sec/batch
Epoch 20/20  Iteration 3464/3560 Training loss: 1.6744 0.4084 sec/batch
Epoch 20/20  Iteration 3465/3560 Training loss: 1.6739 0.4140 sec/batch
Epoch 20/20  Iteration 3466/3560 Training loss: 1.6738 0.4127 sec/batch
Epoch 20/20  Iteration 3467/3560 Training loss: 1.6734 0.4101 sec/batch
Epoch 20/20  Iteration 3468/3560 Training loss: 1.6732 0.4153 sec/batch
Epoch 20/20  Iteration 3469/3560 Training loss: 1.6730 0.4091 sec/batch
Epoch 20/20  Iteration 3470/3560 Training loss: 1.6727 0.4389 sec/batch
Epoch 20/20  Iteration 3471/3560 Training loss: 1.6722 0.4097 sec/batch
Epoch 20/20  Iteration 3472/3560 Training loss: 1.6723 0.4077 sec/batch
Epoch 20/20  Iteration 3473/3560 Training loss: 1.6720 0.4239 sec/batch
Epoch 20/20  Iteration 3474/3560 Training loss: 1.6718 0.4071 sec/batch
Epoch 20/20  Iteration 3475/3560 Training loss: 1.6714 0.4050 sec/batch
Epoch 20/20  Iteration 3476/3560 Training loss: 1.6710 0.4244 sec/batch
Epoch 20/20  Iteration 3477/3560 Training loss: 1.6707 0.4050 sec/batch
Epoch 20/20  Iteration 3478/3560 Training loss: 1.6707 0.4103 sec/batch
Epoch 20/20  Iteration 3479/3560 Training loss: 1.6707 0.4046 sec/batch
Epoch 20/20  Iteration 3480/3560 Training loss: 1.6703 0.4054 sec/batch
Epoch 20/20  Iteration 3481/3560 Training loss: 1.6701 0.4109 sec/batch
Epoch 20/20  Iteration 3482/3560 Training loss: 1.6697 0.4068 sec/batch
Epoch 20/20  Iteration 3483/3560 Training loss: 1.6697 0.4294 sec/batch
Epoch 20/20  Iteration 3484/3560 Training loss: 1.6696 0.4206 sec/batch
Epoch 20/20  Iteration 3485/3560 Training loss: 1.6694 0.4073 sec/batch
Epoch 20/20  Iteration 3486/3560 Training loss: 1.6693 0.4053 sec/batch
Epoch 20/20  Iteration 3487/3560 Training loss: 1.6691 0.4106 sec/batch
Epoch 20/20  Iteration 3488/3560 Training loss: 1.6689 0.4055 sec/batch
Epoch 20/20  Iteration 3489/3560 Training loss: 1.6688 0.4043 sec/batch
Epoch 20/20  Iteration 3490/3560 Training loss: 1.6688 0.4023 sec/batch
Epoch 20/20  Iteration 3491/3560 Training loss: 1.6688 0.4056 sec/batch
Epoch 20/20  Iteration 3492/3560 Training loss: 1.6688 0.4034 sec/batch
Epoch 20/20  Iteration 3493/3560 Training loss: 1.6688 0.4059 sec/batch
Epoch 20/20  Iteration 3494/3560 Training loss: 1.6688 0.4020 sec/batch
Epoch 20/20  Iteration 3495/3560 Training loss: 1.6686 0.4308 sec/batch
Epoch 20/20  Iteration 3496/3560 Training loss: 1.6685 0.4031 sec/batch
Epoch 20/20  Iteration 3497/3560 Training loss: 1.6683 0.4191 sec/batch
Epoch 20/20  Iteration 3498/3560 Training loss: 1.6680 0.4134 sec/batch
Epoch 20/20  Iteration 3499/3560 Training loss: 1.6680 0.4093 sec/batch
Epoch 20/20  Iteration 3500/3560 Training loss: 1.6681 0.4047 sec/batch
Epoch 20/20  Iteration 3501/3560 Training loss: 1.6680 0.4076 sec/batch
Epoch 20/20  Iteration 3502/3560 Training loss: 1.6680 0.4090 sec/batch
Epoch 20/20  Iteration 3503/3560 Training loss: 1.6680 0.4050 sec/batch
Epoch 20/20  Iteration 3504/3560 Training loss: 1.6677 0.4081 sec/batch
Epoch 20/20  Iteration 3505/3560 Training loss: 1.6673 0.4068 sec/batch
Epoch 20/20  Iteration 3506/3560 Training loss: 1.6674 0.3994 sec/batch
Epoch 20/20  Iteration 3507/3560 Training loss: 1.6673 0.4338 sec/batch
Epoch 20/20  Iteration 3508/3560 Training loss: 1.6670 0.4098 sec/batch
Epoch 20/20  Iteration 3509/3560 Training loss: 1.6672 0.4031 sec/batch
Epoch 20/20  Iteration 3510/3560 Training loss: 1.6672 0.4017 sec/batch
Epoch 20/20  Iteration 3511/3560 Training loss: 1.6671 0.4060 sec/batch
Epoch 20/20  Iteration 3512/3560 Training loss: 1.6670 0.4027 sec/batch
Epoch 20/20  Iteration 3513/3560 Training loss: 1.6668 0.4142 sec/batch
Epoch 20/20  Iteration 3514/3560 Training loss: 1.6666 0.4076 sec/batch
Epoch 20/20  Iteration 3515/3560 Training loss: 1.6667 0.3988 sec/batch
Epoch 20/20  Iteration 3516/3560 Training loss: 1.6667 0.3971 sec/batch
Epoch 20/20  Iteration 3517/3560 Training loss: 1.6667 0.4060 sec/batch
Epoch 20/20  Iteration 3518/3560 Training loss: 1.6669 0.4029 sec/batch
Epoch 20/20  Iteration 3519/3560 Training loss: 1.6670 0.4144 sec/batch
Epoch 20/20  Iteration 3520/3560 Training loss: 1.6670 0.4321 sec/batch
Epoch 20/20  Iteration 3521/3560 Training loss: 1.6671 0.4065 sec/batch
Epoch 20/20  Iteration 3522/3560 Training loss: 1.6669 0.4106 sec/batch
Epoch 20/20  Iteration 3523/3560 Training loss: 1.6673 0.4105 sec/batch
Epoch 20/20  Iteration 3524/3560 Training loss: 1.6673 0.4036 sec/batch
Epoch 20/20  Iteration 3525/3560 Training loss: 1.6673 0.4096 sec/batch
Epoch 20/20  Iteration 3526/3560 Training loss: 1.6675 0.4171 sec/batch
Epoch 20/20  Iteration 3527/3560 Training loss: 1.6674 0.4026 sec/batch
Epoch 20/20  Iteration 3528/3560 Training loss: 1.6675 0.4053 sec/batch
Epoch 20/20  Iteration 3529/3560 Training loss: 1.6676 0.4038 sec/batch
Epoch 20/20  Iteration 3530/3560 Training loss: 1.6679 0.4068 sec/batch
Epoch 20/20  Iteration 3531/3560 Training loss: 1.6679 0.4098 sec/batch
Epoch 20/20  Iteration 3532/3560 Training loss: 1.6678 0.4272 sec/batch
Epoch 20/20  Iteration 3533/3560 Training loss: 1.6676 0.4064 sec/batch
Epoch 20/20  Iteration 3534/3560 Training loss: 1.6676 0.4058 sec/batch
Epoch 20/20  Iteration 3535/3560 Training loss: 1.6676 0.3999 sec/batch
Epoch 20/20  Iteration 3536/3560 Training loss: 1.6678 0.4101 sec/batch
Epoch 20/20  Iteration 3537/3560 Training loss: 1.6678 0.5284 sec/batch
Epoch 20/20  Iteration 3538/3560 Training loss: 1.6679 0.5411 sec/batch
Epoch 20/20  Iteration 3539/3560 Training loss: 1.6679 0.4515 sec/batch
Epoch 20/20  Iteration 3540/3560 Training loss: 1.6679 0.4245 sec/batch
Epoch 20/20  Iteration 3541/3560 Training loss: 1.6677 0.4083 sec/batch
Epoch 20/20  Iteration 3542/3560 Training loss: 1.6678 0.4076 sec/batch
Epoch 20/20  Iteration 3543/3560 Training loss: 1.6681 0.4084 sec/batch
Epoch 20/20  Iteration 3544/3560 Training loss: 1.6681 0.4273 sec/batch
Epoch 20/20  Iteration 3545/3560 Training loss: 1.6682 0.4065 sec/batch
Epoch 20/20  Iteration 3546/3560 Training loss: 1.6682 0.4065 sec/batch
Epoch 20/20  Iteration 3547/3560 Training loss: 1.6683 0.4108 sec/batch
Epoch 20/20  Iteration 3548/3560 Training loss: 1.6682 0.4269 sec/batch
Epoch 20/20  Iteration 3549/3560 Training loss: 1.6684 0.4098 sec/batch
Epoch 20/20  Iteration 3550/3560 Training loss: 1.6689 0.4034 sec/batch
Epoch 20/20  Iteration 3551/3560 Training loss: 1.6689 0.4034 sec/batch
Epoch 20/20  Iteration 3552/3560 Training loss: 1.6689 0.4043 sec/batch
Epoch 20/20  Iteration 3553/3560 Training loss: 1.6688 0.4050 sec/batch
Epoch 20/20  Iteration 3554/3560 Training loss: 1.6686 0.4142 sec/batch
Epoch 20/20  Iteration 3555/3560 Training loss: 1.6688 0.4049 sec/batch
Epoch 20/20  Iteration 3556/3560 Training loss: 1.6687 0.4308 sec/batch
Epoch 20/20  Iteration 3557/3560 Training loss: 1.6688 0.3997 sec/batch
Epoch 20/20  Iteration 3558/3560 Training loss: 1.6688 0.4071 sec/batch
Epoch 20/20  Iteration 3559/3560 Training loss: 1.6687 0.4093 sec/batch
Epoch 20/20  Iteration 3560/3560 Training loss: 1.6688 0.4061 sec/batch
Epoch 1/20  Iteration 1/3560 Training loss: 4.4182 0.4347 sec/batch
Epoch 1/20  Iteration 2/3560 Training loss: 4.4129 0.4071 sec/batch
Epoch 1/20  Iteration 3/3560 Training loss: 4.4078 0.4023 sec/batch
Epoch 1/20  Iteration 4/3560 Training loss: 4.4020 0.4067 sec/batch
Epoch 1/20  Iteration 5/3560 Training loss: 4.3955 0.4006 sec/batch
Epoch 1/20  Iteration 6/3560 Training loss: 4.3883 0.4221 sec/batch
Epoch 1/20  Iteration 7/3560 Training loss: 4.3787 0.3956 sec/batch
Epoch 1/20  Iteration 8/3560 Training loss: 4.3655 0.4035 sec/batch
Epoch 1/20  Iteration 9/3560 Training loss: 4.3447 0.4090 sec/batch
Epoch 1/20  Iteration 10/3560 Training loss: 4.3102 0.4058 sec/batch
Epoch 1/20  Iteration 11/3560 Training loss: 4.2645 0.4107 sec/batch
Epoch 1/20  Iteration 12/3560 Training loss: 4.2178 0.4106 sec/batch
Epoch 1/20  Iteration 13/3560 Training loss: 4.1725 0.4151 sec/batch
Epoch 1/20  Iteration 14/3560 Training loss: 4.1321 0.4068 sec/batch
Epoch 1/20  Iteration 15/3560 Training loss: 4.0937 0.4057 sec/batch
Epoch 1/20  Iteration 16/3560 Training loss: 4.0578 0.4053 sec/batch
Epoch 1/20  Iteration 17/3560 Training loss: 4.0239 0.4036 sec/batch
Epoch 1/20  Iteration 18/3560 Training loss: 3.9927 0.4118 sec/batch
Epoch 1/20  Iteration 19/3560 Training loss: 3.9630 0.4050 sec/batch
Epoch 1/20  Iteration 20/3560 Training loss: 3.9335 0.4082 sec/batch
Epoch 1/20  Iteration 21/3560 Training loss: 3.9068 0.4164 sec/batch
Epoch 1/20  Iteration 22/3560 Training loss: 3.8818 0.4110 sec/batch
Epoch 1/20  Iteration 23/3560 Training loss: 3.8582 0.4085 sec/batch
Epoch 1/20  Iteration 24/3560 Training loss: 3.8368 0.4067 sec/batch
Epoch 1/20  Iteration 25/3560 Training loss: 3.8154 0.4044 sec/batch
Epoch 1/20  Iteration 26/3560 Training loss: 3.7959 0.4079 sec/batch
Epoch 1/20  Iteration 27/3560 Training loss: 3.7780 0.4073 sec/batch
Epoch 1/20  Iteration 28/3560 Training loss: 3.7603 0.4016 sec/batch
Epoch 1/20  Iteration 29/3560 Training loss: 3.7439 0.4097 sec/batch
Epoch 1/20  Iteration 30/3560 Training loss: 3.7282 0.4103 sec/batch
Epoch 1/20  Iteration 31/3560 Training loss: 3.7140 0.4003 sec/batch
Epoch 1/20  Iteration 32/3560 Training loss: 3.6998 0.4010 sec/batch
Epoch 1/20  Iteration 33/3560 Training loss: 3.6860 0.4035 sec/batch
Epoch 1/20  Iteration 34/3560 Training loss: 3.6734 0.4044 sec/batch
Epoch 1/20  Iteration 35/3560 Training loss: 3.6606 0.4125 sec/batch
Epoch 1/20  Iteration 36/3560 Training loss: 3.6491 0.4220 sec/batch
Epoch 1/20  Iteration 37/3560 Training loss: 3.6377 0.4154 sec/batch
Epoch 1/20  Iteration 38/3560 Training loss: 3.6265 0.4038 sec/batch
Epoch 1/20  Iteration 39/3560 Training loss: 3.6158 0.4028 sec/batch
Epoch 1/20  Iteration 40/3560 Training loss: 3.6058 0.4091 sec/batch
Epoch 1/20  Iteration 41/3560 Training loss: 3.5959 0.4034 sec/batch
Epoch 1/20  Iteration 42/3560 Training loss: 3.5866 0.4023 sec/batch
Epoch 1/20  Iteration 43/3560 Training loss: 3.5774 0.4134 sec/batch
Epoch 1/20  Iteration 44/3560 Training loss: 3.5688 0.4076 sec/batch
Epoch 1/20  Iteration 45/3560 Training loss: 3.5603 0.4033 sec/batch
Epoch 1/20  Iteration 46/3560 Training loss: 3.5524 0.4061 sec/batch
Epoch 1/20  Iteration 47/3560 Training loss: 3.5449 0.4064 sec/batch
Epoch 1/20  Iteration 48/3560 Training loss: 3.5376 0.4035 sec/batch
Epoch 1/20  Iteration 49/3560 Training loss: 3.5308 0.4059 sec/batch
Epoch 1/20  Iteration 50/3560 Training loss: 3.5241 0.4162 sec/batch
Epoch 1/20  Iteration 51/3560 Training loss: 3.5173 0.4024 sec/batch
Epoch 1/20  Iteration 52/3560 Training loss: 3.5108 0.4090 sec/batch
Epoch 1/20  Iteration 53/3560 Training loss: 3.5047 0.4059 sec/batch
Epoch 1/20  Iteration 54/3560 Training loss: 3.4983 0.4077 sec/batch
Epoch 1/20  Iteration 55/3560 Training loss: 3.4926 0.4571 sec/batch
Epoch 1/20  Iteration 56/3560 Training loss: 3.4868 0.3998 sec/batch
Epoch 1/20  Iteration 57/3560 Training loss: 3.4811 0.4062 sec/batch
Epoch 1/20  Iteration 58/3560 Training loss: 3.4758 0.4211 sec/batch
Epoch 1/20  Iteration 59/3560 Training loss: 3.4704 0.4049 sec/batch
Epoch 1/20  Iteration 60/3560 Training loss: 3.4653 0.4006 sec/batch
Epoch 1/20  Iteration 61/3560 Training loss: 3.4603 0.4108 sec/batch
Epoch 1/20  Iteration 62/3560 Training loss: 3.4558 0.4062 sec/batch
Epoch 1/20  Iteration 63/3560 Training loss: 3.4515 0.4033 sec/batch
Epoch 1/20  Iteration 64/3560 Training loss: 3.4467 0.4159 sec/batch
Epoch 1/20  Iteration 65/3560 Training loss: 3.4421 0.4052 sec/batch
Epoch 1/20  Iteration 66/3560 Training loss: 3.4379 0.4055 sec/batch
Epoch 1/20  Iteration 67/3560 Training loss: 3.4339 0.4280 sec/batch
Epoch 1/20  Iteration 68/3560 Training loss: 3.4292 0.4033 sec/batch
Epoch 1/20  Iteration 69/3560 Training loss: 3.4249 0.4107 sec/batch
Epoch 1/20  Iteration 70/3560 Training loss: 3.4210 0.4057 sec/batch
Epoch 1/20  Iteration 71/3560 Training loss: 3.4171 0.4020 sec/batch
Epoch 1/20  Iteration 72/3560 Training loss: 3.4135 0.4042 sec/batch
Epoch 1/20  Iteration 73/3560 Training loss: 3.4098 0.4030 sec/batch
Epoch 1/20  Iteration 74/3560 Training loss: 3.4061 0.4030 sec/batch
Epoch 1/20  Iteration 75/3560 Training loss: 3.4027 0.4012 sec/batch
Epoch 1/20  Iteration 76/3560 Training loss: 3.3995 0.4007 sec/batch
Epoch 1/20  Iteration 77/3560 Training loss: 3.3962 0.4090 sec/batch
Epoch 1/20  Iteration 78/3560 Training loss: 3.3929 0.4072 sec/batch
Epoch 1/20  Iteration 79/3560 Training loss: 3.3896 0.4158 sec/batch
Epoch 1/20  Iteration 80/3560 Training loss: 3.3863 0.4252 sec/batch
Epoch 1/20  Iteration 81/3560 Training loss: 3.3830 0.4114 sec/batch
Epoch 1/20  Iteration 82/3560 Training loss: 3.3800 0.4068 sec/batch
Epoch 1/20  Iteration 83/3560 Training loss: 3.3770 0.4071 sec/batch
Epoch 1/20  Iteration 84/3560 Training loss: 3.3741 0.4074 sec/batch
Epoch 1/20  Iteration 85/3560 Training loss: 3.3710 0.4015 sec/batch
Epoch 1/20  Iteration 86/3560 Training loss: 3.3681 0.4071 sec/batch
Epoch 1/20  Iteration 87/3560 Training loss: 3.3652 0.4027 sec/batch
Epoch 1/20  Iteration 88/3560 Training loss: 3.3624 0.4016 sec/batch
Epoch 1/20  Iteration 89/3560 Training loss: 3.3598 0.4023 sec/batch
Epoch 1/20  Iteration 90/3560 Training loss: 3.3573 0.4099 sec/batch
Epoch 1/20  Iteration 91/3560 Training loss: 3.3548 0.4093 sec/batch
Epoch 1/20  Iteration 92/3560 Training loss: 3.3521 0.4198 sec/batch
Epoch 1/20  Iteration 93/3560 Training loss: 3.3497 0.4036 sec/batch
Epoch 1/20  Iteration 94/3560 Training loss: 3.3473 0.3983 sec/batch
Epoch 1/20  Iteration 95/3560 Training loss: 3.3448 0.4153 sec/batch
Epoch 1/20  Iteration 96/3560 Training loss: 3.3423 0.4053 sec/batch
Epoch 1/20  Iteration 97/3560 Training loss: 3.3400 0.4046 sec/batch
Epoch 1/20  Iteration 98/3560 Training loss: 3.3376 0.4063 sec/batch
Epoch 1/20  Iteration 99/3560 Training loss: 3.3354 0.4033 sec/batch
Epoch 1/20  Iteration 100/3560 Training loss: 3.3331 0.4023 sec/batch
Epoch 1/20  Iteration 101/3560 Training loss: 3.3309 0.4075 sec/batch
Epoch 1/20  Iteration 102/3560 Training loss: 3.3287 0.4257 sec/batch
Epoch 1/20  Iteration 103/3560 Training loss: 3.3265 0.4146 sec/batch
Epoch 1/20  Iteration 104/3560 Training loss: 3.3243 0.4198 sec/batch
Epoch 1/20  Iteration 105/3560 Training loss: 3.3222 0.4062 sec/batch
Epoch 1/20  Iteration 106/3560 Training loss: 3.3202 0.4018 sec/batch
Epoch 1/20  Iteration 107/3560 Training loss: 3.3179 0.4076 sec/batch
Epoch 1/20  Iteration 108/3560 Training loss: 3.3157 0.4125 sec/batch
Epoch 1/20  Iteration 109/3560 Training loss: 3.3137 0.4052 sec/batch
Epoch 1/20  Iteration 110/3560 Training loss: 3.3114 0.4030 sec/batch
Epoch 1/20  Iteration 111/3560 Training loss: 3.3094 0.4063 sec/batch
Epoch 1/20  Iteration 112/3560 Training loss: 3.3075 0.4056 sec/batch
Epoch 1/20  Iteration 113/3560 Training loss: 3.3055 0.4043 sec/batch
Epoch 1/20  Iteration 114/3560 Training loss: 3.3034 0.4024 sec/batch
Epoch 1/20  Iteration 115/3560 Training loss: 3.3013 0.4028 sec/batch
Epoch 1/20  Iteration 116/3560 Training loss: 3.2993 0.4021 sec/batch
Epoch 1/20  Iteration 117/3560 Training loss: 3.2973 0.4223 sec/batch
Epoch 1/20  Iteration 118/3560 Training loss: 3.2956 0.4036 sec/batch
Epoch 1/20  Iteration 119/3560 Training loss: 3.2938 0.4009 sec/batch
Epoch 1/20  Iteration 120/3560 Training loss: 3.2919 0.4111 sec/batch
Epoch 1/20  Iteration 121/3560 Training loss: 3.2902 0.4054 sec/batch
Epoch 1/20  Iteration 122/3560 Training loss: 3.2885 0.4171 sec/batch
Epoch 1/20  Iteration 123/3560 Training loss: 3.2867 0.4100 sec/batch
Epoch 1/20  Iteration 124/3560 Training loss: 3.2851 0.4068 sec/batch
Epoch 1/20  Iteration 125/3560 Training loss: 3.2832 0.4029 sec/batch
Epoch 1/20  Iteration 126/3560 Training loss: 3.2813 0.4128 sec/batch
Epoch 1/20  Iteration 127/3560 Training loss: 3.2795 0.4062 sec/batch
Epoch 1/20  Iteration 128/3560 Training loss: 3.2779 0.4001 sec/batch
Epoch 1/20  Iteration 129/3560 Training loss: 3.2761 0.4260 sec/batch
Epoch 1/20  Iteration 130/3560 Training loss: 3.2744 0.4095 sec/batch
Epoch 1/20  Iteration 131/3560 Training loss: 3.2729 0.4176 sec/batch
Epoch 1/20  Iteration 132/3560 Training loss: 3.2711 0.4123 sec/batch
Epoch 1/20  Iteration 133/3560 Training loss: 3.2695 0.4059 sec/batch
Epoch 1/20  Iteration 134/3560 Training loss: 3.2677 0.4052 sec/batch
Epoch 1/20  Iteration 135/3560 Training loss: 3.2658 0.4163 sec/batch
Epoch 1/20  Iteration 136/3560 Training loss: 3.2639 0.4206 sec/batch
Epoch 1/20  Iteration 137/3560 Training loss: 3.2622 0.4100 sec/batch
Epoch 1/20  Iteration 138/3560 Training loss: 3.2604 0.4012 sec/batch
Epoch 1/20  Iteration 139/3560 Training loss: 3.2588 0.4053 sec/batch
Epoch 1/20  Iteration 140/3560 Training loss: 3.2571 0.4040 sec/batch
Epoch 1/20  Iteration 141/3560 Training loss: 3.2555 0.4260 sec/batch
Epoch 1/20  Iteration 142/3560 Training loss: 3.2537 0.4096 sec/batch
Epoch 1/20  Iteration 143/3560 Training loss: 3.2520 0.4058 sec/batch
Epoch 1/20  Iteration 144/3560 Training loss: 3.2503 0.4104 sec/batch
Epoch 1/20  Iteration 145/3560 Training loss: 3.2486 0.4014 sec/batch
Epoch 1/20  Iteration 146/3560 Training loss: 3.2470 0.4011 sec/batch
Epoch 1/20  Iteration 147/3560 Training loss: 3.2454 0.4009 sec/batch
Epoch 1/20  Iteration 148/3560 Training loss: 3.2439 0.4109 sec/batch
Epoch 1/20  Iteration 149/3560 Training loss: 3.2422 0.4034 sec/batch
Epoch 1/20  Iteration 150/3560 Training loss: 3.2405 0.4065 sec/batch
Epoch 1/20  Iteration 151/3560 Training loss: 3.2390 0.4115 sec/batch
Epoch 1/20  Iteration 152/3560 Training loss: 3.2376 0.4215 sec/batch
Epoch 1/20  Iteration 153/3560 Training loss: 3.2360 0.4048 sec/batch
Epoch 1/20  Iteration 154/3560 Training loss: 3.2344 0.4278 sec/batch
Epoch 1/20  Iteration 155/3560 Training loss: 3.2327 0.4028 sec/batch
Epoch 1/20  Iteration 156/3560 Training loss: 3.2310 0.4130 sec/batch
Epoch 1/20  Iteration 157/3560 Training loss: 3.2292 0.4043 sec/batch
Epoch 1/20  Iteration 158/3560 Training loss: 3.2276 0.4037 sec/batch
Epoch 1/20  Iteration 159/3560 Training loss: 3.2258 0.4038 sec/batch
Epoch 1/20  Iteration 160/3560 Training loss: 3.2240 0.4069 sec/batch
Epoch 1/20  Iteration 161/3560 Training loss: 3.2224 0.4089 sec/batch
Epoch 1/20  Iteration 162/3560 Training loss: 3.2205 0.4095 sec/batch
Epoch 1/20  Iteration 163/3560 Training loss: 3.2188 0.4020 sec/batch
Epoch 1/20  Iteration 164/3560 Training loss: 3.2171 0.4195 sec/batch
Epoch 1/20  Iteration 165/3560 Training loss: 3.2154 0.4057 sec/batch
Epoch 1/20  Iteration 166/3560 Training loss: 3.2137 0.4650 sec/batch
Epoch 1/20  Iteration 167/3560 Training loss: 3.2120 0.4061 sec/batch
Epoch 1/20  Iteration 168/3560 Training loss: 3.2103 0.4049 sec/batch
Epoch 1/20  Iteration 169/3560 Training loss: 3.2086 0.4233 sec/batch
Epoch 1/20  Iteration 170/3560 Training loss: 3.2068 0.4072 sec/batch
Epoch 1/20  Iteration 171/3560 Training loss: 3.2051 0.3988 sec/batch
Epoch 1/20  Iteration 172/3560 Training loss: 3.2036 0.4093 sec/batch
Epoch 1/20  Iteration 173/3560 Training loss: 3.2021 0.4068 sec/batch
Epoch 1/20  Iteration 174/3560 Training loss: 3.2007 0.4051 sec/batch
Epoch 1/20  Iteration 175/3560 Training loss: 3.1991 0.4102 sec/batch
Epoch 1/20  Iteration 176/3560 Training loss: 3.1974 0.4073 sec/batch
Epoch 1/20  Iteration 177/3560 Training loss: 3.1956 0.4180 sec/batch
Epoch 1/20  Iteration 178/3560 Training loss: 3.1938 0.4247 sec/batch
Epoch 2/20  Iteration 179/3560 Training loss: 2.9509 0.4191 sec/batch
Epoch 2/20  Iteration 180/3560 Training loss: 2.9047 0.4026 sec/batch
Epoch 2/20  Iteration 181/3560 Training loss: 2.8866 0.4035 sec/batch
Epoch 2/20  Iteration 182/3560 Training loss: 2.8796 0.4139 sec/batch
Epoch 2/20  Iteration 183/3560 Training loss: 2.8763 0.4075 sec/batch
Epoch 2/20  Iteration 184/3560 Training loss: 2.8738 0.4027 sec/batch
Epoch 2/20  Iteration 185/3560 Training loss: 2.8724 0.4124 sec/batch
Epoch 2/20  Iteration 186/3560 Training loss: 2.8708 0.4086 sec/batch
Epoch 2/20  Iteration 187/3560 Training loss: 2.8683 0.3986 sec/batch
Epoch 2/20  Iteration 188/3560 Training loss: 2.8660 0.4092 sec/batch
Epoch 2/20  Iteration 189/3560 Training loss: 2.8620 0.4081 sec/batch
Epoch 2/20  Iteration 190/3560 Training loss: 2.8597 0.4239 sec/batch
Epoch 2/20  Iteration 191/3560 Training loss: 2.8576 0.4209 sec/batch
Epoch 2/20  Iteration 192/3560 Training loss: 2.8563 0.4169 sec/batch
Epoch 2/20  Iteration 193/3560 Training loss: 2.8549 0.4099 sec/batch
Epoch 2/20  Iteration 194/3560 Training loss: 2.8535 0.4045 sec/batch
Epoch 2/20  Iteration 195/3560 Training loss: 2.8513 0.4089 sec/batch
Epoch 2/20  Iteration 196/3560 Training loss: 2.8510 0.4083 sec/batch
Epoch 2/20  Iteration 197/3560 Training loss: 2.8491 0.4060 sec/batch
Epoch 2/20  Iteration 198/3560 Training loss: 2.8458 0.4037 sec/batch
Epoch 2/20  Iteration 199/3560 Training loss: 2.8433 0.4073 sec/batch
Epoch 2/20  Iteration 200/3560 Training loss: 2.8418 0.4237 sec/batch
Epoch 2/20  Iteration 201/3560 Training loss: 2.8395 0.4083 sec/batch
Epoch 2/20  Iteration 202/3560 Training loss: 2.8375 0.4147 sec/batch
Epoch 2/20  Iteration 203/3560 Training loss: 2.8350 0.4365 sec/batch
Epoch 2/20  Iteration 204/3560 Training loss: 2.8334 0.4074 sec/batch
Epoch 2/20  Iteration 205/3560 Training loss: 2.8317 0.4134 sec/batch
Epoch 2/20  Iteration 206/3560 Training loss: 2.8295 0.4057 sec/batch
Epoch 2/20  Iteration 207/3560 Training loss: 2.8277 0.4070 sec/batch
Epoch 2/20  Iteration 208/3560 Training loss: 2.8259 0.4052 sec/batch
Epoch 2/20  Iteration 209/3560 Training loss: 2.8248 0.4099 sec/batch
Epoch 2/20  Iteration 210/3560 Training loss: 2.8227 0.4055 sec/batch
Epoch 2/20  Iteration 211/3560 Training loss: 2.8202 0.4020 sec/batch
Epoch 2/20  Iteration 212/3560 Training loss: 2.8183 0.4027 sec/batch
Epoch 2/20  Iteration 213/3560 Training loss: 2.8158 0.4048 sec/batch
Epoch 2/20  Iteration 214/3560 Training loss: 2.8143 0.4027 sec/batch
Epoch 2/20  Iteration 215/3560 Training loss: 2.8122 0.4204 sec/batch
Epoch 2/20  Iteration 216/3560 Training loss: 2.8097 0.4063 sec/batch
Epoch 2/20  Iteration 217/3560 Training loss: 2.8074 0.4089 sec/batch
Epoch 2/20  Iteration 218/3560 Training loss: 2.8053 0.4058 sec/batch
Epoch 2/20  Iteration 219/3560 Training loss: 2.8029 0.4169 sec/batch
Epoch 2/20  Iteration 220/3560 Training loss: 2.8008 0.4033 sec/batch
Epoch 2/20  Iteration 221/3560 Training loss: 2.7986 0.4085 sec/batch
Epoch 2/20  Iteration 222/3560 Training loss: 2.7965 0.4076 sec/batch
Epoch 2/20  Iteration 223/3560 Training loss: 2.7942 0.4068 sec/batch
Epoch 2/20  Iteration 224/3560 Training loss: 2.7922 0.4108 sec/batch
Epoch 2/20  Iteration 225/3560 Training loss: 2.7904 0.4072 sec/batch
Epoch 2/20  Iteration 226/3560 Training loss: 2.7888 0.4080 sec/batch
Epoch 2/20  Iteration 227/3560 Training loss: 2.7870 0.4194 sec/batch
Epoch 2/20  Iteration 228/3560 Training loss: 2.7856 0.4135 sec/batch
Epoch 2/20  Iteration 229/3560 Training loss: 2.7837 0.4084 sec/batch
Epoch 2/20  Iteration 230/3560 Training loss: 2.7820 0.4038 sec/batch
Epoch 2/20  Iteration 231/3560 Training loss: 2.7802 0.4038 sec/batch
Epoch 2/20  Iteration 232/3560 Training loss: 2.7783 0.4110 sec/batch
Epoch 2/20  Iteration 233/3560 Training loss: 2.7764 0.4089 sec/batch
Epoch 2/20  Iteration 234/3560 Training loss: 2.7745 0.4057 sec/batch
Epoch 2/20  Iteration 235/3560 Training loss: 2.7728 0.4108 sec/batch
Epoch 2/20  Iteration 236/3560 Training loss: 2.7711 0.4089 sec/batch
Epoch 2/20  Iteration 237/3560 Training loss: 2.7691 0.4049 sec/batch
Epoch 2/20  Iteration 238/3560 Training loss: 2.7676 0.4125 sec/batch
Epoch 2/20  Iteration 239/3560 Training loss: 2.7658 0.4288 sec/batch
Epoch 2/20  Iteration 240/3560 Training loss: 2.7643 0.4078 sec/batch
Epoch 2/20  Iteration 241/3560 Training loss: 2.7629 0.4116 sec/batch
Epoch 2/20  Iteration 242/3560 Training loss: 2.7612 0.4044 sec/batch
Epoch 2/20  Iteration 243/3560 Training loss: 2.7593 0.4044 sec/batch
Epoch 2/20  Iteration 244/3560 Training loss: 2.7578 0.4021 sec/batch
Epoch 2/20  Iteration 245/3560 Training loss: 2.7563 0.4047 sec/batch
Epoch 2/20  Iteration 246/3560 Training loss: 2.7540 0.4043 sec/batch
Epoch 2/20  Iteration 247/3560 Training loss: 2.7520 0.4053 sec/batch
Epoch 2/20  Iteration 248/3560 Training loss: 2.7506 0.4029 sec/batch
Epoch 2/20  Iteration 249/3560 Training loss: 2.7490 0.4174 sec/batch
Epoch 2/20  Iteration 250/3560 Training loss: 2.7477 0.4076 sec/batch
Epoch 2/20  Iteration 251/3560 Training loss: 2.7461 0.4089 sec/batch
Epoch 2/20  Iteration 252/3560 Training loss: 2.7446 0.4286 sec/batch
Epoch 2/20  Iteration 253/3560 Training loss: 2.7431 0.4050 sec/batch
Epoch 2/20  Iteration 254/3560 Training loss: 2.7420 0.4051 sec/batch
Epoch 2/20  Iteration 255/3560 Training loss: 2.7405 0.4036 sec/batch
Epoch 2/20  Iteration 256/3560 Training loss: 2.7391 0.4057 sec/batch
Epoch 2/20  Iteration 257/3560 Training loss: 2.7376 0.4054 sec/batch
Epoch 2/20  Iteration 258/3560 Training loss: 2.7360 0.4016 sec/batch
Epoch 2/20  Iteration 259/3560 Training loss: 2.7345 0.4071 sec/batch
Epoch 2/20  Iteration 260/3560 Training loss: 2.7332 0.4052 sec/batch
Epoch 2/20  Iteration 261/3560 Training loss: 2.7318 0.4044 sec/batch
Epoch 2/20  Iteration 262/3560 Training loss: 2.7303 0.4042 sec/batch
Epoch 2/20  Iteration 263/3560 Training loss: 2.7287 0.4048 sec/batch
Epoch 2/20  Iteration 264/3560 Training loss: 2.7271 0.4376 sec/batch
Epoch 2/20  Iteration 265/3560 Training loss: 2.7257 0.4128 sec/batch
Epoch 2/20  Iteration 266/3560 Training loss: 2.7243 0.4053 sec/batch
Epoch 2/20  Iteration 267/3560 Training loss: 2.7228 0.4049 sec/batch
Epoch 2/20  Iteration 268/3560 Training loss: 2.7216 0.4083 sec/batch
Epoch 2/20  Iteration 269/3560 Training loss: 2.7203 0.4061 sec/batch
Epoch 2/20  Iteration 270/3560 Training loss: 2.7190 0.4080 sec/batch
Epoch 2/20  Iteration 271/3560 Training loss: 2.7176 0.4082 sec/batch
Epoch 2/20  Iteration 272/3560 Training loss: 2.7162 0.4039 sec/batch
Epoch 2/20  Iteration 273/3560 Training loss: 2.7146 0.4063 sec/batch
Epoch 2/20  Iteration 274/3560 Training loss: 2.7132 0.4107 sec/batch
Epoch 2/20  Iteration 275/3560 Training loss: 2.7119 0.4162 sec/batch
Epoch 2/20  Iteration 276/3560 Training loss: 2.7106 0.4495 sec/batch
Epoch 2/20  Iteration 277/3560 Training loss: 2.7093 0.4185 sec/batch
Epoch 2/20  Iteration 278/3560 Training loss: 2.7080 0.4050 sec/batch
Epoch 2/20  Iteration 279/3560 Training loss: 2.7068 0.4050 sec/batch
Epoch 2/20  Iteration 280/3560 Training loss: 2.7055 0.4091 sec/batch
Epoch 2/20  Iteration 281/3560 Training loss: 2.7042 0.4057 sec/batch
Epoch 2/20  Iteration 282/3560 Training loss: 2.7029 0.4045 sec/batch
Epoch 2/20  Iteration 283/3560 Training loss: 2.7015 0.3990 sec/batch
Epoch 2/20  Iteration 284/3560 Training loss: 2.7003 0.4072 sec/batch
Epoch 2/20  Iteration 285/3560 Training loss: 2.6989 0.4084 sec/batch
Epoch 2/20  Iteration 286/3560 Training loss: 2.6978 0.4064 sec/batch
Epoch 2/20  Iteration 287/3560 Training loss: 2.6967 0.4069 sec/batch
Epoch 2/20  Iteration 288/3560 Training loss: 2.6951 0.4090 sec/batch
Epoch 2/20  Iteration 289/3560 Training loss: 2.6940 0.4255 sec/batch
Epoch 2/20  Iteration 290/3560 Training loss: 2.6929 0.4027 sec/batch
Epoch 2/20  Iteration 291/3560 Training loss: 2.6916 0.4063 sec/batch
Epoch 2/20  Iteration 292/3560 Training loss: 2.6903 0.4071 sec/batch
Epoch 2/20  Iteration 293/3560 Training loss: 2.6891 0.4110 sec/batch
Epoch 2/20  Iteration 294/3560 Training loss: 2.6876 0.4041 sec/batch
Epoch 2/20  Iteration 295/3560 Training loss: 2.6864 0.4028 sec/batch
Epoch 2/20  Iteration 296/3560 Training loss: 2.6853 0.4028 sec/batch
Epoch 2/20  Iteration 297/3560 Training loss: 2.6843 0.4013 sec/batch
Epoch 2/20  Iteration 298/3560 Training loss: 2.6832 0.4077 sec/batch
Epoch 2/20  Iteration 299/3560 Training loss: 2.6823 0.4079 sec/batch
Epoch 2/20  Iteration 300/3560 Training loss: 2.6812 0.4132 sec/batch
Epoch 2/20  Iteration 301/3560 Training loss: 2.6800 0.4252 sec/batch
Epoch 2/20  Iteration 302/3560 Training loss: 2.6790 0.4073 sec/batch
Epoch 2/20  Iteration 303/3560 Training loss: 2.6779 0.4066 sec/batch
Epoch 2/20  Iteration 304/3560 Training loss: 2.6767 0.4060 sec/batch
Epoch 2/20  Iteration 305/3560 Training loss: 2.6757 0.4063 sec/batch
Epoch 2/20  Iteration 306/3560 Training loss: 2.6747 0.4092 sec/batch
Epoch 2/20  Iteration 307/3560 Training loss: 2.6736 0.4186 sec/batch
Epoch 2/20  Iteration 308/3560 Training loss: 2.6726 0.4043 sec/batch
Epoch 2/20  Iteration 309/3560 Training loss: 2.6716 0.4265 sec/batch
Epoch 2/20  Iteration 310/3560 Training loss: 2.6704 0.4079 sec/batch
Epoch 2/20  Iteration 311/3560 Training loss: 2.6695 0.4101 sec/batch
Epoch 2/20  Iteration 312/3560 Training loss: 2.6685 0.4055 sec/batch
Epoch 2/20  Iteration 313/3560 Training loss: 2.6673 0.4233 sec/batch
Epoch 2/20  Iteration 314/3560 Training loss: 2.6662 0.4071 sec/batch
Epoch 2/20  Iteration 315/3560 Training loss: 2.6652 0.4052 sec/batch
Epoch 2/20  Iteration 316/3560 Training loss: 2.6642 0.4299 sec/batch
Epoch 2/20  Iteration 317/3560 Training loss: 2.6633 0.4365 sec/batch
Epoch 2/20  Iteration 318/3560 Training loss: 2.6622 0.4107 sec/batch
Epoch 2/20  Iteration 319/3560 Training loss: 2.6613 0.4101 sec/batch
Epoch 2/20  Iteration 320/3560 Training loss: 2.6603 0.4190 sec/batch
Epoch 2/20  Iteration 321/3560 Training loss: 2.6593 0.4041 sec/batch
Epoch 2/20  Iteration 322/3560 Training loss: 2.6582 0.4082 sec/batch
Epoch 2/20  Iteration 323/3560 Training loss: 2.6573 0.4101 sec/batch
Epoch 2/20  Iteration 324/3560 Training loss: 2.6564 0.4077 sec/batch
Epoch 2/20  Iteration 325/3560 Training loss: 2.6554 0.4266 sec/batch
Epoch 2/20  Iteration 326/3560 Training loss: 2.6546 0.4219 sec/batch
Epoch 2/20  Iteration 327/3560 Training loss: 2.6535 0.4112 sec/batch
Epoch 2/20  Iteration 328/3560 Training loss: 2.6525 0.4105 sec/batch
Epoch 2/20  Iteration 329/3560 Training loss: 2.6518 0.4070 sec/batch
Epoch 2/20  Iteration 330/3560 Training loss: 2.6511 0.4065 sec/batch
Epoch 2/20  Iteration 331/3560 Training loss: 2.6503 0.4048 sec/batch
Epoch 2/20  Iteration 332/3560 Training loss: 2.6494 0.4098 sec/batch
Epoch 2/20  Iteration 333/3560 Training loss: 2.6484 0.4163 sec/batch
Epoch 2/20  Iteration 334/3560 Training loss: 2.6475 0.4128 sec/batch
Epoch 2/20  Iteration 335/3560 Training loss: 2.6464 0.4084 sec/batch
Epoch 2/20  Iteration 336/3560 Training loss: 2.6455 0.4030 sec/batch
Epoch 2/20  Iteration 337/3560 Training loss: 2.6444 0.4053 sec/batch
Epoch 2/20  Iteration 338/3560 Training loss: 2.6436 0.4277 sec/batch
Epoch 2/20  Iteration 339/3560 Training loss: 2.6428 0.4077 sec/batch
Epoch 2/20  Iteration 340/3560 Training loss: 2.6417 0.4109 sec/batch
Epoch 2/20  Iteration 341/3560 Training loss: 2.6407 0.4073 sec/batch
Epoch 2/20  Iteration 342/3560 Training loss: 2.6397 0.4004 sec/batch
Epoch 2/20  Iteration 343/3560 Training loss: 2.6389 0.4014 sec/batch
Epoch 2/20  Iteration 344/3560 Training loss: 2.6381 0.4078 sec/batch
Epoch 2/20  Iteration 345/3560 Training loss: 2.6372 0.4126 sec/batch
Epoch 2/20  Iteration 346/3560 Training loss: 2.6364 0.4165 sec/batch
Epoch 2/20  Iteration 347/3560 Training loss: 2.6355 0.4081 sec/batch
Epoch 2/20  Iteration 348/3560 Training loss: 2.6346 0.4216 sec/batch
Epoch 2/20  Iteration 349/3560 Training loss: 2.6338 0.4146 sec/batch
Epoch 2/20  Iteration 350/3560 Training loss: 2.6331 0.4212 sec/batch
Epoch 2/20  Iteration 351/3560 Training loss: 2.6324 0.4079 sec/batch
Epoch 2/20  Iteration 352/3560 Training loss: 2.6318 0.4081 sec/batch
Epoch 2/20  Iteration 353/3560 Training loss: 2.6312 0.4028 sec/batch
Epoch 2/20  Iteration 354/3560 Training loss: 2.6304 0.4054 sec/batch
Epoch 2/20  Iteration 355/3560 Training loss: 2.6295 0.4067 sec/batch
Epoch 2/20  Iteration 356/3560 Training loss: 2.6286 0.4054 sec/batch
Epoch 3/20  Iteration 357/3560 Training loss: 2.5433 0.4066 sec/batch
Epoch 3/20  Iteration 358/3560 Training loss: 2.4943 0.4030 sec/batch
Epoch 3/20  Iteration 359/3560 Training loss: 2.4797 0.4041 sec/batch
Epoch 3/20  Iteration 360/3560 Training loss: 2.4759 0.4143 sec/batch
Epoch 3/20  Iteration 361/3560 Training loss: 2.4741 0.4122 sec/batch
Epoch 3/20  Iteration 362/3560 Training loss: 2.4713 0.4331 sec/batch
Epoch 3/20  Iteration 363/3560 Training loss: 2.4720 0.4028 sec/batch
Epoch 3/20  Iteration 364/3560 Training loss: 2.4728 0.4049 sec/batch
Epoch 3/20  Iteration 365/3560 Training loss: 2.4737 0.4041 sec/batch
Epoch 3/20  Iteration 366/3560 Training loss: 2.4740 0.4069 sec/batch
Epoch 3/20  Iteration 367/3560 Training loss: 2.4718 0.4039 sec/batch
Epoch 3/20  Iteration 368/3560 Training loss: 2.4719 0.4056 sec/batch
Epoch 3/20  Iteration 369/3560 Training loss: 2.4720 0.4197 sec/batch
Epoch 3/20  Iteration 370/3560 Training loss: 2.4737 0.4042 sec/batch
Epoch 3/20  Iteration 371/3560 Training loss: 2.4734 0.4130 sec/batch
Epoch 3/20  Iteration 372/3560 Training loss: 2.4732 0.4015 sec/batch
Epoch 3/20  Iteration 373/3560 Training loss: 2.4725 0.4075 sec/batch
Epoch 3/20  Iteration 374/3560 Training loss: 2.4735 0.4067 sec/batch
Epoch 3/20  Iteration 375/3560 Training loss: 2.4732 0.4329 sec/batch
Epoch 3/20  Iteration 376/3560 Training loss: 2.4717 0.4053 sec/batch
Epoch 3/20  Iteration 377/3560 Training loss: 2.4708 0.4051 sec/batch
Epoch 3/20  Iteration 378/3560 Training loss: 2.4715 0.4050 sec/batch
Epoch 3/20  Iteration 379/3560 Training loss: 2.4707 0.4044 sec/batch
Epoch 3/20  Iteration 380/3560 Training loss: 2.4697 0.4071 sec/batch
Epoch 3/20  Iteration 381/3560 Training loss: 2.4687 0.4059 sec/batch
Epoch 3/20  Iteration 382/3560 Training loss: 2.4682 0.4153 sec/batch
Epoch 3/20  Iteration 383/3560 Training loss: 2.4676 0.4121 sec/batch
Epoch 3/20  Iteration 384/3560 Training loss: 2.4669 0.4033 sec/batch
Epoch 3/20  Iteration 385/3560 Training loss: 2.4668 0.4052 sec/batch
Epoch 3/20  Iteration 386/3560 Training loss: 2.4664 0.4051 sec/batch
Epoch 3/20  Iteration 387/3560 Training loss: 2.4664 0.4263 sec/batch
Epoch 3/20  Iteration 388/3560 Training loss: 2.4656 0.4166 sec/batch
Epoch 3/20  Iteration 389/3560 Training loss: 2.4645 0.4112 sec/batch
Epoch 3/20  Iteration 390/3560 Training loss: 2.4642 0.4044 sec/batch
Epoch 3/20  Iteration 391/3560 Training loss: 2.4635 0.4123 sec/batch
Epoch 3/20  Iteration 392/3560 Training loss: 2.4632 0.4074 sec/batch
Epoch 3/20  Iteration 393/3560 Training loss: 2.4626 0.4102 sec/batch
Epoch 3/20  Iteration 394/3560 Training loss: 2.4613 0.4050 sec/batch
Epoch 3/20  Iteration 395/3560 Training loss: 2.4603 0.4070 sec/batch
Epoch 3/20  Iteration 396/3560 Training loss: 2.4594 0.4080 sec/batch
Epoch 3/20  Iteration 397/3560 Training loss: 2.4585 0.4038 sec/batch
Epoch 3/20  Iteration 398/3560 Training loss: 2.4577 0.4098 sec/batch
Epoch 3/20  Iteration 399/3560 Training loss: 2.4567 0.4214 sec/batch
Epoch 3/20  Iteration 400/3560 Training loss: 2.4557 0.4061 sec/batch
Epoch 3/20  Iteration 401/3560 Training loss: 2.4549 0.4000 sec/batch
Epoch 3/20  Iteration 402/3560 Training loss: 2.4536 0.4117 sec/batch
Epoch 3/20  Iteration 403/3560 Training loss: 2.4533 0.4094 sec/batch
Epoch 3/20  Iteration 404/3560 Training loss: 2.4528 0.4061 sec/batch
Epoch 3/20  Iteration 405/3560 Training loss: 2.4520 0.4094 sec/batch
Epoch 3/20  Iteration 406/3560 Training loss: 2.4519 0.4053 sec/batch
Epoch 3/20  Iteration 407/3560 Training loss: 2.4512 0.4077 sec/batch
Epoch 3/20  Iteration 408/3560 Training loss: 2.4506 0.4109 sec/batch
Epoch 3/20  Iteration 409/3560 Training loss: 2.4500 0.4104 sec/batch
Epoch 3/20  Iteration 410/3560 Training loss: 2.4494 0.4101 sec/batch
Epoch 3/20  Iteration 411/3560 Training loss: 2.4487 0.4346 sec/batch
Epoch 3/20  Iteration 412/3560 Training loss: 2.4483 0.4119 sec/batch
Epoch 3/20  Iteration 413/3560 Training loss: 2.4480 0.4033 sec/batch
Epoch 3/20  Iteration 414/3560 Training loss: 2.4474 0.4019 sec/batch
Epoch 3/20  Iteration 415/3560 Training loss: 2.4467 0.4184 sec/batch
Epoch 3/20  Iteration 416/3560 Training loss: 2.4465 0.4065 sec/batch
Epoch 3/20  Iteration 417/3560 Training loss: 2.4459 0.4071 sec/batch
Epoch 3/20  Iteration 418/3560 Training loss: 2.4456 0.4062 sec/batch
Epoch 3/20  Iteration 419/3560 Training loss: 2.4455 0.4052 sec/batch
Epoch 3/20  Iteration 420/3560 Training loss: 2.4450 0.4072 sec/batch
Epoch 3/20  Iteration 421/3560 Training loss: 2.4443 0.4216 sec/batch
Epoch 3/20  Iteration 422/3560 Training loss: 2.4441 0.4047 sec/batch
Epoch 3/20  Iteration 423/3560 Training loss: 2.4437 0.4035 sec/batch
Epoch 3/20  Iteration 424/3560 Training loss: 2.4429 0.4278 sec/batch
Epoch 3/20  Iteration 425/3560 Training loss: 2.4422 0.4031 sec/batch
Epoch 3/20  Iteration 426/3560 Training loss: 2.4419 0.4062 sec/batch
Epoch 3/20  Iteration 427/3560 Training loss: 2.4415 0.4030 sec/batch
Epoch 3/20  Iteration 428/3560 Training loss: 2.4412 0.4086 sec/batch
Epoch 3/20  Iteration 429/3560 Training loss: 2.4408 0.4157 sec/batch
Epoch 3/20  Iteration 430/3560 Training loss: 2.4402 0.4009 sec/batch
Epoch 3/20  Iteration 431/3560 Training loss: 2.4398 0.4109 sec/batch
Epoch 3/20  Iteration 432/3560 Training loss: 2.4398 0.4064 sec/batch
Epoch 3/20  Iteration 433/3560 Training loss: 2.4393 0.4061 sec/batch
Epoch 3/20  Iteration 434/3560 Training loss: 2.4392 0.4086 sec/batch
Epoch 3/20  Iteration 435/3560 Training loss: 2.4386 0.4082 sec/batch
Epoch 3/20  Iteration 436/3560 Training loss: 2.4381 0.4290 sec/batch
Epoch 3/20  Iteration 437/3560 Training loss: 2.4375 0.4080 sec/batch
Epoch 3/20  Iteration 438/3560 Training loss: 2.4373 0.4036 sec/batch
Epoch 3/20  Iteration 439/3560 Training loss: 2.4367 0.4054 sec/batch
Epoch 3/20  Iteration 440/3560 Training loss: 2.4362 0.4026 sec/batch
Epoch 3/20  Iteration 441/3560 Training loss: 2.4353 0.4097 sec/batch
Epoch 3/20  Iteration 442/3560 Training loss: 2.4348 0.4094 sec/batch
Epoch 3/20  Iteration 443/3560 Training loss: 2.4345 0.4041 sec/batch
Epoch 3/20  Iteration 444/3560 Training loss: 2.4340 0.4172 sec/batch
Epoch 3/20  Iteration 445/3560 Training loss: 2.4334 0.4037 sec/batch
Epoch 3/20  Iteration 446/3560 Training loss: 2.4332 0.4059 sec/batch
Epoch 3/20  Iteration 447/3560 Training loss: 2.4328 0.4018 sec/batch
Epoch 3/20  Iteration 448/3560 Training loss: 2.4324 0.4237 sec/batch
Epoch 3/20  Iteration 449/3560 Training loss: 2.4320 0.4085 sec/batch
Epoch 3/20  Iteration 450/3560 Training loss: 2.4314 0.4082 sec/batch
Epoch 3/20  Iteration 451/3560 Training loss: 2.4308 0.4026 sec/batch
Epoch 3/20  Iteration 452/3560 Training loss: 2.4302 0.4110 sec/batch
Epoch 3/20  Iteration 453/3560 Training loss: 2.4298 0.4086 sec/batch
Epoch 3/20  Iteration 454/3560 Training loss: 2.4292 0.4074 sec/batch
Epoch 3/20  Iteration 455/3560 Training loss: 2.4288 0.4033 sec/batch
Epoch 3/20  Iteration 456/3560 Training loss: 2.4283 0.4511 sec/batch
Epoch 3/20  Iteration 457/3560 Training loss: 2.4282 0.4047 sec/batch
Epoch 3/20  Iteration 458/3560 Training loss: 2.4278 0.4047 sec/batch
Epoch 3/20  Iteration 459/3560 Training loss: 2.4272 0.4247 sec/batch
Epoch 3/20  Iteration 460/3560 Training loss: 2.4267 0.4121 sec/batch
Epoch 3/20  Iteration 461/3560 Training loss: 2.4262 0.4358 sec/batch
Epoch 3/20  Iteration 462/3560 Training loss: 2.4259 0.4086 sec/batch
Epoch 3/20  Iteration 463/3560 Training loss: 2.4255 0.4146 sec/batch
Epoch 3/20  Iteration 464/3560 Training loss: 2.4253 0.4164 sec/batch
Epoch 3/20  Iteration 465/3560 Training loss: 2.4251 0.4117 sec/batch
Epoch 3/20  Iteration 466/3560 Training loss: 2.4245 0.3992 sec/batch
Epoch 3/20  Iteration 467/3560 Training loss: 2.4242 0.4050 sec/batch
Epoch 3/20  Iteration 468/3560 Training loss: 2.4239 0.4086 sec/batch
Epoch 3/20  Iteration 469/3560 Training loss: 2.4235 0.4110 sec/batch
Epoch 3/20  Iteration 470/3560 Training loss: 2.4230 0.4007 sec/batch
Epoch 3/20  Iteration 471/3560 Training loss: 2.4226 0.4201 sec/batch
Epoch 3/20  Iteration 472/3560 Training loss: 2.4219 0.4120 sec/batch
Epoch 3/20  Iteration 473/3560 Training loss: 2.4216 0.4357 sec/batch
Epoch 3/20  Iteration 474/3560 Training loss: 2.4213 0.4044 sec/batch
Epoch 3/20  Iteration 475/3560 Training loss: 2.4211 0.4085 sec/batch
Epoch 3/20  Iteration 476/3560 Training loss: 2.4207 0.3983 sec/batch
Epoch 3/20  Iteration 477/3560 Training loss: 2.4205 0.4017 sec/batch
Epoch 3/20  Iteration 478/3560 Training loss: 2.4201 0.4046 sec/batch
Epoch 3/20  Iteration 479/3560 Training loss: 2.4197 0.3994 sec/batch
Epoch 3/20  Iteration 480/3560 Training loss: 2.4195 0.4051 sec/batch
Epoch 3/20  Iteration 481/3560 Training loss: 2.4192 0.3983 sec/batch
Epoch 3/20  Iteration 482/3560 Training loss: 2.4187 0.4045 sec/batch
Epoch 3/20  Iteration 483/3560 Training loss: 2.4184 0.4042 sec/batch
Epoch 3/20  Iteration 484/3560 Training loss: 2.4182 0.4027 sec/batch
Epoch 3/20  Iteration 485/3560 Training loss: 2.4179 0.4278 sec/batch
Epoch 3/20  Iteration 486/3560 Training loss: 2.4176 0.4131 sec/batch
Epoch 3/20  Iteration 487/3560 Training loss: 2.4173 0.4047 sec/batch
Epoch 3/20  Iteration 488/3560 Training loss: 2.4168 0.3976 sec/batch
Epoch 3/20  Iteration 489/3560 Training loss: 2.4165 0.4197 sec/batch
Epoch 3/20  Iteration 490/3560 Training loss: 2.4164 0.4032 sec/batch
Epoch 3/20  Iteration 491/3560 Training loss: 2.4159 0.4072 sec/batch
Epoch 3/20  Iteration 492/3560 Training loss: 2.4157 0.4202 sec/batch
Epoch 3/20  Iteration 493/3560 Training loss: 2.4154 0.4091 sec/batch
Epoch 3/20  Iteration 494/3560 Training loss: 2.4151 0.4175 sec/batch
Epoch 3/20  Iteration 495/3560 Training loss: 2.4150 0.4063 sec/batch
Epoch 3/20  Iteration 496/3560 Training loss: 2.4147 0.3998 sec/batch
Epoch 3/20  Iteration 497/3560 Training loss: 2.4145 0.4141 sec/batch
Epoch 3/20  Iteration 498/3560 Training loss: 2.4142 0.4136 sec/batch
Epoch 3/20  Iteration 499/3560 Training loss: 2.4139 0.4151 sec/batch
Epoch 3/20  Iteration 500/3560 Training loss: 2.4135 0.4033 sec/batch
Epoch 3/20  Iteration 501/3560 Training loss: 2.4132 0.4045 sec/batch
Epoch 3/20  Iteration 502/3560 Training loss: 2.4131 0.4045 sec/batch
Epoch 3/20  Iteration 503/3560 Training loss: 2.4129 0.4003 sec/batch
Epoch 3/20  Iteration 504/3560 Training loss: 2.4127 0.4081 sec/batch
Epoch 3/20  Iteration 505/3560 Training loss: 2.4123 0.4043 sec/batch
Epoch 3/20  Iteration 506/3560 Training loss: 2.4120 0.4100 sec/batch
Epoch 3/20  Iteration 507/3560 Training loss: 2.4118 0.3986 sec/batch
Epoch 3/20  Iteration 508/3560 Training loss: 2.4119 0.4053 sec/batch
Epoch 3/20  Iteration 509/3560 Training loss: 2.4117 0.4014 sec/batch
Epoch 3/20  Iteration 510/3560 Training loss: 2.4115 0.4231 sec/batch
Epoch 3/20  Iteration 511/3560 Training loss: 2.4111 0.4039 sec/batch
Epoch 3/20  Iteration 512/3560 Training loss: 2.4108 0.4064 sec/batch
Epoch 3/20  Iteration 513/3560 Training loss: 2.4104 0.4058 sec/batch
Epoch 3/20  Iteration 514/3560 Training loss: 2.4101 0.4029 sec/batch
Epoch 3/20  Iteration 515/3560 Training loss: 2.4097 0.4044 sec/batch
Epoch 3/20  Iteration 516/3560 Training loss: 2.4095 0.4110 sec/batch
Epoch 3/20  Iteration 517/3560 Training loss: 2.4093 0.4093 sec/batch
Epoch 3/20  Iteration 518/3560 Training loss: 2.4089 0.4033 sec/batch
Epoch 3/20  Iteration 519/3560 Training loss: 2.4085 0.4172 sec/batch
Epoch 3/20  Iteration 520/3560 Training loss: 2.4082 0.4156 sec/batch
Epoch 3/20  Iteration 521/3560 Training loss: 2.4080 0.4160 sec/batch
Epoch 3/20  Iteration 522/3560 Training loss: 2.4077 0.4230 sec/batch
Epoch 3/20  Iteration 523/3560 Training loss: 2.4075 0.4050 sec/batch
Epoch 3/20  Iteration 524/3560 Training loss: 2.4073 0.4028 sec/batch
Epoch 3/20  Iteration 525/3560 Training loss: 2.4070 0.4081 sec/batch
Epoch 3/20  Iteration 526/3560 Training loss: 2.4066 0.4034 sec/batch
Epoch 3/20  Iteration 527/3560 Training loss: 2.4064 0.4033 sec/batch
Epoch 3/20  Iteration 528/3560 Training loss: 2.4062 0.4025 sec/batch
Epoch 3/20  Iteration 529/3560 Training loss: 2.4061 0.4096 sec/batch
Epoch 3/20  Iteration 530/3560 Training loss: 2.4060 0.4004 sec/batch
Epoch 3/20  Iteration 531/3560 Training loss: 2.4059 0.4054 sec/batch
Epoch 3/20  Iteration 532/3560 Training loss: 2.4056 0.3973 sec/batch
Epoch 3/20  Iteration 533/3560 Training loss: 2.4053 0.4078 sec/batch
Epoch 3/20  Iteration 534/3560 Training loss: 2.4050 0.4001 sec/batch
Epoch 4/20  Iteration 535/3560 Training loss: 2.4263 0.4206 sec/batch
Epoch 4/20  Iteration 536/3560 Training loss: 2.3770 0.4038 sec/batch
Epoch 4/20  Iteration 537/3560 Training loss: 2.3609 0.4022 sec/batch
Epoch 4/20  Iteration 538/3560 Training loss: 2.3549 0.3988 sec/batch
Epoch 4/20  Iteration 539/3560 Training loss: 2.3533 0.3969 sec/batch
Epoch 4/20  Iteration 540/3560 Training loss: 2.3497 0.3981 sec/batch
Epoch 4/20  Iteration 541/3560 Training loss: 2.3500 0.3990 sec/batch
Epoch 4/20  Iteration 542/3560 Training loss: 2.3511 0.3994 sec/batch
Epoch 4/20  Iteration 543/3560 Training loss: 2.3528 0.4049 sec/batch
Epoch 4/20  Iteration 544/3560 Training loss: 2.3528 0.4061 sec/batch
Epoch 4/20  Iteration 545/3560 Training loss: 2.3512 0.4041 sec/batch
Epoch 4/20  Iteration 546/3560 Training loss: 2.3502 0.3983 sec/batch
Epoch 4/20  Iteration 547/3560 Training loss: 2.3506 0.4215 sec/batch
Epoch 4/20  Iteration 548/3560 Training loss: 2.3524 0.4069 sec/batch
Epoch 4/20  Iteration 549/3560 Training loss: 2.3526 0.3981 sec/batch
Epoch 4/20  Iteration 550/3560 Training loss: 2.3524 0.3966 sec/batch
Epoch 4/20  Iteration 551/3560 Training loss: 2.3523 0.4000 sec/batch
Epoch 4/20  Iteration 552/3560 Training loss: 2.3539 0.3944 sec/batch
Epoch 4/20  Iteration 553/3560 Training loss: 2.3538 0.3990 sec/batch
Epoch 4/20  Iteration 554/3560 Training loss: 2.3528 0.4006 sec/batch
Epoch 4/20  Iteration 555/3560 Training loss: 2.3518 0.4035 sec/batch
Epoch 4/20  Iteration 556/3560 Training loss: 2.3524 0.4006 sec/batch
Epoch 4/20  Iteration 557/3560 Training loss: 2.3519 0.4140 sec/batch
Epoch 4/20  Iteration 558/3560 Training loss: 2.3511 0.4643 sec/batch
Epoch 4/20  Iteration 559/3560 Training loss: 2.3505 0.4337 sec/batch
Epoch 4/20  Iteration 560/3560 Training loss: 2.3498 0.4126 sec/batch
Epoch 4/20  Iteration 561/3560 Training loss: 2.3492 0.4171 sec/batch
Epoch 4/20  Iteration 562/3560 Training loss: 2.3491 0.4005 sec/batch
Epoch 4/20  Iteration 563/3560 Training loss: 2.3496 0.4030 sec/batch
Epoch 4/20  Iteration 564/3560 Training loss: 2.3494 0.4017 sec/batch
Epoch 4/20  Iteration 565/3560 Training loss: 2.3495 0.4013 sec/batch
Epoch 4/20  Iteration 566/3560 Training loss: 2.3490 0.4134 sec/batch
Epoch 4/20  Iteration 567/3560 Training loss: 2.3485 0.4059 sec/batch
Epoch 4/20  Iteration 568/3560 Training loss: 2.3485 0.3968 sec/batch
Epoch 4/20  Iteration 569/3560 Training loss: 2.3481 0.4135 sec/batch
Epoch 4/20  Iteration 570/3560 Training loss: 2.3480 0.4248 sec/batch
Epoch 4/20  Iteration 571/3560 Training loss: 2.3477 0.4171 sec/batch
Epoch 4/20  Iteration 572/3560 Training loss: 2.3466 0.4323 sec/batch
Epoch 4/20  Iteration 573/3560 Training loss: 2.3458 0.4052 sec/batch
Epoch 4/20  Iteration 574/3560 Training loss: 2.3452 0.3999 sec/batch
Epoch 4/20  Iteration 575/3560 Training loss: 2.3444 0.3986 sec/batch
Epoch 4/20  Iteration 576/3560 Training loss: 2.3436 0.3984 sec/batch
Epoch 4/20  Iteration 577/3560 Training loss: 2.3429 0.4033 sec/batch
Epoch 4/20  Iteration 578/3560 Training loss: 2.3422 0.4008 sec/batch
Epoch 4/20  Iteration 579/3560 Training loss: 2.3417 0.4081 sec/batch
Epoch 4/20  Iteration 580/3560 Training loss: 2.3404 0.4147 sec/batch
Epoch 4/20  Iteration 581/3560 Training loss: 2.3402 0.4034 sec/batch
Epoch 4/20  Iteration 582/3560 Training loss: 2.3396 0.3992 sec/batch
Epoch 4/20  Iteration 583/3560 Training loss: 2.3392 0.3996 sec/batch
Epoch 4/20  Iteration 584/3560 Training loss: 2.3395 0.4262 sec/batch
Epoch 4/20  Iteration 585/3560 Training loss: 2.3390 0.4017 sec/batch
Epoch 4/20  Iteration 586/3560 Training loss: 2.3389 0.3964 sec/batch
Epoch 4/20  Iteration 587/3560 Training loss: 2.3384 0.4075 sec/batch
Epoch 4/20  Iteration 588/3560 Training loss: 2.3379 0.3983 sec/batch
Epoch 4/20  Iteration 589/3560 Training loss: 2.3375 0.3996 sec/batch
Epoch 4/20  Iteration 590/3560 Training loss: 2.3374 0.3925 sec/batch
Epoch 4/20  Iteration 591/3560 Training loss: 2.3371 0.4006 sec/batch
Epoch 4/20  Iteration 592/3560 Training loss: 2.3366 0.3989 sec/batch
Epoch 4/20  Iteration 593/3560 Training loss: 2.3362 0.4047 sec/batch
Epoch 4/20  Iteration 594/3560 Training loss: 2.3361 0.3972 sec/batch
Epoch 4/20  Iteration 595/3560 Training loss: 2.3362 0.3987 sec/batch
Epoch 4/20  Iteration 596/3560 Training loss: 2.3365 0.3992 sec/batch
Epoch 4/20  Iteration 597/3560 Training loss: 2.3371 0.4119 sec/batch
Epoch 4/20  Iteration 598/3560 Training loss: 2.3373 0.4026 sec/batch
Epoch 4/20  Iteration 599/3560 Training loss: 2.3368 0.3995 sec/batch
Epoch 4/20  Iteration 600/3560 Training loss: 2.3366 0.3993 sec/batch
Epoch 4/20  Iteration 601/3560 Training loss: 2.3364 0.3986 sec/batch
Epoch 4/20  Iteration 602/3560 Training loss: 2.3355 0.3984 sec/batch
Epoch 4/20  Iteration 603/3560 Training loss: 2.3350 0.4066 sec/batch
Epoch 4/20  Iteration 604/3560 Training loss: 2.3347 0.3991 sec/batch
Epoch 4/20  Iteration 605/3560 Training loss: 2.3348 0.4058 sec/batch
Epoch 4/20  Iteration 606/3560 Training loss: 2.3346 0.3965 sec/batch
Epoch 4/20  Iteration 607/3560 Training loss: 2.3344 0.3982 sec/batch
Epoch 4/20  Iteration 608/3560 Training loss: 2.3340 0.4069 sec/batch
Epoch 4/20  Iteration 609/3560 Training loss: 2.3336 0.4195 sec/batch
Epoch 4/20  Iteration 610/3560 Training loss: 2.3339 0.3970 sec/batch
Epoch 4/20  Iteration 611/3560 Training loss: 2.3336 0.4034 sec/batch
Epoch 4/20  Iteration 612/3560 Training loss: 2.3337 0.4003 sec/batch
Epoch 4/20  Iteration 613/3560 Training loss: 2.3331 0.4030 sec/batch
Epoch 4/20  Iteration 614/3560 Training loss: 2.3328 0.4000 sec/batch
Epoch 4/20  Iteration 615/3560 Training loss: 2.3323 0.3971 sec/batch
Epoch 4/20  Iteration 616/3560 Training loss: 2.3321 0.4024 sec/batch
Epoch 4/20  Iteration 617/3560 Training loss: 2.3317 0.3999 sec/batch
Epoch 4/20  Iteration 618/3560 Training loss: 2.3312 0.4096 sec/batch
Epoch 4/20  Iteration 619/3560 Training loss: 2.3305 0.4021 sec/batch
Epoch 4/20  Iteration 620/3560 Training loss: 2.3300 0.3962 sec/batch
Epoch 4/20  Iteration 621/3560 Training loss: 2.3297 0.4051 sec/batch
Epoch 4/20  Iteration 622/3560 Training loss: 2.3293 0.4230 sec/batch
Epoch 4/20  Iteration 623/3560 Training loss: 2.3289 0.3993 sec/batch
Epoch 4/20  Iteration 624/3560 Training loss: 2.3287 0.4141 sec/batch
Epoch 4/20  Iteration 625/3560 Training loss: 2.3284 0.4025 sec/batch
Epoch 4/20  Iteration 626/3560 Training loss: 2.3282 0.3984 sec/batch
Epoch 4/20  Iteration 627/3560 Training loss: 2.3277 0.3980 sec/batch
Epoch 4/20  Iteration 628/3560 Training loss: 2.3273 0.4032 sec/batch
Epoch 4/20  Iteration 629/3560 Training loss: 2.3268 0.3992 sec/batch
Epoch 4/20  Iteration 630/3560 Training loss: 2.3264 0.3957 sec/batch
Epoch 4/20  Iteration 631/3560 Training loss: 2.3261 0.4086 sec/batch
Epoch 4/20  Iteration 632/3560 Training loss: 2.3258 0.3951 sec/batch
Epoch 4/20  Iteration 633/3560 Training loss: 2.3254 0.4010 sec/batch
Epoch 4/20  Iteration 634/3560 Training loss: 2.3250 0.4228 sec/batch
Epoch 4/20  Iteration 635/3560 Training loss: 2.3249 0.3968 sec/batch
Epoch 4/20  Iteration 636/3560 Training loss: 2.3247 0.4188 sec/batch
Epoch 4/20  Iteration 637/3560 Training loss: 2.3243 0.3927 sec/batch
Epoch 4/20  Iteration 638/3560 Training loss: 2.3239 0.3960 sec/batch
Epoch 4/20  Iteration 639/3560 Training loss: 2.3236 0.4139 sec/batch
Epoch 4/20  Iteration 640/3560 Training loss: 2.3234 0.3986 sec/batch
Epoch 4/20  Iteration 641/3560 Training loss: 2.3230 0.3981 sec/batch
Epoch 4/20  Iteration 642/3560 Training loss: 2.3229 0.4008 sec/batch
Epoch 4/20  Iteration 643/3560 Training loss: 2.3228 0.4006 sec/batch
Epoch 4/20  Iteration 644/3560 Training loss: 2.3224 0.3970 sec/batch
Epoch 4/20  Iteration 645/3560 Training loss: 2.3223 0.4029 sec/batch
Epoch 4/20  Iteration 646/3560 Training loss: 2.3221 0.4089 sec/batch
Epoch 4/20  Iteration 647/3560 Training loss: 2.3218 0.4220 sec/batch
Epoch 4/20  Iteration 648/3560 Training loss: 2.3214 0.4017 sec/batch
Epoch 4/20  Iteration 649/3560 Training loss: 2.3210 0.4067 sec/batch
Epoch 4/20  Iteration 650/3560 Training loss: 2.3204 0.3937 sec/batch
Epoch 4/20  Iteration 651/3560 Training loss: 2.3202 0.3980 sec/batch
Epoch 4/20  Iteration 652/3560 Training loss: 2.3200 0.4067 sec/batch
Epoch 4/20  Iteration 653/3560 Training loss: 2.3200 0.3994 sec/batch
Epoch 4/20  Iteration 654/3560 Training loss: 2.3198 0.3965 sec/batch
Epoch 4/20  Iteration 655/3560 Training loss: 2.3197 0.3936 sec/batch
Epoch 4/20  Iteration 656/3560 Training loss: 2.3195 0.3996 sec/batch
Epoch 4/20  Iteration 657/3560 Training loss: 2.3192 0.3999 sec/batch
Epoch 4/20  Iteration 658/3560 Training loss: 2.3191 0.3999 sec/batch
Epoch 4/20  Iteration 659/3560 Training loss: 2.3188 0.4223 sec/batch
Epoch 4/20  Iteration 660/3560 Training loss: 2.3185 0.3958 sec/batch
Epoch 4/20  Iteration 661/3560 Training loss: 2.3183 0.4105 sec/batch
Epoch 4/20  Iteration 662/3560 Training loss: 2.3182 0.3993 sec/batch
Epoch 4/20  Iteration 663/3560 Training loss: 2.3180 0.4003 sec/batch
Epoch 4/20  Iteration 664/3560 Training loss: 2.3179 0.3986 sec/batch
Epoch 4/20  Iteration 665/3560 Training loss: 2.3176 0.4019 sec/batch
Epoch 4/20  Iteration 666/3560 Training loss: 2.3172 0.4012 sec/batch
Epoch 4/20  Iteration 667/3560 Training loss: 2.3170 0.3970 sec/batch
Epoch 4/20  Iteration 668/3560 Training loss: 2.3169 0.3978 sec/batch
Epoch 4/20  Iteration 669/3560 Training loss: 2.3167 0.3849 sec/batch
Epoch 4/20  Iteration 670/3560 Training loss: 2.3165 0.3933 sec/batch
Epoch 4/20  Iteration 671/3560 Training loss: 2.3163 0.3914 sec/batch
Epoch 4/20  Iteration 672/3560 Training loss: 2.3162 0.4034 sec/batch
Epoch 4/20  Iteration 673/3560 Training loss: 2.3162 0.3877 sec/batch
Epoch 4/20  Iteration 674/3560 Training loss: 2.3159 0.3762 sec/batch
Epoch 4/20  Iteration 675/3560 Training loss: 2.3159 0.3790 sec/batch
Epoch 4/20  Iteration 676/3560 Training loss: 2.3156 0.3871 sec/batch
Epoch 4/20  Iteration 677/3560 Training loss: 2.3154 0.3849 sec/batch
Epoch 4/20  Iteration 678/3560 Training loss: 2.3152 0.3821 sec/batch
Epoch 4/20  Iteration 679/3560 Training loss: 2.3150 0.3879 sec/batch
Epoch 4/20  Iteration 680/3560 Training loss: 2.3150 0.3782 sec/batch
Epoch 4/20  Iteration 681/3560 Training loss: 2.3148 0.3780 sec/batch
Epoch 4/20  Iteration 682/3560 Training loss: 2.3148 0.3830 sec/batch
Epoch 4/20  Iteration 683/3560 Training loss: 2.3146 0.3786 sec/batch
Epoch 4/20  Iteration 684/3560 Training loss: 2.3142 0.3954 sec/batch
Epoch 4/20  Iteration 685/3560 Training loss: 2.3141 0.4114 sec/batch
Epoch 4/20  Iteration 686/3560 Training loss: 2.3142 0.3915 sec/batch
Epoch 4/20  Iteration 687/3560 Training loss: 2.3141 0.3947 sec/batch
Epoch 4/20  Iteration 688/3560 Training loss: 2.3141 0.3913 sec/batch
Epoch 4/20  Iteration 689/3560 Training loss: 2.3138 0.3827 sec/batch
Epoch 4/20  Iteration 690/3560 Training loss: 2.3136 0.3979 sec/batch
Epoch 4/20  Iteration 691/3560 Training loss: 2.3134 0.3775 sec/batch
Epoch 4/20  Iteration 692/3560 Training loss: 2.3132 0.3814 sec/batch
Epoch 4/20  Iteration 693/3560 Training loss: 2.3130 0.3947 sec/batch
Epoch 4/20  Iteration 694/3560 Training loss: 2.3129 0.3741 sec/batch
Epoch 4/20  Iteration 695/3560 Training loss: 2.3129 0.3848 sec/batch
Epoch 4/20  Iteration 696/3560 Training loss: 2.3126 0.3821 sec/batch
Epoch 4/20  Iteration 697/3560 Training loss: 2.3124 0.3798 sec/batch
Epoch 4/20  Iteration 698/3560 Training loss: 2.3122 0.4001 sec/batch
Epoch 4/20  Iteration 699/3560 Training loss: 2.3121 0.3774 sec/batch
Epoch 4/20  Iteration 700/3560 Training loss: 2.3119 0.3752 sec/batch
Epoch 4/20  Iteration 701/3560 Training loss: 2.3118 0.3783 sec/batch
Epoch 4/20  Iteration 702/3560 Training loss: 2.3117 0.3791 sec/batch
Epoch 4/20  Iteration 703/3560 Training loss: 2.3116 0.4221 sec/batch
Epoch 4/20  Iteration 704/3560 Training loss: 2.3113 0.5135 sec/batch
Epoch 4/20  Iteration 705/3560 Training loss: 2.3111 0.4989 sec/batch
Epoch 4/20  Iteration 706/3560 Training loss: 2.3109 0.4836 sec/batch
Epoch 4/20  Iteration 707/3560 Training loss: 2.3108 0.3840 sec/batch
Epoch 4/20  Iteration 708/3560 Training loss: 2.3107 0.3815 sec/batch
Epoch 4/20  Iteration 709/3560 Training loss: 2.3107 0.3784 sec/batch
Epoch 4/20  Iteration 710/3560 Training loss: 2.3106 0.4011 sec/batch
Epoch 4/20  Iteration 711/3560 Training loss: 2.3103 0.3785 sec/batch
Epoch 4/20  Iteration 712/3560 Training loss: 2.3100 0.3770 sec/batch
Epoch 5/20  Iteration 713/3560 Training loss: 2.3517 0.3790 sec/batch
Epoch 5/20  Iteration 714/3560 Training loss: 2.3027 0.3938 sec/batch
Epoch 5/20  Iteration 715/3560 Training loss: 2.2862 0.3824 sec/batch
Epoch 5/20  Iteration 716/3560 Training loss: 2.2817 0.3860 sec/batch
Epoch 5/20  Iteration 717/3560 Training loss: 2.2777 0.3808 sec/batch
Epoch 5/20  Iteration 718/3560 Training loss: 2.2738 0.3876 sec/batch
Epoch 5/20  Iteration 719/3560 Training loss: 2.2747 0.3867 sec/batch
Epoch 5/20  Iteration 720/3560 Training loss: 2.2759 0.3871 sec/batch
Epoch 5/20  Iteration 721/3560 Training loss: 2.2777 0.3765 sec/batch
Epoch 5/20  Iteration 722/3560 Training loss: 2.2766 0.3774 sec/batch
Epoch 5/20  Iteration 723/3560 Training loss: 2.2751 0.4110 sec/batch
Epoch 5/20  Iteration 724/3560 Training loss: 2.2729 0.3772 sec/batch
Epoch 5/20  Iteration 725/3560 Training loss: 2.2730 0.3829 sec/batch
Epoch 5/20  Iteration 726/3560 Training loss: 2.2751 0.3802 sec/batch
Epoch 5/20  Iteration 727/3560 Training loss: 2.2748 0.3811 sec/batch
Epoch 5/20  Iteration 728/3560 Training loss: 2.2745 0.3774 sec/batch
Epoch 5/20  Iteration 729/3560 Training loss: 2.2743 0.3749 sec/batch
Epoch 5/20  Iteration 730/3560 Training loss: 2.2764 0.3780 sec/batch
Epoch 5/20  Iteration 731/3560 Training loss: 2.2765 0.3831 sec/batch
Epoch 5/20  Iteration 732/3560 Training loss: 2.2755 0.3687 sec/batch
Epoch 5/20  Iteration 733/3560 Training loss: 2.2749 0.3865 sec/batch
Epoch 5/20  Iteration 734/3560 Training loss: 2.2759 0.3809 sec/batch
Epoch 5/20  Iteration 735/3560 Training loss: 2.2758 0.3806 sec/batch
Epoch 5/20  Iteration 736/3560 Training loss: 2.2751 0.4066 sec/batch
Epoch 5/20  Iteration 737/3560 Training loss: 2.2747 0.3779 sec/batch
Epoch 5/20  Iteration 738/3560 Training loss: 2.2738 0.3783 sec/batch
Epoch 5/20  Iteration 739/3560 Training loss: 2.2736 0.3827 sec/batch
Epoch 5/20  Iteration 740/3560 Training loss: 2.2737 0.3751 sec/batch
Epoch 5/20  Iteration 741/3560 Training loss: 2.2744 0.3788 sec/batch
Epoch 5/20  Iteration 742/3560 Training loss: 2.2742 0.3807 sec/batch
Epoch 5/20  Iteration 743/3560 Training loss: 2.2743 0.3795 sec/batch
Epoch 5/20  Iteration 744/3560 Training loss: 2.2735 0.3854 sec/batch
Epoch 5/20  Iteration 745/3560 Training loss: 2.2731 0.3839 sec/batch
Epoch 5/20  Iteration 746/3560 Training loss: 2.2735 0.3868 sec/batch
Epoch 5/20  Iteration 747/3560 Training loss: 2.2732 0.3758 sec/batch
Epoch 5/20  Iteration 748/3560 Training loss: 2.2731 0.3744 sec/batch
Epoch 5/20  Iteration 749/3560 Training loss: 2.2727 0.3877 sec/batch
Epoch 5/20  Iteration 750/3560 Training loss: 2.2716 0.4064 sec/batch
Epoch 5/20  Iteration 751/3560 Training loss: 2.2709 0.3997 sec/batch
Epoch 5/20  Iteration 752/3560 Training loss: 2.2700 0.4014 sec/batch
Epoch 5/20  Iteration 753/3560 Training loss: 2.2696 0.3829 sec/batch
Epoch 5/20  Iteration 754/3560 Training loss: 2.2691 0.3789 sec/batch
Epoch 5/20  Iteration 755/3560 Training loss: 2.2684 0.4350 sec/batch
Epoch 5/20  Iteration 756/3560 Training loss: 2.2678 0.3914 sec/batch
Epoch 5/20  Iteration 757/3560 Training loss: 2.2675 0.3864 sec/batch
Epoch 5/20  Iteration 758/3560 Training loss: 2.2661 0.3720 sec/batch
Epoch 5/20  Iteration 759/3560 Training loss: 2.2662 0.3831 sec/batch
Epoch 5/20  Iteration 760/3560 Training loss: 2.2655 0.3739 sec/batch
Epoch 5/20  Iteration 761/3560 Training loss: 2.2653 0.3799 sec/batch
Epoch 5/20  Iteration 762/3560 Training loss: 2.2658 0.3876 sec/batch
Epoch 5/20  Iteration 763/3560 Training loss: 2.2653 0.3817 sec/batch
Epoch 5/20  Iteration 764/3560 Training loss: 2.2654 0.3780 sec/batch
Epoch 5/20  Iteration 765/3560 Training loss: 2.2651 0.4054 sec/batch
Epoch 5/20  Iteration 766/3560 Training loss: 2.2647 0.3832 sec/batch
Epoch 5/20  Iteration 767/3560 Training loss: 2.2643 0.3900 sec/batch
Epoch 5/20  Iteration 768/3560 Training loss: 2.2642 0.3868 sec/batch
Epoch 5/20  Iteration 769/3560 Training loss: 2.2641 0.3793 sec/batch
Epoch 5/20  Iteration 770/3560 Training loss: 2.2637 0.3951 sec/batch
Epoch 5/20  Iteration 771/3560 Training loss: 2.2634 0.3827 sec/batch
Epoch 5/20  Iteration 772/3560 Training loss: 2.2636 0.3890 sec/batch
Epoch 5/20  Iteration 773/3560 Training loss: 2.2634 0.3883 sec/batch
Epoch 5/20  Iteration 774/3560 Training loss: 2.2636 0.3764 sec/batch
Epoch 5/20  Iteration 775/3560 Training loss: 2.2637 0.4067 sec/batch
Epoch 5/20  Iteration 776/3560 Training loss: 2.2635 0.3992 sec/batch
Epoch 5/20  Iteration 777/3560 Training loss: 2.2631 0.3899 sec/batch
Epoch 5/20  Iteration 778/3560 Training loss: 2.2632 0.3789 sec/batch
Epoch 5/20  Iteration 779/3560 Training loss: 2.2631 0.3772 sec/batch
Epoch 5/20  Iteration 780/3560 Training loss: 2.2626 0.3896 sec/batch
Epoch 5/20  Iteration 781/3560 Training loss: 2.2620 0.3764 sec/batch
Epoch 5/20  Iteration 782/3560 Training loss: 2.2619 0.3772 sec/batch
Epoch 5/20  Iteration 783/3560 Training loss: 2.2621 0.3835 sec/batch
Epoch 5/20  Iteration 784/3560 Training loss: 2.2620 0.3826 sec/batch
Epoch 5/20  Iteration 785/3560 Training loss: 2.2619 0.3811 sec/batch
Epoch 5/20  Iteration 786/3560 Training loss: 2.2615 0.3803 sec/batch
Epoch 5/20  Iteration 787/3560 Training loss: 2.2613 0.3869 sec/batch
Epoch 5/20  Iteration 788/3560 Training loss: 2.2616 0.3750 sec/batch
Epoch 5/20  Iteration 789/3560 Training loss: 2.2613 0.3924 sec/batch
Epoch 5/20  Iteration 790/3560 Training loss: 2.2614 0.3971 sec/batch
Epoch 5/20  Iteration 791/3560 Training loss: 2.2609 0.3973 sec/batch
Epoch 5/20  Iteration 792/3560 Training loss: 2.2605 0.3784 sec/batch
Epoch 5/20  Iteration 793/3560 Training loss: 2.2601 0.3972 sec/batch
Epoch 5/20  Iteration 794/3560 Training loss: 2.2602 0.3891 sec/batch
Epoch 5/20  Iteration 795/3560 Training loss: 2.2598 0.3902 sec/batch
Epoch 5/20  Iteration 796/3560 Training loss: 2.2595 0.3962 sec/batch
Epoch 5/20  Iteration 797/3560 Training loss: 2.2588 0.3892 sec/batch
Epoch 5/20  Iteration 798/3560 Training loss: 2.2584 0.3807 sec/batch
Epoch 5/20  Iteration 799/3560 Training loss: 2.2582 0.3798 sec/batch
Epoch 5/20  Iteration 800/3560 Training loss: 2.2579 0.3728 sec/batch
Epoch 5/20  Iteration 801/3560 Training loss: 2.2575 0.3740 sec/batch
Epoch 5/20  Iteration 802/3560 Training loss: 2.2574 0.3912 sec/batch
Epoch 5/20  Iteration 803/3560 Training loss: 2.2572 0.3833 sec/batch
Epoch 5/20  Iteration 804/3560 Training loss: 2.2571 0.3995 sec/batch
Epoch 5/20  Iteration 805/3560 Training loss: 2.2567 0.3843 sec/batch
Epoch 5/20  Iteration 806/3560 Training loss: 2.2563 0.3757 sec/batch
Epoch 5/20  Iteration 807/3560 Training loss: 2.2559 0.3840 sec/batch
Epoch 5/20  Iteration 808/3560 Training loss: 2.2556 0.3863 sec/batch
Epoch 5/20  Iteration 809/3560 Training loss: 2.2554 0.3756 sec/batch
Epoch 5/20  Iteration 810/3560 Training loss: 2.2551 0.3786 sec/batch
Epoch 5/20  Iteration 811/3560 Training loss: 2.2548 0.3897 sec/batch
Epoch 5/20  Iteration 812/3560 Training loss: 2.2545 0.3828 sec/batch
Epoch 5/20  Iteration 813/3560 Training loss: 2.2544 0.3821 sec/batch
Epoch 5/20  Iteration 814/3560 Training loss: 2.2542 0.3884 sec/batch
Epoch 5/20  Iteration 815/3560 Training loss: 2.2539 0.3942 sec/batch
Epoch 5/20  Iteration 816/3560 Training loss: 2.2536 0.3788 sec/batch
Epoch 5/20  Iteration 817/3560 Training loss: 2.2533 0.3734 sec/batch
Epoch 5/20  Iteration 818/3560 Training loss: 2.2531 0.3802 sec/batch
Epoch 5/20  Iteration 819/3560 Training loss: 2.2529 0.3813 sec/batch
Epoch 5/20  Iteration 820/3560 Training loss: 2.2528 0.3767 sec/batch
Epoch 5/20  Iteration 821/3560 Training loss: 2.2527 0.3778 sec/batch
Epoch 5/20  Iteration 822/3560 Training loss: 2.2524 0.3825 sec/batch
Epoch 5/20  Iteration 823/3560 Training loss: 2.2523 0.3936 sec/batch
Epoch 5/20  Iteration 824/3560 Training loss: 2.2522 0.3765 sec/batch
Epoch 5/20  Iteration 825/3560 Training loss: 2.2520 0.3817 sec/batch
Epoch 5/20  Iteration 826/3560 Training loss: 2.2517 0.3884 sec/batch
Epoch 5/20  Iteration 827/3560 Training loss: 2.2516 0.3858 sec/batch
Epoch 5/20  Iteration 828/3560 Training loss: 2.2511 0.3967 sec/batch
Epoch 5/20  Iteration 829/3560 Training loss: 2.2510 0.3765 sec/batch
Epoch 5/20  Iteration 830/3560 Training loss: 2.2508 0.3846 sec/batch
Epoch 5/20  Iteration 831/3560 Training loss: 2.2508 0.3781 sec/batch
Epoch 5/20  Iteration 832/3560 Training loss: 2.2507 0.3761 sec/batch
Epoch 5/20  Iteration 833/3560 Training loss: 2.2507 0.3767 sec/batch
Epoch 5/20  Iteration 834/3560 Training loss: 2.2504 0.3793 sec/batch
Epoch 5/20  Iteration 835/3560 Training loss: 2.2501 0.3767 sec/batch
Epoch 5/20  Iteration 836/3560 Training loss: 2.2501 0.3768 sec/batch
Epoch 5/20  Iteration 837/3560 Training loss: 2.2500 0.3743 sec/batch
Epoch 5/20  Iteration 838/3560 Training loss: 2.2497 0.3770 sec/batch
Epoch 5/20  Iteration 839/3560 Training loss: 2.2496 0.3724 sec/batch
Epoch 5/20  Iteration 840/3560 Training loss: 2.2496 0.3744 sec/batch
Epoch 5/20  Iteration 841/3560 Training loss: 2.2494 0.3995 sec/batch
Epoch 5/20  Iteration 842/3560 Training loss: 2.2493 0.3812 sec/batch
Epoch 5/20  Iteration 843/3560 Training loss: 2.2491 0.3761 sec/batch
Epoch 5/20  Iteration 844/3560 Training loss: 2.2487 0.3821 sec/batch
Epoch 5/20  Iteration 845/3560 Training loss: 2.2486 0.3743 sec/batch
Epoch 5/20  Iteration 846/3560 Training loss: 2.2486 0.3730 sec/batch
Epoch 5/20  Iteration 847/3560 Training loss: 2.2484 0.3758 sec/batch
Epoch 5/20  Iteration 848/3560 Training loss: 2.2483 0.3724 sec/batch
Epoch 5/20  Iteration 849/3560 Training loss: 2.2482 0.3809 sec/batch
Epoch 5/20  Iteration 850/3560 Training loss: 2.2482 0.3810 sec/batch
Epoch 5/20  Iteration 851/3560 Training loss: 2.2482 0.3760 sec/batch
Epoch 5/20  Iteration 852/3560 Training loss: 2.2480 0.3756 sec/batch
Epoch 5/20  Iteration 853/3560 Training loss: 2.2481 0.3779 sec/batch
Epoch 5/20  Iteration 854/3560 Training loss: 2.2479 0.4060 sec/batch
Epoch 5/20  Iteration 855/3560 Training loss: 2.2477 0.4010 sec/batch
Epoch 5/20  Iteration 856/3560 Training loss: 2.2476 0.3777 sec/batch
Epoch 5/20  Iteration 857/3560 Training loss: 2.2474 0.3885 sec/batch
Epoch 5/20  Iteration 858/3560 Training loss: 2.2475 0.3896 sec/batch
Epoch 5/20  Iteration 859/3560 Training loss: 2.2473 0.3867 sec/batch
Epoch 5/20  Iteration 860/3560 Training loss: 2.2474 0.3785 sec/batch
Epoch 5/20  Iteration 861/3560 Training loss: 2.2473 0.3704 sec/batch
Epoch 5/20  Iteration 862/3560 Training loss: 2.2470 0.3804 sec/batch
Epoch 5/20  Iteration 863/3560 Training loss: 2.2468 0.3832 sec/batch
Epoch 5/20  Iteration 864/3560 Training loss: 2.2469 0.3798 sec/batch
Epoch 5/20  Iteration 865/3560 Training loss: 2.2469 0.3787 sec/batch
Epoch 5/20  Iteration 866/3560 Training loss: 2.2468 0.3750 sec/batch
Epoch 5/20  Iteration 867/3560 Training loss: 2.2467 0.4106 sec/batch
Epoch 5/20  Iteration 868/3560 Training loss: 2.2466 0.4345 sec/batch
Epoch 5/20  Iteration 869/3560 Training loss: 2.2464 0.6565 sec/batch
Epoch 5/20  Iteration 870/3560 Training loss: 2.2462 0.3778 sec/batch
Epoch 5/20  Iteration 871/3560 Training loss: 2.2460 0.3773 sec/batch
Epoch 5/20  Iteration 872/3560 Training loss: 2.2460 0.3721 sec/batch
Epoch 5/20  Iteration 873/3560 Training loss: 2.2460 0.3770 sec/batch
Epoch 5/20  Iteration 874/3560 Training loss: 2.2458 0.3727 sec/batch
Epoch 5/20  Iteration 875/3560 Training loss: 2.2457 0.3765 sec/batch
Epoch 5/20  Iteration 876/3560 Training loss: 2.2455 0.3777 sec/batch
Epoch 5/20  Iteration 877/3560 Training loss: 2.2454 0.3820 sec/batch
Epoch 5/20  Iteration 878/3560 Training loss: 2.2453 0.3784 sec/batch
Epoch 5/20  Iteration 879/3560 Training loss: 2.2452 0.3802 sec/batch
Epoch 5/20  Iteration 880/3560 Training loss: 2.2453 0.3975 sec/batch
Epoch 5/20  Iteration 881/3560 Training loss: 2.2452 0.3760 sec/batch
Epoch 5/20  Iteration 882/3560 Training loss: 2.2450 0.3933 sec/batch
Epoch 5/20  Iteration 883/3560 Training loss: 2.2449 0.3728 sec/batch
Epoch 5/20  Iteration 884/3560 Training loss: 2.2448 0.3797 sec/batch
Epoch 5/20  Iteration 885/3560 Training loss: 2.2447 0.3801 sec/batch
Epoch 5/20  Iteration 886/3560 Training loss: 2.2447 0.3869 sec/batch
Epoch 5/20  Iteration 887/3560 Training loss: 2.2447 0.3749 sec/batch
Epoch 5/20  Iteration 888/3560 Training loss: 2.2447 0.3748 sec/batch
Epoch 5/20  Iteration 889/3560 Training loss: 2.2445 0.3758 sec/batch
Epoch 5/20  Iteration 890/3560 Training loss: 2.2443 0.3802 sec/batch
Epoch 6/20  Iteration 891/3560 Training loss: 2.2882 0.3705 sec/batch
Epoch 6/20  Iteration 892/3560 Training loss: 2.2400 0.3796 sec/batch
Epoch 6/20  Iteration 893/3560 Training loss: 2.2276 0.4004 sec/batch
Epoch 6/20  Iteration 894/3560 Training loss: 2.2238 0.3733 sec/batch
Epoch 6/20  Iteration 895/3560 Training loss: 2.2208 0.3872 sec/batch
Epoch 6/20  Iteration 896/3560 Training loss: 2.2169 0.3819 sec/batch
Epoch 6/20  Iteration 897/3560 Training loss: 2.2162 0.3882 sec/batch
Epoch 6/20  Iteration 898/3560 Training loss: 2.2168 0.3787 sec/batch
Epoch 6/20  Iteration 899/3560 Training loss: 2.2190 0.3766 sec/batch
Epoch 6/20  Iteration 900/3560 Training loss: 2.2180 0.3810 sec/batch
Epoch 6/20  Iteration 901/3560 Training loss: 2.2161 0.3722 sec/batch
Epoch 6/20  Iteration 902/3560 Training loss: 2.2153 0.3741 sec/batch
Epoch 6/20  Iteration 903/3560 Training loss: 2.2148 0.3754 sec/batch
Epoch 6/20  Iteration 904/3560 Training loss: 2.2171 0.3843 sec/batch
Epoch 6/20  Iteration 905/3560 Training loss: 2.2165 0.3725 sec/batch
Epoch 6/20  Iteration 906/3560 Training loss: 2.2156 0.4043 sec/batch
Epoch 6/20  Iteration 907/3560 Training loss: 2.2157 0.3761 sec/batch
Epoch 6/20  Iteration 908/3560 Training loss: 2.2174 0.3960 sec/batch
Epoch 6/20  Iteration 909/3560 Training loss: 2.2176 0.4055 sec/batch
Epoch 6/20  Iteration 910/3560 Training loss: 2.2171 0.3783 sec/batch
Epoch 6/20  Iteration 911/3560 Training loss: 2.2163 0.3840 sec/batch
Epoch 6/20  Iteration 912/3560 Training loss: 2.2172 0.3726 sec/batch
Epoch 6/20  Iteration 913/3560 Training loss: 2.2170 0.3797 sec/batch
Epoch 6/20  Iteration 914/3560 Training loss: 2.2161 0.3794 sec/batch
Epoch 6/20  Iteration 915/3560 Training loss: 2.2153 0.3800 sec/batch
Epoch 6/20  Iteration 916/3560 Training loss: 2.2149 0.3792 sec/batch
Epoch 6/20  Iteration 917/3560 Training loss: 2.2147 0.3764 sec/batch
Epoch 6/20  Iteration 918/3560 Training loss: 2.2148 0.4121 sec/batch
Epoch 6/20  Iteration 919/3560 Training loss: 2.2156 0.4102 sec/batch
Epoch 6/20  Iteration 920/3560 Training loss: 2.2156 0.3936 sec/batch
Epoch 6/20  Iteration 921/3560 Training loss: 2.2159 0.3781 sec/batch
Epoch 6/20  Iteration 922/3560 Training loss: 2.2154 0.3749 sec/batch
Epoch 6/20  Iteration 923/3560 Training loss: 2.2154 0.3798 sec/batch
Epoch 6/20  Iteration 924/3560 Training loss: 2.2158 0.3816 sec/batch
Epoch 6/20  Iteration 925/3560 Training loss: 2.2157 0.3776 sec/batch
Epoch 6/20  Iteration 926/3560 Training loss: 2.2156 0.3757 sec/batch
Epoch 6/20  Iteration 927/3560 Training loss: 2.2154 0.3743 sec/batch
Epoch 6/20  Iteration 928/3560 Training loss: 2.2141 0.3880 sec/batch
Epoch 6/20  Iteration 929/3560 Training loss: 2.2132 0.3788 sec/batch
Epoch 6/20  Iteration 930/3560 Training loss: 2.2127 0.3732 sec/batch
Epoch 6/20  Iteration 931/3560 Training loss: 2.2123 0.3843 sec/batch
Epoch 6/20  Iteration 932/3560 Training loss: 2.2121 0.3834 sec/batch
Epoch 6/20  Iteration 933/3560 Training loss: 2.2117 0.3879 sec/batch
Epoch 6/20  Iteration 934/3560 Training loss: 2.2112 0.3777 sec/batch
Epoch 6/20  Iteration 935/3560 Training loss: 2.2110 0.3851 sec/batch
Epoch 6/20  Iteration 936/3560 Training loss: 2.2096 0.3755 sec/batch
Epoch 6/20  Iteration 937/3560 Training loss: 2.2099 0.3908 sec/batch
Epoch 6/20  Iteration 938/3560 Training loss: 2.2094 0.3777 sec/batch
Epoch 6/20  Iteration 939/3560 Training loss: 2.2090 0.3812 sec/batch
Epoch 6/20  Iteration 940/3560 Training loss: 2.2094 0.3750 sec/batch
Epoch 6/20  Iteration 941/3560 Training loss: 2.2087 0.3788 sec/batch
Epoch 6/20  Iteration 942/3560 Training loss: 2.2091 0.3853 sec/batch
Epoch 6/20  Iteration 943/3560 Training loss: 2.2085 0.3757 sec/batch
Epoch 6/20  Iteration 944/3560 Training loss: 2.2079 0.3772 sec/batch
Epoch 6/20  Iteration 945/3560 Training loss: 2.2074 0.4025 sec/batch
Epoch 6/20  Iteration 946/3560 Training loss: 2.2075 0.4046 sec/batch
Epoch 6/20  Iteration 947/3560 Training loss: 2.2074 0.3744 sec/batch
Epoch 6/20  Iteration 948/3560 Training loss: 2.2071 0.3773 sec/batch
Epoch 6/20  Iteration 949/3560 Training loss: 2.2068 0.3800 sec/batch
Epoch 6/20  Iteration 950/3560 Training loss: 2.2069 0.3843 sec/batch
Epoch 6/20  Iteration 951/3560 Training loss: 2.2066 0.3777 sec/batch
Epoch 6/20  Iteration 952/3560 Training loss: 2.2068 0.3787 sec/batch
Epoch 6/20  Iteration 953/3560 Training loss: 2.2070 0.3755 sec/batch
Epoch 6/20  Iteration 954/3560 Training loss: 2.2069 0.3759 sec/batch
Epoch 6/20  Iteration 955/3560 Training loss: 2.2065 0.3770 sec/batch
Epoch 6/20  Iteration 956/3560 Training loss: 2.2066 0.3730 sec/batch
Epoch 6/20  Iteration 957/3560 Training loss: 2.2066 0.3862 sec/batch
Epoch 6/20  Iteration 958/3560 Training loss: 2.2060 0.3784 sec/batch
Epoch 6/20  Iteration 959/3560 Training loss: 2.2057 0.3977 sec/batch
Epoch 6/20  Iteration 960/3560 Training loss: 2.2055 0.3859 sec/batch
Epoch 6/20  Iteration 961/3560 Training loss: 2.2056 0.3765 sec/batch
Epoch 6/20  Iteration 962/3560 Training loss: 2.2056 0.3750 sec/batch
Epoch 6/20  Iteration 963/3560 Training loss: 2.2057 0.3762 sec/batch
Epoch 6/20  Iteration 964/3560 Training loss: 2.2053 0.3826 sec/batch
Epoch 6/20  Iteration 965/3560 Training loss: 2.2051 0.3799 sec/batch
Epoch 6/20  Iteration 966/3560 Training loss: 2.2054 0.3764 sec/batch
Epoch 6/20  Iteration 967/3560 Training loss: 2.2053 0.3838 sec/batch
Epoch 6/20  Iteration 968/3560 Training loss: 2.2054 0.3757 sec/batch
Epoch 6/20  Iteration 969/3560 Training loss: 2.2050 0.3765 sec/batch
Epoch 6/20  Iteration 970/3560 Training loss: 2.2047 0.3812 sec/batch
Epoch 6/20  Iteration 971/3560 Training loss: 2.2043 0.3774 sec/batch
Epoch 6/20  Iteration 972/3560 Training loss: 2.2043 0.3742 sec/batch
Epoch 6/20  Iteration 973/3560 Training loss: 2.2039 0.4174 sec/batch
Epoch 6/20  Iteration 974/3560 Training loss: 2.2036 0.3797 sec/batch
Epoch 6/20  Iteration 975/3560 Training loss: 2.2028 0.3813 sec/batch
Epoch 6/20  Iteration 976/3560 Training loss: 2.2024 0.3764 sec/batch
Epoch 6/20  Iteration 977/3560 Training loss: 2.2024 0.3899 sec/batch
Epoch 6/20  Iteration 978/3560 Training loss: 2.2020 0.3797 sec/batch
Epoch 6/20  Iteration 979/3560 Training loss: 2.2015 0.3767 sec/batch
Epoch 6/20  Iteration 980/3560 Training loss: 2.2014 0.3767 sec/batch
Epoch 6/20  Iteration 981/3560 Training loss: 2.2011 0.3780 sec/batch
Epoch 6/20  Iteration 982/3560 Training loss: 2.2010 0.3768 sec/batch
Epoch 6/20  Iteration 983/3560 Training loss: 2.2006 0.3744 sec/batch
Epoch 6/20  Iteration 984/3560 Training loss: 2.2003 0.3750 sec/batch
Epoch 6/20  Iteration 985/3560 Training loss: 2.2000 0.3775 sec/batch
Epoch 6/20  Iteration 986/3560 Training loss: 2.1998 0.3944 sec/batch
Epoch 6/20  Iteration 987/3560 Training loss: 2.1996 0.3794 sec/batch
Epoch 6/20  Iteration 988/3560 Training loss: 2.1994 0.3778 sec/batch
Epoch 6/20  Iteration 989/3560 Training loss: 2.1990 0.3858 sec/batch
Epoch 6/20  Iteration 990/3560 Training loss: 2.1986 0.3736 sec/batch
Epoch 6/20  Iteration 991/3560 Training loss: 2.1986 0.3750 sec/batch
Epoch 6/20  Iteration 992/3560 Training loss: 2.1986 0.3743 sec/batch
Epoch 6/20  Iteration 993/3560 Training loss: 2.1982 0.3824 sec/batch
Epoch 6/20  Iteration 994/3560 Training loss: 2.1980 0.3763 sec/batch
Epoch 6/20  Iteration 995/3560 Training loss: 2.1977 0.3817 sec/batch
Epoch 6/20  Iteration 996/3560 Training loss: 2.1976 0.3765 sec/batch
Epoch 6/20  Iteration 997/3560 Training loss: 2.1974 0.3772 sec/batch
Epoch 6/20  Iteration 998/3560 Training loss: 2.1973 0.3763 sec/batch
Epoch 6/20  Iteration 999/3560 Training loss: 2.1973 0.4015 sec/batch
Epoch 6/20  Iteration 1000/3560 Training loss: 2.1970 0.3749 sec/batch
Epoch 6/20  Iteration 1001/3560 Training loss: 2.1969 0.3779 sec/batch
Epoch 6/20  Iteration 1002/3560 Training loss: 2.1968 0.3730 sec/batch
Epoch 6/20  Iteration 1003/3560 Training loss: 2.1967 0.3762 sec/batch
Epoch 6/20  Iteration 1004/3560 Training loss: 2.1965 0.3887 sec/batch
Epoch 6/20  Iteration 1005/3560 Training loss: 2.1963 0.3777 sec/batch
Epoch 6/20  Iteration 1006/3560 Training loss: 2.1958 0.3753 sec/batch
Epoch 6/20  Iteration 1007/3560 Training loss: 2.1956 0.3795 sec/batch
Epoch 6/20  Iteration 1008/3560 Training loss: 2.1956 0.3993 sec/batch
Epoch 6/20  Iteration 1009/3560 Training loss: 2.1956 0.3766 sec/batch
Epoch 6/20  Iteration 1010/3560 Training loss: 2.1956 0.3824 sec/batch
Epoch 6/20  Iteration 1011/3560 Training loss: 2.1955 0.3839 sec/batch
Epoch 6/20  Iteration 1012/3560 Training loss: 2.1952 0.3994 sec/batch
Epoch 6/20  Iteration 1013/3560 Training loss: 2.1950 0.3808 sec/batch
Epoch 6/20  Iteration 1014/3560 Training loss: 2.1950 0.3864 sec/batch
Epoch 6/20  Iteration 1015/3560 Training loss: 2.1948 0.3761 sec/batch
Epoch 6/20  Iteration 1016/3560 Training loss: 2.1945 0.3777 sec/batch
Epoch 6/20  Iteration 1017/3560 Training loss: 2.1945 0.3835 sec/batch
Epoch 6/20  Iteration 1018/3560 Training loss: 2.1944 0.3705 sec/batch
Epoch 6/20  Iteration 1019/3560 Training loss: 2.1943 0.3850 sec/batch
Epoch 6/20  Iteration 1020/3560 Training loss: 2.1942 0.3769 sec/batch
Epoch 6/20  Iteration 1021/3560 Training loss: 2.1940 0.3767 sec/batch
Epoch 6/20  Iteration 1022/3560 Training loss: 2.1937 0.3750 sec/batch
Epoch 6/20  Iteration 1023/3560 Training loss: 2.1937 0.3844 sec/batch
Epoch 6/20  Iteration 1024/3560 Training loss: 2.1937 0.3889 sec/batch
Epoch 6/20  Iteration 1025/3560 Training loss: 2.1935 0.3782 sec/batch
Epoch 6/20  Iteration 1026/3560 Training loss: 2.1934 0.3928 sec/batch
Epoch 6/20  Iteration 1027/3560 Training loss: 2.1934 0.3776 sec/batch
Epoch 6/20  Iteration 1028/3560 Training loss: 2.1933 0.3751 sec/batch
Epoch 6/20  Iteration 1029/3560 Training loss: 2.1934 0.3797 sec/batch
Epoch 6/20  Iteration 1030/3560 Training loss: 2.1932 0.3751 sec/batch
Epoch 6/20  Iteration 1031/3560 Training loss: 2.1932 0.3862 sec/batch
Epoch 6/20  Iteration 1032/3560 Training loss: 2.1930 0.3808 sec/batch
Epoch 6/20  Iteration 1033/3560 Training loss: 2.1929 0.3749 sec/batch
Epoch 6/20  Iteration 1034/3560 Training loss: 2.1928 0.3781 sec/batch
Epoch 6/20  Iteration 1035/3560 Training loss: 2.1926 0.3815 sec/batch
Epoch 6/20  Iteration 1036/3560 Training loss: 2.1926 0.3742 sec/batch
Epoch 6/20  Iteration 1037/3560 Training loss: 2.1925 0.3764 sec/batch
Epoch 6/20  Iteration 1038/3560 Training loss: 2.1927 0.3824 sec/batch
Epoch 6/20  Iteration 1039/3560 Training loss: 2.1925 0.3972 sec/batch
Epoch 6/20  Iteration 1040/3560 Training loss: 2.1923 0.3769 sec/batch
Epoch 6/20  Iteration 1041/3560 Training loss: 2.1921 0.3761 sec/batch
Epoch 6/20  Iteration 1042/3560 Training loss: 2.1922 0.3728 sec/batch
Epoch 6/20  Iteration 1043/3560 Training loss: 2.1922 0.3828 sec/batch
Epoch 6/20  Iteration 1044/3560 Training loss: 2.1922 0.3815 sec/batch
Epoch 6/20  Iteration 1045/3560 Training loss: 2.1920 0.3861 sec/batch
Epoch 6/20  Iteration 1046/3560 Training loss: 2.1919 0.3762 sec/batch
Epoch 6/20  Iteration 1047/3560 Training loss: 2.1918 0.3817 sec/batch
Epoch 6/20  Iteration 1048/3560 Training loss: 2.1917 0.3802 sec/batch
Epoch 6/20  Iteration 1049/3560 Training loss: 2.1914 0.3829 sec/batch
Epoch 6/20  Iteration 1050/3560 Training loss: 2.1915 0.3802 sec/batch
Epoch 6/20  Iteration 1051/3560 Training loss: 2.1915 0.3791 sec/batch
Epoch 6/20  Iteration 1052/3560 Training loss: 2.1913 0.3979 sec/batch
Epoch 6/20  Iteration 1053/3560 Training loss: 2.1912 0.3777 sec/batch
Epoch 6/20  Iteration 1054/3560 Training loss: 2.1911 0.3781 sec/batch
Epoch 6/20  Iteration 1055/3560 Training loss: 2.1910 0.3747 sec/batch
Epoch 6/20  Iteration 1056/3560 Training loss: 2.1909 0.3892 sec/batch
Epoch 6/20  Iteration 1057/3560 Training loss: 2.1908 0.3855 sec/batch
Epoch 6/20  Iteration 1058/3560 Training loss: 2.1909 0.3896 sec/batch
Epoch 6/20  Iteration 1059/3560 Training loss: 2.1907 0.3742 sec/batch
Epoch 6/20  Iteration 1060/3560 Training loss: 2.1906 0.3792 sec/batch
Epoch 6/20  Iteration 1061/3560 Training loss: 2.1905 0.4101 sec/batch
Epoch 6/20  Iteration 1062/3560 Training loss: 2.1904 0.3798 sec/batch
Epoch 6/20  Iteration 1063/3560 Training loss: 2.1904 0.3920 sec/batch
Epoch 6/20  Iteration 1064/3560 Training loss: 2.1904 0.3916 sec/batch
Epoch 6/20  Iteration 1065/3560 Training loss: 2.1905 0.4140 sec/batch
Epoch 6/20  Iteration 1066/3560 Training loss: 2.1904 0.3813 sec/batch
Epoch 6/20  Iteration 1067/3560 Training loss: 2.1902 0.3787 sec/batch
Epoch 6/20  Iteration 1068/3560 Training loss: 2.1901 0.3811 sec/batch
Epoch 7/20  Iteration 1069/3560 Training loss: 2.2338 0.3789 sec/batch
Epoch 7/20  Iteration 1070/3560 Training loss: 2.1903 0.3764 sec/batch
Epoch 7/20  Iteration 1071/3560 Training loss: 2.1774 0.3833 sec/batch
Epoch 7/20  Iteration 1072/3560 Training loss: 2.1714 0.3753 sec/batch
Epoch 7/20  Iteration 1073/3560 Training loss: 2.1689 0.3780 sec/batch
Epoch 7/20  Iteration 1074/3560 Training loss: 2.1636 0.3841 sec/batch
Epoch 7/20  Iteration 1075/3560 Training loss: 2.1658 0.3725 sec/batch
Epoch 7/20  Iteration 1076/3560 Training loss: 2.1664 0.3806 sec/batch
Epoch 7/20  Iteration 1077/3560 Training loss: 2.1683 0.3839 sec/batch
Epoch 7/20  Iteration 1078/3560 Training loss: 2.1678 0.3882 sec/batch
Epoch 7/20  Iteration 1079/3560 Training loss: 2.1656 0.3958 sec/batch
Epoch 7/20  Iteration 1080/3560 Training loss: 2.1643 0.3809 sec/batch
Epoch 7/20  Iteration 1081/3560 Training loss: 2.1646 0.3826 sec/batch
Epoch 7/20  Iteration 1082/3560 Training loss: 2.1669 0.3760 sec/batch
Epoch 7/20  Iteration 1083/3560 Training loss: 2.1660 0.3775 sec/batch
Epoch 7/20  Iteration 1084/3560 Training loss: 2.1648 0.3781 sec/batch
Epoch 7/20  Iteration 1085/3560 Training loss: 2.1647 0.3778 sec/batch
Epoch 7/20  Iteration 1086/3560 Training loss: 2.1667 0.3740 sec/batch
Epoch 7/20  Iteration 1087/3560 Training loss: 2.1668 0.3781 sec/batch
Epoch 7/20  Iteration 1088/3560 Training loss: 2.1660 0.3781 sec/batch
Epoch 7/20  Iteration 1089/3560 Training loss: 2.1655 0.3936 sec/batch
Epoch 7/20  Iteration 1090/3560 Training loss: 2.1664 0.3762 sec/batch
Epoch 7/20  Iteration 1091/3560 Training loss: 2.1662 0.3806 sec/batch
Epoch 7/20  Iteration 1092/3560 Training loss: 2.1656 0.3965 sec/batch
Epoch 7/20  Iteration 1093/3560 Training loss: 2.1653 0.3777 sec/batch
Epoch 7/20  Iteration 1094/3560 Training loss: 2.1646 0.3764 sec/batch
Epoch 7/20  Iteration 1095/3560 Training loss: 2.1640 0.3744 sec/batch
Epoch 7/20  Iteration 1096/3560 Training loss: 2.1644 0.3741 sec/batch
Epoch 7/20  Iteration 1097/3560 Training loss: 2.1653 0.3821 sec/batch
Epoch 7/20  Iteration 1098/3560 Training loss: 2.1651 0.3699 sec/batch
Epoch 7/20  Iteration 1099/3560 Training loss: 2.1652 0.3853 sec/batch
Epoch 7/20  Iteration 1100/3560 Training loss: 2.1644 0.3785 sec/batch
Epoch 7/20  Iteration 1101/3560 Training loss: 2.1642 0.3924 sec/batch
Epoch 7/20  Iteration 1102/3560 Training loss: 2.1648 0.3868 sec/batch
Epoch 7/20  Iteration 1103/3560 Training loss: 2.1646 0.3754 sec/batch
Epoch 7/20  Iteration 1104/3560 Training loss: 2.1642 0.3777 sec/batch
Epoch 7/20  Iteration 1105/3560 Training loss: 2.1639 0.3970 sec/batch
Epoch 7/20  Iteration 1106/3560 Training loss: 2.1628 0.3798 sec/batch
Epoch 7/20  Iteration 1107/3560 Training loss: 2.1620 0.3871 sec/batch
Epoch 7/20  Iteration 1108/3560 Training loss: 2.1615 0.3767 sec/batch
Epoch 7/20  Iteration 1109/3560 Training loss: 2.1609 0.4099 sec/batch
Epoch 7/20  Iteration 1110/3560 Training loss: 2.1610 0.3774 sec/batch
Epoch 7/20  Iteration 1111/3560 Training loss: 2.1607 0.3770 sec/batch
Epoch 7/20  Iteration 1112/3560 Training loss: 2.1600 0.3714 sec/batch
Epoch 7/20  Iteration 1113/3560 Training loss: 2.1599 0.3795 sec/batch
Epoch 7/20  Iteration 1114/3560 Training loss: 2.1585 0.3780 sec/batch
Epoch 7/20  Iteration 1115/3560 Training loss: 2.1584 0.3760 sec/batch
Epoch 7/20  Iteration 1116/3560 Training loss: 2.1580 0.3746 sec/batch
Epoch 7/20  Iteration 1117/3560 Training loss: 2.1580 0.3824 sec/batch
Epoch 7/20  Iteration 1118/3560 Training loss: 2.1585 0.4030 sec/batch
Epoch 7/20  Iteration 1119/3560 Training loss: 2.1580 0.3856 sec/batch
Epoch 7/20  Iteration 1120/3560 Training loss: 2.1583 0.3817 sec/batch
Epoch 7/20  Iteration 1121/3560 Training loss: 2.1579 0.3788 sec/batch
Epoch 7/20  Iteration 1122/3560 Training loss: 2.1576 0.3741 sec/batch
Epoch 7/20  Iteration 1123/3560 Training loss: 2.1572 0.3876 sec/batch
Epoch 7/20  Iteration 1124/3560 Training loss: 2.1573 0.3829 sec/batch
Epoch 7/20  Iteration 1125/3560 Training loss: 2.1576 0.3803 sec/batch
Epoch 7/20  Iteration 1126/3560 Training loss: 2.1571 0.3797 sec/batch
Epoch 7/20  Iteration 1127/3560 Training loss: 2.1568 0.3812 sec/batch
Epoch 7/20  Iteration 1128/3560 Training loss: 2.1571 0.3780 sec/batch
Epoch 7/20  Iteration 1129/3560 Training loss: 2.1569 0.3714 sec/batch
Epoch 7/20  Iteration 1130/3560 Training loss: 2.1571 0.3876 sec/batch
Epoch 7/20  Iteration 1131/3560 Training loss: 2.1575 0.3900 sec/batch
Epoch 7/20  Iteration 1132/3560 Training loss: 2.1574 0.3839 sec/batch
Epoch 7/20  Iteration 1133/3560 Training loss: 2.1571 0.3891 sec/batch
Epoch 7/20  Iteration 1134/3560 Training loss: 2.1575 0.3781 sec/batch
Epoch 7/20  Iteration 1135/3560 Training loss: 2.1575 0.3789 sec/batch
Epoch 7/20  Iteration 1136/3560 Training loss: 2.1569 0.3755 sec/batch
Epoch 7/20  Iteration 1137/3560 Training loss: 2.1567 0.3757 sec/batch
Epoch 7/20  Iteration 1138/3560 Training loss: 2.1566 0.3771 sec/batch
Epoch 7/20  Iteration 1139/3560 Training loss: 2.1569 0.3843 sec/batch
Epoch 7/20  Iteration 1140/3560 Training loss: 2.1570 0.3757 sec/batch
Epoch 7/20  Iteration 1141/3560 Training loss: 2.1570 0.3788 sec/batch
Epoch 7/20  Iteration 1142/3560 Training loss: 2.1567 0.3759 sec/batch
Epoch 7/20  Iteration 1143/3560 Training loss: 2.1563 0.3731 sec/batch
Epoch 7/20  Iteration 1144/3560 Training loss: 2.1567 0.3736 sec/batch
Epoch 7/20  Iteration 1145/3560 Training loss: 2.1566 0.3958 sec/batch
Epoch 7/20  Iteration 1146/3560 Training loss: 2.1566 0.3755 sec/batch
Epoch 7/20  Iteration 1147/3560 Training loss: 2.1562 0.3779 sec/batch
Epoch 7/20  Iteration 1148/3560 Training loss: 2.1559 0.3782 sec/batch
Epoch 7/20  Iteration 1149/3560 Training loss: 2.1554 0.3789 sec/batch
Epoch 7/20  Iteration 1150/3560 Training loss: 2.1555 0.3873 sec/batch
Epoch 7/20  Iteration 1151/3560 Training loss: 2.1551 0.3800 sec/batch
Epoch 7/20  Iteration 1152/3560 Training loss: 2.1548 0.3827 sec/batch
Epoch 7/20  Iteration 1153/3560 Training loss: 2.1541 0.3735 sec/batch
Epoch 7/20  Iteration 1154/3560 Training loss: 2.1539 0.3770 sec/batch
Epoch 7/20  Iteration 1155/3560 Training loss: 2.1537 0.3835 sec/batch
Epoch 7/20  Iteration 1156/3560 Training loss: 2.1533 0.3767 sec/batch
Epoch 7/20  Iteration 1157/3560 Training loss: 2.1529 0.3769 sec/batch
Epoch 7/20  Iteration 1158/3560 Training loss: 2.1529 0.3945 sec/batch
Epoch 7/20  Iteration 1159/3560 Training loss: 2.1527 0.3782 sec/batch
Epoch 7/20  Iteration 1160/3560 Training loss: 2.1525 0.3740 sec/batch
Epoch 7/20  Iteration 1161/3560 Training loss: 2.1520 0.3755 sec/batch
Epoch 7/20  Iteration 1162/3560 Training loss: 2.1516 0.3817 sec/batch
Epoch 7/20  Iteration 1163/3560 Training loss: 2.1514 0.3797 sec/batch
Epoch 7/20  Iteration 1164/3560 Training loss: 2.1511 0.3794 sec/batch
Epoch 7/20  Iteration 1165/3560 Training loss: 2.1510 0.3880 sec/batch
Epoch 7/20  Iteration 1166/3560 Training loss: 2.1507 0.3707 sec/batch
Epoch 7/20  Iteration 1167/3560 Training loss: 2.1504 0.3871 sec/batch
Epoch 7/20  Iteration 1168/3560 Training loss: 2.1500 0.3745 sec/batch
Epoch 7/20  Iteration 1169/3560 Training loss: 2.1500 0.3814 sec/batch
Epoch 7/20  Iteration 1170/3560 Training loss: 2.1499 0.3723 sec/batch
Epoch 7/20  Iteration 1171/3560 Training loss: 2.1496 0.3860 sec/batch
Epoch 7/20  Iteration 1172/3560 Training loss: 2.1493 0.3873 sec/batch
Epoch 7/20  Iteration 1173/3560 Training loss: 2.1490 0.3796 sec/batch
Epoch 7/20  Iteration 1174/3560 Training loss: 2.1488 0.3822 sec/batch
Epoch 7/20  Iteration 1175/3560 Training loss: 2.1486 0.3739 sec/batch
Epoch 7/20  Iteration 1176/3560 Training loss: 2.1486 0.3742 sec/batch
Epoch 7/20  Iteration 1177/3560 Training loss: 2.1485 0.3757 sec/batch
Epoch 7/20  Iteration 1178/3560 Training loss: 2.1483 0.3745 sec/batch
Epoch 7/20  Iteration 1179/3560 Training loss: 2.1483 0.3887 sec/batch
Epoch 7/20  Iteration 1180/3560 Training loss: 2.1482 0.3722 sec/batch
Epoch 7/20  Iteration 1181/3560 Training loss: 2.1480 0.3870 sec/batch
Epoch 7/20  Iteration 1182/3560 Training loss: 2.1478 0.3798 sec/batch
Epoch 7/20  Iteration 1183/3560 Training loss: 2.1476 0.3761 sec/batch
Epoch 7/20  Iteration 1184/3560 Training loss: 2.1471 0.3776 sec/batch
Epoch 7/20  Iteration 1185/3560 Training loss: 2.1471 0.4049 sec/batch
Epoch 7/20  Iteration 1186/3560 Training loss: 2.1469 0.3799 sec/batch
Epoch 7/20  Iteration 1187/3560 Training loss: 2.1469 0.3763 sec/batch
Epoch 7/20  Iteration 1188/3560 Training loss: 2.1469 0.3884 sec/batch
Epoch 7/20  Iteration 1189/3560 Training loss: 2.1469 0.3795 sec/batch
Epoch 7/20  Iteration 1190/3560 Training loss: 2.1466 0.3800 sec/batch
Epoch 7/20  Iteration 1191/3560 Training loss: 2.1463 0.3806 sec/batch
Epoch 7/20  Iteration 1192/3560 Training loss: 2.1464 0.3783 sec/batch
Epoch 7/20  Iteration 1193/3560 Training loss: 2.1463 0.3839 sec/batch
Epoch 7/20  Iteration 1194/3560 Training loss: 2.1459 0.3756 sec/batch
Epoch 7/20  Iteration 1195/3560 Training loss: 2.1459 0.3775 sec/batch
Epoch 7/20  Iteration 1196/3560 Training loss: 2.1458 0.3830 sec/batch
Epoch 7/20  Iteration 1197/3560 Training loss: 2.1458 0.3762 sec/batch
Epoch 7/20  Iteration 1198/3560 Training loss: 2.1458 0.4050 sec/batch
Epoch 7/20  Iteration 1199/3560 Training loss: 2.1456 0.3793 sec/batch
Epoch 7/20  Iteration 1200/3560 Training loss: 2.1453 0.3840 sec/batch
Epoch 7/20  Iteration 1201/3560 Training loss: 2.1452 0.3735 sec/batch
Epoch 7/20  Iteration 1202/3560 Training loss: 2.1453 0.3811 sec/batch
Epoch 7/20  Iteration 1203/3560 Training loss: 2.1452 0.3771 sec/batch
Epoch 7/20  Iteration 1204/3560 Training loss: 2.1451 0.3778 sec/batch
Epoch 7/20  Iteration 1205/3560 Training loss: 2.1450 0.3766 sec/batch
Epoch 7/20  Iteration 1206/3560 Training loss: 2.1450 0.3785 sec/batch
Epoch 7/20  Iteration 1207/3560 Training loss: 2.1452 0.3810 sec/batch
Epoch 7/20  Iteration 1208/3560 Training loss: 2.1450 0.3812 sec/batch
Epoch 7/20  Iteration 1209/3560 Training loss: 2.1451 0.3804 sec/batch
Epoch 7/20  Iteration 1210/3560 Training loss: 2.1449 0.3867 sec/batch
Epoch 7/20  Iteration 1211/3560 Training loss: 2.1448 0.4041 sec/batch
Epoch 7/20  Iteration 1212/3560 Training loss: 2.1447 0.3822 sec/batch
Epoch 7/20  Iteration 1213/3560 Training loss: 2.1445 0.3716 sec/batch
Epoch 7/20  Iteration 1214/3560 Training loss: 2.1446 0.3781 sec/batch
Epoch 7/20  Iteration 1215/3560 Training loss: 2.1446 0.3827 sec/batch
Epoch 7/20  Iteration 1216/3560 Training loss: 2.1446 0.3785 sec/batch
Epoch 7/20  Iteration 1217/3560 Training loss: 2.1445 0.3773 sec/batch
Epoch 7/20  Iteration 1218/3560 Training loss: 2.1443 0.3770 sec/batch
Epoch 7/20  Iteration 1219/3560 Training loss: 2.1441 0.3763 sec/batch
Epoch 7/20  Iteration 1220/3560 Training loss: 2.1442 0.3774 sec/batch
Epoch 7/20  Iteration 1221/3560 Training loss: 2.1442 0.3760 sec/batch
Epoch 7/20  Iteration 1222/3560 Training loss: 2.1442 0.3765 sec/batch
Epoch 7/20  Iteration 1223/3560 Training loss: 2.1441 0.3767 sec/batch
Epoch 7/20  Iteration 1224/3560 Training loss: 2.1439 0.3910 sec/batch
Epoch 7/20  Iteration 1225/3560 Training loss: 2.1439 0.3910 sec/batch
Epoch 7/20  Iteration 1226/3560 Training loss: 2.1437 0.3773 sec/batch
Epoch 7/20  Iteration 1227/3560 Training loss: 2.1435 0.3788 sec/batch
Epoch 7/20  Iteration 1228/3560 Training loss: 2.1435 0.3796 sec/batch
Epoch 7/20  Iteration 1229/3560 Training loss: 2.1435 0.3804 sec/batch
Epoch 7/20  Iteration 1230/3560 Training loss: 2.1433 0.3757 sec/batch
Epoch 7/20  Iteration 1231/3560 Training loss: 2.1432 0.3749 sec/batch
Epoch 7/20  Iteration 1232/3560 Training loss: 2.1431 0.3764 sec/batch
Epoch 7/20  Iteration 1233/3560 Training loss: 2.1431 0.3794 sec/batch
Epoch 7/20  Iteration 1234/3560 Training loss: 2.1429 0.3769 sec/batch
Epoch 7/20  Iteration 1235/3560 Training loss: 2.1429 0.3817 sec/batch
Epoch 7/20  Iteration 1236/3560 Training loss: 2.1430 0.3777 sec/batch
Epoch 7/20  Iteration 1237/3560 Training loss: 2.1429 0.3851 sec/batch
Epoch 7/20  Iteration 1238/3560 Training loss: 2.1428 0.3971 sec/batch
Epoch 7/20  Iteration 1239/3560 Training loss: 2.1426 0.3767 sec/batch
Epoch 7/20  Iteration 1240/3560 Training loss: 2.1425 0.3855 sec/batch
Epoch 7/20  Iteration 1241/3560 Training loss: 2.1425 0.3756 sec/batch
Epoch 7/20  Iteration 1242/3560 Training loss: 2.1425 0.3841 sec/batch
Epoch 7/20  Iteration 1243/3560 Training loss: 2.1424 0.3721 sec/batch
Epoch 7/20  Iteration 1244/3560 Training loss: 2.1424 0.3739 sec/batch
Epoch 7/20  Iteration 1245/3560 Training loss: 2.1423 0.3820 sec/batch
Epoch 7/20  Iteration 1246/3560 Training loss: 2.1422 0.3818 sec/batch
Epoch 8/20  Iteration 1247/3560 Training loss: 2.1878 0.3761 sec/batch
Epoch 8/20  Iteration 1248/3560 Training loss: 2.1455 0.3751 sec/batch
Epoch 8/20  Iteration 1249/3560 Training loss: 2.1350 0.3724 sec/batch
Epoch 8/20  Iteration 1250/3560 Training loss: 2.1275 0.3757 sec/batch
Epoch 8/20  Iteration 1251/3560 Training loss: 2.1227 0.4084 sec/batch
Epoch 8/20  Iteration 1252/3560 Training loss: 2.1181 0.3813 sec/batch
Epoch 8/20  Iteration 1253/3560 Training loss: 2.1193 0.3818 sec/batch
Epoch 8/20  Iteration 1254/3560 Training loss: 2.1196 0.3753 sec/batch
Epoch 8/20  Iteration 1255/3560 Training loss: 2.1222 0.3784 sec/batch
Epoch 8/20  Iteration 1256/3560 Training loss: 2.1216 0.3770 sec/batch
Epoch 8/20  Iteration 1257/3560 Training loss: 2.1192 0.4085 sec/batch
Epoch 8/20  Iteration 1258/3560 Training loss: 2.1179 0.3777 sec/batch
Epoch 8/20  Iteration 1259/3560 Training loss: 2.1184 0.3809 sec/batch
Epoch 8/20  Iteration 1260/3560 Training loss: 2.1206 0.3851 sec/batch
Epoch 8/20  Iteration 1261/3560 Training loss: 2.1199 0.3744 sec/batch
Epoch 8/20  Iteration 1262/3560 Training loss: 2.1194 0.3830 sec/batch
Epoch 8/20  Iteration 1263/3560 Training loss: 2.1192 0.3694 sec/batch
Epoch 8/20  Iteration 1264/3560 Training loss: 2.1210 0.4010 sec/batch
Epoch 8/20  Iteration 1265/3560 Training loss: 2.1215 0.3741 sec/batch
Epoch 8/20  Iteration 1266/3560 Training loss: 2.1215 0.3779 sec/batch
Epoch 8/20  Iteration 1267/3560 Training loss: 2.1209 0.3794 sec/batch
Epoch 8/20  Iteration 1268/3560 Training loss: 2.1218 0.3806 sec/batch
Epoch 8/20  Iteration 1269/3560 Training loss: 2.1219 0.3784 sec/batch
Epoch 8/20  Iteration 1270/3560 Training loss: 2.1212 0.3777 sec/batch
Epoch 8/20  Iteration 1271/3560 Training loss: 2.1209 0.3847 sec/batch
Epoch 8/20  Iteration 1272/3560 Training loss: 2.1199 0.3733 sec/batch
Epoch 8/20  Iteration 1273/3560 Training loss: 2.1191 0.3735 sec/batch
Epoch 8/20  Iteration 1274/3560 Training loss: 2.1194 0.3837 sec/batch
Epoch 8/20  Iteration 1275/3560 Training loss: 2.1201 0.3774 sec/batch
Epoch 8/20  Iteration 1276/3560 Training loss: 2.1203 0.3786 sec/batch
Epoch 8/20  Iteration 1277/3560 Training loss: 2.1203 0.3902 sec/batch
Epoch 8/20  Iteration 1278/3560 Training loss: 2.1197 0.3950 sec/batch
Epoch 8/20  Iteration 1279/3560 Training loss: 2.1196 0.3800 sec/batch
Epoch 8/20  Iteration 1280/3560 Training loss: 2.1204 0.3845 sec/batch
Epoch 8/20  Iteration 1281/3560 Training loss: 2.1199 0.3713 sec/batch
Epoch 8/20  Iteration 1282/3560 Training loss: 2.1197 0.3791 sec/batch
Epoch 8/20  Iteration 1283/3560 Training loss: 2.1194 0.3755 sec/batch
Epoch 8/20  Iteration 1284/3560 Training loss: 2.1182 0.3819 sec/batch
Epoch 8/20  Iteration 1285/3560 Training loss: 2.1172 0.3767 sec/batch
Epoch 8/20  Iteration 1286/3560 Training loss: 2.1165 0.3845 sec/batch
Epoch 8/20  Iteration 1287/3560 Training loss: 2.1160 0.3735 sec/batch
Epoch 8/20  Iteration 1288/3560 Training loss: 2.1159 0.3823 sec/batch
Epoch 8/20  Iteration 1289/3560 Training loss: 2.1153 0.3836 sec/batch
Epoch 8/20  Iteration 1290/3560 Training loss: 2.1146 0.3779 sec/batch
Epoch 8/20  Iteration 1291/3560 Training loss: 2.1145 0.3960 sec/batch
Epoch 8/20  Iteration 1292/3560 Training loss: 2.1132 0.3807 sec/batch
Epoch 8/20  Iteration 1293/3560 Training loss: 2.1131 0.3840 sec/batch
Epoch 8/20  Iteration 1294/3560 Training loss: 2.1127 0.3715 sec/batch
Epoch 8/20  Iteration 1295/3560 Training loss: 2.1126 0.3853 sec/batch
Epoch 8/20  Iteration 1296/3560 Training loss: 2.1132 0.3831 sec/batch
Epoch 8/20  Iteration 1297/3560 Training loss: 2.1126 0.3817 sec/batch
Epoch 8/20  Iteration 1298/3560 Training loss: 2.1130 0.3825 sec/batch
Epoch 8/20  Iteration 1299/3560 Training loss: 2.1126 0.3794 sec/batch
Epoch 8/20  Iteration 1300/3560 Training loss: 2.1123 0.3729 sec/batch
Epoch 8/20  Iteration 1301/3560 Training loss: 2.1119 0.3995 sec/batch
Epoch 8/20  Iteration 1302/3560 Training loss: 2.1120 0.3779 sec/batch
Epoch 8/20  Iteration 1303/3560 Training loss: 2.1122 0.3805 sec/batch
Epoch 8/20  Iteration 1304/3560 Training loss: 2.1118 0.4026 sec/batch
Epoch 8/20  Iteration 1305/3560 Training loss: 2.1115 0.3780 sec/batch
Epoch 8/20  Iteration 1306/3560 Training loss: 2.1119 0.3785 sec/batch
Epoch 8/20  Iteration 1307/3560 Training loss: 2.1116 0.3812 sec/batch
Epoch 8/20  Iteration 1308/3560 Training loss: 2.1120 0.3769 sec/batch
Epoch 8/20  Iteration 1309/3560 Training loss: 2.1123 0.3814 sec/batch
Epoch 8/20  Iteration 1310/3560 Training loss: 2.1124 0.3869 sec/batch
Epoch 8/20  Iteration 1311/3560 Training loss: 2.1123 0.3775 sec/batch
Epoch 8/20  Iteration 1312/3560 Training loss: 2.1124 0.3747 sec/batch
Epoch 8/20  Iteration 1313/3560 Training loss: 2.1125 0.3785 sec/batch
Epoch 8/20  Iteration 1314/3560 Training loss: 2.1119 0.3740 sec/batch
Epoch 8/20  Iteration 1315/3560 Training loss: 2.1117 0.3868 sec/batch
Epoch 8/20  Iteration 1316/3560 Training loss: 2.1117 0.3741 sec/batch
Epoch 8/20  Iteration 1317/3560 Training loss: 2.1118 0.3965 sec/batch
Epoch 8/20  Iteration 1318/3560 Training loss: 2.1118 0.3792 sec/batch
Epoch 8/20  Iteration 1319/3560 Training loss: 2.1119 0.3831 sec/batch
Epoch 8/20  Iteration 1320/3560 Training loss: 2.1114 0.3803 sec/batch
Epoch 8/20  Iteration 1321/3560 Training loss: 2.1113 0.3776 sec/batch
Epoch 8/20  Iteration 1322/3560 Training loss: 2.1115 0.3796 sec/batch
Epoch 8/20  Iteration 1323/3560 Training loss: 2.1114 0.3785 sec/batch
Epoch 8/20  Iteration 1324/3560 Training loss: 2.1115 0.3758 sec/batch
Epoch 8/20  Iteration 1325/3560 Training loss: 2.1111 0.3753 sec/batch
Epoch 8/20  Iteration 1326/3560 Training loss: 2.1108 0.3776 sec/batch
Epoch 8/20  Iteration 1327/3560 Training loss: 2.1104 0.3762 sec/batch
Epoch 8/20  Iteration 1328/3560 Training loss: 2.1104 0.3773 sec/batch
Epoch 8/20  Iteration 1329/3560 Training loss: 2.1100 0.3772 sec/batch
Epoch 8/20  Iteration 1330/3560 Training loss: 2.1097 0.4090 sec/batch
Epoch 8/20  Iteration 1331/3560 Training loss: 2.1090 0.3808 sec/batch
Epoch 8/20  Iteration 1332/3560 Training loss: 2.1086 0.3778 sec/batch
Epoch 8/20  Iteration 1333/3560 Training loss: 2.1085 0.3782 sec/batch
Epoch 8/20  Iteration 1334/3560 Training loss: 2.1082 0.3967 sec/batch
Epoch 8/20  Iteration 1335/3560 Training loss: 2.1077 0.3780 sec/batch
Epoch 8/20  Iteration 1336/3560 Training loss: 2.1077 0.3780 sec/batch
Epoch 8/20  Iteration 1337/3560 Training loss: 2.1075 0.3819 sec/batch
Epoch 8/20  Iteration 1338/3560 Training loss: 2.1073 0.3795 sec/batch
Epoch 8/20  Iteration 1339/3560 Training loss: 2.1069 0.3802 sec/batch
Epoch 8/20  Iteration 1340/3560 Training loss: 2.1065 0.3754 sec/batch
Epoch 8/20  Iteration 1341/3560 Training loss: 2.1062 0.3773 sec/batch
Epoch 8/20  Iteration 1342/3560 Training loss: 2.1059 0.3755 sec/batch
Epoch 8/20  Iteration 1343/3560 Training loss: 2.1058 0.3777 sec/batch
Epoch 8/20  Iteration 1344/3560 Training loss: 2.1056 0.4077 sec/batch
Epoch 8/20  Iteration 1345/3560 Training loss: 2.1053 0.3758 sec/batch
Epoch 8/20  Iteration 1346/3560 Training loss: 2.1049 0.3821 sec/batch
Epoch 8/20  Iteration 1347/3560 Training loss: 2.1049 0.3752 sec/batch
Epoch 8/20  Iteration 1348/3560 Training loss: 2.1048 0.3751 sec/batch
Epoch 8/20  Iteration 1349/3560 Training loss: 2.1045 0.3786 sec/batch
Epoch 8/20  Iteration 1350/3560 Training loss: 2.1043 0.3826 sec/batch
Epoch 8/20  Iteration 1351/3560 Training loss: 2.1041 0.3746 sec/batch
Epoch 8/20  Iteration 1352/3560 Training loss: 2.1039 0.3867 sec/batch
Epoch 8/20  Iteration 1353/3560 Training loss: 2.1038 0.3802 sec/batch
Epoch 8/20  Iteration 1354/3560 Training loss: 2.1037 0.3898 sec/batch
Epoch 8/20  Iteration 1355/3560 Training loss: 2.1037 0.3782 sec/batch
Epoch 8/20  Iteration 1356/3560 Training loss: 2.1035 0.3904 sec/batch
Epoch 8/20  Iteration 1357/3560 Training loss: 2.1035 0.3989 sec/batch
Epoch 8/20  Iteration 1358/3560 Training loss: 2.1034 0.3749 sec/batch
Epoch 8/20  Iteration 1359/3560 Training loss: 2.1032 0.3854 sec/batch
Epoch 8/20  Iteration 1360/3560 Training loss: 2.1031 0.3748 sec/batch
Epoch 8/20  Iteration 1361/3560 Training loss: 2.1028 0.3794 sec/batch
Epoch 8/20  Iteration 1362/3560 Training loss: 2.1024 0.3858 sec/batch
Epoch 8/20  Iteration 1363/3560 Training loss: 2.1023 0.3829 sec/batch
Epoch 8/20  Iteration 1364/3560 Training loss: 2.1022 0.3849 sec/batch
Epoch 8/20  Iteration 1365/3560 Training loss: 2.1022 0.3806 sec/batch
Epoch 8/20  Iteration 1366/3560 Training loss: 2.1021 0.3791 sec/batch
Epoch 8/20  Iteration 1367/3560 Training loss: 2.1021 0.3756 sec/batch
Epoch 8/20  Iteration 1368/3560 Training loss: 2.1019 0.3733 sec/batch
Epoch 8/20  Iteration 1369/3560 Training loss: 2.1018 0.3710 sec/batch
Epoch 8/20  Iteration 1370/3560 Training loss: 2.1019 0.4005 sec/batch
Epoch 8/20  Iteration 1371/3560 Training loss: 2.1017 0.3794 sec/batch
Epoch 8/20  Iteration 1372/3560 Training loss: 2.1014 0.4210 sec/batch
Epoch 8/20  Iteration 1373/3560 Training loss: 2.1014 0.3805 sec/batch
Epoch 8/20  Iteration 1374/3560 Training loss: 2.1014 0.3980 sec/batch
Epoch 8/20  Iteration 1375/3560 Training loss: 2.1013 0.3989 sec/batch
Epoch 8/20  Iteration 1376/3560 Training loss: 2.1013 0.3793 sec/batch
Epoch 8/20  Iteration 1377/3560 Training loss: 2.1010 0.3767 sec/batch
Epoch 8/20  Iteration 1378/3560 Training loss: 2.1007 0.3773 sec/batch
Epoch 8/20  Iteration 1379/3560 Training loss: 2.1007 0.3797 sec/batch
Epoch 8/20  Iteration 1380/3560 Training loss: 2.1007 0.3826 sec/batch
Epoch 8/20  Iteration 1381/3560 Training loss: 2.1006 0.3746 sec/batch
Epoch 8/20  Iteration 1382/3560 Training loss: 2.1005 0.3821 sec/batch
Epoch 8/20  Iteration 1383/3560 Training loss: 2.1005 0.4040 sec/batch
Epoch 8/20  Iteration 1384/3560 Training loss: 2.1006 0.3851 sec/batch
Epoch 8/20  Iteration 1385/3560 Training loss: 2.1008 0.3732 sec/batch
Epoch 8/20  Iteration 1386/3560 Training loss: 2.1005 0.3816 sec/batch
Epoch 8/20  Iteration 1387/3560 Training loss: 2.1007 0.3865 sec/batch
Epoch 8/20  Iteration 1388/3560 Training loss: 2.1005 0.3781 sec/batch
Epoch 8/20  Iteration 1389/3560 Training loss: 2.1004 0.3779 sec/batch
Epoch 8/20  Iteration 1390/3560 Training loss: 2.1004 0.3795 sec/batch
Epoch 8/20  Iteration 1391/3560 Training loss: 2.1002 0.3725 sec/batch
Epoch 8/20  Iteration 1392/3560 Training loss: 2.1003 0.3776 sec/batch
Epoch 8/20  Iteration 1393/3560 Training loss: 2.1003 0.3840 sec/batch
Epoch 8/20  Iteration 1394/3560 Training loss: 2.1004 0.3749 sec/batch
Epoch 8/20  Iteration 1395/3560 Training loss: 2.1003 0.3787 sec/batch
Epoch 8/20  Iteration 1396/3560 Training loss: 2.1002 0.3949 sec/batch
Epoch 8/20  Iteration 1397/3560 Training loss: 2.0999 0.3823 sec/batch
Epoch 8/20  Iteration 1398/3560 Training loss: 2.1000 0.3775 sec/batch
Epoch 8/20  Iteration 1399/3560 Training loss: 2.1000 0.3787 sec/batch
Epoch 8/20  Iteration 1400/3560 Training loss: 2.1001 0.3792 sec/batch
Epoch 8/20  Iteration 1401/3560 Training loss: 2.0999 0.3829 sec/batch
Epoch 8/20  Iteration 1402/3560 Training loss: 2.0998 0.3813 sec/batch
Epoch 8/20  Iteration 1403/3560 Training loss: 2.0997 0.3907 sec/batch
Epoch 8/20  Iteration 1404/3560 Training loss: 2.0997 0.3745 sec/batch
Epoch 8/20  Iteration 1405/3560 Training loss: 2.0994 0.3757 sec/batch
Epoch 8/20  Iteration 1406/3560 Training loss: 2.0994 0.3800 sec/batch
Epoch 8/20  Iteration 1407/3560 Training loss: 2.0994 0.3822 sec/batch
Epoch 8/20  Iteration 1408/3560 Training loss: 2.0993 0.3911 sec/batch
Epoch 8/20  Iteration 1409/3560 Training loss: 2.0993 0.3947 sec/batch
Epoch 8/20  Iteration 1410/3560 Training loss: 2.0992 0.3859 sec/batch
Epoch 8/20  Iteration 1411/3560 Training loss: 2.0992 0.3821 sec/batch
Epoch 8/20  Iteration 1412/3560 Training loss: 2.0991 0.3916 sec/batch
Epoch 8/20  Iteration 1413/3560 Training loss: 2.0991 0.3776 sec/batch
Epoch 8/20  Iteration 1414/3560 Training loss: 2.0993 0.3825 sec/batch
Epoch 8/20  Iteration 1415/3560 Training loss: 2.0992 0.3840 sec/batch
Epoch 8/20  Iteration 1416/3560 Training loss: 2.0991 0.3777 sec/batch
Epoch 8/20  Iteration 1417/3560 Training loss: 2.0990 0.3770 sec/batch
Epoch 8/20  Iteration 1418/3560 Training loss: 2.0989 0.3789 sec/batch
Epoch 8/20  Iteration 1419/3560 Training loss: 2.0989 0.3785 sec/batch
Epoch 8/20  Iteration 1420/3560 Training loss: 2.0989 0.3921 sec/batch
Epoch 8/20  Iteration 1421/3560 Training loss: 2.0989 0.3766 sec/batch
Epoch 8/20  Iteration 1422/3560 Training loss: 2.0989 0.3746 sec/batch
Epoch 8/20  Iteration 1423/3560 Training loss: 2.0987 0.3993 sec/batch
Epoch 8/20  Iteration 1424/3560 Training loss: 2.0986 0.3881 sec/batch
Epoch 9/20  Iteration 1425/3560 Training loss: 2.1412 0.3794 sec/batch
Epoch 9/20  Iteration 1426/3560 Training loss: 2.1056 0.3758 sec/batch
Epoch 9/20  Iteration 1427/3560 Training loss: 2.0940 0.3796 sec/batch
Epoch 9/20  Iteration 1428/3560 Training loss: 2.0888 0.3817 sec/batch
Epoch 9/20  Iteration 1429/3560 Training loss: 2.0851 0.3772 sec/batch
Epoch 9/20  Iteration 1430/3560 Training loss: 2.0794 0.3819 sec/batch
Epoch 9/20  Iteration 1431/3560 Training loss: 2.0806 0.3776 sec/batch
Epoch 9/20  Iteration 1432/3560 Training loss: 2.0814 0.3847 sec/batch
Epoch 9/20  Iteration 1433/3560 Training loss: 2.0834 0.3995 sec/batch
Epoch 9/20  Iteration 1434/3560 Training loss: 2.0835 0.4036 sec/batch
Epoch 9/20  Iteration 1435/3560 Training loss: 2.0809 0.3991 sec/batch
Epoch 9/20  Iteration 1436/3560 Training loss: 2.0789 0.4407 sec/batch
Epoch 9/20  Iteration 1437/3560 Training loss: 2.0794 0.4097 sec/batch
Epoch 9/20  Iteration 1438/3560 Training loss: 2.0811 0.4180 sec/batch
Epoch 9/20  Iteration 1439/3560 Training loss: 2.0808 0.4378 sec/batch
Epoch 9/20  Iteration 1440/3560 Training loss: 2.0794 0.4585 sec/batch
Epoch 9/20  Iteration 1441/3560 Training loss: 2.0795 0.4013 sec/batch
Epoch 9/20  Iteration 1442/3560 Training loss: 2.0816 0.4798 sec/batch
Epoch 9/20  Iteration 1443/3560 Training loss: 2.0816 0.4065 sec/batch
Epoch 9/20  Iteration 1444/3560 Training loss: 2.0813 0.3823 sec/batch
Epoch 9/20  Iteration 1445/3560 Training loss: 2.0804 0.3929 sec/batch
Epoch 9/20  Iteration 1446/3560 Training loss: 2.0815 0.3829 sec/batch
Epoch 9/20  Iteration 1447/3560 Training loss: 2.0810 0.3881 sec/batch
Epoch 9/20  Iteration 1448/3560 Training loss: 2.0802 0.4027 sec/batch
Epoch 9/20  Iteration 1449/3560 Training loss: 2.0798 0.3849 sec/batch
Epoch 9/20  Iteration 1450/3560 Training loss: 2.0792 0.3765 sec/batch
Epoch 9/20  Iteration 1451/3560 Training loss: 2.0784 0.3862 sec/batch
Epoch 9/20  Iteration 1452/3560 Training loss: 2.0789 0.3738 sec/batch
Epoch 9/20  Iteration 1453/3560 Training loss: 2.0796 0.3779 sec/batch
Epoch 9/20  Iteration 1454/3560 Training loss: 2.0796 0.3767 sec/batch
Epoch 9/20  Iteration 1455/3560 Training loss: 2.0795 0.3862 sec/batch
Epoch 9/20  Iteration 1456/3560 Training loss: 2.0789 0.3755 sec/batch
Epoch 9/20  Iteration 1457/3560 Training loss: 2.0790 0.3769 sec/batch
Epoch 9/20  Iteration 1458/3560 Training loss: 2.0798 0.3878 sec/batch
Epoch 9/20  Iteration 1459/3560 Training loss: 2.0794 0.3759 sec/batch
Epoch 9/20  Iteration 1460/3560 Training loss: 2.0793 0.3772 sec/batch
Epoch 9/20  Iteration 1461/3560 Training loss: 2.0794 0.3982 sec/batch
Epoch 9/20  Iteration 1462/3560 Training loss: 2.0780 0.3780 sec/batch
Epoch 9/20  Iteration 1463/3560 Training loss: 2.0771 0.3762 sec/batch
Epoch 9/20  Iteration 1464/3560 Training loss: 2.0762 0.3869 sec/batch
Epoch 9/20  Iteration 1465/3560 Training loss: 2.0758 0.3870 sec/batch
Epoch 9/20  Iteration 1466/3560 Training loss: 2.0759 0.3795 sec/batch
Epoch 9/20  Iteration 1467/3560 Training loss: 2.0756 0.3805 sec/batch
Epoch 9/20  Iteration 1468/3560 Training loss: 2.0746 0.3840 sec/batch
Epoch 9/20  Iteration 1469/3560 Training loss: 2.0746 0.3747 sec/batch
Epoch 9/20  Iteration 1470/3560 Training loss: 2.0732 0.3832 sec/batch
Epoch 9/20  Iteration 1471/3560 Training loss: 2.0732 0.3916 sec/batch
Epoch 9/20  Iteration 1472/3560 Training loss: 2.0727 0.3805 sec/batch
Epoch 9/20  Iteration 1473/3560 Training loss: 2.0725 0.3854 sec/batch
Epoch 9/20  Iteration 1474/3560 Training loss: 2.0731 0.3991 sec/batch
Epoch 9/20  Iteration 1475/3560 Training loss: 2.0725 0.4515 sec/batch
Epoch 9/20  Iteration 1476/3560 Training loss: 2.0730 0.5055 sec/batch
Epoch 9/20  Iteration 1477/3560 Training loss: 2.0727 0.3971 sec/batch
Epoch 9/20  Iteration 1478/3560 Training loss: 2.0725 0.3760 sec/batch
Epoch 9/20  Iteration 1479/3560 Training loss: 2.0721 0.3879 sec/batch
Epoch 9/20  Iteration 1480/3560 Training loss: 2.0723 0.3861 sec/batch
Epoch 9/20  Iteration 1481/3560 Training loss: 2.0727 0.3822 sec/batch
Epoch 9/20  Iteration 1482/3560 Training loss: 2.0725 0.3792 sec/batch
Epoch 9/20  Iteration 1483/3560 Training loss: 2.0721 0.3806 sec/batch
Epoch 9/20  Iteration 1484/3560 Training loss: 2.0724 0.3787 sec/batch
Epoch 9/20  Iteration 1485/3560 Training loss: 2.0722 0.3837 sec/batch
Epoch 9/20  Iteration 1486/3560 Training loss: 2.0728 0.3810 sec/batch
Epoch 9/20  Iteration 1487/3560 Training loss: 2.0731 0.4124 sec/batch
Epoch 9/20  Iteration 1488/3560 Training loss: 2.0731 0.3773 sec/batch
Epoch 9/20  Iteration 1489/3560 Training loss: 2.0728 0.3782 sec/batch
Epoch 9/20  Iteration 1490/3560 Training loss: 2.0730 0.3839 sec/batch
Epoch 9/20  Iteration 1491/3560 Training loss: 2.0731 0.3820 sec/batch
Epoch 9/20  Iteration 1492/3560 Training loss: 2.0726 0.3764 sec/batch
Epoch 9/20  Iteration 1493/3560 Training loss: 2.0722 0.3853 sec/batch
Epoch 9/20  Iteration 1494/3560 Training loss: 2.0722 0.3941 sec/batch
Epoch 9/20  Iteration 1495/3560 Training loss: 2.0724 0.3803 sec/batch
Epoch 9/20  Iteration 1496/3560 Training loss: 2.0724 0.3801 sec/batch
Epoch 9/20  Iteration 1497/3560 Training loss: 2.0728 0.3801 sec/batch
Epoch 9/20  Iteration 1498/3560 Training loss: 2.0724 0.3823 sec/batch
Epoch 9/20  Iteration 1499/3560 Training loss: 2.0723 0.3778 sec/batch
Epoch 9/20  Iteration 1500/3560 Training loss: 2.0726 0.3952 sec/batch
Epoch 9/20  Iteration 1501/3560 Training loss: 2.0725 0.3767 sec/batch
Epoch 9/20  Iteration 1502/3560 Training loss: 2.0725 0.3797 sec/batch
Epoch 9/20  Iteration 1503/3560 Training loss: 2.0721 0.3756 sec/batch
Epoch 9/20  Iteration 1504/3560 Training loss: 2.0721 0.3791 sec/batch
Epoch 9/20  Iteration 1505/3560 Training loss: 2.0717 0.3826 sec/batch
Epoch 9/20  Iteration 1506/3560 Training loss: 2.0717 0.3818 sec/batch
Epoch 9/20  Iteration 1507/3560 Training loss: 2.0711 0.3777 sec/batch
Epoch 9/20  Iteration 1508/3560 Training loss: 2.0710 0.3844 sec/batch
Epoch 9/20  Iteration 1509/3560 Training loss: 2.0704 0.3770 sec/batch
Epoch 9/20  Iteration 1510/3560 Training loss: 2.0700 0.3771 sec/batch
Epoch 9/20  Iteration 1511/3560 Training loss: 2.0698 0.3897 sec/batch
Epoch 9/20  Iteration 1512/3560 Training loss: 2.0695 0.3846 sec/batch
Epoch 9/20  Iteration 1513/3560 Training loss: 2.0692 0.4016 sec/batch
Epoch 9/20  Iteration 1514/3560 Training loss: 2.0692 0.3814 sec/batch
Epoch 9/20  Iteration 1515/3560 Training loss: 2.0687 0.3868 sec/batch
Epoch 9/20  Iteration 1516/3560 Training loss: 2.0686 0.3793 sec/batch
Epoch 9/20  Iteration 1517/3560 Training loss: 2.0682 0.3880 sec/batch
Epoch 9/20  Iteration 1518/3560 Training loss: 2.0678 0.3773 sec/batch
Epoch 9/20  Iteration 1519/3560 Training loss: 2.0675 0.3760 sec/batch
Epoch 9/20  Iteration 1520/3560 Training loss: 2.0673 0.3799 sec/batch
Epoch 9/20  Iteration 1521/3560 Training loss: 2.0671 0.3872 sec/batch
Epoch 9/20  Iteration 1522/3560 Training loss: 2.0668 0.3769 sec/batch
Epoch 9/20  Iteration 1523/3560 Training loss: 2.0665 0.3774 sec/batch
Epoch 9/20  Iteration 1524/3560 Training loss: 2.0661 0.3795 sec/batch
Epoch 9/20  Iteration 1525/3560 Training loss: 2.0661 0.3849 sec/batch
Epoch 9/20  Iteration 1526/3560 Training loss: 2.0659 0.4072 sec/batch
Epoch 9/20  Iteration 1527/3560 Training loss: 2.0656 0.3793 sec/batch
Epoch 9/20  Iteration 1528/3560 Training loss: 2.0654 0.3847 sec/batch
Epoch 9/20  Iteration 1529/3560 Training loss: 2.0652 0.3769 sec/batch
Epoch 9/20  Iteration 1530/3560 Training loss: 2.0650 0.3780 sec/batch
Epoch 9/20  Iteration 1531/3560 Training loss: 2.0650 0.3821 sec/batch
Epoch 9/20  Iteration 1532/3560 Training loss: 2.0649 0.3732 sec/batch
Epoch 9/20  Iteration 1533/3560 Training loss: 2.0649 0.3758 sec/batch
Epoch 9/20  Iteration 1534/3560 Training loss: 2.0647 0.3805 sec/batch
Epoch 9/20  Iteration 1535/3560 Training loss: 2.0647 0.3774 sec/batch
Epoch 9/20  Iteration 1536/3560 Training loss: 2.0646 0.3777 sec/batch
Epoch 9/20  Iteration 1537/3560 Training loss: 2.0645 0.3790 sec/batch
Epoch 9/20  Iteration 1538/3560 Training loss: 2.0643 0.3779 sec/batch
Epoch 9/20  Iteration 1539/3560 Training loss: 2.0641 0.4021 sec/batch
Epoch 9/20  Iteration 1540/3560 Training loss: 2.0636 0.3793 sec/batch
Epoch 9/20  Iteration 1541/3560 Training loss: 2.0635 0.3784 sec/batch
Epoch 9/20  Iteration 1542/3560 Training loss: 2.0633 0.3830 sec/batch
Epoch 9/20  Iteration 1543/3560 Training loss: 2.0634 0.3761 sec/batch
Epoch 9/20  Iteration 1544/3560 Training loss: 2.0632 0.3751 sec/batch
Epoch 9/20  Iteration 1545/3560 Training loss: 2.0632 0.3787 sec/batch
Epoch 9/20  Iteration 1546/3560 Training loss: 2.0630 0.3899 sec/batch
Epoch 9/20  Iteration 1547/3560 Training loss: 2.0628 0.3718 sec/batch
Epoch 9/20  Iteration 1548/3560 Training loss: 2.0629 0.3787 sec/batch
Epoch 9/20  Iteration 1549/3560 Training loss: 2.0629 0.3822 sec/batch
Epoch 9/20  Iteration 1550/3560 Training loss: 2.0625 0.3815 sec/batch
Epoch 9/20  Iteration 1551/3560 Training loss: 2.0625 0.3834 sec/batch
Epoch 9/20  Iteration 1552/3560 Training loss: 2.0625 0.3827 sec/batch
Epoch 9/20  Iteration 1553/3560 Training loss: 2.0624 0.3939 sec/batch
Epoch 9/20  Iteration 1554/3560 Training loss: 2.0624 0.3767 sec/batch
Epoch 9/20  Iteration 1555/3560 Training loss: 2.0621 0.3739 sec/batch
Epoch 9/20  Iteration 1556/3560 Training loss: 2.0618 0.3884 sec/batch
Epoch 9/20  Iteration 1557/3560 Training loss: 2.0618 0.3910 sec/batch
Epoch 9/20  Iteration 1558/3560 Training loss: 2.0619 0.3870 sec/batch
Epoch 9/20  Iteration 1559/3560 Training loss: 2.0618 0.3929 sec/batch
Epoch 9/20  Iteration 1560/3560 Training loss: 2.0617 0.3920 sec/batch
Epoch 9/20  Iteration 1561/3560 Training loss: 2.0617 0.3952 sec/batch
Epoch 9/20  Iteration 1562/3560 Training loss: 2.0617 0.3742 sec/batch
Epoch 9/20  Iteration 1563/3560 Training loss: 2.0619 0.3822 sec/batch
Epoch 9/20  Iteration 1564/3560 Training loss: 2.0617 0.3763 sec/batch
Epoch 9/20  Iteration 1565/3560 Training loss: 2.0619 0.3921 sec/batch
Epoch 9/20  Iteration 1566/3560 Training loss: 2.0618 0.3987 sec/batch
Epoch 9/20  Iteration 1567/3560 Training loss: 2.0617 0.3841 sec/batch
Epoch 9/20  Iteration 1568/3560 Training loss: 2.0617 0.3738 sec/batch
Epoch 9/20  Iteration 1569/3560 Training loss: 2.0615 0.3815 sec/batch
Epoch 9/20  Iteration 1570/3560 Training loss: 2.0616 0.3694 sec/batch
Epoch 9/20  Iteration 1571/3560 Training loss: 2.0616 0.3772 sec/batch
Epoch 9/20  Iteration 1572/3560 Training loss: 2.0617 0.3686 sec/batch
Epoch 9/20  Iteration 1573/3560 Training loss: 2.0617 0.3838 sec/batch
Epoch 9/20  Iteration 1574/3560 Training loss: 2.0616 0.3739 sec/batch
Epoch 9/20  Iteration 1575/3560 Training loss: 2.0614 0.3768 sec/batch
Epoch 9/20  Iteration 1576/3560 Training loss: 2.0616 0.3812 sec/batch
Epoch 9/20  Iteration 1577/3560 Training loss: 2.0616 0.3783 sec/batch
Epoch 9/20  Iteration 1578/3560 Training loss: 2.0616 0.3791 sec/batch
Epoch 9/20  Iteration 1579/3560 Training loss: 2.0615 0.4075 sec/batch
Epoch 9/20  Iteration 1580/3560 Training loss: 2.0614 0.3732 sec/batch
Epoch 9/20  Iteration 1581/3560 Training loss: 2.0614 0.3802 sec/batch
Epoch 9/20  Iteration 1582/3560 Training loss: 2.0613 0.3753 sec/batch
Epoch 9/20  Iteration 1583/3560 Training loss: 2.0611 0.3791 sec/batch
Epoch 9/20  Iteration 1584/3560 Training loss: 2.0612 0.3768 sec/batch
Epoch 9/20  Iteration 1585/3560 Training loss: 2.0612 0.3789 sec/batch
Epoch 9/20  Iteration 1586/3560 Training loss: 2.0611 0.3773 sec/batch
Epoch 9/20  Iteration 1587/3560 Training loss: 2.0611 0.3830 sec/batch
Epoch 9/20  Iteration 1588/3560 Training loss: 2.0611 0.3823 sec/batch
Epoch 9/20  Iteration 1589/3560 Training loss: 2.0610 0.3782 sec/batch
Epoch 9/20  Iteration 1590/3560 Training loss: 2.0608 0.3806 sec/batch
Epoch 9/20  Iteration 1591/3560 Training loss: 2.0608 0.3778 sec/batch
Epoch 9/20  Iteration 1592/3560 Training loss: 2.0610 0.4153 sec/batch
Epoch 9/20  Iteration 1593/3560 Training loss: 2.0609 0.3831 sec/batch
Epoch 9/20  Iteration 1594/3560 Training loss: 2.0608 0.3793 sec/batch
Epoch 9/20  Iteration 1595/3560 Training loss: 2.0607 0.3799 sec/batch
Epoch 9/20  Iteration 1596/3560 Training loss: 2.0605 0.3729 sec/batch
Epoch 9/20  Iteration 1597/3560 Training loss: 2.0606 0.3667 sec/batch
Epoch 9/20  Iteration 1598/3560 Training loss: 2.0605 0.3886 sec/batch
Epoch 9/20  Iteration 1599/3560 Training loss: 2.0606 0.3783 sec/batch
Epoch 9/20  Iteration 1600/3560 Training loss: 2.0605 0.3838 sec/batch
Epoch 9/20  Iteration 1601/3560 Training loss: 2.0604 0.3907 sec/batch
Epoch 9/20  Iteration 1602/3560 Training loss: 2.0604 0.3759 sec/batch
Epoch 10/20  Iteration 1603/3560 Training loss: 2.1149 0.3764 sec/batch
Epoch 10/20  Iteration 1604/3560 Training loss: 2.0684 0.3832 sec/batch
Epoch 10/20  Iteration 1605/3560 Training loss: 2.0572 0.4004 sec/batch
Epoch 10/20  Iteration 1606/3560 Training loss: 2.0530 0.3800 sec/batch
Epoch 10/20  Iteration 1607/3560 Training loss: 2.0507 0.3788 sec/batch
Epoch 10/20  Iteration 1608/3560 Training loss: 2.0439 0.3779 sec/batch
Epoch 10/20  Iteration 1609/3560 Training loss: 2.0448 0.3762 sec/batch
Epoch 10/20  Iteration 1610/3560 Training loss: 2.0459 0.3850 sec/batch
Epoch 10/20  Iteration 1611/3560 Training loss: 2.0480 0.3810 sec/batch
Epoch 10/20  Iteration 1612/3560 Training loss: 2.0472 0.3814 sec/batch
Epoch 10/20  Iteration 1613/3560 Training loss: 2.0446 0.3834 sec/batch
Epoch 10/20  Iteration 1614/3560 Training loss: 2.0425 0.3763 sec/batch
Epoch 10/20  Iteration 1615/3560 Training loss: 2.0430 0.3772 sec/batch
Epoch 10/20  Iteration 1616/3560 Training loss: 2.0453 0.3834 sec/batch
Epoch 10/20  Iteration 1617/3560 Training loss: 2.0442 0.3745 sec/batch
Epoch 10/20  Iteration 1618/3560 Training loss: 2.0432 0.3754 sec/batch
Epoch 10/20  Iteration 1619/3560 Training loss: 2.0430 0.4019 sec/batch
Epoch 10/20  Iteration 1620/3560 Training loss: 2.0452 0.3730 sec/batch
Epoch 10/20  Iteration 1621/3560 Training loss: 2.0450 0.3778 sec/batch
Epoch 10/20  Iteration 1622/3560 Training loss: 2.0450 0.3746 sec/batch
Epoch 10/20  Iteration 1623/3560 Training loss: 2.0442 0.3760 sec/batch
Epoch 10/20  Iteration 1624/3560 Training loss: 2.0452 0.3735 sec/batch
Epoch 10/20  Iteration 1625/3560 Training loss: 2.0447 0.3762 sec/batch
Epoch 10/20  Iteration 1626/3560 Training loss: 2.0438 0.3768 sec/batch
Epoch 10/20  Iteration 1627/3560 Training loss: 2.0437 0.3731 sec/batch
Epoch 10/20  Iteration 1628/3560 Training loss: 2.0429 0.3836 sec/batch
Epoch 10/20  Iteration 1629/3560 Training loss: 2.0420 0.3820 sec/batch
Epoch 10/20  Iteration 1630/3560 Training loss: 2.0423 0.3896 sec/batch
Epoch 10/20  Iteration 1631/3560 Training loss: 2.0429 0.3807 sec/batch
Epoch 10/20  Iteration 1632/3560 Training loss: 2.0430 0.4059 sec/batch
Epoch 10/20  Iteration 1633/3560 Training loss: 2.0431 0.3744 sec/batch
Epoch 10/20  Iteration 1634/3560 Training loss: 2.0428 0.3769 sec/batch
Epoch 10/20  Iteration 1635/3560 Training loss: 2.0429 0.3723 sec/batch
Epoch 10/20  Iteration 1636/3560 Training loss: 2.0438 0.3762 sec/batch
Epoch 10/20  Iteration 1637/3560 Training loss: 2.0435 0.3754 sec/batch
Epoch 10/20  Iteration 1638/3560 Training loss: 2.0431 0.3848 sec/batch
Epoch 10/20  Iteration 1639/3560 Training loss: 2.0428 0.3743 sec/batch
Epoch 10/20  Iteration 1640/3560 Training loss: 2.0417 0.3815 sec/batch
Epoch 10/20  Iteration 1641/3560 Training loss: 2.0407 0.3803 sec/batch
Epoch 10/20  Iteration 1642/3560 Training loss: 2.0399 0.3785 sec/batch
Epoch 10/20  Iteration 1643/3560 Training loss: 2.0396 0.3951 sec/batch
Epoch 10/20  Iteration 1644/3560 Training loss: 2.0396 0.3754 sec/batch
Epoch 10/20  Iteration 1645/3560 Training loss: 2.0392 0.4026 sec/batch
Epoch 10/20  Iteration 1646/3560 Training loss: 2.0385 0.3817 sec/batch
Epoch 10/20  Iteration 1647/3560 Training loss: 2.0385 0.3905 sec/batch
Epoch 10/20  Iteration 1648/3560 Training loss: 2.0374 0.3767 sec/batch
Epoch 10/20  Iteration 1649/3560 Training loss: 2.0375 0.3782 sec/batch
Epoch 10/20  Iteration 1650/3560 Training loss: 2.0368 0.3760 sec/batch
Epoch 10/20  Iteration 1651/3560 Training loss: 2.0366 0.3881 sec/batch
Epoch 10/20  Iteration 1652/3560 Training loss: 2.0374 0.3827 sec/batch
Epoch 10/20  Iteration 1653/3560 Training loss: 2.0366 0.3771 sec/batch
Epoch 10/20  Iteration 1654/3560 Training loss: 2.0372 0.3806 sec/batch
Epoch 10/20  Iteration 1655/3560 Training loss: 2.0370 0.3745 sec/batch
Epoch 10/20  Iteration 1656/3560 Training loss: 2.0370 0.3751 sec/batch
Epoch 10/20  Iteration 1657/3560 Training loss: 2.0366 0.3834 sec/batch
Epoch 10/20  Iteration 1658/3560 Training loss: 2.0367 0.4081 sec/batch
Epoch 10/20  Iteration 1659/3560 Training loss: 2.0369 0.3762 sec/batch
Epoch 10/20  Iteration 1660/3560 Training loss: 2.0367 0.3844 sec/batch
Epoch 10/20  Iteration 1661/3560 Training loss: 2.0363 0.3777 sec/batch
Epoch 10/20  Iteration 1662/3560 Training loss: 2.0368 0.3825 sec/batch
Epoch 10/20  Iteration 1663/3560 Training loss: 2.0369 0.3793 sec/batch
Epoch 10/20  Iteration 1664/3560 Training loss: 2.0375 0.3717 sec/batch
Epoch 10/20  Iteration 1665/3560 Training loss: 2.0376 0.3992 sec/batch
Epoch 10/20  Iteration 1666/3560 Training loss: 2.0378 0.3738 sec/batch
Epoch 10/20  Iteration 1667/3560 Training loss: 2.0377 0.3855 sec/batch
Epoch 10/20  Iteration 1668/3560 Training loss: 2.0380 0.3714 sec/batch
Epoch 10/20  Iteration 1669/3560 Training loss: 2.0382 0.3794 sec/batch
Epoch 10/20  Iteration 1670/3560 Training loss: 2.0377 0.3835 sec/batch
Epoch 10/20  Iteration 1671/3560 Training loss: 2.0374 0.3964 sec/batch
Epoch 10/20  Iteration 1672/3560 Training loss: 2.0373 0.3858 sec/batch
Epoch 10/20  Iteration 1673/3560 Training loss: 2.0377 0.3813 sec/batch
Epoch 10/20  Iteration 1674/3560 Training loss: 2.0378 0.3765 sec/batch
Epoch 10/20  Iteration 1675/3560 Training loss: 2.0382 0.3905 sec/batch
Epoch 10/20  Iteration 1676/3560 Training loss: 2.0379 0.3851 sec/batch
Epoch 10/20  Iteration 1677/3560 Training loss: 2.0378 0.3778 sec/batch
Epoch 10/20  Iteration 1678/3560 Training loss: 2.0381 0.3772 sec/batch
Epoch 10/20  Iteration 1679/3560 Training loss: 2.0381 0.3715 sec/batch
Epoch 10/20  Iteration 1680/3560 Training loss: 2.0380 0.4161 sec/batch
Epoch 10/20  Iteration 1681/3560 Training loss: 2.0376 0.3838 sec/batch
Epoch 10/20  Iteration 1682/3560 Training loss: 2.0374 0.3818 sec/batch
Epoch 10/20  Iteration 1683/3560 Training loss: 2.0369 0.3808 sec/batch
Epoch 10/20  Iteration 1684/3560 Training loss: 2.0369 0.4006 sec/batch
Epoch 10/20  Iteration 1685/3560 Training loss: 2.0364 0.3962 sec/batch
Epoch 10/20  Iteration 1686/3560 Training loss: 2.0362 0.3730 sec/batch
Epoch 10/20  Iteration 1687/3560 Training loss: 2.0355 0.3802 sec/batch
Epoch 10/20  Iteration 1688/3560 Training loss: 2.0353 0.3760 sec/batch
Epoch 10/20  Iteration 1689/3560 Training loss: 2.0352 0.3790 sec/batch
Epoch 10/20  Iteration 1690/3560 Training loss: 2.0350 0.3824 sec/batch
Epoch 10/20  Iteration 1691/3560 Training loss: 2.0345 0.3818 sec/batch
Epoch 10/20  Iteration 1692/3560 Training loss: 2.0345 0.3888 sec/batch
Epoch 10/20  Iteration 1693/3560 Training loss: 2.0341 0.3766 sec/batch
Epoch 10/20  Iteration 1694/3560 Training loss: 2.0340 0.3754 sec/batch
Epoch 10/20  Iteration 1695/3560 Training loss: 2.0335 0.3745 sec/batch
Epoch 10/20  Iteration 1696/3560 Training loss: 2.0332 0.3757 sec/batch
Epoch 10/20  Iteration 1697/3560 Training loss: 2.0329 0.3807 sec/batch
Epoch 10/20  Iteration 1698/3560 Training loss: 2.0328 0.4001 sec/batch
Epoch 10/20  Iteration 1699/3560 Training loss: 2.0326 0.3775 sec/batch
Epoch 10/20  Iteration 1700/3560 Training loss: 2.0322 0.3766 sec/batch
Epoch 10/20  Iteration 1701/3560 Training loss: 2.0319 0.3760 sec/batch
Epoch 10/20  Iteration 1702/3560 Training loss: 2.0315 0.3829 sec/batch
Epoch 10/20  Iteration 1703/3560 Training loss: 2.0314 0.3766 sec/batch
Epoch 10/20  Iteration 1704/3560 Training loss: 2.0313 0.3767 sec/batch
Epoch 10/20  Iteration 1705/3560 Training loss: 2.0310 0.3779 sec/batch
Epoch 10/20  Iteration 1706/3560 Training loss: 2.0308 0.3818 sec/batch
Epoch 10/20  Iteration 1707/3560 Training loss: 2.0304 0.3878 sec/batch
Epoch 10/20  Iteration 1708/3560 Training loss: 2.0303 0.3799 sec/batch
Epoch 10/20  Iteration 1709/3560 Training loss: 2.0301 0.3780 sec/batch
Epoch 10/20  Iteration 1710/3560 Training loss: 2.0301 0.3817 sec/batch
Epoch 10/20  Iteration 1711/3560 Training loss: 2.0301 0.4008 sec/batch
Epoch 10/20  Iteration 1712/3560 Training loss: 2.0299 0.3749 sec/batch
Epoch 10/20  Iteration 1713/3560 Training loss: 2.0299 0.3817 sec/batch
Epoch 10/20  Iteration 1714/3560 Training loss: 2.0297 0.3813 sec/batch
Epoch 10/20  Iteration 1715/3560 Training loss: 2.0297 0.3787 sec/batch
Epoch 10/20  Iteration 1716/3560 Training loss: 2.0295 0.3901 sec/batch
Epoch 10/20  Iteration 1717/3560 Training loss: 2.0292 0.3823 sec/batch
Epoch 10/20  Iteration 1718/3560 Training loss: 2.0288 0.3805 sec/batch
Epoch 10/20  Iteration 1719/3560 Training loss: 2.0287 0.3829 sec/batch
Epoch 10/20  Iteration 1720/3560 Training loss: 2.0286 0.3919 sec/batch
Epoch 10/20  Iteration 1721/3560 Training loss: 2.0285 0.3849 sec/batch
Epoch 10/20  Iteration 1722/3560 Training loss: 2.0284 0.3872 sec/batch
Epoch 10/20  Iteration 1723/3560 Training loss: 2.0284 0.3827 sec/batch
Epoch 10/20  Iteration 1724/3560 Training loss: 2.0281 0.3987 sec/batch
Epoch 10/20  Iteration 1725/3560 Training loss: 2.0279 0.3765 sec/batch
Epoch 10/20  Iteration 1726/3560 Training loss: 2.0280 0.3754 sec/batch
Epoch 10/20  Iteration 1727/3560 Training loss: 2.0279 0.3798 sec/batch
Epoch 10/20  Iteration 1728/3560 Training loss: 2.0275 0.3758 sec/batch
Epoch 10/20  Iteration 1729/3560 Training loss: 2.0276 0.3770 sec/batch
Epoch 10/20  Iteration 1730/3560 Training loss: 2.0276 0.3764 sec/batch
Epoch 10/20  Iteration 1731/3560 Training loss: 2.0276 0.3801 sec/batch
Epoch 10/20  Iteration 1732/3560 Training loss: 2.0275 0.3800 sec/batch
Epoch 10/20  Iteration 1733/3560 Training loss: 2.0273 0.3809 sec/batch
Epoch 10/20  Iteration 1734/3560 Training loss: 2.0270 0.3871 sec/batch
Epoch 10/20  Iteration 1735/3560 Training loss: 2.0269 0.3766 sec/batch
Epoch 10/20  Iteration 1736/3560 Training loss: 2.0270 0.3804 sec/batch
Epoch 10/20  Iteration 1737/3560 Training loss: 2.0269 0.4053 sec/batch
Epoch 10/20  Iteration 1738/3560 Training loss: 2.0270 0.3855 sec/batch
Epoch 10/20  Iteration 1739/3560 Training loss: 2.0270 0.3819 sec/batch
Epoch 10/20  Iteration 1740/3560 Training loss: 2.0270 0.3746 sec/batch
Epoch 10/20  Iteration 1741/3560 Training loss: 2.0272 0.3794 sec/batch
Epoch 10/20  Iteration 1742/3560 Training loss: 2.0269 0.3787 sec/batch
Epoch 10/20  Iteration 1743/3560 Training loss: 2.0271 0.3778 sec/batch
Epoch 10/20  Iteration 1744/3560 Training loss: 2.0269 0.3791 sec/batch
Epoch 10/20  Iteration 1745/3560 Training loss: 2.0269 0.3765 sec/batch
Epoch 10/20  Iteration 1746/3560 Training loss: 2.0268 0.3767 sec/batch
Epoch 10/20  Iteration 1747/3560 Training loss: 2.0267 0.3757 sec/batch
Epoch 10/20  Iteration 1748/3560 Training loss: 2.0268 0.3769 sec/batch
Epoch 10/20  Iteration 1749/3560 Training loss: 2.0268 0.3783 sec/batch
Epoch 10/20  Iteration 1750/3560 Training loss: 2.0269 0.3792 sec/batch
Epoch 10/20  Iteration 1751/3560 Training loss: 2.0268 0.4018 sec/batch
Epoch 10/20  Iteration 1752/3560 Training loss: 2.0266 0.3777 sec/batch
Epoch 10/20  Iteration 1753/3560 Training loss: 2.0264 0.3894 sec/batch
Epoch 10/20  Iteration 1754/3560 Training loss: 2.0265 0.3885 sec/batch
Epoch 10/20  Iteration 1755/3560 Training loss: 2.0265 0.3845 sec/batch
Epoch 10/20  Iteration 1756/3560 Training loss: 2.0265 0.3828 sec/batch
Epoch 10/20  Iteration 1757/3560 Training loss: 2.0264 0.3786 sec/batch
Epoch 10/20  Iteration 1758/3560 Training loss: 2.0263 0.3769 sec/batch
Epoch 10/20  Iteration 1759/3560 Training loss: 2.0262 0.3783 sec/batch
Epoch 10/20  Iteration 1760/3560 Training loss: 2.0262 0.3749 sec/batch
Epoch 10/20  Iteration 1761/3560 Training loss: 2.0260 0.3763 sec/batch
Epoch 10/20  Iteration 1762/3560 Training loss: 2.0261 0.3775 sec/batch
Epoch 10/20  Iteration 1763/3560 Training loss: 2.0261 0.3850 sec/batch
Epoch 10/20  Iteration 1764/3560 Training loss: 2.0260 0.4005 sec/batch
Epoch 10/20  Iteration 1765/3560 Training loss: 2.0261 0.3755 sec/batch
Epoch 10/20  Iteration 1766/3560 Training loss: 2.0260 0.3804 sec/batch
Epoch 10/20  Iteration 1767/3560 Training loss: 2.0260 0.3803 sec/batch
Epoch 10/20  Iteration 1768/3560 Training loss: 2.0259 0.3928 sec/batch
Epoch 10/20  Iteration 1769/3560 Training loss: 2.0259 0.3825 sec/batch
Epoch 10/20  Iteration 1770/3560 Training loss: 2.0261 0.3824 sec/batch
Epoch 10/20  Iteration 1771/3560 Training loss: 2.0260 0.3802 sec/batch
Epoch 10/20  Iteration 1772/3560 Training loss: 2.0259 0.3740 sec/batch
Epoch 10/20  Iteration 1773/3560 Training loss: 2.0258 0.3772 sec/batch
Epoch 10/20  Iteration 1774/3560 Training loss: 2.0257 0.3828 sec/batch
Epoch 10/20  Iteration 1775/3560 Training loss: 2.0258 0.3824 sec/batch
Epoch 10/20  Iteration 1776/3560 Training loss: 2.0258 0.3821 sec/batch
Epoch 10/20  Iteration 1777/3560 Training loss: 2.0258 0.3952 sec/batch
Epoch 10/20  Iteration 1778/3560 Training loss: 2.0258 0.3698 sec/batch
Epoch 10/20  Iteration 1779/3560 Training loss: 2.0257 0.3821 sec/batch
Epoch 10/20  Iteration 1780/3560 Training loss: 2.0256 0.3790 sec/batch
Epoch 11/20  Iteration 1781/3560 Training loss: 2.0880 0.3870 sec/batch
Epoch 11/20  Iteration 1782/3560 Training loss: 2.0424 0.3822 sec/batch
Epoch 11/20  Iteration 1783/3560 Training loss: 2.0263 0.3796 sec/batch
Epoch 11/20  Iteration 1784/3560 Training loss: 2.0200 0.3772 sec/batch
Epoch 11/20  Iteration 1785/3560 Training loss: 2.0168 0.3774 sec/batch
Epoch 11/20  Iteration 1786/3560 Training loss: 2.0100 0.3788 sec/batch
Epoch 11/20  Iteration 1787/3560 Training loss: 2.0104 0.3800 sec/batch
Epoch 11/20  Iteration 1788/3560 Training loss: 2.0108 0.3779 sec/batch
Epoch 11/20  Iteration 1789/3560 Training loss: 2.0139 0.3782 sec/batch
Epoch 11/20  Iteration 1790/3560 Training loss: 2.0130 0.4003 sec/batch
Epoch 11/20  Iteration 1791/3560 Training loss: 2.0099 0.3795 sec/batch
Epoch 11/20  Iteration 1792/3560 Training loss: 2.0090 0.3781 sec/batch
Epoch 11/20  Iteration 1793/3560 Training loss: 2.0086 0.3769 sec/batch
Epoch 11/20  Iteration 1794/3560 Training loss: 2.0107 0.3819 sec/batch
Epoch 11/20  Iteration 1795/3560 Training loss: 2.0098 0.3768 sec/batch
Epoch 11/20  Iteration 1796/3560 Training loss: 2.0090 0.3775 sec/batch
Epoch 11/20  Iteration 1797/3560 Training loss: 2.0086 0.3891 sec/batch
Epoch 11/20  Iteration 1798/3560 Training loss: 2.0104 0.3940 sec/batch
Epoch 11/20  Iteration 1799/3560 Training loss: 2.0104 0.3730 sec/batch
Epoch 11/20  Iteration 1800/3560 Training loss: 2.0106 0.3782 sec/batch
Epoch 11/20  Iteration 1801/3560 Training loss: 2.0101 0.3858 sec/batch
Epoch 11/20  Iteration 1802/3560 Training loss: 2.0110 0.3841 sec/batch
Epoch 11/20  Iteration 1803/3560 Training loss: 2.0108 0.3881 sec/batch
Epoch 11/20  Iteration 1804/3560 Training loss: 2.0099 0.3903 sec/batch
Epoch 11/20  Iteration 1805/3560 Training loss: 2.0096 0.3856 sec/batch
Epoch 11/20  Iteration 1806/3560 Training loss: 2.0085 0.3790 sec/batch
Epoch 11/20  Iteration 1807/3560 Training loss: 2.0076 0.3728 sec/batch
Epoch 11/20  Iteration 1808/3560 Training loss: 2.0081 0.3824 sec/batch
Epoch 11/20  Iteration 1809/3560 Training loss: 2.0089 0.3779 sec/batch
Epoch 11/20  Iteration 1810/3560 Training loss: 2.0093 0.3774 sec/batch
Epoch 11/20  Iteration 1811/3560 Training loss: 2.0094 0.3749 sec/batch
Epoch 11/20  Iteration 1812/3560 Training loss: 2.0090 0.3887 sec/batch
Epoch 11/20  Iteration 1813/3560 Training loss: 2.0089 0.3790 sec/batch
Epoch 11/20  Iteration 1814/3560 Training loss: 2.0100 0.3751 sec/batch
Epoch 11/20  Iteration 1815/3560 Training loss: 2.0095 0.3792 sec/batch
Epoch 11/20  Iteration 1816/3560 Training loss: 2.0095 0.3776 sec/batch
Epoch 11/20  Iteration 1817/3560 Training loss: 2.0092 0.3996 sec/batch
Epoch 11/20  Iteration 1818/3560 Training loss: 2.0082 0.3780 sec/batch
Epoch 11/20  Iteration 1819/3560 Training loss: 2.0073 0.3795 sec/batch
Epoch 11/20  Iteration 1820/3560 Training loss: 2.0068 0.3791 sec/batch
Epoch 11/20  Iteration 1821/3560 Training loss: 2.0062 0.3777 sec/batch
Epoch 11/20  Iteration 1822/3560 Training loss: 2.0062 0.3782 sec/batch
Epoch 11/20  Iteration 1823/3560 Training loss: 2.0057 0.3853 sec/batch
Epoch 11/20  Iteration 1824/3560 Training loss: 2.0053 0.3709 sec/batch
Epoch 11/20  Iteration 1825/3560 Training loss: 2.0055 0.3768 sec/batch
Epoch 11/20  Iteration 1826/3560 Training loss: 2.0042 0.3786 sec/batch
Epoch 11/20  Iteration 1827/3560 Training loss: 2.0040 0.3804 sec/batch
Epoch 11/20  Iteration 1828/3560 Training loss: 2.0034 0.3776 sec/batch
Epoch 11/20  Iteration 1829/3560 Training loss: 2.0031 0.3863 sec/batch
Epoch 11/20  Iteration 1830/3560 Training loss: 2.0039 0.3976 sec/batch
Epoch 11/20  Iteration 1831/3560 Training loss: 2.0032 0.3753 sec/batch
Epoch 11/20  Iteration 1832/3560 Training loss: 2.0041 0.3814 sec/batch
Epoch 11/20  Iteration 1833/3560 Training loss: 2.0039 0.3783 sec/batch
Epoch 11/20  Iteration 1834/3560 Training loss: 2.0038 0.3838 sec/batch
Epoch 11/20  Iteration 1835/3560 Training loss: 2.0035 0.3845 sec/batch
Epoch 11/20  Iteration 1836/3560 Training loss: 2.0037 0.3710 sec/batch
Epoch 11/20  Iteration 1837/3560 Training loss: 2.0039 0.3781 sec/batch
Epoch 11/20  Iteration 1838/3560 Training loss: 2.0036 0.3851 sec/batch
Epoch 11/20  Iteration 1839/3560 Training loss: 2.0031 0.3750 sec/batch
Epoch 11/20  Iteration 1840/3560 Training loss: 2.0036 0.3821 sec/batch
Epoch 11/20  Iteration 1841/3560 Training loss: 2.0036 0.3797 sec/batch
Epoch 11/20  Iteration 1842/3560 Training loss: 2.0041 0.3776 sec/batch
Epoch 11/20  Iteration 1843/3560 Training loss: 2.0044 0.4054 sec/batch
Epoch 11/20  Iteration 1844/3560 Training loss: 2.0045 0.3775 sec/batch
Epoch 11/20  Iteration 1845/3560 Training loss: 2.0045 0.3760 sec/batch
Epoch 11/20  Iteration 1846/3560 Training loss: 2.0047 0.3789 sec/batch
Epoch 11/20  Iteration 1847/3560 Training loss: 2.0049 0.3848 sec/batch
Epoch 11/20  Iteration 1848/3560 Training loss: 2.0044 0.3837 sec/batch
Epoch 11/20  Iteration 1849/3560 Training loss: 2.0042 0.3777 sec/batch
Epoch 11/20  Iteration 1850/3560 Training loss: 2.0042 0.3793 sec/batch
Epoch 11/20  Iteration 1851/3560 Training loss: 2.0046 0.3764 sec/batch
Epoch 11/20  Iteration 1852/3560 Training loss: 2.0047 0.3804 sec/batch
Epoch 11/20  Iteration 1853/3560 Training loss: 2.0050 0.3788 sec/batch
Epoch 11/20  Iteration 1854/3560 Training loss: 2.0048 0.3809 sec/batch
Epoch 11/20  Iteration 1855/3560 Training loss: 2.0047 0.3839 sec/batch
Epoch 11/20  Iteration 1856/3560 Training loss: 2.0049 0.4017 sec/batch
Epoch 11/20  Iteration 1857/3560 Training loss: 2.0048 0.3824 sec/batch
Epoch 11/20  Iteration 1858/3560 Training loss: 2.0050 0.3835 sec/batch
Epoch 11/20  Iteration 1859/3560 Training loss: 2.0046 0.3950 sec/batch
Epoch 11/20  Iteration 1860/3560 Training loss: 2.0044 0.3844 sec/batch
Epoch 11/20  Iteration 1861/3560 Training loss: 2.0040 0.3832 sec/batch
Epoch 11/20  Iteration 1862/3560 Training loss: 2.0040 0.3822 sec/batch
Epoch 11/20  Iteration 1863/3560 Training loss: 2.0036 0.3764 sec/batch
Epoch 11/20  Iteration 1864/3560 Training loss: 2.0034 0.3747 sec/batch
Epoch 11/20  Iteration 1865/3560 Training loss: 2.0028 0.3793 sec/batch
Epoch 11/20  Iteration 1866/3560 Training loss: 2.0025 0.3767 sec/batch
Epoch 11/20  Iteration 1867/3560 Training loss: 2.0025 0.3753 sec/batch
Epoch 11/20  Iteration 1868/3560 Training loss: 2.0023 0.3799 sec/batch
Epoch 11/20  Iteration 1869/3560 Training loss: 2.0018 0.3846 sec/batch
Epoch 11/20  Iteration 1870/3560 Training loss: 2.0017 0.4008 sec/batch
Epoch 11/20  Iteration 1871/3560 Training loss: 2.0014 0.3852 sec/batch
Epoch 11/20  Iteration 1872/3560 Training loss: 2.0013 0.3875 sec/batch
Epoch 11/20  Iteration 1873/3560 Training loss: 2.0007 0.3753 sec/batch
Epoch 11/20  Iteration 1874/3560 Training loss: 2.0004 0.3790 sec/batch
Epoch 11/20  Iteration 1875/3560 Training loss: 2.0001 0.3951 sec/batch
Epoch 11/20  Iteration 1876/3560 Training loss: 2.0000 0.3806 sec/batch
Epoch 11/20  Iteration 1877/3560 Training loss: 1.9998 0.3847 sec/batch
Epoch 11/20  Iteration 1878/3560 Training loss: 1.9994 0.3763 sec/batch
Epoch 11/20  Iteration 1879/3560 Training loss: 1.9990 0.3829 sec/batch
Epoch 11/20  Iteration 1880/3560 Training loss: 1.9986 0.3785 sec/batch
Epoch 11/20  Iteration 1881/3560 Training loss: 1.9986 0.3767 sec/batch
Epoch 11/20  Iteration 1882/3560 Training loss: 1.9985 0.3747 sec/batch
Epoch 11/20  Iteration 1883/3560 Training loss: 1.9983 0.4007 sec/batch
Epoch 11/20  Iteration 1884/3560 Training loss: 1.9980 0.3843 sec/batch
Epoch 11/20  Iteration 1885/3560 Training loss: 1.9976 0.3864 sec/batch
Epoch 11/20  Iteration 1886/3560 Training loss: 1.9975 0.3783 sec/batch
Epoch 11/20  Iteration 1887/3560 Training loss: 1.9974 0.3766 sec/batch
Epoch 11/20  Iteration 1888/3560 Training loss: 1.9974 0.3775 sec/batch
Epoch 11/20  Iteration 1889/3560 Training loss: 1.9974 0.3715 sec/batch
Epoch 11/20  Iteration 1890/3560 Training loss: 1.9973 0.3812 sec/batch
Epoch 11/20  Iteration 1891/3560 Training loss: 1.9973 0.3819 sec/batch
Epoch 11/20  Iteration 1892/3560 Training loss: 1.9972 0.3771 sec/batch
Epoch 11/20  Iteration 1893/3560 Training loss: 1.9971 0.3764 sec/batch
Epoch 11/20  Iteration 1894/3560 Training loss: 1.9969 0.3768 sec/batch
Epoch 11/20  Iteration 1895/3560 Training loss: 1.9966 0.3793 sec/batch
Epoch 11/20  Iteration 1896/3560 Training loss: 1.9963 0.3932 sec/batch
Epoch 11/20  Iteration 1897/3560 Training loss: 1.9961 0.3808 sec/batch
Epoch 11/20  Iteration 1898/3560 Training loss: 1.9961 0.3895 sec/batch
Epoch 11/20  Iteration 1899/3560 Training loss: 1.9962 0.3799 sec/batch
Epoch 11/20  Iteration 1900/3560 Training loss: 1.9961 0.3821 sec/batch
Epoch 11/20  Iteration 1901/3560 Training loss: 1.9961 0.3780 sec/batch
Epoch 11/20  Iteration 1902/3560 Training loss: 1.9959 0.3822 sec/batch
Epoch 11/20  Iteration 1903/3560 Training loss: 1.9956 0.3836 sec/batch
Epoch 11/20  Iteration 1904/3560 Training loss: 1.9957 0.3823 sec/batch
Epoch 11/20  Iteration 1905/3560 Training loss: 1.9957 0.3747 sec/batch
Epoch 11/20  Iteration 1906/3560 Training loss: 1.9954 0.3784 sec/batch
Epoch 11/20  Iteration 1907/3560 Training loss: 1.9954 0.3803 sec/batch
Epoch 11/20  Iteration 1908/3560 Training loss: 1.9955 0.3825 sec/batch
Epoch 11/20  Iteration 1909/3560 Training loss: 1.9954 0.4040 sec/batch
Epoch 11/20  Iteration 1910/3560 Training loss: 1.9954 0.3777 sec/batch
Epoch 11/20  Iteration 1911/3560 Training loss: 1.9952 0.3758 sec/batch
Epoch 11/20  Iteration 1912/3560 Training loss: 1.9949 0.3829 sec/batch
Epoch 11/20  Iteration 1913/3560 Training loss: 1.9949 0.3893 sec/batch
Epoch 11/20  Iteration 1914/3560 Training loss: 1.9951 0.3833 sec/batch
Epoch 11/20  Iteration 1915/3560 Training loss: 1.9950 0.3871 sec/batch
Epoch 11/20  Iteration 1916/3560 Training loss: 1.9949 0.3824 sec/batch
Epoch 11/20  Iteration 1917/3560 Training loss: 1.9950 0.3776 sec/batch
Epoch 11/20  Iteration 1918/3560 Training loss: 1.9950 0.3765 sec/batch
Epoch 11/20  Iteration 1919/3560 Training loss: 1.9952 0.3772 sec/batch
Epoch 11/20  Iteration 1920/3560 Training loss: 1.9950 0.3750 sec/batch
Epoch 11/20  Iteration 1921/3560 Training loss: 1.9951 0.3764 sec/batch
Epoch 11/20  Iteration 1922/3560 Training loss: 1.9950 0.4126 sec/batch
Epoch 11/20  Iteration 1923/3560 Training loss: 1.9950 0.3809 sec/batch
Epoch 11/20  Iteration 1924/3560 Training loss: 1.9950 0.3819 sec/batch
Epoch 11/20  Iteration 1925/3560 Training loss: 1.9948 0.3784 sec/batch
Epoch 11/20  Iteration 1926/3560 Training loss: 1.9950 0.3843 sec/batch
Epoch 11/20  Iteration 1927/3560 Training loss: 1.9950 0.3761 sec/batch
Epoch 11/20  Iteration 1928/3560 Training loss: 1.9951 0.3745 sec/batch
Epoch 11/20  Iteration 1929/3560 Training loss: 1.9951 0.3840 sec/batch
Epoch 11/20  Iteration 1930/3560 Training loss: 1.9950 0.3766 sec/batch
Epoch 11/20  Iteration 1931/3560 Training loss: 1.9948 0.3757 sec/batch
Epoch 11/20  Iteration 1932/3560 Training loss: 1.9949 0.3873 sec/batch
Epoch 11/20  Iteration 1933/3560 Training loss: 1.9949 0.3844 sec/batch
Epoch 11/20  Iteration 1934/3560 Training loss: 1.9949 0.3822 sec/batch
Epoch 11/20  Iteration 1935/3560 Training loss: 1.9948 0.3863 sec/batch
Epoch 11/20  Iteration 1936/3560 Training loss: 1.9947 0.4087 sec/batch
Epoch 11/20  Iteration 1937/3560 Training loss: 1.9947 0.3739 sec/batch
Epoch 11/20  Iteration 1938/3560 Training loss: 1.9947 0.3807 sec/batch
Epoch 11/20  Iteration 1939/3560 Training loss: 1.9944 0.3815 sec/batch
Epoch 11/20  Iteration 1940/3560 Training loss: 1.9946 0.3780 sec/batch
Epoch 11/20  Iteration 1941/3560 Training loss: 1.9947 0.3799 sec/batch
Epoch 11/20  Iteration 1942/3560 Training loss: 1.9946 0.3805 sec/batch
Epoch 11/20  Iteration 1943/3560 Training loss: 1.9947 0.3862 sec/batch
Epoch 11/20  Iteration 1944/3560 Training loss: 1.9947 0.3807 sec/batch
Epoch 11/20  Iteration 1945/3560 Training loss: 1.9947 0.3881 sec/batch
Epoch 11/20  Iteration 1946/3560 Training loss: 1.9946 0.3811 sec/batch
Epoch 11/20  Iteration 1947/3560 Training loss: 1.9946 0.3880 sec/batch
Epoch 11/20  Iteration 1948/3560 Training loss: 1.9949 0.3703 sec/batch
Epoch 11/20  Iteration 1949/3560 Training loss: 1.9948 0.4055 sec/batch
Epoch 11/20  Iteration 1950/3560 Training loss: 1.9948 0.3772 sec/batch
Epoch 11/20  Iteration 1951/3560 Training loss: 1.9947 0.3827 sec/batch
Epoch 11/20  Iteration 1952/3560 Training loss: 1.9945 0.3797 sec/batch
Epoch 11/20  Iteration 1953/3560 Training loss: 1.9946 0.3858 sec/batch
Epoch 11/20  Iteration 1954/3560 Training loss: 1.9946 0.3807 sec/batch
Epoch 11/20  Iteration 1955/3560 Training loss: 1.9947 0.3812 sec/batch
Epoch 11/20  Iteration 1956/3560 Training loss: 1.9946 0.3812 sec/batch
Epoch 11/20  Iteration 1957/3560 Training loss: 1.9945 0.3783 sec/batch
Epoch 11/20  Iteration 1958/3560 Training loss: 1.9945 0.3807 sec/batch
Epoch 12/20  Iteration 1959/3560 Training loss: 2.0561 0.3800 sec/batch
Epoch 12/20  Iteration 1960/3560 Training loss: 2.0135 0.3839 sec/batch
Epoch 12/20  Iteration 1961/3560 Training loss: 1.9984 0.3776 sec/batch
Epoch 12/20  Iteration 1962/3560 Training loss: 1.9929 0.4007 sec/batch
Epoch 12/20  Iteration 1963/3560 Training loss: 1.9898 0.3768 sec/batch
Epoch 12/20  Iteration 1964/3560 Training loss: 1.9814 0.3736 sec/batch
Epoch 12/20  Iteration 1965/3560 Training loss: 1.9828 0.3828 sec/batch
Epoch 12/20  Iteration 1966/3560 Training loss: 1.9832 0.3762 sec/batch
Epoch 12/20  Iteration 1967/3560 Training loss: 1.9843 0.3733 sec/batch
Epoch 12/20  Iteration 1968/3560 Training loss: 1.9832 0.3776 sec/batch
Epoch 12/20  Iteration 1969/3560 Training loss: 1.9804 0.3812 sec/batch
Epoch 12/20  Iteration 1970/3560 Training loss: 1.9788 0.3801 sec/batch
Epoch 12/20  Iteration 1971/3560 Training loss: 1.9790 0.3811 sec/batch
Epoch 12/20  Iteration 1972/3560 Training loss: 1.9817 0.3771 sec/batch
Epoch 12/20  Iteration 1973/3560 Training loss: 1.9804 0.3863 sec/batch
Epoch 12/20  Iteration 1974/3560 Training loss: 1.9792 0.3745 sec/batch
Epoch 12/20  Iteration 1975/3560 Training loss: 1.9788 0.4137 sec/batch
Epoch 12/20  Iteration 1976/3560 Training loss: 1.9809 0.3824 sec/batch
Epoch 12/20  Iteration 1977/3560 Training loss: 1.9806 0.3774 sec/batch
Epoch 12/20  Iteration 1978/3560 Training loss: 1.9810 0.3774 sec/batch
Epoch 12/20  Iteration 1979/3560 Training loss: 1.9801 0.3771 sec/batch
Epoch 12/20  Iteration 1980/3560 Training loss: 1.9811 0.3828 sec/batch
Epoch 12/20  Iteration 1981/3560 Training loss: 1.9810 0.3744 sec/batch
Epoch 12/20  Iteration 1982/3560 Training loss: 1.9810 0.3754 sec/batch
Epoch 12/20  Iteration 1983/3560 Training loss: 1.9807 0.3763 sec/batch
Epoch 12/20  Iteration 1984/3560 Training loss: 1.9798 0.3766 sec/batch
Epoch 12/20  Iteration 1985/3560 Training loss: 1.9790 0.3795 sec/batch
Epoch 12/20  Iteration 1986/3560 Training loss: 1.9797 0.3851 sec/batch
Epoch 12/20  Iteration 1987/3560 Training loss: 1.9810 0.3734 sec/batch
Epoch 12/20  Iteration 1988/3560 Training loss: 1.9811 0.4085 sec/batch
Epoch 12/20  Iteration 1989/3560 Training loss: 1.9810 0.3793 sec/batch
Epoch 12/20  Iteration 1990/3560 Training loss: 1.9803 0.4303 sec/batch
Epoch 12/20  Iteration 1991/3560 Training loss: 1.9805 0.3860 sec/batch
Epoch 12/20  Iteration 1992/3560 Training loss: 1.9813 0.3876 sec/batch
Epoch 12/20  Iteration 1993/3560 Training loss: 1.9813 0.4022 sec/batch
Epoch 12/20  Iteration 1994/3560 Training loss: 1.9810 0.3792 sec/batch
Epoch 12/20  Iteration 1995/3560 Training loss: 1.9807 0.3802 sec/batch
Epoch 12/20  Iteration 1996/3560 Training loss: 1.9799 0.3810 sec/batch
Epoch 12/20  Iteration 1997/3560 Training loss: 1.9790 0.3806 sec/batch
Epoch 12/20  Iteration 1998/3560 Training loss: 1.9783 0.3810 sec/batch
Epoch 12/20  Iteration 1999/3560 Training loss: 1.9777 0.3796 sec/batch
Epoch 12/20  Iteration 2000/3560 Training loss: 1.9777 0.3770 sec/batch
Epoch 12/20  Iteration 2001/3560 Training loss: 1.9771 0.4060 sec/batch
Epoch 12/20  Iteration 2002/3560 Training loss: 1.9764 0.3852 sec/batch
Epoch 12/20  Iteration 2003/3560 Training loss: 1.9767 0.3972 sec/batch
Epoch 12/20  Iteration 2004/3560 Training loss: 1.9755 0.3731 sec/batch
Epoch 12/20  Iteration 2005/3560 Training loss: 1.9753 0.3798 sec/batch
Epoch 12/20  Iteration 2006/3560 Training loss: 1.9747 0.3813 sec/batch
Epoch 12/20  Iteration 2007/3560 Training loss: 1.9748 0.3729 sec/batch
Epoch 12/20  Iteration 2008/3560 Training loss: 1.9756 0.3788 sec/batch
Epoch 12/20  Iteration 2009/3560 Training loss: 1.9749 0.3781 sec/batch
Epoch 12/20  Iteration 2010/3560 Training loss: 1.9758 0.3799 sec/batch
Epoch 12/20  Iteration 2011/3560 Training loss: 1.9757 0.3776 sec/batch
Epoch 12/20  Iteration 2012/3560 Training loss: 1.9755 0.3794 sec/batch
Epoch 12/20  Iteration 2013/3560 Training loss: 1.9751 0.3758 sec/batch
Epoch 12/20  Iteration 2014/3560 Training loss: 1.9750 0.3967 sec/batch
Epoch 12/20  Iteration 2015/3560 Training loss: 1.9752 0.3867 sec/batch
Epoch 12/20  Iteration 2016/3560 Training loss: 1.9751 0.3813 sec/batch
Epoch 12/20  Iteration 2017/3560 Training loss: 1.9747 0.3740 sec/batch
Epoch 12/20  Iteration 2018/3560 Training loss: 1.9750 0.3855 sec/batch
Epoch 12/20  Iteration 2019/3560 Training loss: 1.9750 0.3876 sec/batch
Epoch 12/20  Iteration 2020/3560 Training loss: 1.9756 0.3867 sec/batch
Epoch 12/20  Iteration 2021/3560 Training loss: 1.9758 0.3792 sec/batch
Epoch 12/20  Iteration 2022/3560 Training loss: 1.9760 0.3743 sec/batch
Epoch 12/20  Iteration 2023/3560 Training loss: 1.9757 0.3791 sec/batch
Epoch 12/20  Iteration 2024/3560 Training loss: 1.9759 0.3779 sec/batch
Epoch 12/20  Iteration 2025/3560 Training loss: 1.9760 0.3817 sec/batch
Epoch 12/20  Iteration 2026/3560 Training loss: 1.9754 0.3886 sec/batch
Epoch 12/20  Iteration 2027/3560 Training loss: 1.9752 0.3744 sec/batch
Epoch 12/20  Iteration 2028/3560 Training loss: 1.9752 0.3976 sec/batch
Epoch 12/20  Iteration 2029/3560 Training loss: 1.9755 0.3750 sec/batch
Epoch 12/20  Iteration 2030/3560 Training loss: 1.9756 0.3923 sec/batch
Epoch 12/20  Iteration 2031/3560 Training loss: 1.9759 0.3834 sec/batch
Epoch 12/20  Iteration 2032/3560 Training loss: 1.9755 0.3794 sec/batch
Epoch 12/20  Iteration 2033/3560 Training loss: 1.9755 0.3777 sec/batch
Epoch 12/20  Iteration 2034/3560 Training loss: 1.9758 0.3883 sec/batch
Epoch 12/20  Iteration 2035/3560 Training loss: 1.9757 0.3754 sec/batch
Epoch 12/20  Iteration 2036/3560 Training loss: 1.9758 0.3789 sec/batch
Epoch 12/20  Iteration 2037/3560 Training loss: 1.9753 0.3837 sec/batch
Epoch 12/20  Iteration 2038/3560 Training loss: 1.9751 0.3834 sec/batch
Epoch 12/20  Iteration 2039/3560 Training loss: 1.9747 0.3801 sec/batch
Epoch 12/20  Iteration 2040/3560 Training loss: 1.9749 0.3772 sec/batch
Epoch 12/20  Iteration 2041/3560 Training loss: 1.9744 0.4047 sec/batch
Epoch 12/20  Iteration 2042/3560 Training loss: 1.9742 0.3803 sec/batch
Epoch 12/20  Iteration 2043/3560 Training loss: 1.9737 0.3786 sec/batch
Epoch 12/20  Iteration 2044/3560 Training loss: 1.9733 0.3757 sec/batch
Epoch 12/20  Iteration 2045/3560 Training loss: 1.9732 0.3811 sec/batch
Epoch 12/20  Iteration 2046/3560 Training loss: 1.9729 0.3779 sec/batch
Epoch 12/20  Iteration 2047/3560 Training loss: 1.9725 0.3820 sec/batch
Epoch 12/20  Iteration 2048/3560 Training loss: 1.9724 0.3777 sec/batch
Epoch 12/20  Iteration 2049/3560 Training loss: 1.9721 0.3748 sec/batch
Epoch 12/20  Iteration 2050/3560 Training loss: 1.9720 0.3884 sec/batch
Epoch 12/20  Iteration 2051/3560 Training loss: 1.9715 0.3771 sec/batch
Epoch 12/20  Iteration 2052/3560 Training loss: 1.9711 0.3815 sec/batch
Epoch 12/20  Iteration 2053/3560 Training loss: 1.9708 0.3767 sec/batch
Epoch 12/20  Iteration 2054/3560 Training loss: 1.9707 0.3957 sec/batch
Epoch 12/20  Iteration 2055/3560 Training loss: 1.9706 0.3822 sec/batch
Epoch 12/20  Iteration 2056/3560 Training loss: 1.9703 0.3756 sec/batch
Epoch 12/20  Iteration 2057/3560 Training loss: 1.9699 0.3809 sec/batch
Epoch 12/20  Iteration 2058/3560 Training loss: 1.9694 0.3782 sec/batch
Epoch 12/20  Iteration 2059/3560 Training loss: 1.9694 0.3762 sec/batch
Epoch 12/20  Iteration 2060/3560 Training loss: 1.9694 0.3791 sec/batch
Epoch 12/20  Iteration 2061/3560 Training loss: 1.9691 0.3767 sec/batch
Epoch 12/20  Iteration 2062/3560 Training loss: 1.9689 0.3814 sec/batch
Epoch 12/20  Iteration 2063/3560 Training loss: 1.9686 0.3838 sec/batch
Epoch 12/20  Iteration 2064/3560 Training loss: 1.9685 0.3750 sec/batch
Epoch 12/20  Iteration 2065/3560 Training loss: 1.9684 0.3781 sec/batch
Epoch 12/20  Iteration 2066/3560 Training loss: 1.9684 0.3813 sec/batch
Epoch 12/20  Iteration 2067/3560 Training loss: 1.9684 0.4110 sec/batch
Epoch 12/20  Iteration 2068/3560 Training loss: 1.9683 0.3795 sec/batch
Epoch 12/20  Iteration 2069/3560 Training loss: 1.9683 0.3870 sec/batch
Epoch 12/20  Iteration 2070/3560 Training loss: 1.9681 0.3773 sec/batch
Epoch 12/20  Iteration 2071/3560 Training loss: 1.9681 0.3789 sec/batch
Epoch 12/20  Iteration 2072/3560 Training loss: 1.9679 0.3787 sec/batch
Epoch 12/20  Iteration 2073/3560 Training loss: 1.9677 0.3788 sec/batch
Epoch 12/20  Iteration 2074/3560 Training loss: 1.9674 0.3768 sec/batch
Epoch 12/20  Iteration 2075/3560 Training loss: 1.9672 0.3778 sec/batch
Epoch 12/20  Iteration 2076/3560 Training loss: 1.9672 0.3746 sec/batch
Epoch 12/20  Iteration 2077/3560 Training loss: 1.9672 0.3757 sec/batch
Epoch 12/20  Iteration 2078/3560 Training loss: 1.9671 0.3743 sec/batch
Epoch 12/20  Iteration 2079/3560 Training loss: 1.9672 0.3750 sec/batch
Epoch 12/20  Iteration 2080/3560 Training loss: 1.9669 0.3743 sec/batch
Epoch 12/20  Iteration 2081/3560 Training loss: 1.9666 0.4003 sec/batch
Epoch 12/20  Iteration 2082/3560 Training loss: 1.9667 0.3803 sec/batch
Epoch 12/20  Iteration 2083/3560 Training loss: 1.9666 0.3736 sec/batch
Epoch 12/20  Iteration 2084/3560 Training loss: 1.9663 0.3799 sec/batch
Epoch 12/20  Iteration 2085/3560 Training loss: 1.9664 0.3806 sec/batch
Epoch 12/20  Iteration 2086/3560 Training loss: 1.9665 0.3832 sec/batch
Epoch 12/20  Iteration 2087/3560 Training loss: 1.9663 0.3730 sec/batch
Epoch 12/20  Iteration 2088/3560 Training loss: 1.9663 0.3740 sec/batch
Epoch 12/20  Iteration 2089/3560 Training loss: 1.9661 0.3797 sec/batch
Epoch 12/20  Iteration 2090/3560 Training loss: 1.9658 0.3788 sec/batch
Epoch 12/20  Iteration 2091/3560 Training loss: 1.9659 0.3811 sec/batch
Epoch 12/20  Iteration 2092/3560 Training loss: 1.9659 0.3818 sec/batch
Epoch 12/20  Iteration 2093/3560 Training loss: 1.9659 0.3769 sec/batch
Epoch 12/20  Iteration 2094/3560 Training loss: 1.9660 0.3998 sec/batch
Epoch 12/20  Iteration 2095/3560 Training loss: 1.9661 0.3772 sec/batch
Epoch 12/20  Iteration 2096/3560 Training loss: 1.9661 0.3778 sec/batch
Epoch 12/20  Iteration 2097/3560 Training loss: 1.9663 0.3868 sec/batch
Epoch 12/20  Iteration 2098/3560 Training loss: 1.9661 0.3774 sec/batch
Epoch 12/20  Iteration 2099/3560 Training loss: 1.9663 0.3850 sec/batch
Epoch 12/20  Iteration 2100/3560 Training loss: 1.9662 0.3769 sec/batch
Epoch 12/20  Iteration 2101/3560 Training loss: 1.9663 0.3749 sec/batch
Epoch 12/20  Iteration 2102/3560 Training loss: 1.9662 0.3777 sec/batch
Epoch 12/20  Iteration 2103/3560 Training loss: 1.9661 0.3751 sec/batch
Epoch 12/20  Iteration 2104/3560 Training loss: 1.9662 0.3766 sec/batch
Epoch 12/20  Iteration 2105/3560 Training loss: 1.9662 0.3763 sec/batch
Epoch 12/20  Iteration 2106/3560 Training loss: 1.9663 0.3749 sec/batch
Epoch 12/20  Iteration 2107/3560 Training loss: 1.9662 0.4079 sec/batch
Epoch 12/20  Iteration 2108/3560 Training loss: 1.9661 0.4011 sec/batch
Epoch 12/20  Iteration 2109/3560 Training loss: 1.9659 0.4446 sec/batch
Epoch 12/20  Iteration 2110/3560 Training loss: 1.9660 0.3763 sec/batch
Epoch 12/20  Iteration 2111/3560 Training loss: 1.9660 0.3816 sec/batch
Epoch 12/20  Iteration 2112/3560 Training loss: 1.9661 0.3859 sec/batch
Epoch 12/20  Iteration 2113/3560 Training loss: 1.9660 0.3759 sec/batch
Epoch 12/20  Iteration 2114/3560 Training loss: 1.9660 0.3848 sec/batch
Epoch 12/20  Iteration 2115/3560 Training loss: 1.9660 0.3844 sec/batch
Epoch 12/20  Iteration 2116/3560 Training loss: 1.9659 0.3814 sec/batch
Epoch 12/20  Iteration 2117/3560 Training loss: 1.9657 0.3790 sec/batch
Epoch 12/20  Iteration 2118/3560 Training loss: 1.9658 0.3736 sec/batch
Epoch 12/20  Iteration 2119/3560 Training loss: 1.9660 0.3770 sec/batch
Epoch 12/20  Iteration 2120/3560 Training loss: 1.9659 0.4025 sec/batch
Epoch 12/20  Iteration 2121/3560 Training loss: 1.9659 0.3744 sec/batch
Epoch 12/20  Iteration 2122/3560 Training loss: 1.9659 0.3827 sec/batch
Epoch 12/20  Iteration 2123/3560 Training loss: 1.9658 0.3782 sec/batch
Epoch 12/20  Iteration 2124/3560 Training loss: 1.9658 0.3819 sec/batch
Epoch 12/20  Iteration 2125/3560 Training loss: 1.9658 0.3847 sec/batch
Epoch 12/20  Iteration 2126/3560 Training loss: 1.9661 0.3805 sec/batch
Epoch 12/20  Iteration 2127/3560 Training loss: 1.9660 0.3731 sec/batch
Epoch 12/20  Iteration 2128/3560 Training loss: 1.9659 0.3721 sec/batch
Epoch 12/20  Iteration 2129/3560 Training loss: 1.9658 0.3765 sec/batch
Epoch 12/20  Iteration 2130/3560 Training loss: 1.9657 0.3770 sec/batch
Epoch 12/20  Iteration 2131/3560 Training loss: 1.9657 0.3783 sec/batch
Epoch 12/20  Iteration 2132/3560 Training loss: 1.9657 0.3812 sec/batch
Epoch 12/20  Iteration 2133/3560 Training loss: 1.9658 0.4131 sec/batch
Epoch 12/20  Iteration 2134/3560 Training loss: 1.9657 0.3777 sec/batch
Epoch 12/20  Iteration 2135/3560 Training loss: 1.9657 0.3805 sec/batch
Epoch 12/20  Iteration 2136/3560 Training loss: 1.9657 0.3751 sec/batch
Epoch 13/20  Iteration 2137/3560 Training loss: 2.0242 0.3721 sec/batch
Epoch 13/20  Iteration 2138/3560 Training loss: 1.9873 0.3804 sec/batch
Epoch 13/20  Iteration 2139/3560 Training loss: 1.9745 0.3809 sec/batch
Epoch 13/20  Iteration 2140/3560 Training loss: 1.9649 0.3765 sec/batch
Epoch 13/20  Iteration 2141/3560 Training loss: 1.9617 0.3770 sec/batch
Epoch 13/20  Iteration 2142/3560 Training loss: 1.9559 0.3852 sec/batch
Epoch 13/20  Iteration 2143/3560 Training loss: 1.9573 0.3911 sec/batch
Epoch 13/20  Iteration 2144/3560 Training loss: 1.9570 0.3774 sec/batch
Epoch 13/20  Iteration 2145/3560 Training loss: 1.9588 0.3730 sec/batch
Epoch 13/20  Iteration 2146/3560 Training loss: 1.9577 0.3923 sec/batch
Epoch 13/20  Iteration 2147/3560 Training loss: 1.9540 0.3882 sec/batch
Epoch 13/20  Iteration 2148/3560 Training loss: 1.9513 0.3778 sec/batch
Epoch 13/20  Iteration 2149/3560 Training loss: 1.9507 0.3775 sec/batch
Epoch 13/20  Iteration 2150/3560 Training loss: 1.9532 0.3772 sec/batch
Epoch 13/20  Iteration 2151/3560 Training loss: 1.9522 0.3817 sec/batch
Epoch 13/20  Iteration 2152/3560 Training loss: 1.9508 0.3720 sec/batch
Epoch 13/20  Iteration 2153/3560 Training loss: 1.9509 0.3826 sec/batch
Epoch 13/20  Iteration 2154/3560 Training loss: 1.9531 0.3752 sec/batch
Epoch 13/20  Iteration 2155/3560 Training loss: 1.9530 0.3781 sec/batch
Epoch 13/20  Iteration 2156/3560 Training loss: 1.9535 0.3803 sec/batch
Epoch 13/20  Iteration 2157/3560 Training loss: 1.9527 0.3885 sec/batch
Epoch 13/20  Iteration 2158/3560 Training loss: 1.9533 0.3775 sec/batch
Epoch 13/20  Iteration 2159/3560 Training loss: 1.9532 0.3859 sec/batch
Epoch 13/20  Iteration 2160/3560 Training loss: 1.9527 0.3987 sec/batch
Epoch 13/20  Iteration 2161/3560 Training loss: 1.9526 0.3754 sec/batch
Epoch 13/20  Iteration 2162/3560 Training loss: 1.9519 0.3775 sec/batch
Epoch 13/20  Iteration 2163/3560 Training loss: 1.9511 0.3948 sec/batch
Epoch 13/20  Iteration 2164/3560 Training loss: 1.9519 0.3807 sec/batch
Epoch 13/20  Iteration 2165/3560 Training loss: 1.9532 0.3807 sec/batch
Epoch 13/20  Iteration 2166/3560 Training loss: 1.9536 0.3868 sec/batch
Epoch 13/20  Iteration 2167/3560 Training loss: 1.9535 0.3844 sec/batch
Epoch 13/20  Iteration 2168/3560 Training loss: 1.9530 0.3829 sec/batch
Epoch 13/20  Iteration 2169/3560 Training loss: 1.9533 0.3829 sec/batch
Epoch 13/20  Iteration 2170/3560 Training loss: 1.9540 0.3766 sec/batch
Epoch 13/20  Iteration 2171/3560 Training loss: 1.9536 0.3912 sec/batch
Epoch 13/20  Iteration 2172/3560 Training loss: 1.9534 0.3784 sec/batch
Epoch 13/20  Iteration 2173/3560 Training loss: 1.9528 0.3990 sec/batch
Epoch 13/20  Iteration 2174/3560 Training loss: 1.9520 0.3812 sec/batch
Epoch 13/20  Iteration 2175/3560 Training loss: 1.9508 0.3745 sec/batch
Epoch 13/20  Iteration 2176/3560 Training loss: 1.9502 0.3763 sec/batch
Epoch 13/20  Iteration 2177/3560 Training loss: 1.9497 0.3756 sec/batch
Epoch 13/20  Iteration 2178/3560 Training loss: 1.9497 0.3761 sec/batch
Epoch 13/20  Iteration 2179/3560 Training loss: 1.9492 0.3849 sec/batch
Epoch 13/20  Iteration 2180/3560 Training loss: 1.9483 0.3790 sec/batch
Epoch 13/20  Iteration 2181/3560 Training loss: 1.9486 0.3899 sec/batch
Epoch 13/20  Iteration 2182/3560 Training loss: 1.9476 0.3835 sec/batch
Epoch 13/20  Iteration 2183/3560 Training loss: 1.9475 0.3829 sec/batch
Epoch 13/20  Iteration 2184/3560 Training loss: 1.9470 0.3807 sec/batch
Epoch 13/20  Iteration 2185/3560 Training loss: 1.9468 0.3894 sec/batch
Epoch 13/20  Iteration 2186/3560 Training loss: 1.9476 0.4024 sec/batch
Epoch 13/20  Iteration 2187/3560 Training loss: 1.9471 0.3939 sec/batch
Epoch 13/20  Iteration 2188/3560 Training loss: 1.9477 0.4032 sec/batch
Epoch 13/20  Iteration 2189/3560 Training loss: 1.9476 0.4616 sec/batch
Epoch 13/20  Iteration 2190/3560 Training loss: 1.9474 0.4852 sec/batch
Epoch 13/20  Iteration 2191/3560 Training loss: 1.9472 0.4736 sec/batch
Epoch 13/20  Iteration 2192/3560 Training loss: 1.9473 0.4799 sec/batch
Epoch 13/20  Iteration 2193/3560 Training loss: 1.9475 0.4816 sec/batch
Epoch 13/20  Iteration 2194/3560 Training loss: 1.9472 0.4491 sec/batch
Epoch 13/20  Iteration 2195/3560 Training loss: 1.9468 0.3761 sec/batch
Epoch 13/20  Iteration 2196/3560 Training loss: 1.9472 0.3856 sec/batch
Epoch 13/20  Iteration 2197/3560 Training loss: 1.9470 0.3967 sec/batch
Epoch 13/20  Iteration 2198/3560 Training loss: 1.9475 0.4174 sec/batch
Epoch 13/20  Iteration 2199/3560 Training loss: 1.9477 0.3773 sec/batch
Epoch 13/20  Iteration 2200/3560 Training loss: 1.9482 0.3801 sec/batch
Epoch 13/20  Iteration 2201/3560 Training loss: 1.9481 0.3874 sec/batch
Epoch 13/20  Iteration 2202/3560 Training loss: 1.9485 0.3752 sec/batch
Epoch 13/20  Iteration 2203/3560 Training loss: 1.9486 0.3787 sec/batch
Epoch 13/20  Iteration 2204/3560 Training loss: 1.9481 0.3814 sec/batch
Epoch 13/20  Iteration 2205/3560 Training loss: 1.9480 0.3823 sec/batch
Epoch 13/20  Iteration 2206/3560 Training loss: 1.9479 0.3731 sec/batch
Epoch 13/20  Iteration 2207/3560 Training loss: 1.9484 0.3787 sec/batch
Epoch 13/20  Iteration 2208/3560 Training loss: 1.9486 0.3735 sec/batch
Epoch 13/20  Iteration 2209/3560 Training loss: 1.9490 0.3815 sec/batch
Epoch 13/20  Iteration 2210/3560 Training loss: 1.9488 0.3793 sec/batch
Epoch 13/20  Iteration 2211/3560 Training loss: 1.9487 0.4164 sec/batch
Epoch 13/20  Iteration 2212/3560 Training loss: 1.9491 0.4071 sec/batch
Epoch 13/20  Iteration 2213/3560 Training loss: 1.9490 0.3830 sec/batch
Epoch 13/20  Iteration 2214/3560 Training loss: 1.9491 0.3848 sec/batch
Epoch 13/20  Iteration 2215/3560 Training loss: 1.9486 0.3804 sec/batch
Epoch 13/20  Iteration 2216/3560 Training loss: 1.9485 0.3822 sec/batch
Epoch 13/20  Iteration 2217/3560 Training loss: 1.9480 0.3986 sec/batch
Epoch 13/20  Iteration 2218/3560 Training loss: 1.9482 0.3774 sec/batch
Epoch 13/20  Iteration 2219/3560 Training loss: 1.9477 0.3777 sec/batch
Epoch 13/20  Iteration 2220/3560 Training loss: 1.9476 0.3764 sec/batch
Epoch 13/20  Iteration 2221/3560 Training loss: 1.9471 0.3796 sec/batch
Epoch 13/20  Iteration 2222/3560 Training loss: 1.9469 0.3782 sec/batch
Epoch 13/20  Iteration 2223/3560 Training loss: 1.9468 0.3810 sec/batch
Epoch 13/20  Iteration 2224/3560 Training loss: 1.9465 0.4051 sec/batch
Epoch 13/20  Iteration 2225/3560 Training loss: 1.9459 0.3761 sec/batch
Epoch 13/20  Iteration 2226/3560 Training loss: 1.9460 0.3779 sec/batch
Epoch 13/20  Iteration 2227/3560 Training loss: 1.9456 0.3806 sec/batch
Epoch 13/20  Iteration 2228/3560 Training loss: 1.9454 0.3768 sec/batch
Epoch 13/20  Iteration 2229/3560 Training loss: 1.9449 0.3776 sec/batch
Epoch 13/20  Iteration 2230/3560 Training loss: 1.9445 0.3778 sec/batch
Epoch 13/20  Iteration 2231/3560 Training loss: 1.9442 0.3714 sec/batch
Epoch 13/20  Iteration 2232/3560 Training loss: 1.9441 0.3921 sec/batch
Epoch 13/20  Iteration 2233/3560 Training loss: 1.9440 0.3810 sec/batch
Epoch 13/20  Iteration 2234/3560 Training loss: 1.9437 0.3781 sec/batch
Epoch 13/20  Iteration 2235/3560 Training loss: 1.9433 0.3805 sec/batch
Epoch 13/20  Iteration 2236/3560 Training loss: 1.9429 0.3777 sec/batch
Epoch 13/20  Iteration 2237/3560 Training loss: 1.9428 0.4081 sec/batch
Epoch 13/20  Iteration 2238/3560 Training loss: 1.9427 0.4160 sec/batch
Epoch 13/20  Iteration 2239/3560 Training loss: 1.9424 0.4547 sec/batch
Epoch 13/20  Iteration 2240/3560 Training loss: 1.9423 0.3986 sec/batch
Epoch 13/20  Iteration 2241/3560 Training loss: 1.9420 0.3928 sec/batch
Epoch 13/20  Iteration 2242/3560 Training loss: 1.9420 0.3803 sec/batch
Epoch 13/20  Iteration 2243/3560 Training loss: 1.9419 0.3904 sec/batch
Epoch 13/20  Iteration 2244/3560 Training loss: 1.9419 0.3818 sec/batch
Epoch 13/20  Iteration 2245/3560 Training loss: 1.9420 0.3838 sec/batch
Epoch 13/20  Iteration 2246/3560 Training loss: 1.9420 0.3766 sec/batch
Epoch 13/20  Iteration 2247/3560 Training loss: 1.9419 0.3865 sec/batch
Epoch 13/20  Iteration 2248/3560 Training loss: 1.9417 0.4794 sec/batch
Epoch 13/20  Iteration 2249/3560 Training loss: 1.9416 0.5374 sec/batch
Epoch 13/20  Iteration 2250/3560 Training loss: 1.9414 0.4680 sec/batch
Epoch 13/20  Iteration 2251/3560 Training loss: 1.9412 0.3789 sec/batch
Epoch 13/20  Iteration 2252/3560 Training loss: 1.9410 0.3748 sec/batch
Epoch 13/20  Iteration 2253/3560 Training loss: 1.9408 0.3797 sec/batch
Epoch 13/20  Iteration 2254/3560 Training loss: 1.9407 0.3912 sec/batch
Epoch 13/20  Iteration 2255/3560 Training loss: 1.9407 0.3821 sec/batch
Epoch 13/20  Iteration 2256/3560 Training loss: 1.9406 0.3888 sec/batch
Epoch 13/20  Iteration 2257/3560 Training loss: 1.9406 0.3750 sec/batch
Epoch 13/20  Iteration 2258/3560 Training loss: 1.9403 0.3764 sec/batch
Epoch 13/20  Iteration 2259/3560 Training loss: 1.9401 0.3814 sec/batch
Epoch 13/20  Iteration 2260/3560 Training loss: 1.9402 0.3919 sec/batch
Epoch 13/20  Iteration 2261/3560 Training loss: 1.9401 0.3783 sec/batch
Epoch 13/20  Iteration 2262/3560 Training loss: 1.9398 0.4111 sec/batch
Epoch 13/20  Iteration 2263/3560 Training loss: 1.9399 0.3898 sec/batch
Epoch 13/20  Iteration 2264/3560 Training loss: 1.9399 0.3827 sec/batch
Epoch 13/20  Iteration 2265/3560 Training loss: 1.9399 0.3854 sec/batch
Epoch 13/20  Iteration 2266/3560 Training loss: 1.9399 0.3822 sec/batch
Epoch 13/20  Iteration 2267/3560 Training loss: 1.9397 0.3803 sec/batch
Epoch 13/20  Iteration 2268/3560 Training loss: 1.9394 0.3753 sec/batch
Epoch 13/20  Iteration 2269/3560 Training loss: 1.9395 0.3801 sec/batch
Epoch 13/20  Iteration 2270/3560 Training loss: 1.9395 0.3799 sec/batch
Epoch 13/20  Iteration 2271/3560 Training loss: 1.9395 0.3750 sec/batch
Epoch 13/20  Iteration 2272/3560 Training loss: 1.9395 0.3853 sec/batch
Epoch 13/20  Iteration 2273/3560 Training loss: 1.9396 0.3745 sec/batch
Epoch 13/20  Iteration 2274/3560 Training loss: 1.9397 0.3764 sec/batch
Epoch 13/20  Iteration 2275/3560 Training loss: 1.9399 0.3908 sec/batch
Epoch 13/20  Iteration 2276/3560 Training loss: 1.9397 0.4062 sec/batch
Epoch 13/20  Iteration 2277/3560 Training loss: 1.9398 0.3753 sec/batch
Epoch 13/20  Iteration 2278/3560 Training loss: 1.9397 0.3913 sec/batch
Epoch 13/20  Iteration 2279/3560 Training loss: 1.9397 0.3827 sec/batch
Epoch 13/20  Iteration 2280/3560 Training loss: 1.9398 0.3871 sec/batch
Epoch 13/20  Iteration 2281/3560 Training loss: 1.9397 0.3840 sec/batch
Epoch 13/20  Iteration 2282/3560 Training loss: 1.9399 0.3823 sec/batch
Epoch 13/20  Iteration 2283/3560 Training loss: 1.9399 0.3703 sec/batch
Epoch 13/20  Iteration 2284/3560 Training loss: 1.9400 0.3793 sec/batch
Epoch 13/20  Iteration 2285/3560 Training loss: 1.9400 0.3922 sec/batch
Epoch 13/20  Iteration 2286/3560 Training loss: 1.9399 0.3809 sec/batch
Epoch 13/20  Iteration 2287/3560 Training loss: 1.9397 0.3833 sec/batch
Epoch 13/20  Iteration 2288/3560 Training loss: 1.9398 0.3786 sec/batch
Epoch 13/20  Iteration 2289/3560 Training loss: 1.9398 0.4029 sec/batch
Epoch 13/20  Iteration 2290/3560 Training loss: 1.9399 0.3770 sec/batch
Epoch 13/20  Iteration 2291/3560 Training loss: 1.9398 0.3770 sec/batch
Epoch 13/20  Iteration 2292/3560 Training loss: 1.9398 0.3889 sec/batch
Epoch 13/20  Iteration 2293/3560 Training loss: 1.9398 0.3761 sec/batch
Epoch 13/20  Iteration 2294/3560 Training loss: 1.9397 0.3836 sec/batch
Epoch 13/20  Iteration 2295/3560 Training loss: 1.9395 0.3776 sec/batch
Epoch 13/20  Iteration 2296/3560 Training loss: 1.9397 0.3786 sec/batch
Epoch 13/20  Iteration 2297/3560 Training loss: 1.9398 0.4244 sec/batch
Epoch 13/20  Iteration 2298/3560 Training loss: 1.9398 0.3782 sec/batch
Epoch 13/20  Iteration 2299/3560 Training loss: 1.9399 0.3816 sec/batch
Epoch 13/20  Iteration 2300/3560 Training loss: 1.9399 0.3991 sec/batch
Epoch 13/20  Iteration 2301/3560 Training loss: 1.9399 0.3838 sec/batch
Epoch 13/20  Iteration 2302/3560 Training loss: 1.9398 0.4009 sec/batch
Epoch 13/20  Iteration 2303/3560 Training loss: 1.9398 0.3820 sec/batch
Epoch 13/20  Iteration 2304/3560 Training loss: 1.9400 0.3770 sec/batch
Epoch 13/20  Iteration 2305/3560 Training loss: 1.9400 0.3796 sec/batch
Epoch 13/20  Iteration 2306/3560 Training loss: 1.9399 0.3713 sec/batch
Epoch 13/20  Iteration 2307/3560 Training loss: 1.9398 0.3766 sec/batch
Epoch 13/20  Iteration 2308/3560 Training loss: 1.9397 0.3823 sec/batch
Epoch 13/20  Iteration 2309/3560 Training loss: 1.9397 0.3862 sec/batch
Epoch 13/20  Iteration 2310/3560 Training loss: 1.9397 0.3816 sec/batch
Epoch 13/20  Iteration 2311/3560 Training loss: 1.9398 0.3773 sec/batch
Epoch 13/20  Iteration 2312/3560 Training loss: 1.9397 0.3925 sec/batch
Epoch 13/20  Iteration 2313/3560 Training loss: 1.9396 0.3812 sec/batch
Epoch 13/20  Iteration 2314/3560 Training loss: 1.9396 0.3895 sec/batch
Epoch 14/20  Iteration 2315/3560 Training loss: 1.9971 0.3949 sec/batch
Epoch 14/20  Iteration 2316/3560 Training loss: 1.9586 0.3727 sec/batch
Epoch 14/20  Iteration 2317/3560 Training loss: 1.9466 0.3803 sec/batch
Epoch 14/20  Iteration 2318/3560 Training loss: 1.9397 0.3736 sec/batch
Epoch 14/20  Iteration 2319/3560 Training loss: 1.9373 0.3868 sec/batch
Epoch 14/20  Iteration 2320/3560 Training loss: 1.9287 0.3768 sec/batch
Epoch 14/20  Iteration 2321/3560 Training loss: 1.9294 0.3865 sec/batch
Epoch 14/20  Iteration 2322/3560 Training loss: 1.9298 0.3811 sec/batch
Epoch 14/20  Iteration 2323/3560 Training loss: 1.9328 0.3776 sec/batch
Epoch 14/20  Iteration 2324/3560 Training loss: 1.9331 0.3763 sec/batch
Epoch 14/20  Iteration 2325/3560 Training loss: 1.9301 0.3863 sec/batch
Epoch 14/20  Iteration 2326/3560 Training loss: 1.9281 0.3843 sec/batch
Epoch 14/20  Iteration 2327/3560 Training loss: 1.9284 0.3746 sec/batch
Epoch 14/20  Iteration 2328/3560 Training loss: 1.9311 0.3979 sec/batch
Epoch 14/20  Iteration 2329/3560 Training loss: 1.9302 0.3748 sec/batch
Epoch 14/20  Iteration 2330/3560 Training loss: 1.9284 0.3881 sec/batch
Epoch 14/20  Iteration 2331/3560 Training loss: 1.9278 0.3756 sec/batch
Epoch 14/20  Iteration 2332/3560 Training loss: 1.9305 0.3758 sec/batch
Epoch 14/20  Iteration 2333/3560 Training loss: 1.9304 0.3945 sec/batch
Epoch 14/20  Iteration 2334/3560 Training loss: 1.9309 0.3789 sec/batch
Epoch 14/20  Iteration 2335/3560 Training loss: 1.9301 0.3780 sec/batch
Epoch 14/20  Iteration 2336/3560 Training loss: 1.9309 0.3893 sec/batch
Epoch 14/20  Iteration 2337/3560 Training loss: 1.9303 0.3947 sec/batch
Epoch 14/20  Iteration 2338/3560 Training loss: 1.9298 0.3802 sec/batch
Epoch 14/20  Iteration 2339/3560 Training loss: 1.9295 0.3825 sec/batch
Epoch 14/20  Iteration 2340/3560 Training loss: 1.9285 0.3864 sec/batch
Epoch 14/20  Iteration 2341/3560 Training loss: 1.9275 0.4115 sec/batch
Epoch 14/20  Iteration 2342/3560 Training loss: 1.9279 0.3783 sec/batch
Epoch 14/20  Iteration 2343/3560 Training loss: 1.9290 0.3815 sec/batch
Epoch 14/20  Iteration 2344/3560 Training loss: 1.9295 0.3842 sec/batch
Epoch 14/20  Iteration 2345/3560 Training loss: 1.9295 0.3745 sec/batch
Epoch 14/20  Iteration 2346/3560 Training loss: 1.9289 0.3824 sec/batch
Epoch 14/20  Iteration 2347/3560 Training loss: 1.9292 0.3793 sec/batch
Epoch 14/20  Iteration 2348/3560 Training loss: 1.9300 0.3880 sec/batch
Epoch 14/20  Iteration 2349/3560 Training loss: 1.9299 0.3776 sec/batch
Epoch 14/20  Iteration 2350/3560 Training loss: 1.9299 0.3788 sec/batch
Epoch 14/20  Iteration 2351/3560 Training loss: 1.9295 0.3855 sec/batch
Epoch 14/20  Iteration 2352/3560 Training loss: 1.9285 0.3842 sec/batch
Epoch 14/20  Iteration 2353/3560 Training loss: 1.9274 0.3846 sec/batch
Epoch 14/20  Iteration 2354/3560 Training loss: 1.9270 0.4037 sec/batch
Epoch 14/20  Iteration 2355/3560 Training loss: 1.9263 0.3815 sec/batch
Epoch 14/20  Iteration 2356/3560 Training loss: 1.9268 0.3761 sec/batch
Epoch 14/20  Iteration 2357/3560 Training loss: 1.9261 0.3813 sec/batch
Epoch 14/20  Iteration 2358/3560 Training loss: 1.9253 0.3783 sec/batch
Epoch 14/20  Iteration 2359/3560 Training loss: 1.9254 0.3823 sec/batch
Epoch 14/20  Iteration 2360/3560 Training loss: 1.9244 0.3801 sec/batch
Epoch 14/20  Iteration 2361/3560 Training loss: 1.9240 0.3802 sec/batch
Epoch 14/20  Iteration 2362/3560 Training loss: 1.9236 0.3883 sec/batch
Epoch 14/20  Iteration 2363/3560 Training loss: 1.9236 0.3766 sec/batch
Epoch 14/20  Iteration 2364/3560 Training loss: 1.9245 0.3761 sec/batch
Epoch 14/20  Iteration 2365/3560 Training loss: 1.9239 0.3760 sec/batch
Epoch 14/20  Iteration 2366/3560 Training loss: 1.9246 0.3723 sec/batch
Epoch 14/20  Iteration 2367/3560 Training loss: 1.9244 0.3900 sec/batch
Epoch 14/20  Iteration 2368/3560 Training loss: 1.9243 0.3924 sec/batch
Epoch 14/20  Iteration 2369/3560 Training loss: 1.9239 0.3753 sec/batch
Epoch 14/20  Iteration 2370/3560 Training loss: 1.9241 0.3765 sec/batch
Epoch 14/20  Iteration 2371/3560 Training loss: 1.9244 0.3842 sec/batch
Epoch 14/20  Iteration 2372/3560 Training loss: 1.9241 0.3822 sec/batch
Epoch 14/20  Iteration 2373/3560 Training loss: 1.9239 0.3753 sec/batch
Epoch 14/20  Iteration 2374/3560 Training loss: 1.9243 0.3787 sec/batch
Epoch 14/20  Iteration 2375/3560 Training loss: 1.9241 0.3785 sec/batch
Epoch 14/20  Iteration 2376/3560 Training loss: 1.9246 0.3789 sec/batch
Epoch 14/20  Iteration 2377/3560 Training loss: 1.9249 0.3742 sec/batch
Epoch 14/20  Iteration 2378/3560 Training loss: 1.9251 0.3923 sec/batch
Epoch 14/20  Iteration 2379/3560 Training loss: 1.9249 0.3811 sec/batch
Epoch 14/20  Iteration 2380/3560 Training loss: 1.9252 0.3753 sec/batch
Epoch 14/20  Iteration 2381/3560 Training loss: 1.9253 0.4019 sec/batch
Epoch 14/20  Iteration 2382/3560 Training loss: 1.9249 0.3860 sec/batch
Epoch 14/20  Iteration 2383/3560 Training loss: 1.9248 0.3794 sec/batch
Epoch 14/20  Iteration 2384/3560 Training loss: 1.9248 0.3786 sec/batch
Epoch 14/20  Iteration 2385/3560 Training loss: 1.9253 0.3781 sec/batch
Epoch 14/20  Iteration 2386/3560 Training loss: 1.9253 0.3791 sec/batch
Epoch 14/20  Iteration 2387/3560 Training loss: 1.9257 0.3837 sec/batch
Epoch 14/20  Iteration 2388/3560 Training loss: 1.9255 0.3806 sec/batch
Epoch 14/20  Iteration 2389/3560 Training loss: 1.9255 0.3833 sec/batch
Epoch 14/20  Iteration 2390/3560 Training loss: 1.9257 0.3783 sec/batch
Epoch 14/20  Iteration 2391/3560 Training loss: 1.9256 0.3778 sec/batch
Epoch 14/20  Iteration 2392/3560 Training loss: 1.9256 0.3744 sec/batch
Epoch 14/20  Iteration 2393/3560 Training loss: 1.9252 0.3913 sec/batch
Epoch 14/20  Iteration 2394/3560 Training loss: 1.9250 0.3977 sec/batch
Epoch 14/20  Iteration 2395/3560 Training loss: 1.9245 0.3805 sec/batch
Epoch 14/20  Iteration 2396/3560 Training loss: 1.9246 0.3760 sec/batch
Epoch 14/20  Iteration 2397/3560 Training loss: 1.9241 0.3820 sec/batch
Epoch 14/20  Iteration 2398/3560 Training loss: 1.9240 0.3756 sec/batch
Epoch 14/20  Iteration 2399/3560 Training loss: 1.9234 0.3750 sec/batch
Epoch 14/20  Iteration 2400/3560 Training loss: 1.9232 0.3826 sec/batch
Epoch 14/20  Iteration 2401/3560 Training loss: 1.9230 0.3681 sec/batch
Epoch 14/20  Iteration 2402/3560 Training loss: 1.9227 0.3767 sec/batch
Epoch 14/20  Iteration 2403/3560 Training loss: 1.9222 0.3791 sec/batch
Epoch 14/20  Iteration 2404/3560 Training loss: 1.9224 0.3812 sec/batch
Epoch 14/20  Iteration 2405/3560 Training loss: 1.9221 0.3788 sec/batch
Epoch 14/20  Iteration 2406/3560 Training loss: 1.9219 0.3892 sec/batch
Epoch 14/20  Iteration 2407/3560 Training loss: 1.9214 0.4019 sec/batch
Epoch 14/20  Iteration 2408/3560 Training loss: 1.9211 0.3841 sec/batch
Epoch 14/20  Iteration 2409/3560 Training loss: 1.9208 0.3811 sec/batch
Epoch 14/20  Iteration 2410/3560 Training loss: 1.9208 0.3744 sec/batch
Epoch 14/20  Iteration 2411/3560 Training loss: 1.9207 0.3785 sec/batch
Epoch 14/20  Iteration 2412/3560 Training loss: 1.9204 0.3769 sec/batch
Epoch 14/20  Iteration 2413/3560 Training loss: 1.9200 0.3784 sec/batch
Epoch 14/20  Iteration 2414/3560 Training loss: 1.9196 0.3769 sec/batch
Epoch 14/20  Iteration 2415/3560 Training loss: 1.9196 0.6804 sec/batch
Epoch 14/20  Iteration 2416/3560 Training loss: 1.9197 0.3744 sec/batch
Epoch 14/20  Iteration 2417/3560 Training loss: 1.9195 0.3777 sec/batch
Epoch 14/20  Iteration 2418/3560 Training loss: 1.9194 0.3846 sec/batch
Epoch 14/20  Iteration 2419/3560 Training loss: 1.9191 0.3828 sec/batch
Epoch 14/20  Iteration 2420/3560 Training loss: 1.9191 0.4175 sec/batch
Epoch 14/20  Iteration 2421/3560 Training loss: 1.9190 0.3765 sec/batch
Epoch 14/20  Iteration 2422/3560 Training loss: 1.9189 0.3823 sec/batch
Epoch 14/20  Iteration 2423/3560 Training loss: 1.9189 0.3773 sec/batch
Epoch 14/20  Iteration 2424/3560 Training loss: 1.9189 0.3819 sec/batch
Epoch 14/20  Iteration 2425/3560 Training loss: 1.9189 0.3771 sec/batch
Epoch 14/20  Iteration 2426/3560 Training loss: 1.9187 0.3793 sec/batch
Epoch 14/20  Iteration 2427/3560 Training loss: 1.9186 0.3805 sec/batch
Epoch 14/20  Iteration 2428/3560 Training loss: 1.9185 0.3792 sec/batch
Epoch 14/20  Iteration 2429/3560 Training loss: 1.9183 0.3804 sec/batch
Epoch 14/20  Iteration 2430/3560 Training loss: 1.9179 0.3757 sec/batch
Epoch 14/20  Iteration 2431/3560 Training loss: 1.9179 0.3790 sec/batch
Epoch 14/20  Iteration 2432/3560 Training loss: 1.9178 0.3793 sec/batch
Epoch 14/20  Iteration 2433/3560 Training loss: 1.9178 0.4011 sec/batch
Epoch 14/20  Iteration 2434/3560 Training loss: 1.9178 0.3737 sec/batch
Epoch 14/20  Iteration 2435/3560 Training loss: 1.9177 0.3788 sec/batch
Epoch 14/20  Iteration 2436/3560 Training loss: 1.9174 0.3904 sec/batch
Epoch 14/20  Iteration 2437/3560 Training loss: 1.9172 0.3752 sec/batch
Epoch 14/20  Iteration 2438/3560 Training loss: 1.9173 0.3771 sec/batch
Epoch 14/20  Iteration 2439/3560 Training loss: 1.9173 0.3794 sec/batch
Epoch 14/20  Iteration 2440/3560 Training loss: 1.9170 0.3809 sec/batch
Epoch 14/20  Iteration 2441/3560 Training loss: 1.9171 0.3829 sec/batch
Epoch 14/20  Iteration 2442/3560 Training loss: 1.9172 0.3822 sec/batch
Epoch 14/20  Iteration 2443/3560 Training loss: 1.9171 0.3761 sec/batch
Epoch 14/20  Iteration 2444/3560 Training loss: 1.9171 0.3784 sec/batch
Epoch 14/20  Iteration 2445/3560 Training loss: 1.9168 0.3763 sec/batch
Epoch 14/20  Iteration 2446/3560 Training loss: 1.9166 0.3965 sec/batch
Epoch 14/20  Iteration 2447/3560 Training loss: 1.9167 0.3745 sec/batch
Epoch 14/20  Iteration 2448/3560 Training loss: 1.9167 0.3756 sec/batch
Epoch 14/20  Iteration 2449/3560 Training loss: 1.9166 0.3833 sec/batch
Epoch 14/20  Iteration 2450/3560 Training loss: 1.9167 0.3745 sec/batch
Epoch 14/20  Iteration 2451/3560 Training loss: 1.9167 0.3785 sec/batch
Epoch 14/20  Iteration 2452/3560 Training loss: 1.9168 0.3804 sec/batch
Epoch 14/20  Iteration 2453/3560 Training loss: 1.9169 0.3795 sec/batch
Epoch 14/20  Iteration 2454/3560 Training loss: 1.9168 0.3752 sec/batch
Epoch 14/20  Iteration 2455/3560 Training loss: 1.9170 0.3742 sec/batch
Epoch 14/20  Iteration 2456/3560 Training loss: 1.9169 0.3777 sec/batch
Epoch 14/20  Iteration 2457/3560 Training loss: 1.9169 0.3797 sec/batch
Epoch 14/20  Iteration 2458/3560 Training loss: 1.9169 0.3822 sec/batch
Epoch 14/20  Iteration 2459/3560 Training loss: 1.9169 0.3737 sec/batch
Epoch 14/20  Iteration 2460/3560 Training loss: 1.9169 0.3963 sec/batch
Epoch 14/20  Iteration 2461/3560 Training loss: 1.9169 0.3698 sec/batch
Epoch 14/20  Iteration 2462/3560 Training loss: 1.9171 0.3851 sec/batch
Epoch 14/20  Iteration 2463/3560 Training loss: 1.9170 0.3754 sec/batch
Epoch 14/20  Iteration 2464/3560 Training loss: 1.9170 0.4183 sec/batch
Epoch 14/20  Iteration 2465/3560 Training loss: 1.9168 0.4731 sec/batch
Epoch 14/20  Iteration 2466/3560 Training loss: 1.9169 0.3803 sec/batch
Epoch 14/20  Iteration 2467/3560 Training loss: 1.9169 0.3841 sec/batch
Epoch 14/20  Iteration 2468/3560 Training loss: 1.9169 0.3773 sec/batch
Epoch 14/20  Iteration 2469/3560 Training loss: 1.9168 0.3788 sec/batch
Epoch 14/20  Iteration 2470/3560 Training loss: 1.9168 0.3782 sec/batch
Epoch 14/20  Iteration 2471/3560 Training loss: 1.9169 0.3820 sec/batch
Epoch 14/20  Iteration 2472/3560 Training loss: 1.9168 0.3779 sec/batch
Epoch 14/20  Iteration 2473/3560 Training loss: 1.9165 0.4080 sec/batch
Epoch 14/20  Iteration 2474/3560 Training loss: 1.9167 0.3766 sec/batch
Epoch 14/20  Iteration 2475/3560 Training loss: 1.9168 0.3834 sec/batch
Epoch 14/20  Iteration 2476/3560 Training loss: 1.9166 0.3842 sec/batch
Epoch 14/20  Iteration 2477/3560 Training loss: 1.9167 0.3906 sec/batch
Epoch 14/20  Iteration 2478/3560 Training loss: 1.9167 0.3742 sec/batch
Epoch 14/20  Iteration 2479/3560 Training loss: 1.9167 0.3779 sec/batch
Epoch 14/20  Iteration 2480/3560 Training loss: 1.9166 0.3774 sec/batch
Epoch 14/20  Iteration 2481/3560 Training loss: 1.9166 0.3769 sec/batch
Epoch 14/20  Iteration 2482/3560 Training loss: 1.9170 0.3804 sec/batch
Epoch 14/20  Iteration 2483/3560 Training loss: 1.9170 0.3747 sec/batch
Epoch 14/20  Iteration 2484/3560 Training loss: 1.9169 0.3758 sec/batch
Epoch 14/20  Iteration 2485/3560 Training loss: 1.9169 0.3753 sec/batch
Epoch 14/20  Iteration 2486/3560 Training loss: 1.9167 0.4003 sec/batch
Epoch 14/20  Iteration 2487/3560 Training loss: 1.9169 0.3856 sec/batch
Epoch 14/20  Iteration 2488/3560 Training loss: 1.9169 0.3790 sec/batch
Epoch 14/20  Iteration 2489/3560 Training loss: 1.9169 0.3741 sec/batch
Epoch 14/20  Iteration 2490/3560 Training loss: 1.9169 0.3806 sec/batch
Epoch 14/20  Iteration 2491/3560 Training loss: 1.9168 0.3880 sec/batch
Epoch 14/20  Iteration 2492/3560 Training loss: 1.9168 0.3792 sec/batch
Epoch 15/20  Iteration 2493/3560 Training loss: 1.9776 0.3844 sec/batch
Epoch 15/20  Iteration 2494/3560 Training loss: 1.9388 0.3728 sec/batch
Epoch 15/20  Iteration 2495/3560 Training loss: 1.9285 0.3794 sec/batch
Epoch 15/20  Iteration 2496/3560 Training loss: 1.9211 0.3752 sec/batch
Epoch 15/20  Iteration 2497/3560 Training loss: 1.9160 0.3759 sec/batch
Epoch 15/20  Iteration 2498/3560 Training loss: 1.9099 0.3761 sec/batch
Epoch 15/20  Iteration 2499/3560 Training loss: 1.9098 0.3893 sec/batch
Epoch 15/20  Iteration 2500/3560 Training loss: 1.9081 0.3805 sec/batch
Epoch 15/20  Iteration 2501/3560 Training loss: 1.9098 0.3769 sec/batch
Epoch 15/20  Iteration 2502/3560 Training loss: 1.9090 0.3762 sec/batch
Epoch 15/20  Iteration 2503/3560 Training loss: 1.9054 0.3846 sec/batch
Epoch 15/20  Iteration 2504/3560 Training loss: 1.9037 0.3842 sec/batch
Epoch 15/20  Iteration 2505/3560 Training loss: 1.9030 0.3779 sec/batch
Epoch 15/20  Iteration 2506/3560 Training loss: 1.9058 0.3751 sec/batch
Epoch 15/20  Iteration 2507/3560 Training loss: 1.9055 0.3814 sec/batch
Epoch 15/20  Iteration 2508/3560 Training loss: 1.9039 0.3934 sec/batch
Epoch 15/20  Iteration 2509/3560 Training loss: 1.9036 0.3741 sec/batch
Epoch 15/20  Iteration 2510/3560 Training loss: 1.9060 0.3684 sec/batch
Epoch 15/20  Iteration 2511/3560 Training loss: 1.9062 0.3785 sec/batch
Epoch 15/20  Iteration 2512/3560 Training loss: 1.9068 0.3768 sec/batch
Epoch 15/20  Iteration 2513/3560 Training loss: 1.9064 0.4014 sec/batch
Epoch 15/20  Iteration 2514/3560 Training loss: 1.9071 0.3798 sec/batch
Epoch 15/20  Iteration 2515/3560 Training loss: 1.9066 0.3778 sec/batch
Epoch 15/20  Iteration 2516/3560 Training loss: 1.9061 0.3879 sec/batch
Epoch 15/20  Iteration 2517/3560 Training loss: 1.9057 0.3768 sec/batch
Epoch 15/20  Iteration 2518/3560 Training loss: 1.9047 0.3825 sec/batch
Epoch 15/20  Iteration 2519/3560 Training loss: 1.9039 0.3834 sec/batch
Epoch 15/20  Iteration 2520/3560 Training loss: 1.9045 0.3799 sec/batch
Epoch 15/20  Iteration 2521/3560 Training loss: 1.9056 0.3801 sec/batch
Epoch 15/20  Iteration 2522/3560 Training loss: 1.9059 0.3752 sec/batch
Epoch 15/20  Iteration 2523/3560 Training loss: 1.9061 0.3843 sec/batch
Epoch 15/20  Iteration 2524/3560 Training loss: 1.9055 0.3793 sec/batch
Epoch 15/20  Iteration 2525/3560 Training loss: 1.9058 0.3835 sec/batch
Epoch 15/20  Iteration 2526/3560 Training loss: 1.9067 0.4026 sec/batch
Epoch 15/20  Iteration 2527/3560 Training loss: 1.9063 0.3787 sec/batch
Epoch 15/20  Iteration 2528/3560 Training loss: 1.9062 0.3734 sec/batch
Epoch 15/20  Iteration 2529/3560 Training loss: 1.9059 0.3782 sec/batch
Epoch 15/20  Iteration 2530/3560 Training loss: 1.9049 0.3744 sec/batch
Epoch 15/20  Iteration 2531/3560 Training loss: 1.9038 0.3753 sec/batch
Epoch 15/20  Iteration 2532/3560 Training loss: 1.9034 0.3757 sec/batch
Epoch 15/20  Iteration 2533/3560 Training loss: 1.9029 0.3775 sec/batch
Epoch 15/20  Iteration 2534/3560 Training loss: 1.9032 0.3830 sec/batch
Epoch 15/20  Iteration 2535/3560 Training loss: 1.9027 0.3771 sec/batch
Epoch 15/20  Iteration 2536/3560 Training loss: 1.9018 0.3743 sec/batch
Epoch 15/20  Iteration 2537/3560 Training loss: 1.9019 0.3782 sec/batch
Epoch 15/20  Iteration 2538/3560 Training loss: 1.9008 0.3803 sec/batch
Epoch 15/20  Iteration 2539/3560 Training loss: 1.9006 0.4077 sec/batch
Epoch 15/20  Iteration 2540/3560 Training loss: 1.9003 0.3765 sec/batch
Epoch 15/20  Iteration 2541/3560 Training loss: 1.9002 0.3842 sec/batch
Epoch 15/20  Iteration 2542/3560 Training loss: 1.9009 0.3760 sec/batch
Epoch 15/20  Iteration 2543/3560 Training loss: 1.9003 0.3778 sec/batch
Epoch 15/20  Iteration 2544/3560 Training loss: 1.9012 0.3819 sec/batch
Epoch 15/20  Iteration 2545/3560 Training loss: 1.9010 0.3781 sec/batch
Epoch 15/20  Iteration 2546/3560 Training loss: 1.9011 0.3778 sec/batch
Epoch 15/20  Iteration 2547/3560 Training loss: 1.9007 0.3810 sec/batch
Epoch 15/20  Iteration 2548/3560 Training loss: 1.9010 0.3799 sec/batch
Epoch 15/20  Iteration 2549/3560 Training loss: 1.9013 0.3744 sec/batch
Epoch 15/20  Iteration 2550/3560 Training loss: 1.9010 0.3768 sec/batch
Epoch 15/20  Iteration 2551/3560 Training loss: 1.9006 0.3763 sec/batch
Epoch 15/20  Iteration 2552/3560 Training loss: 1.9011 0.4113 sec/batch
Epoch 15/20  Iteration 2553/3560 Training loss: 1.9010 0.3828 sec/batch
Epoch 15/20  Iteration 2554/3560 Training loss: 1.9017 0.3841 sec/batch
Epoch 15/20  Iteration 2555/3560 Training loss: 1.9018 0.3807 sec/batch
Epoch 15/20  Iteration 2556/3560 Training loss: 1.9021 0.3817 sec/batch
Epoch 15/20  Iteration 2557/3560 Training loss: 1.9020 0.3781 sec/batch
Epoch 15/20  Iteration 2558/3560 Training loss: 1.9025 0.3839 sec/batch
Epoch 15/20  Iteration 2559/3560 Training loss: 1.9026 0.3750 sec/batch
Epoch 15/20  Iteration 2560/3560 Training loss: 1.9023 0.3752 sec/batch
Epoch 15/20  Iteration 2561/3560 Training loss: 1.9021 0.3785 sec/batch
Epoch 15/20  Iteration 2562/3560 Training loss: 1.9021 0.3768 sec/batch
Epoch 15/20  Iteration 2563/3560 Training loss: 1.9024 0.3742 sec/batch
Epoch 15/20  Iteration 2564/3560 Training loss: 1.9025 0.3746 sec/batch
Epoch 15/20  Iteration 2565/3560 Training loss: 1.9029 0.3774 sec/batch
Epoch 15/20  Iteration 2566/3560 Training loss: 1.9025 0.4006 sec/batch
Epoch 15/20  Iteration 2567/3560 Training loss: 1.9024 0.3749 sec/batch
Epoch 15/20  Iteration 2568/3560 Training loss: 1.9028 0.3869 sec/batch
Epoch 15/20  Iteration 2569/3560 Training loss: 1.9028 0.4000 sec/batch
Epoch 15/20  Iteration 2570/3560 Training loss: 1.9029 0.4574 sec/batch
Epoch 15/20  Iteration 2571/3560 Training loss: 1.9023 0.3802 sec/batch
Epoch 15/20  Iteration 2572/3560 Training loss: 1.9021 0.3760 sec/batch
Epoch 15/20  Iteration 2573/3560 Training loss: 1.9017 0.3757 sec/batch
Epoch 15/20  Iteration 2574/3560 Training loss: 1.9017 0.3776 sec/batch
Epoch 15/20  Iteration 2575/3560 Training loss: 1.9012 0.3805 sec/batch
Epoch 15/20  Iteration 2576/3560 Training loss: 1.9011 0.3810 sec/batch
Epoch 15/20  Iteration 2577/3560 Training loss: 1.9006 0.3762 sec/batch
Epoch 15/20  Iteration 2578/3560 Training loss: 1.9003 0.3834 sec/batch
Epoch 15/20  Iteration 2579/3560 Training loss: 1.9002 0.3953 sec/batch
Epoch 15/20  Iteration 2580/3560 Training loss: 1.8998 0.3856 sec/batch
Epoch 15/20  Iteration 2581/3560 Training loss: 1.8993 0.3829 sec/batch
Epoch 15/20  Iteration 2582/3560 Training loss: 1.8995 0.3798 sec/batch
Epoch 15/20  Iteration 2583/3560 Training loss: 1.8992 0.3769 sec/batch
Epoch 15/20  Iteration 2584/3560 Training loss: 1.8990 0.3857 sec/batch
Epoch 15/20  Iteration 2585/3560 Training loss: 1.8985 0.3757 sec/batch
Epoch 15/20  Iteration 2586/3560 Training loss: 1.8984 0.3753 sec/batch
Epoch 15/20  Iteration 2587/3560 Training loss: 1.8981 0.3749 sec/batch
Epoch 15/20  Iteration 2588/3560 Training loss: 1.8980 0.3792 sec/batch
Epoch 15/20  Iteration 2589/3560 Training loss: 1.8979 0.3755 sec/batch
Epoch 15/20  Iteration 2590/3560 Training loss: 1.8975 0.3740 sec/batch
Epoch 15/20  Iteration 2591/3560 Training loss: 1.8972 0.3825 sec/batch
Epoch 15/20  Iteration 2592/3560 Training loss: 1.8967 0.3964 sec/batch
Epoch 15/20  Iteration 2593/3560 Training loss: 1.8968 0.3790 sec/batch
Epoch 15/20  Iteration 2594/3560 Training loss: 1.8967 0.3855 sec/batch
Epoch 15/20  Iteration 2595/3560 Training loss: 1.8965 0.3810 sec/batch
Epoch 15/20  Iteration 2596/3560 Training loss: 1.8963 0.3867 sec/batch
Epoch 15/20  Iteration 2597/3560 Training loss: 1.8961 0.3789 sec/batch
Epoch 15/20  Iteration 2598/3560 Training loss: 1.8959 0.3944 sec/batch
Epoch 15/20  Iteration 2599/3560 Training loss: 1.8958 0.3876 sec/batch
Epoch 15/20  Iteration 2600/3560 Training loss: 1.8958 0.3747 sec/batch
Epoch 15/20  Iteration 2601/3560 Training loss: 1.8959 0.3890 sec/batch
Epoch 15/20  Iteration 2602/3560 Training loss: 1.8959 0.3789 sec/batch
Epoch 15/20  Iteration 2603/3560 Training loss: 1.8958 0.3866 sec/batch
Epoch 15/20  Iteration 2604/3560 Training loss: 1.8957 0.3777 sec/batch
Epoch 15/20  Iteration 2605/3560 Training loss: 1.8957 0.4023 sec/batch
Epoch 15/20  Iteration 2606/3560 Training loss: 1.8955 0.4187 sec/batch
Epoch 15/20  Iteration 2607/3560 Training loss: 1.8953 0.4057 sec/batch
Epoch 15/20  Iteration 2608/3560 Training loss: 1.8949 0.3802 sec/batch
Epoch 15/20  Iteration 2609/3560 Training loss: 1.8949 0.3932 sec/batch
Epoch 15/20  Iteration 2610/3560 Training loss: 1.8949 0.4065 sec/batch
Epoch 15/20  Iteration 2611/3560 Training loss: 1.8949 0.3798 sec/batch
Epoch 15/20  Iteration 2612/3560 Training loss: 1.8949 0.3772 sec/batch
Epoch 15/20  Iteration 2613/3560 Training loss: 1.8950 0.3797 sec/batch
Epoch 15/20  Iteration 2614/3560 Training loss: 1.8947 0.3836 sec/batch
Epoch 15/20  Iteration 2615/3560 Training loss: 1.8945 0.3849 sec/batch
Epoch 15/20  Iteration 2616/3560 Training loss: 1.8946 0.4039 sec/batch
Epoch 15/20  Iteration 2617/3560 Training loss: 1.8946 0.3802 sec/batch
Epoch 15/20  Iteration 2618/3560 Training loss: 1.8943 0.4178 sec/batch
Epoch 15/20  Iteration 2619/3560 Training loss: 1.8944 0.3858 sec/batch
Epoch 15/20  Iteration 2620/3560 Training loss: 1.8945 0.3794 sec/batch
Epoch 15/20  Iteration 2621/3560 Training loss: 1.8944 0.3804 sec/batch
Epoch 15/20  Iteration 2622/3560 Training loss: 1.8944 0.3821 sec/batch
Epoch 15/20  Iteration 2623/3560 Training loss: 1.8941 0.3870 sec/batch
Epoch 15/20  Iteration 2624/3560 Training loss: 1.8938 0.3787 sec/batch
Epoch 15/20  Iteration 2625/3560 Training loss: 1.8938 0.3786 sec/batch
Epoch 15/20  Iteration 2626/3560 Training loss: 1.8939 0.3795 sec/batch
Epoch 15/20  Iteration 2627/3560 Training loss: 1.8939 0.3757 sec/batch
Epoch 15/20  Iteration 2628/3560 Training loss: 1.8939 0.3753 sec/batch
Epoch 15/20  Iteration 2629/3560 Training loss: 1.8940 0.3888 sec/batch
Epoch 15/20  Iteration 2630/3560 Training loss: 1.8941 0.3824 sec/batch
Epoch 15/20  Iteration 2631/3560 Training loss: 1.8942 0.4058 sec/batch
Epoch 15/20  Iteration 2632/3560 Training loss: 1.8940 0.3766 sec/batch
Epoch 15/20  Iteration 2633/3560 Training loss: 1.8942 0.3793 sec/batch
Epoch 15/20  Iteration 2634/3560 Training loss: 1.8942 0.3787 sec/batch
Epoch 15/20  Iteration 2635/3560 Training loss: 1.8941 0.3770 sec/batch
Epoch 15/20  Iteration 2636/3560 Training loss: 1.8942 0.3752 sec/batch
Epoch 15/20  Iteration 2637/3560 Training loss: 1.8940 0.3819 sec/batch
Epoch 15/20  Iteration 2638/3560 Training loss: 1.8941 0.3797 sec/batch
Epoch 15/20  Iteration 2639/3560 Training loss: 1.8941 0.3891 sec/batch
Epoch 15/20  Iteration 2640/3560 Training loss: 1.8942 0.3811 sec/batch
Epoch 15/20  Iteration 2641/3560 Training loss: 1.8942 0.3809 sec/batch
Epoch 15/20  Iteration 2642/3560 Training loss: 1.8940 0.3864 sec/batch
Epoch 15/20  Iteration 2643/3560 Training loss: 1.8938 0.3844 sec/batch
Epoch 15/20  Iteration 2644/3560 Training loss: 1.8940 0.3864 sec/batch
Epoch 15/20  Iteration 2645/3560 Training loss: 1.8940 0.3910 sec/batch
Epoch 15/20  Iteration 2646/3560 Training loss: 1.8941 0.3900 sec/batch
Epoch 15/20  Iteration 2647/3560 Training loss: 1.8941 0.3845 sec/batch
Epoch 15/20  Iteration 2648/3560 Training loss: 1.8940 0.3748 sec/batch
Epoch 15/20  Iteration 2649/3560 Training loss: 1.8940 0.3780 sec/batch
Epoch 15/20  Iteration 2650/3560 Training loss: 1.8940 0.3723 sec/batch
Epoch 15/20  Iteration 2651/3560 Training loss: 1.8938 0.3790 sec/batch
Epoch 15/20  Iteration 2652/3560 Training loss: 1.8939 0.3786 sec/batch
Epoch 15/20  Iteration 2653/3560 Training loss: 1.8941 0.3776 sec/batch
Epoch 15/20  Iteration 2654/3560 Training loss: 1.8940 0.3752 sec/batch
Epoch 15/20  Iteration 2655/3560 Training loss: 1.8941 0.4278 sec/batch
Epoch 15/20  Iteration 2656/3560 Training loss: 1.8941 0.3865 sec/batch
Epoch 15/20  Iteration 2657/3560 Training loss: 1.8941 0.4072 sec/batch
Epoch 15/20  Iteration 2658/3560 Training loss: 1.8941 0.3954 sec/batch
Epoch 15/20  Iteration 2659/3560 Training loss: 1.8941 0.3762 sec/batch
Epoch 15/20  Iteration 2660/3560 Training loss: 1.8944 0.3785 sec/batch
Epoch 15/20  Iteration 2661/3560 Training loss: 1.8944 0.3784 sec/batch
Epoch 15/20  Iteration 2662/3560 Training loss: 1.8945 0.3998 sec/batch
Epoch 15/20  Iteration 2663/3560 Training loss: 1.8943 0.3964 sec/batch
Epoch 15/20  Iteration 2664/3560 Training loss: 1.8942 0.3816 sec/batch
Epoch 15/20  Iteration 2665/3560 Training loss: 1.8943 0.3788 sec/batch
Epoch 15/20  Iteration 2666/3560 Training loss: 1.8943 0.3762 sec/batch
Epoch 15/20  Iteration 2667/3560 Training loss: 1.8943 0.3769 sec/batch
Epoch 15/20  Iteration 2668/3560 Training loss: 1.8943 0.3796 sec/batch
Epoch 15/20  Iteration 2669/3560 Training loss: 1.8942 1596.3026 sec/batch
Epoch 15/20  Iteration 2670/3560 Training loss: 1.8943 0.4854 sec/batch
Epoch 16/20  Iteration 2671/3560 Training loss: 1.9633 0.4163 sec/batch
Epoch 16/20  Iteration 2672/3560 Training loss: 1.9251 0.3973 sec/batch
Epoch 16/20  Iteration 2673/3560 Training loss: 1.9133 0.4671 sec/batch
Epoch 16/20  Iteration 2674/3560 Training loss: 1.9080 0.4532 sec/batch
Epoch 16/20  Iteration 2675/3560 Training loss: 1.9017 0.5661 sec/batch
Epoch 16/20  Iteration 2676/3560 Training loss: 1.8937 0.4799 sec/batch
Epoch 16/20  Iteration 2677/3560 Training loss: 1.8923 0.6353 sec/batch
Epoch 16/20  Iteration 2678/3560 Training loss: 1.8905 0.5104 sec/batch
Epoch 16/20  Iteration 2679/3560 Training loss: 1.8925 0.4336 sec/batch
Epoch 16/20  Iteration 2680/3560 Training loss: 1.8923 0.4144 sec/batch
Epoch 16/20  Iteration 2681/3560 Training loss: 1.8892 0.4058 sec/batch
Epoch 16/20  Iteration 2682/3560 Training loss: 1.8877 0.3852 sec/batch
Epoch 16/20  Iteration 2683/3560 Training loss: 1.8869 0.3828 sec/batch
Epoch 16/20  Iteration 2684/3560 Training loss: 1.8893 0.4007 sec/batch
Epoch 16/20  Iteration 2685/3560 Training loss: 1.8882 0.4885 sec/batch
Epoch 16/20  Iteration 2686/3560 Training loss: 1.8867 0.4208 sec/batch
Epoch 16/20  Iteration 2687/3560 Training loss: 1.8865 0.3798 sec/batch
Epoch 16/20  Iteration 2688/3560 Training loss: 1.8885 0.5290 sec/batch
Epoch 16/20  Iteration 2689/3560 Training loss: 1.8887 0.4437 sec/batch
Epoch 16/20  Iteration 2690/3560 Training loss: 1.8893 0.4312 sec/batch
Epoch 16/20  Iteration 2691/3560 Training loss: 1.8889 0.4334 sec/batch
Epoch 16/20  Iteration 2692/3560 Training loss: 1.8895 0.4473 sec/batch
Epoch 16/20  Iteration 2693/3560 Training loss: 1.8890 0.5587 sec/batch
Epoch 16/20  Iteration 2694/3560 Training loss: 1.8882 0.5218 sec/batch
Epoch 16/20  Iteration 2695/3560 Training loss: 1.8876 0.4210 sec/batch
Epoch 16/20  Iteration 2696/3560 Training loss: 1.8867 0.4003 sec/batch
Epoch 16/20  Iteration 2697/3560 Training loss: 1.8860 0.4331 sec/batch
Epoch 16/20  Iteration 2698/3560 Training loss: 1.8864 0.4504 sec/batch
Epoch 16/20  Iteration 2699/3560 Training loss: 1.8873 0.4237 sec/batch
Epoch 16/20  Iteration 2700/3560 Training loss: 1.8876 0.4044 sec/batch
Epoch 16/20  Iteration 2701/3560 Training loss: 1.8877 0.4046 sec/batch
Epoch 16/20  Iteration 2702/3560 Training loss: 1.8871 0.4033 sec/batch
Epoch 16/20  Iteration 2703/3560 Training loss: 1.8873 0.4631 sec/batch
Epoch 16/20  Iteration 2704/3560 Training loss: 1.8880 0.4159 sec/batch
Epoch 16/20  Iteration 2705/3560 Training loss: 1.8877 0.5310 sec/batch
Epoch 16/20  Iteration 2706/3560 Training loss: 1.8875 0.4768 sec/batch
Epoch 16/20  Iteration 2707/3560 Training loss: 1.8871 0.3996 sec/batch
Epoch 16/20  Iteration 2708/3560 Training loss: 1.8862 0.4150 sec/batch
Epoch 16/20  Iteration 2709/3560 Training loss: 1.8852 0.4515 sec/batch
Epoch 16/20  Iteration 2710/3560 Training loss: 1.8846 0.4305 sec/batch
Epoch 16/20  Iteration 2711/3560 Training loss: 1.8840 0.4462 sec/batch
Epoch 16/20  Iteration 2712/3560 Training loss: 1.8842 0.3906 sec/batch
Epoch 16/20  Iteration 2713/3560 Training loss: 1.8839 0.3998 sec/batch
Epoch 16/20  Iteration 2714/3560 Training loss: 1.8833 0.4526 sec/batch
Epoch 16/20  Iteration 2715/3560 Training loss: 1.8836 0.5380 sec/batch
Epoch 16/20  Iteration 2716/3560 Training loss: 1.8824 0.4131 sec/batch
Epoch 16/20  Iteration 2717/3560 Training loss: 1.8821 0.4150 sec/batch
Epoch 16/20  Iteration 2718/3560 Training loss: 1.8815 0.4508 sec/batch
Epoch 16/20  Iteration 2719/3560 Training loss: 1.8813 0.4336 sec/batch
Epoch 16/20  Iteration 2720/3560 Training loss: 1.8822 0.4124 sec/batch
Epoch 16/20  Iteration 2721/3560 Training loss: 1.8816 0.4200 sec/batch
Epoch 16/20  Iteration 2722/3560 Training loss: 1.8824 0.4272 sec/batch
Epoch 16/20  Iteration 2723/3560 Training loss: 1.8823 0.3985 sec/batch
Epoch 16/20  Iteration 2724/3560 Training loss: 1.8823 0.4070 sec/batch
Epoch 16/20  Iteration 2725/3560 Training loss: 1.8820 0.4205 sec/batch
Epoch 16/20  Iteration 2726/3560 Training loss: 1.8822 0.4122 sec/batch
Epoch 16/20  Iteration 2727/3560 Training loss: 1.8823 0.4226 sec/batch
Epoch 16/20  Iteration 2728/3560 Training loss: 1.8821 0.4455 sec/batch
Epoch 16/20  Iteration 2729/3560 Training loss: 1.8817 0.4262 sec/batch
Epoch 16/20  Iteration 2730/3560 Training loss: 1.8822 0.4256 sec/batch
Epoch 16/20  Iteration 2731/3560 Training loss: 1.8821 0.4043 sec/batch
Epoch 16/20  Iteration 2732/3560 Training loss: 1.8826 0.4132 sec/batch
Epoch 16/20  Iteration 2733/3560 Training loss: 1.8829 0.3822 sec/batch
Epoch 16/20  Iteration 2734/3560 Training loss: 1.8832 0.4725 sec/batch
Epoch 16/20  Iteration 2735/3560 Training loss: 1.8831 0.4122 sec/batch
Epoch 16/20  Iteration 2736/3560 Training loss: 1.8836 0.3918 sec/batch
Epoch 16/20  Iteration 2737/3560 Training loss: 1.8838 0.3916 sec/batch
Epoch 16/20  Iteration 2738/3560 Training loss: 1.8834 0.4310 sec/batch
Epoch 16/20  Iteration 2739/3560 Training loss: 1.8832 0.4452 sec/batch
Epoch 16/20  Iteration 2740/3560 Training loss: 1.8830 0.4216 sec/batch
Epoch 16/20  Iteration 2741/3560 Training loss: 1.8836 0.4161 sec/batch
Epoch 16/20  Iteration 2742/3560 Training loss: 1.8838 0.4131 sec/batch
Epoch 16/20  Iteration 2743/3560 Training loss: 1.8843 0.4283 sec/batch
Epoch 16/20  Iteration 2744/3560 Training loss: 1.8840 0.3979 sec/batch
Epoch 16/20  Iteration 2745/3560 Training loss: 1.8840 0.3790 sec/batch
Epoch 16/20  Iteration 2746/3560 Training loss: 1.8843 0.3821 sec/batch
Epoch 16/20  Iteration 2747/3560 Training loss: 1.8843 0.3901 sec/batch
Epoch 16/20  Iteration 2748/3560 Training loss: 1.8843 0.3867 sec/batch
Epoch 16/20  Iteration 2749/3560 Training loss: 1.8838 0.3869 sec/batch
Epoch 16/20  Iteration 2750/3560 Training loss: 1.8837 0.4187 sec/batch
Epoch 16/20  Iteration 2751/3560 Training loss: 1.8833 0.4469 sec/batch
Epoch 16/20  Iteration 2752/3560 Training loss: 1.8834 0.3752 sec/batch
Epoch 16/20  Iteration 2753/3560 Training loss: 1.8828 0.3976 sec/batch
Epoch 16/20  Iteration 2754/3560 Training loss: 1.8828 0.3905 sec/batch
Epoch 16/20  Iteration 2755/3560 Training loss: 1.8822 0.3921 sec/batch
Epoch 16/20  Iteration 2756/3560 Training loss: 1.8819 0.3857 sec/batch
Epoch 16/20  Iteration 2757/3560 Training loss: 1.8818 0.3864 sec/batch
Epoch 16/20  Iteration 2758/3560 Training loss: 1.8815 0.4114 sec/batch
Epoch 16/20  Iteration 2759/3560 Training loss: 1.8810 0.4266 sec/batch
Epoch 16/20  Iteration 2760/3560 Training loss: 1.8811 0.4320 sec/batch
Epoch 16/20  Iteration 2761/3560 Training loss: 1.8808 0.4647 sec/batch
Epoch 16/20  Iteration 2762/3560 Training loss: 1.8807 0.4575 sec/batch
Epoch 16/20  Iteration 2763/3560 Training loss: 1.8802 0.5046 sec/batch
Epoch 16/20  Iteration 2764/3560 Training loss: 1.8798 0.4539 sec/batch
Epoch 16/20  Iteration 2765/3560 Training loss: 1.8795 0.4314 sec/batch
Epoch 16/20  Iteration 2766/3560 Training loss: 1.8796 0.4521 sec/batch
Epoch 16/20  Iteration 2767/3560 Training loss: 1.8794 0.4559 sec/batch
Epoch 16/20  Iteration 2768/3560 Training loss: 1.8789 0.5228 sec/batch
Epoch 16/20  Iteration 2769/3560 Training loss: 1.8786 0.4006 sec/batch
Epoch 16/20  Iteration 2770/3560 Training loss: 1.8781 0.4141 sec/batch
Epoch 16/20  Iteration 2771/3560 Training loss: 1.8781 0.4274 sec/batch
Epoch 16/20  Iteration 2772/3560 Training loss: 1.8781 0.4205 sec/batch
Epoch 16/20  Iteration 2773/3560 Training loss: 1.8779 0.4604 sec/batch
Epoch 16/20  Iteration 2774/3560 Training loss: 1.8777 0.4364 sec/batch
Epoch 16/20  Iteration 2775/3560 Training loss: 1.8775 0.3982 sec/batch
Epoch 16/20  Iteration 2776/3560 Training loss: 1.8774 0.3969 sec/batch
Epoch 16/20  Iteration 2777/3560 Training loss: 1.8774 0.3999 sec/batch
Epoch 16/20  Iteration 2778/3560 Training loss: 1.8774 0.4052 sec/batch
Epoch 16/20  Iteration 2779/3560 Training loss: 1.8774 0.3891 sec/batch
Epoch 16/20  Iteration 2780/3560 Training loss: 1.8773 0.3835 sec/batch
Epoch 16/20  Iteration 2781/3560 Training loss: 1.8773 0.5198 sec/batch
Epoch 16/20  Iteration 2782/3560 Training loss: 1.8772 0.5281 sec/batch
Epoch 16/20  Iteration 2783/3560 Training loss: 1.8772 0.5051 sec/batch
Epoch 16/20  Iteration 2784/3560 Training loss: 1.8771 0.5430 sec/batch
Epoch 16/20  Iteration 2785/3560 Training loss: 1.8768 0.5228 sec/batch
Epoch 16/20  Iteration 2786/3560 Training loss: 1.8765 0.4055 sec/batch
Epoch 16/20  Iteration 2787/3560 Training loss: 1.8765 0.4055 sec/batch
Epoch 16/20  Iteration 2788/3560 Training loss: 1.8764 0.4261 sec/batch
Epoch 16/20  Iteration 2789/3560 Training loss: 1.8764 0.3910 sec/batch
Epoch 16/20  Iteration 2790/3560 Training loss: 1.8763 0.4226 sec/batch
Epoch 16/20  Iteration 2791/3560 Training loss: 1.8763 0.4235 sec/batch
Epoch 16/20  Iteration 2792/3560 Training loss: 1.8760 0.4009 sec/batch
Epoch 16/20  Iteration 2793/3560 Training loss: 1.8757 0.3844 sec/batch
Epoch 16/20  Iteration 2794/3560 Training loss: 1.8758 0.3888 sec/batch
Epoch 16/20  Iteration 2795/3560 Training loss: 1.8758 0.3859 sec/batch
Epoch 16/20  Iteration 2796/3560 Training loss: 1.8755 0.3910 sec/batch
Epoch 16/20  Iteration 2797/3560 Training loss: 1.8756 0.3920 sec/batch
Epoch 16/20  Iteration 2798/3560 Training loss: 1.8757 0.4055 sec/batch
Epoch 16/20  Iteration 2799/3560 Training loss: 1.8756 0.3955 sec/batch
Epoch 16/20  Iteration 2800/3560 Training loss: 1.8756 0.3908 sec/batch
Epoch 16/20  Iteration 2801/3560 Training loss: 1.8753 0.3944 sec/batch
Epoch 16/20  Iteration 2802/3560 Training loss: 1.8750 0.3908 sec/batch
Epoch 16/20  Iteration 2803/3560 Training loss: 1.8750 0.3821 sec/batch
Epoch 16/20  Iteration 2804/3560 Training loss: 1.8750 0.3925 sec/batch
Epoch 16/20  Iteration 2805/3560 Training loss: 1.8750 0.3866 sec/batch
Epoch 16/20  Iteration 2806/3560 Training loss: 1.8750 0.3922 sec/batch
Epoch 16/20  Iteration 2807/3560 Training loss: 1.8750 0.3936 sec/batch
Epoch 16/20  Iteration 2808/3560 Training loss: 1.8751 0.3864 sec/batch
Epoch 16/20  Iteration 2809/3560 Training loss: 1.8752 0.3874 sec/batch
Epoch 16/20  Iteration 2810/3560 Training loss: 1.8751 0.3800 sec/batch
Epoch 16/20  Iteration 2811/3560 Training loss: 1.8753 0.4072 sec/batch
Epoch 16/20  Iteration 2812/3560 Training loss: 1.8753 0.3950 sec/batch
Epoch 16/20  Iteration 2813/3560 Training loss: 1.8753 0.3884 sec/batch
Epoch 16/20  Iteration 2814/3560 Training loss: 1.8753 0.3802 sec/batch
Epoch 16/20  Iteration 2815/3560 Training loss: 1.8753 0.3884 sec/batch
Epoch 16/20  Iteration 2816/3560 Training loss: 1.8754 0.3855 sec/batch
Epoch 16/20  Iteration 2817/3560 Training loss: 1.8755 0.3850 sec/batch
Epoch 16/20  Iteration 2818/3560 Training loss: 1.8756 0.3796 sec/batch
Epoch 16/20  Iteration 2819/3560 Training loss: 1.8756 0.3859 sec/batch
Epoch 16/20  Iteration 2820/3560 Training loss: 1.8754 0.3833 sec/batch
Epoch 16/20  Iteration 2821/3560 Training loss: 1.8751 0.3898 sec/batch
Epoch 16/20  Iteration 2822/3560 Training loss: 1.8753 0.3846 sec/batch
Epoch 16/20  Iteration 2823/3560 Training loss: 1.8753 0.3851 sec/batch
Epoch 16/20  Iteration 2824/3560 Training loss: 1.8754 0.4071 sec/batch
Epoch 16/20  Iteration 2825/3560 Training loss: 1.8754 0.3825 sec/batch
Epoch 16/20  Iteration 2826/3560 Training loss: 1.8753 0.3799 sec/batch
Epoch 16/20  Iteration 2827/3560 Training loss: 1.8754 0.3889 sec/batch
Epoch 16/20  Iteration 2828/3560 Training loss: 1.8754 0.3966 sec/batch
Epoch 16/20  Iteration 2829/3560 Training loss: 1.8752 0.3865 sec/batch
Epoch 16/20  Iteration 2830/3560 Training loss: 1.8753 0.3874 sec/batch
Epoch 16/20  Iteration 2831/3560 Training loss: 1.8755 0.3899 sec/batch
Epoch 16/20  Iteration 2832/3560 Training loss: 1.8755 0.3878 sec/batch
Epoch 16/20  Iteration 2833/3560 Training loss: 1.8756 0.3976 sec/batch
Epoch 16/20  Iteration 2834/3560 Training loss: 1.8755 0.5232 sec/batch
Epoch 16/20  Iteration 2835/3560 Training loss: 1.8755 0.5142 sec/batch
Epoch 16/20  Iteration 2836/3560 Training loss: 1.8755 0.5245 sec/batch
Epoch 16/20  Iteration 2837/3560 Training loss: 1.8755 0.4412 sec/batch
Epoch 16/20  Iteration 2838/3560 Training loss: 1.8759 0.5335 sec/batch
Epoch 16/20  Iteration 2839/3560 Training loss: 1.8759 0.5891 sec/batch
Epoch 16/20  Iteration 2840/3560 Training loss: 1.8758 0.4074 sec/batch
Epoch 16/20  Iteration 2841/3560 Training loss: 1.8757 0.3874 sec/batch
Epoch 16/20  Iteration 2842/3560 Training loss: 1.8756 0.3851 sec/batch
Epoch 16/20  Iteration 2843/3560 Training loss: 1.8757 0.3818 sec/batch
Epoch 16/20  Iteration 2844/3560 Training loss: 1.8757 0.3991 sec/batch
Epoch 16/20  Iteration 2845/3560 Training loss: 1.8758 0.3877 sec/batch
Epoch 16/20  Iteration 2846/3560 Training loss: 1.8758 0.3813 sec/batch
Epoch 16/20  Iteration 2847/3560 Training loss: 1.8757 0.4084 sec/batch
Epoch 16/20  Iteration 2848/3560 Training loss: 1.8757 0.3884 sec/batch
Epoch 17/20  Iteration 2849/3560 Training loss: 1.9423 0.3808 sec/batch
Epoch 17/20  Iteration 2850/3560 Training loss: 1.9030 0.3845 sec/batch
Epoch 17/20  Iteration 2851/3560 Training loss: 1.8914 0.3822 sec/batch
Epoch 17/20  Iteration 2852/3560 Training loss: 1.8840 0.3845 sec/batch
Epoch 17/20  Iteration 2853/3560 Training loss: 1.8773 0.3937 sec/batch
Epoch 17/20  Iteration 2854/3560 Training loss: 1.8692 0.3917 sec/batch
Epoch 17/20  Iteration 2855/3560 Training loss: 1.8695 0.3904 sec/batch
Epoch 17/20  Iteration 2856/3560 Training loss: 1.8688 0.3850 sec/batch
Epoch 17/20  Iteration 2857/3560 Training loss: 1.8704 0.3849 sec/batch
Epoch 17/20  Iteration 2858/3560 Training loss: 1.8702 0.3853 sec/batch
Epoch 17/20  Iteration 2859/3560 Training loss: 1.8672 0.3867 sec/batch
Epoch 17/20  Iteration 2860/3560 Training loss: 1.8661 0.4313 sec/batch
Epoch 17/20  Iteration 2861/3560 Training loss: 1.8661 0.3836 sec/batch
Epoch 17/20  Iteration 2862/3560 Training loss: 1.8681 0.3874 sec/batch
Epoch 17/20  Iteration 2863/3560 Training loss: 1.8670 0.3899 sec/batch
Epoch 17/20  Iteration 2864/3560 Training loss: 1.8657 0.3825 sec/batch
Epoch 17/20  Iteration 2865/3560 Training loss: 1.8652 0.3823 sec/batch
Epoch 17/20  Iteration 2866/3560 Training loss: 1.8676 0.3816 sec/batch
Epoch 17/20  Iteration 2867/3560 Training loss: 1.8676 0.3844 sec/batch
Epoch 17/20  Iteration 2868/3560 Training loss: 1.8683 0.3871 sec/batch
Epoch 17/20  Iteration 2869/3560 Training loss: 1.8676 0.3896 sec/batch
Epoch 17/20  Iteration 2870/3560 Training loss: 1.8684 0.4001 sec/batch
Epoch 17/20  Iteration 2871/3560 Training loss: 1.8678 0.3818 sec/batch
Epoch 17/20  Iteration 2872/3560 Training loss: 1.8676 0.3825 sec/batch
Epoch 17/20  Iteration 2873/3560 Training loss: 1.8675 0.4121 sec/batch
Epoch 17/20  Iteration 2874/3560 Training loss: 1.8664 0.3824 sec/batch
Epoch 17/20  Iteration 2875/3560 Training loss: 1.8654 0.3873 sec/batch
Epoch 17/20  Iteration 2876/3560 Training loss: 1.8659 0.3830 sec/batch
Epoch 17/20  Iteration 2877/3560 Training loss: 1.8670 0.3901 sec/batch
Epoch 17/20  Iteration 2878/3560 Training loss: 1.8678 0.3860 sec/batch
Epoch 17/20  Iteration 2879/3560 Training loss: 1.8674 0.3848 sec/batch
Epoch 17/20  Iteration 2880/3560 Training loss: 1.8671 0.3932 sec/batch
Epoch 17/20  Iteration 2881/3560 Training loss: 1.8672 0.3869 sec/batch
Epoch 17/20  Iteration 2882/3560 Training loss: 1.8681 0.3841 sec/batch
Epoch 17/20  Iteration 2883/3560 Training loss: 1.8677 0.4004 sec/batch
Epoch 17/20  Iteration 2884/3560 Training loss: 1.8675 0.3821 sec/batch
Epoch 17/20  Iteration 2885/3560 Training loss: 1.8671 0.4112 sec/batch
Epoch 17/20  Iteration 2886/3560 Training loss: 1.8659 0.4218 sec/batch
Epoch 17/20  Iteration 2887/3560 Training loss: 1.8646 0.3878 sec/batch
Epoch 17/20  Iteration 2888/3560 Training loss: 1.8642 0.3868 sec/batch
Epoch 17/20  Iteration 2889/3560 Training loss: 1.8636 0.3798 sec/batch
Epoch 17/20  Iteration 2890/3560 Training loss: 1.8637 0.3848 sec/batch
Epoch 17/20  Iteration 2891/3560 Training loss: 1.8631 0.3857 sec/batch
Epoch 17/20  Iteration 2892/3560 Training loss: 1.8624 0.3889 sec/batch
Epoch 17/20  Iteration 2893/3560 Training loss: 1.8626 0.3904 sec/batch
Epoch 17/20  Iteration 2894/3560 Training loss: 1.8616 0.3870 sec/batch
Epoch 17/20  Iteration 2895/3560 Training loss: 1.8614 0.3825 sec/batch
Epoch 17/20  Iteration 2896/3560 Training loss: 1.8608 0.3857 sec/batch
Epoch 17/20  Iteration 2897/3560 Training loss: 1.8607 0.3891 sec/batch
Epoch 17/20  Iteration 2898/3560 Training loss: 1.8616 0.4035 sec/batch
Epoch 17/20  Iteration 2899/3560 Training loss: 1.8607 0.4152 sec/batch
Epoch 17/20  Iteration 2900/3560 Training loss: 1.8615 0.3815 sec/batch
Epoch 17/20  Iteration 2901/3560 Training loss: 1.8614 0.3845 sec/batch
Epoch 17/20  Iteration 2902/3560 Training loss: 1.8614 0.3786 sec/batch
Epoch 17/20  Iteration 2903/3560 Training loss: 1.8610 0.3808 sec/batch
Epoch 17/20  Iteration 2904/3560 Training loss: 1.8611 0.3934 sec/batch
Epoch 17/20  Iteration 2905/3560 Training loss: 1.8615 0.3835 sec/batch
Epoch 17/20  Iteration 2906/3560 Training loss: 1.8612 0.3820 sec/batch
Epoch 17/20  Iteration 2907/3560 Training loss: 1.8608 0.3900 sec/batch
Epoch 17/20  Iteration 2908/3560 Training loss: 1.8614 0.3849 sec/batch
Epoch 17/20  Iteration 2909/3560 Training loss: 1.8615 0.3909 sec/batch
Epoch 17/20  Iteration 2910/3560 Training loss: 1.8622 0.3848 sec/batch
Epoch 17/20  Iteration 2911/3560 Training loss: 1.8624 0.3877 sec/batch
Epoch 17/20  Iteration 2912/3560 Training loss: 1.8627 0.4112 sec/batch
Epoch 17/20  Iteration 2913/3560 Training loss: 1.8626 0.3855 sec/batch
Epoch 17/20  Iteration 2914/3560 Training loss: 1.8631 0.3862 sec/batch
Epoch 17/20  Iteration 2915/3560 Training loss: 1.8631 0.3797 sec/batch
Epoch 17/20  Iteration 2916/3560 Training loss: 1.8628 0.3855 sec/batch
Epoch 17/20  Iteration 2917/3560 Training loss: 1.8627 0.3842 sec/batch
Epoch 17/20  Iteration 2918/3560 Training loss: 1.8626 0.3843 sec/batch
Epoch 17/20  Iteration 2919/3560 Training loss: 1.8630 0.3816 sec/batch
Epoch 17/20  Iteration 2920/3560 Training loss: 1.8631 0.4002 sec/batch
Epoch 17/20  Iteration 2921/3560 Training loss: 1.8635 0.3876 sec/batch
Epoch 17/20  Iteration 2922/3560 Training loss: 1.8632 0.3803 sec/batch
Epoch 17/20  Iteration 2923/3560 Training loss: 1.8632 0.3836 sec/batch
Epoch 17/20  Iteration 2924/3560 Training loss: 1.8636 0.3839 sec/batch
Epoch 17/20  Iteration 2925/3560 Training loss: 1.8636 0.4056 sec/batch
Epoch 17/20  Iteration 2926/3560 Training loss: 1.8637 0.3832 sec/batch
Epoch 17/20  Iteration 2927/3560 Training loss: 1.8632 0.3943 sec/batch
Epoch 17/20  Iteration 2928/3560 Training loss: 1.8631 0.3832 sec/batch
Epoch 17/20  Iteration 2929/3560 Training loss: 1.8626 0.3857 sec/batch
Epoch 17/20  Iteration 2930/3560 Training loss: 1.8628 0.3843 sec/batch
Epoch 17/20  Iteration 2931/3560 Training loss: 1.8624 0.3869 sec/batch
Epoch 17/20  Iteration 2932/3560 Training loss: 1.8622 0.3812 sec/batch
Epoch 17/20  Iteration 2933/3560 Training loss: 1.8618 0.3854 sec/batch
Epoch 17/20  Iteration 2934/3560 Training loss: 1.8615 0.3945 sec/batch
Epoch 17/20  Iteration 2935/3560 Training loss: 1.8614 0.3869 sec/batch
Epoch 17/20  Iteration 2936/3560 Training loss: 1.8611 0.3833 sec/batch
Epoch 17/20  Iteration 2937/3560 Training loss: 1.8605 0.3827 sec/batch
Epoch 17/20  Iteration 2938/3560 Training loss: 1.8606 0.4056 sec/batch
Epoch 17/20  Iteration 2939/3560 Training loss: 1.8603 0.3939 sec/batch
Epoch 17/20  Iteration 2940/3560 Training loss: 1.8602 0.3874 sec/batch
Epoch 17/20  Iteration 2941/3560 Training loss: 1.8597 0.3857 sec/batch
Epoch 17/20  Iteration 2942/3560 Training loss: 1.8594 0.3877 sec/batch
Epoch 17/20  Iteration 2943/3560 Training loss: 1.8592 0.3882 sec/batch
Epoch 17/20  Iteration 2944/3560 Training loss: 1.8592 0.3807 sec/batch
Epoch 17/20  Iteration 2945/3560 Training loss: 1.8591 0.3805 sec/batch
Epoch 17/20  Iteration 2946/3560 Training loss: 1.8588 0.3838 sec/batch
Epoch 17/20  Iteration 2947/3560 Training loss: 1.8584 0.3805 sec/batch
Epoch 17/20  Iteration 2948/3560 Training loss: 1.8580 0.3858 sec/batch
Epoch 17/20  Iteration 2949/3560 Training loss: 1.8579 0.3839 sec/batch
Epoch 17/20  Iteration 2950/3560 Training loss: 1.8578 0.3867 sec/batch
Epoch 17/20  Iteration 2951/3560 Training loss: 1.8576 0.4406 sec/batch
Epoch 17/20  Iteration 2952/3560 Training loss: 1.8574 0.3847 sec/batch
Epoch 17/20  Iteration 2953/3560 Training loss: 1.8573 0.3905 sec/batch
Epoch 17/20  Iteration 2954/3560 Training loss: 1.8572 0.3837 sec/batch
Epoch 17/20  Iteration 2955/3560 Training loss: 1.8570 0.3796 sec/batch
Epoch 17/20  Iteration 2956/3560 Training loss: 1.8570 0.3913 sec/batch
Epoch 17/20  Iteration 2957/3560 Training loss: 1.8571 0.3861 sec/batch
Epoch 17/20  Iteration 2958/3560 Training loss: 1.8572 0.3841 sec/batch
Epoch 17/20  Iteration 2959/3560 Training loss: 1.8571 0.3838 sec/batch
Epoch 17/20  Iteration 2960/3560 Training loss: 1.8570 0.3916 sec/batch
Epoch 17/20  Iteration 2961/3560 Training loss: 1.8570 0.3887 sec/batch
Epoch 17/20  Iteration 2962/3560 Training loss: 1.8569 0.3846 sec/batch
Epoch 17/20  Iteration 2963/3560 Training loss: 1.8566 0.3871 sec/batch
Epoch 17/20  Iteration 2964/3560 Training loss: 1.8563 0.4083 sec/batch
Epoch 17/20  Iteration 2965/3560 Training loss: 1.8563 0.3904 sec/batch
Epoch 17/20  Iteration 2966/3560 Training loss: 1.8563 0.3858 sec/batch
Epoch 17/20  Iteration 2967/3560 Training loss: 1.8562 0.3803 sec/batch
Epoch 17/20  Iteration 2968/3560 Training loss: 1.8561 0.3816 sec/batch
Epoch 17/20  Iteration 2969/3560 Training loss: 1.8562 0.3858 sec/batch
Epoch 17/20  Iteration 2970/3560 Training loss: 1.8558 0.3928 sec/batch
Epoch 17/20  Iteration 2971/3560 Training loss: 1.8555 0.4031 sec/batch
Epoch 17/20  Iteration 2972/3560 Training loss: 1.8557 0.3883 sec/batch
Epoch 17/20  Iteration 2973/3560 Training loss: 1.8557 0.3918 sec/batch
Epoch 17/20  Iteration 2974/3560 Training loss: 1.8554 0.3817 sec/batch
Epoch 17/20  Iteration 2975/3560 Training loss: 1.8555 0.3853 sec/batch
Epoch 17/20  Iteration 2976/3560 Training loss: 1.8556 0.3861 sec/batch
Epoch 17/20  Iteration 2977/3560 Training loss: 1.8555 0.4095 sec/batch
Epoch 17/20  Iteration 2978/3560 Training loss: 1.8555 0.3819 sec/batch
Epoch 17/20  Iteration 2979/3560 Training loss: 1.8553 0.3801 sec/batch
Epoch 17/20  Iteration 2980/3560 Training loss: 1.8549 0.3827 sec/batch
Epoch 17/20  Iteration 2981/3560 Training loss: 1.8549 0.3873 sec/batch
Epoch 17/20  Iteration 2982/3560 Training loss: 1.8550 0.3882 sec/batch
Epoch 17/20  Iteration 2983/3560 Training loss: 1.8550 0.3839 sec/batch
Epoch 17/20  Iteration 2984/3560 Training loss: 1.8550 0.3838 sec/batch
Epoch 17/20  Iteration 2985/3560 Training loss: 1.8551 0.3940 sec/batch
Epoch 17/20  Iteration 2986/3560 Training loss: 1.8551 0.3885 sec/batch
Epoch 17/20  Iteration 2987/3560 Training loss: 1.8553 0.3921 sec/batch
Epoch 17/20  Iteration 2988/3560 Training loss: 1.8551 0.3851 sec/batch
Epoch 17/20  Iteration 2989/3560 Training loss: 1.8553 0.3830 sec/batch
Epoch 17/20  Iteration 2990/3560 Training loss: 1.8553 0.4130 sec/batch
Epoch 17/20  Iteration 2991/3560 Training loss: 1.8553 0.3858 sec/batch
Epoch 17/20  Iteration 2992/3560 Training loss: 1.8553 0.3884 sec/batch
Epoch 17/20  Iteration 2993/3560 Training loss: 1.8552 0.3838 sec/batch
Epoch 17/20  Iteration 2994/3560 Training loss: 1.8553 0.3824 sec/batch
Epoch 17/20  Iteration 2995/3560 Training loss: 1.8553 0.3882 sec/batch
Epoch 17/20  Iteration 2996/3560 Training loss: 1.8555 0.4131 sec/batch
Epoch 17/20  Iteration 2997/3560 Training loss: 1.8555 0.3849 sec/batch
Epoch 17/20  Iteration 2998/3560 Training loss: 1.8553 0.3818 sec/batch
Epoch 17/20  Iteration 2999/3560 Training loss: 1.8551 0.3858 sec/batch
Epoch 17/20  Iteration 3000/3560 Training loss: 1.8553 0.4010 sec/batch
Epoch 17/20  Iteration 3001/3560 Training loss: 1.8553 0.3866 sec/batch
Epoch 17/20  Iteration 3002/3560 Training loss: 1.8554 0.3810 sec/batch
Epoch 17/20  Iteration 3003/3560 Training loss: 1.8553 0.4091 sec/batch
Epoch 17/20  Iteration 3004/3560 Training loss: 1.8554 0.3893 sec/batch
Epoch 17/20  Iteration 3005/3560 Training loss: 1.8555 0.3904 sec/batch
Epoch 17/20  Iteration 3006/3560 Training loss: 1.8554 0.3815 sec/batch
Epoch 17/20  Iteration 3007/3560 Training loss: 1.8552 0.3996 sec/batch
Epoch 17/20  Iteration 3008/3560 Training loss: 1.8554 0.3828 sec/batch
Epoch 17/20  Iteration 3009/3560 Training loss: 1.8556 0.3868 sec/batch
Epoch 17/20  Iteration 3010/3560 Training loss: 1.8555 0.3902 sec/batch
Epoch 17/20  Iteration 3011/3560 Training loss: 1.8556 0.3854 sec/batch
Epoch 17/20  Iteration 3012/3560 Training loss: 1.8556 0.3860 sec/batch
Epoch 17/20  Iteration 3013/3560 Training loss: 1.8556 0.3854 sec/batch
Epoch 17/20  Iteration 3014/3560 Training loss: 1.8555 0.3860 sec/batch
Epoch 17/20  Iteration 3015/3560 Training loss: 1.8556 0.3877 sec/batch
Epoch 17/20  Iteration 3016/3560 Training loss: 1.8559 0.4104 sec/batch
Epoch 17/20  Iteration 3017/3560 Training loss: 1.8559 0.3950 sec/batch
Epoch 17/20  Iteration 3018/3560 Training loss: 1.8559 0.3978 sec/batch
Epoch 17/20  Iteration 3019/3560 Training loss: 1.8558 0.3876 sec/batch
Epoch 17/20  Iteration 3020/3560 Training loss: 1.8557 0.3862 sec/batch
Epoch 17/20  Iteration 3021/3560 Training loss: 1.8559 0.3809 sec/batch
Epoch 17/20  Iteration 3022/3560 Training loss: 1.8559 0.3816 sec/batch
Epoch 17/20  Iteration 3023/3560 Training loss: 1.8560 0.3912 sec/batch
Epoch 17/20  Iteration 3024/3560 Training loss: 1.8559 0.3828 sec/batch
Epoch 17/20  Iteration 3025/3560 Training loss: 1.8559 0.3829 sec/batch
Epoch 17/20  Iteration 3026/3560 Training loss: 1.8559 0.3955 sec/batch
Epoch 18/20  Iteration 3027/3560 Training loss: 1.9217 0.3849 sec/batch
Epoch 18/20  Iteration 3028/3560 Training loss: 1.8832 0.3865 sec/batch
Epoch 18/20  Iteration 3029/3560 Training loss: 1.8733 0.4081 sec/batch
Epoch 18/20  Iteration 3030/3560 Training loss: 1.8679 0.3887 sec/batch
Epoch 18/20  Iteration 3031/3560 Training loss: 1.8634 0.4684 sec/batch
Epoch 18/20  Iteration 3032/3560 Training loss: 1.8560 0.3830 sec/batch
Epoch 18/20  Iteration 3033/3560 Training loss: 1.8548 0.3852 sec/batch
Epoch 18/20  Iteration 3034/3560 Training loss: 1.8541 0.3809 sec/batch
Epoch 18/20  Iteration 3035/3560 Training loss: 1.8555 0.3845 sec/batch
Epoch 18/20  Iteration 3036/3560 Training loss: 1.8537 0.4177 sec/batch
Epoch 18/20  Iteration 3037/3560 Training loss: 1.8497 0.4034 sec/batch
Epoch 18/20  Iteration 3038/3560 Training loss: 1.8487 0.3890 sec/batch
Epoch 18/20  Iteration 3039/3560 Training loss: 1.8480 0.3817 sec/batch
Epoch 18/20  Iteration 3040/3560 Training loss: 1.8505 0.3805 sec/batch
Epoch 18/20  Iteration 3041/3560 Training loss: 1.8496 0.3855 sec/batch
Epoch 18/20  Iteration 3042/3560 Training loss: 1.8481 0.4037 sec/batch
Epoch 18/20  Iteration 3043/3560 Training loss: 1.8484 0.3838 sec/batch
Epoch 18/20  Iteration 3044/3560 Training loss: 1.8510 0.3857 sec/batch
Epoch 18/20  Iteration 3045/3560 Training loss: 1.8509 0.3862 sec/batch
Epoch 18/20  Iteration 3046/3560 Training loss: 1.8512 0.3957 sec/batch
Epoch 18/20  Iteration 3047/3560 Training loss: 1.8500 0.3831 sec/batch
Epoch 18/20  Iteration 3048/3560 Training loss: 1.8508 0.3880 sec/batch
Epoch 18/20  Iteration 3049/3560 Training loss: 1.8506 0.3843 sec/batch
Epoch 18/20  Iteration 3050/3560 Training loss: 1.8496 0.3893 sec/batch
Epoch 18/20  Iteration 3051/3560 Training loss: 1.8491 0.3879 sec/batch
Epoch 18/20  Iteration 3052/3560 Training loss: 1.8476 0.3844 sec/batch
Epoch 18/20  Iteration 3053/3560 Training loss: 1.8468 0.3850 sec/batch
Epoch 18/20  Iteration 3054/3560 Training loss: 1.8475 0.3833 sec/batch
Epoch 18/20  Iteration 3055/3560 Training loss: 1.8485 0.4054 sec/batch
Epoch 18/20  Iteration 3056/3560 Training loss: 1.8490 0.3804 sec/batch
Epoch 18/20  Iteration 3057/3560 Training loss: 1.8490 0.3910 sec/batch
Epoch 18/20  Iteration 3058/3560 Training loss: 1.8483 0.3852 sec/batch
Epoch 18/20  Iteration 3059/3560 Training loss: 1.8488 0.3820 sec/batch
Epoch 18/20  Iteration 3060/3560 Training loss: 1.8496 0.3837 sec/batch
Epoch 18/20  Iteration 3061/3560 Training loss: 1.8495 0.3915 sec/batch
Epoch 18/20  Iteration 3062/3560 Training loss: 1.8494 0.3875 sec/batch
Epoch 18/20  Iteration 3063/3560 Training loss: 1.8491 0.3837 sec/batch
Epoch 18/20  Iteration 3064/3560 Training loss: 1.8481 0.3809 sec/batch
Epoch 18/20  Iteration 3065/3560 Training loss: 1.8470 0.3890 sec/batch
Epoch 18/20  Iteration 3066/3560 Training loss: 1.8465 0.3807 sec/batch
Epoch 18/20  Iteration 3067/3560 Training loss: 1.8462 0.3783 sec/batch
Epoch 18/20  Iteration 3068/3560 Training loss: 1.8465 0.4051 sec/batch
Epoch 18/20  Iteration 3069/3560 Training loss: 1.8460 0.3869 sec/batch
Epoch 18/20  Iteration 3070/3560 Training loss: 1.8454 0.3855 sec/batch
Epoch 18/20  Iteration 3071/3560 Training loss: 1.8456 0.3854 sec/batch
Epoch 18/20  Iteration 3072/3560 Training loss: 1.8446 0.4006 sec/batch
Epoch 18/20  Iteration 3073/3560 Training loss: 1.8443 0.3877 sec/batch
Epoch 18/20  Iteration 3074/3560 Training loss: 1.8437 0.3895 sec/batch
Epoch 18/20  Iteration 3075/3560 Training loss: 1.8436 0.3871 sec/batch
Epoch 18/20  Iteration 3076/3560 Training loss: 1.8444 0.3844 sec/batch
Epoch 18/20  Iteration 3077/3560 Training loss: 1.8441 0.3958 sec/batch
Epoch 18/20  Iteration 3078/3560 Training loss: 1.8450 0.3848 sec/batch
Epoch 18/20  Iteration 3079/3560 Training loss: 1.8449 0.3799 sec/batch
Epoch 18/20  Iteration 3080/3560 Training loss: 1.8449 0.3887 sec/batch
Epoch 18/20  Iteration 3081/3560 Training loss: 1.8446 0.4111 sec/batch
Epoch 18/20  Iteration 3082/3560 Training loss: 1.8449 0.3813 sec/batch
Epoch 18/20  Iteration 3083/3560 Training loss: 1.8452 0.3843 sec/batch
Epoch 18/20  Iteration 3084/3560 Training loss: 1.8448 0.3841 sec/batch
Epoch 18/20  Iteration 3085/3560 Training loss: 1.8443 0.3834 sec/batch
Epoch 18/20  Iteration 3086/3560 Training loss: 1.8449 0.3872 sec/batch
Epoch 18/20  Iteration 3087/3560 Training loss: 1.8448 0.3846 sec/batch
Epoch 18/20  Iteration 3088/3560 Training loss: 1.8454 0.3887 sec/batch
Epoch 18/20  Iteration 3089/3560 Training loss: 1.8458 0.3858 sec/batch
Epoch 18/20  Iteration 3090/3560 Training loss: 1.8461 0.3841 sec/batch
Epoch 18/20  Iteration 3091/3560 Training loss: 1.8461 0.3832 sec/batch
Epoch 18/20  Iteration 3092/3560 Training loss: 1.8464 0.3941 sec/batch
Epoch 18/20  Iteration 3093/3560 Training loss: 1.8466 0.3866 sec/batch
Epoch 18/20  Iteration 3094/3560 Training loss: 1.8462 0.4093 sec/batch
Epoch 18/20  Iteration 3095/3560 Training loss: 1.8461 0.3911 sec/batch
Epoch 18/20  Iteration 3096/3560 Training loss: 1.8459 0.3860 sec/batch
Epoch 18/20  Iteration 3097/3560 Training loss: 1.8463 0.3843 sec/batch
Epoch 18/20  Iteration 3098/3560 Training loss: 1.8465 0.3888 sec/batch
Epoch 18/20  Iteration 3099/3560 Training loss: 1.8470 0.3812 sec/batch
Epoch 18/20  Iteration 3100/3560 Training loss: 1.8466 0.3842 sec/batch
Epoch 18/20  Iteration 3101/3560 Training loss: 1.8466 0.3847 sec/batch
Epoch 18/20  Iteration 3102/3560 Training loss: 1.8469 0.3852 sec/batch
Epoch 18/20  Iteration 3103/3560 Training loss: 1.8469 0.3778 sec/batch
Epoch 18/20  Iteration 3104/3560 Training loss: 1.8470 0.3944 sec/batch
Epoch 18/20  Iteration 3105/3560 Training loss: 1.8464 0.3854 sec/batch
Epoch 18/20  Iteration 3106/3560 Training loss: 1.8463 0.3853 sec/batch
Epoch 18/20  Iteration 3107/3560 Training loss: 1.8459 0.4126 sec/batch
Epoch 18/20  Iteration 3108/3560 Training loss: 1.8460 0.3831 sec/batch
Epoch 18/20  Iteration 3109/3560 Training loss: 1.8454 0.3929 sec/batch
Epoch 18/20  Iteration 3110/3560 Training loss: 1.8452 0.3977 sec/batch
Epoch 18/20  Iteration 3111/3560 Training loss: 1.8448 0.3868 sec/batch
Epoch 18/20  Iteration 3112/3560 Training loss: 1.8444 0.3870 sec/batch
Epoch 18/20  Iteration 3113/3560 Training loss: 1.8442 0.3896 sec/batch
Epoch 18/20  Iteration 3114/3560 Training loss: 1.8439 0.3876 sec/batch
Epoch 18/20  Iteration 3115/3560 Training loss: 1.8434 0.3830 sec/batch
Epoch 18/20  Iteration 3116/3560 Training loss: 1.8435 0.3900 sec/batch
Epoch 18/20  Iteration 3117/3560 Training loss: 1.8432 0.3846 sec/batch
Epoch 18/20  Iteration 3118/3560 Training loss: 1.8430 0.3843 sec/batch
Epoch 18/20  Iteration 3119/3560 Training loss: 1.8426 0.3904 sec/batch
Epoch 18/20  Iteration 3120/3560 Training loss: 1.8423 0.4191 sec/batch
Epoch 18/20  Iteration 3121/3560 Training loss: 1.8420 0.3847 sec/batch
Epoch 18/20  Iteration 3122/3560 Training loss: 1.8420 0.3871 sec/batch
Epoch 18/20  Iteration 3123/3560 Training loss: 1.8420 0.3849 sec/batch
Epoch 18/20  Iteration 3124/3560 Training loss: 1.8416 0.3822 sec/batch
Epoch 18/20  Iteration 3125/3560 Training loss: 1.8412 0.3842 sec/batch
Epoch 18/20  Iteration 3126/3560 Training loss: 1.8409 0.3795 sec/batch
Epoch 18/20  Iteration 3127/3560 Training loss: 1.8407 0.3926 sec/batch
Epoch 18/20  Iteration 3128/3560 Training loss: 1.8406 0.3847 sec/batch
Epoch 18/20  Iteration 3129/3560 Training loss: 1.8403 0.3857 sec/batch
Epoch 18/20  Iteration 3130/3560 Training loss: 1.8402 0.3935 sec/batch
Epoch 18/20  Iteration 3131/3560 Training loss: 1.8399 0.3866 sec/batch
Epoch 18/20  Iteration 3132/3560 Training loss: 1.8398 0.3921 sec/batch
Epoch 18/20  Iteration 3133/3560 Training loss: 1.8397 0.4172 sec/batch
Epoch 18/20  Iteration 3134/3560 Training loss: 1.8396 0.3813 sec/batch
Epoch 18/20  Iteration 3135/3560 Training loss: 1.8397 0.3955 sec/batch
Epoch 18/20  Iteration 3136/3560 Training loss: 1.8398 0.3883 sec/batch
Epoch 18/20  Iteration 3137/3560 Training loss: 1.8398 0.3877 sec/batch
Epoch 18/20  Iteration 3138/3560 Training loss: 1.8397 0.3865 sec/batch
Epoch 18/20  Iteration 3139/3560 Training loss: 1.8398 0.3988 sec/batch
Epoch 18/20  Iteration 3140/3560 Training loss: 1.8397 0.3804 sec/batch
Epoch 18/20  Iteration 3141/3560 Training loss: 1.8395 0.3924 sec/batch
Epoch 18/20  Iteration 3142/3560 Training loss: 1.8392 0.3913 sec/batch
Epoch 18/20  Iteration 3143/3560 Training loss: 1.8392 0.3861 sec/batch
Epoch 18/20  Iteration 3144/3560 Training loss: 1.8391 0.3876 sec/batch
Epoch 18/20  Iteration 3145/3560 Training loss: 1.8391 0.3862 sec/batch
Epoch 18/20  Iteration 3146/3560 Training loss: 1.8390 0.4073 sec/batch
Epoch 18/20  Iteration 3147/3560 Training loss: 1.8390 0.4000 sec/batch
Epoch 18/20  Iteration 3148/3560 Training loss: 1.8387 0.3951 sec/batch
Epoch 18/20  Iteration 3149/3560 Training loss: 1.8385 0.3888 sec/batch
Epoch 18/20  Iteration 3150/3560 Training loss: 1.8386 0.3845 sec/batch
Epoch 18/20  Iteration 3151/3560 Training loss: 1.8385 0.3905 sec/batch
Epoch 18/20  Iteration 3152/3560 Training loss: 1.8381 0.3795 sec/batch
Epoch 18/20  Iteration 3153/3560 Training loss: 1.8382 0.3769 sec/batch
Epoch 18/20  Iteration 3154/3560 Training loss: 1.8383 0.3884 sec/batch
Epoch 18/20  Iteration 3155/3560 Training loss: 1.8382 0.3862 sec/batch
Epoch 18/20  Iteration 3156/3560 Training loss: 1.8382 0.3819 sec/batch
Epoch 18/20  Iteration 3157/3560 Training loss: 1.8379 0.3832 sec/batch
Epoch 18/20  Iteration 3158/3560 Training loss: 1.8377 0.3848 sec/batch
Epoch 18/20  Iteration 3159/3560 Training loss: 1.8377 0.4071 sec/batch
Epoch 18/20  Iteration 3160/3560 Training loss: 1.8377 0.3817 sec/batch
Epoch 18/20  Iteration 3161/3560 Training loss: 1.8378 0.3813 sec/batch
Epoch 18/20  Iteration 3162/3560 Training loss: 1.8378 0.3827 sec/batch
Epoch 18/20  Iteration 3163/3560 Training loss: 1.8379 0.3870 sec/batch
Epoch 18/20  Iteration 3164/3560 Training loss: 1.8380 0.3926 sec/batch
Epoch 18/20  Iteration 3165/3560 Training loss: 1.8382 0.3906 sec/batch
Epoch 18/20  Iteration 3166/3560 Training loss: 1.8380 0.3864 sec/batch
Epoch 18/20  Iteration 3167/3560 Training loss: 1.8383 0.3805 sec/batch
Epoch 18/20  Iteration 3168/3560 Training loss: 1.8383 0.3835 sec/batch
Epoch 18/20  Iteration 3169/3560 Training loss: 1.8382 0.3825 sec/batch
Epoch 18/20  Iteration 3170/3560 Training loss: 1.8383 0.3863 sec/batch
Epoch 18/20  Iteration 3171/3560 Training loss: 1.8383 0.3795 sec/batch
Epoch 18/20  Iteration 3172/3560 Training loss: 1.8384 0.4077 sec/batch
Epoch 18/20  Iteration 3173/3560 Training loss: 1.8385 0.3872 sec/batch
Epoch 18/20  Iteration 3174/3560 Training loss: 1.8386 0.3923 sec/batch
Epoch 18/20  Iteration 3175/3560 Training loss: 1.8386 0.3836 sec/batch
Epoch 18/20  Iteration 3176/3560 Training loss: 1.8385 0.3840 sec/batch
Epoch 18/20  Iteration 3177/3560 Training loss: 1.8382 0.3858 sec/batch
Epoch 18/20  Iteration 3178/3560 Training loss: 1.8382 0.3907 sec/batch
Epoch 18/20  Iteration 3179/3560 Training loss: 1.8383 0.3806 sec/batch
Epoch 18/20  Iteration 3180/3560 Training loss: 1.8384 0.3871 sec/batch
Epoch 18/20  Iteration 3181/3560 Training loss: 1.8384 0.3881 sec/batch
Epoch 18/20  Iteration 3182/3560 Training loss: 1.8383 0.3874 sec/batch
Epoch 18/20  Iteration 3183/3560 Training loss: 1.8384 0.3861 sec/batch
Epoch 18/20  Iteration 3184/3560 Training loss: 1.8385 0.3876 sec/batch
Epoch 18/20  Iteration 3185/3560 Training loss: 1.8383 0.4072 sec/batch
Epoch 18/20  Iteration 3186/3560 Training loss: 1.8385 0.3833 sec/batch
Epoch 18/20  Iteration 3187/3560 Training loss: 1.8386 0.3832 sec/batch
Epoch 18/20  Iteration 3188/3560 Training loss: 1.8387 0.3869 sec/batch
Epoch 18/20  Iteration 3189/3560 Training loss: 1.8387 0.3851 sec/batch
Epoch 18/20  Iteration 3190/3560 Training loss: 1.8387 0.3911 sec/batch
Epoch 18/20  Iteration 3191/3560 Training loss: 1.8388 0.3992 sec/batch
Epoch 18/20  Iteration 3192/3560 Training loss: 1.8387 0.3794 sec/batch
Epoch 18/20  Iteration 3193/3560 Training loss: 1.8388 0.3871 sec/batch
Epoch 18/20  Iteration 3194/3560 Training loss: 1.8392 0.3822 sec/batch
Epoch 18/20  Iteration 3195/3560 Training loss: 1.8392 0.3860 sec/batch
Epoch 18/20  Iteration 3196/3560 Training loss: 1.8392 0.3878 sec/batch
Epoch 18/20  Iteration 3197/3560 Training loss: 1.8391 0.3878 sec/batch
Epoch 18/20  Iteration 3198/3560 Training loss: 1.8391 0.4090 sec/batch
Epoch 18/20  Iteration 3199/3560 Training loss: 1.8392 0.3835 sec/batch
Epoch 18/20  Iteration 3200/3560 Training loss: 1.8392 0.3842 sec/batch
Epoch 18/20  Iteration 3201/3560 Training loss: 1.8392 0.3881 sec/batch
Epoch 18/20  Iteration 3202/3560 Training loss: 1.8392 0.3891 sec/batch
Epoch 18/20  Iteration 3203/3560 Training loss: 1.8391 0.3935 sec/batch
Epoch 18/20  Iteration 3204/3560 Training loss: 1.8392 0.3823 sec/batch
Epoch 19/20  Iteration 3205/3560 Training loss: 1.9016 0.3886 sec/batch
Epoch 19/20  Iteration 3206/3560 Training loss: 1.8613 0.3878 sec/batch
Epoch 19/20  Iteration 3207/3560 Training loss: 1.8539 0.3807 sec/batch
Epoch 19/20  Iteration 3208/3560 Training loss: 1.8490 0.3865 sec/batch
Epoch 19/20  Iteration 3209/3560 Training loss: 1.8438 0.3834 sec/batch
Epoch 19/20  Iteration 3210/3560 Training loss: 1.8349 0.3868 sec/batch
Epoch 19/20  Iteration 3211/3560 Training loss: 1.8367 0.4058 sec/batch
Epoch 19/20  Iteration 3212/3560 Training loss: 1.8352 0.3818 sec/batch
Epoch 19/20  Iteration 3213/3560 Training loss: 1.8370 0.3928 sec/batch
Epoch 19/20  Iteration 3214/3560 Training loss: 1.8356 0.3830 sec/batch
Epoch 19/20  Iteration 3215/3560 Training loss: 1.8326 0.3824 sec/batch
Epoch 19/20  Iteration 3216/3560 Training loss: 1.8306 0.3849 sec/batch
Epoch 19/20  Iteration 3217/3560 Training loss: 1.8315 0.3856 sec/batch
Epoch 19/20  Iteration 3218/3560 Training loss: 1.8342 0.3879 sec/batch
Epoch 19/20  Iteration 3219/3560 Training loss: 1.8338 0.3847 sec/batch
Epoch 19/20  Iteration 3220/3560 Training loss: 1.8321 0.3957 sec/batch
Epoch 19/20  Iteration 3221/3560 Training loss: 1.8321 0.3909 sec/batch
Epoch 19/20  Iteration 3222/3560 Training loss: 1.8349 0.3898 sec/batch
Epoch 19/20  Iteration 3223/3560 Training loss: 1.8352 0.3846 sec/batch
Epoch 19/20  Iteration 3224/3560 Training loss: 1.8359 0.4136 sec/batch
Epoch 19/20  Iteration 3225/3560 Training loss: 1.8351 0.3993 sec/batch
Epoch 19/20  Iteration 3226/3560 Training loss: 1.8357 0.3857 sec/batch
Epoch 19/20  Iteration 3227/3560 Training loss: 1.8352 0.3922 sec/batch
Epoch 19/20  Iteration 3228/3560 Training loss: 1.8351 0.3846 sec/batch
Epoch 19/20  Iteration 3229/3560 Training loss: 1.8345 0.3857 sec/batch
Epoch 19/20  Iteration 3230/3560 Training loss: 1.8334 0.3796 sec/batch
Epoch 19/20  Iteration 3231/3560 Training loss: 1.8322 0.3834 sec/batch
Epoch 19/20  Iteration 3232/3560 Training loss: 1.8328 0.3786 sec/batch
Epoch 19/20  Iteration 3233/3560 Training loss: 1.8334 0.3856 sec/batch
Epoch 19/20  Iteration 3234/3560 Training loss: 1.8338 0.3949 sec/batch
Epoch 19/20  Iteration 3235/3560 Training loss: 1.8336 0.3826 sec/batch
Epoch 19/20  Iteration 3236/3560 Training loss: 1.8328 0.3848 sec/batch
Epoch 19/20  Iteration 3237/3560 Training loss: 1.8332 0.4090 sec/batch
Epoch 19/20  Iteration 3238/3560 Training loss: 1.8337 0.3809 sec/batch
Epoch 19/20  Iteration 3239/3560 Training loss: 1.8332 0.3849 sec/batch
Epoch 19/20  Iteration 3240/3560 Training loss: 1.8332 0.3847 sec/batch
Epoch 19/20  Iteration 3241/3560 Training loss: 1.8328 0.3942 sec/batch
Epoch 19/20  Iteration 3242/3560 Training loss: 1.8319 0.3894 sec/batch
Epoch 19/20  Iteration 3243/3560 Training loss: 1.8308 0.3865 sec/batch
Epoch 19/20  Iteration 3244/3560 Training loss: 1.8302 0.3881 sec/batch
Epoch 19/20  Iteration 3245/3560 Training loss: 1.8298 0.3863 sec/batch
Epoch 19/20  Iteration 3246/3560 Training loss: 1.8303 0.3822 sec/batch
Epoch 19/20  Iteration 3247/3560 Training loss: 1.8298 0.3905 sec/batch
Epoch 19/20  Iteration 3248/3560 Training loss: 1.8290 0.3834 sec/batch
Epoch 19/20  Iteration 3249/3560 Training loss: 1.8290 0.3893 sec/batch
Epoch 19/20  Iteration 3250/3560 Training loss: 1.8279 0.4097 sec/batch
Epoch 19/20  Iteration 3251/3560 Training loss: 1.8274 0.3887 sec/batch
Epoch 19/20  Iteration 3252/3560 Training loss: 1.8268 0.3868 sec/batch
Epoch 19/20  Iteration 3253/3560 Training loss: 1.8269 0.3847 sec/batch
Epoch 19/20  Iteration 3254/3560 Training loss: 1.8278 0.3902 sec/batch
Epoch 19/20  Iteration 3255/3560 Training loss: 1.8273 0.3866 sec/batch
Epoch 19/20  Iteration 3256/3560 Training loss: 1.8281 0.3821 sec/batch
Epoch 19/20  Iteration 3257/3560 Training loss: 1.8280 0.3897 sec/batch
Epoch 19/20  Iteration 3258/3560 Training loss: 1.8278 0.3798 sec/batch
Epoch 19/20  Iteration 3259/3560 Training loss: 1.8275 0.3779 sec/batch
Epoch 19/20  Iteration 3260/3560 Training loss: 1.8279 0.3876 sec/batch
Epoch 19/20  Iteration 3261/3560 Training loss: 1.8283 0.3821 sec/batch
Epoch 19/20  Iteration 3262/3560 Training loss: 1.8280 0.4086 sec/batch
Epoch 19/20  Iteration 3263/3560 Training loss: 1.8276 0.4048 sec/batch
Epoch 19/20  Iteration 3264/3560 Training loss: 1.8281 0.3889 sec/batch
Epoch 19/20  Iteration 3265/3560 Training loss: 1.8281 0.3751 sec/batch
Epoch 19/20  Iteration 3266/3560 Training loss: 1.8289 0.3820 sec/batch
Epoch 19/20  Iteration 3267/3560 Training loss: 1.8292 0.3839 sec/batch
Epoch 19/20  Iteration 3268/3560 Training loss: 1.8294 0.3828 sec/batch
Epoch 19/20  Iteration 3269/3560 Training loss: 1.8293 0.3838 sec/batch
Epoch 19/20  Iteration 3270/3560 Training loss: 1.8298 0.3799 sec/batch
Epoch 19/20  Iteration 3271/3560 Training loss: 1.8301 0.3842 sec/batch
Epoch 19/20  Iteration 3272/3560 Training loss: 1.8296 0.3837 sec/batch
Epoch 19/20  Iteration 3273/3560 Training loss: 1.8296 0.3839 sec/batch
Epoch 19/20  Iteration 3274/3560 Training loss: 1.8294 0.3848 sec/batch
Epoch 19/20  Iteration 3275/3560 Training loss: 1.8299 0.3830 sec/batch
Epoch 19/20  Iteration 3276/3560 Training loss: 1.8302 0.4057 sec/batch
Epoch 19/20  Iteration 3277/3560 Training loss: 1.8305 0.3855 sec/batch
Epoch 19/20  Iteration 3278/3560 Training loss: 1.8301 0.3884 sec/batch
Epoch 19/20  Iteration 3279/3560 Training loss: 1.8300 0.3838 sec/batch
Epoch 19/20  Iteration 3280/3560 Training loss: 1.8304 0.3823 sec/batch
Epoch 19/20  Iteration 3281/3560 Training loss: 1.8302 0.3833 sec/batch
Epoch 19/20  Iteration 3282/3560 Training loss: 1.8304 0.3869 sec/batch
Epoch 19/20  Iteration 3283/3560 Training loss: 1.8298 0.3856 sec/batch
Epoch 19/20  Iteration 3284/3560 Training loss: 1.8296 0.3828 sec/batch
Epoch 19/20  Iteration 3285/3560 Training loss: 1.8291 0.3856 sec/batch
Epoch 19/20  Iteration 3286/3560 Training loss: 1.8292 0.3905 sec/batch
Epoch 19/20  Iteration 3287/3560 Training loss: 1.8287 0.3827 sec/batch
Epoch 19/20  Iteration 3288/3560 Training loss: 1.8287 0.3799 sec/batch
Epoch 19/20  Iteration 3289/3560 Training loss: 1.8284 0.4092 sec/batch
Epoch 19/20  Iteration 3290/3560 Training loss: 1.8283 0.3916 sec/batch
Epoch 19/20  Iteration 3291/3560 Training loss: 1.8282 0.3939 sec/batch
Epoch 19/20  Iteration 3292/3560 Training loss: 1.8279 0.3875 sec/batch
Epoch 19/20  Iteration 3293/3560 Training loss: 1.8274 0.3948 sec/batch
Epoch 19/20  Iteration 3294/3560 Training loss: 1.8275 0.3863 sec/batch
Epoch 19/20  Iteration 3295/3560 Training loss: 1.8272 0.3872 sec/batch
Epoch 19/20  Iteration 3296/3560 Training loss: 1.8271 0.3795 sec/batch
Epoch 19/20  Iteration 3297/3560 Training loss: 1.8267 0.3835 sec/batch
Epoch 19/20  Iteration 3298/3560 Training loss: 1.8264 0.3848 sec/batch
Epoch 19/20  Iteration 3299/3560 Training loss: 1.8261 0.3853 sec/batch
Epoch 19/20  Iteration 3300/3560 Training loss: 1.8262 0.3856 sec/batch
Epoch 19/20  Iteration 3301/3560 Training loss: 1.8260 0.4000 sec/batch
Epoch 19/20  Iteration 3302/3560 Training loss: 1.8256 0.4104 sec/batch
Epoch 19/20  Iteration 3303/3560 Training loss: 1.8253 0.3833 sec/batch
Epoch 19/20  Iteration 3304/3560 Training loss: 1.8249 0.3787 sec/batch
Epoch 19/20  Iteration 3305/3560 Training loss: 1.8249 0.3947 sec/batch
Epoch 19/20  Iteration 3306/3560 Training loss: 1.8247 0.3857 sec/batch
Epoch 19/20  Iteration 3307/3560 Training loss: 1.8244 0.3876 sec/batch
Epoch 19/20  Iteration 3308/3560 Training loss: 1.8243 0.3992 sec/batch
Epoch 19/20  Iteration 3309/3560 Training loss: 1.8240 0.3820 sec/batch
Epoch 19/20  Iteration 3310/3560 Training loss: 1.8238 0.3839 sec/batch
Epoch 19/20  Iteration 3311/3560 Training loss: 1.8238 0.3891 sec/batch
Epoch 19/20  Iteration 3312/3560 Training loss: 1.8238 0.3878 sec/batch
Epoch 19/20  Iteration 3313/3560 Training loss: 1.8240 0.3804 sec/batch
Epoch 19/20  Iteration 3314/3560 Training loss: 1.8240 0.3833 sec/batch
Epoch 19/20  Iteration 3315/3560 Training loss: 1.8240 0.4064 sec/batch
Epoch 19/20  Iteration 3316/3560 Training loss: 1.8240 0.3824 sec/batch
Epoch 19/20  Iteration 3317/3560 Training loss: 1.8239 0.3900 sec/batch
Epoch 19/20  Iteration 3318/3560 Training loss: 1.8238 0.3821 sec/batch
Epoch 19/20  Iteration 3319/3560 Training loss: 1.8236 0.3849 sec/batch
Epoch 19/20  Iteration 3320/3560 Training loss: 1.8233 0.3845 sec/batch
Epoch 19/20  Iteration 3321/3560 Training loss: 1.8233 0.3864 sec/batch
Epoch 19/20  Iteration 3322/3560 Training loss: 1.8232 0.3857 sec/batch
Epoch 19/20  Iteration 3323/3560 Training loss: 1.8232 0.3968 sec/batch
Epoch 19/20  Iteration 3324/3560 Training loss: 1.8231 0.3859 sec/batch
Epoch 19/20  Iteration 3325/3560 Training loss: 1.8231 0.3850 sec/batch
Epoch 19/20  Iteration 3326/3560 Training loss: 1.8228 0.3799 sec/batch
Epoch 19/20  Iteration 3327/3560 Training loss: 1.8225 0.3805 sec/batch
Epoch 19/20  Iteration 3328/3560 Training loss: 1.8227 0.4085 sec/batch
Epoch 19/20  Iteration 3329/3560 Training loss: 1.8227 0.3874 sec/batch
Epoch 19/20  Iteration 3330/3560 Training loss: 1.8224 0.3885 sec/batch
Epoch 19/20  Iteration 3331/3560 Training loss: 1.8225 0.3863 sec/batch
Epoch 19/20  Iteration 3332/3560 Training loss: 1.8225 0.3867 sec/batch
Epoch 19/20  Iteration 3333/3560 Training loss: 1.8225 0.3876 sec/batch
Epoch 19/20  Iteration 3334/3560 Training loss: 1.8224 0.3838 sec/batch
Epoch 19/20  Iteration 3335/3560 Training loss: 1.8221 0.3974 sec/batch
Epoch 19/20  Iteration 3336/3560 Training loss: 1.8219 0.4504 sec/batch
Epoch 19/20  Iteration 3337/3560 Training loss: 1.8220 0.3872 sec/batch
Epoch 19/20  Iteration 3338/3560 Training loss: 1.8219 0.3837 sec/batch
Epoch 19/20  Iteration 3339/3560 Training loss: 1.8219 0.4091 sec/batch
Epoch 19/20  Iteration 3340/3560 Training loss: 1.8220 0.4039 sec/batch
Epoch 19/20  Iteration 3341/3560 Training loss: 1.8221 0.4000 sec/batch
Epoch 19/20  Iteration 3342/3560 Training loss: 1.8222 0.3834 sec/batch
Epoch 19/20  Iteration 3343/3560 Training loss: 1.8223 0.3830 sec/batch
Epoch 19/20  Iteration 3344/3560 Training loss: 1.8221 0.3866 sec/batch
Epoch 19/20  Iteration 3345/3560 Training loss: 1.8223 0.3899 sec/batch
Epoch 19/20  Iteration 3346/3560 Training loss: 1.8223 0.3795 sec/batch
Epoch 19/20  Iteration 3347/3560 Training loss: 1.8223 0.3842 sec/batch
Epoch 19/20  Iteration 3348/3560 Training loss: 1.8223 0.4002 sec/batch
Epoch 19/20  Iteration 3349/3560 Training loss: 1.8223 0.3832 sec/batch
Epoch 19/20  Iteration 3350/3560 Training loss: 1.8224 0.3820 sec/batch
Epoch 19/20  Iteration 3351/3560 Training loss: 1.8224 0.3805 sec/batch
Epoch 19/20  Iteration 3352/3560 Training loss: 1.8225 0.3833 sec/batch
Epoch 19/20  Iteration 3353/3560 Training loss: 1.8225 0.4167 sec/batch
Epoch 19/20  Iteration 3354/3560 Training loss: 1.8224 0.3811 sec/batch
Epoch 19/20  Iteration 3355/3560 Training loss: 1.8222 0.3913 sec/batch
Epoch 19/20  Iteration 3356/3560 Training loss: 1.8223 0.3796 sec/batch
Epoch 19/20  Iteration 3357/3560 Training loss: 1.8224 0.3858 sec/batch
Epoch 19/20  Iteration 3358/3560 Training loss: 1.8224 0.3846 sec/batch
Epoch 19/20  Iteration 3359/3560 Training loss: 1.8224 0.3842 sec/batch
Epoch 19/20  Iteration 3360/3560 Training loss: 1.8224 0.3820 sec/batch
Epoch 19/20  Iteration 3361/3560 Training loss: 1.8225 0.3908 sec/batch
Epoch 19/20  Iteration 3362/3560 Training loss: 1.8225 0.3844 sec/batch
Epoch 19/20  Iteration 3363/3560 Training loss: 1.8223 0.3861 sec/batch
Epoch 19/20  Iteration 3364/3560 Training loss: 1.8224 0.3916 sec/batch
Epoch 19/20  Iteration 3365/3560 Training loss: 1.8226 0.3874 sec/batch
Epoch 19/20  Iteration 3366/3560 Training loss: 1.8226 0.4068 sec/batch
Epoch 19/20  Iteration 3367/3560 Training loss: 1.8227 0.3834 sec/batch
Epoch 19/20  Iteration 3368/3560 Training loss: 1.8226 0.3847 sec/batch
Epoch 19/20  Iteration 3369/3560 Training loss: 1.8226 0.3777 sec/batch
Epoch 19/20  Iteration 3370/3560 Training loss: 1.8225 0.3929 sec/batch
Epoch 19/20  Iteration 3371/3560 Training loss: 1.8227 0.3868 sec/batch
Epoch 19/20  Iteration 3372/3560 Training loss: 1.8230 0.3809 sec/batch
Epoch 19/20  Iteration 3373/3560 Training loss: 1.8230 0.3856 sec/batch
Epoch 19/20  Iteration 3374/3560 Training loss: 1.8230 0.3851 sec/batch
Epoch 19/20  Iteration 3375/3560 Training loss: 1.8229 0.3818 sec/batch
Epoch 19/20  Iteration 3376/3560 Training loss: 1.8228 0.3976 sec/batch
Epoch 19/20  Iteration 3377/3560 Training loss: 1.8230 0.4023 sec/batch
Epoch 19/20  Iteration 3378/3560 Training loss: 1.8230 0.3862 sec/batch
Epoch 19/20  Iteration 3379/3560 Training loss: 1.8231 0.4141 sec/batch
Epoch 19/20  Iteration 3380/3560 Training loss: 1.8230 0.3844 sec/batch
Epoch 19/20  Iteration 3381/3560 Training loss: 1.8229 0.3871 sec/batch
Epoch 19/20  Iteration 3382/3560 Training loss: 1.8230 0.3798 sec/batch
Epoch 20/20  Iteration 3383/3560 Training loss: 1.8798 0.3841 sec/batch
Epoch 20/20  Iteration 3384/3560 Training loss: 1.8406 0.3813 sec/batch
Epoch 20/20  Iteration 3385/3560 Training loss: 1.8332 0.3805 sec/batch
Epoch 20/20  Iteration 3386/3560 Training loss: 1.8293 0.3866 sec/batch
Epoch 20/20  Iteration 3387/3560 Training loss: 1.8257 0.3851 sec/batch
Epoch 20/20  Iteration 3388/3560 Training loss: 1.8181 0.3902 sec/batch
Epoch 20/20  Iteration 3389/3560 Training loss: 1.8190 0.3887 sec/batch
Epoch 20/20  Iteration 3390/3560 Training loss: 1.8181 0.3837 sec/batch
Epoch 20/20  Iteration 3391/3560 Training loss: 1.8208 0.3938 sec/batch
Epoch 20/20  Iteration 3392/3560 Training loss: 1.8202 0.4055 sec/batch
Epoch 20/20  Iteration 3393/3560 Training loss: 1.8166 0.3825 sec/batch
Epoch 20/20  Iteration 3394/3560 Training loss: 1.8146 0.3864 sec/batch
Epoch 20/20  Iteration 3395/3560 Training loss: 1.8145 0.3793 sec/batch
Epoch 20/20  Iteration 3396/3560 Training loss: 1.8173 0.3826 sec/batch
Epoch 20/20  Iteration 3397/3560 Training loss: 1.8168 0.3822 sec/batch
Epoch 20/20  Iteration 3398/3560 Training loss: 1.8153 0.3840 sec/batch
Epoch 20/20  Iteration 3399/3560 Training loss: 1.8149 0.3895 sec/batch
Epoch 20/20  Iteration 3400/3560 Training loss: 1.8173 0.3778 sec/batch
Epoch 20/20  Iteration 3401/3560 Training loss: 1.8174 0.3850 sec/batch
Epoch 20/20  Iteration 3402/3560 Training loss: 1.8182 0.3839 sec/batch
Epoch 20/20  Iteration 3403/3560 Training loss: 1.8171 0.3797 sec/batch
Epoch 20/20  Iteration 3404/3560 Training loss: 1.8178 0.3890 sec/batch
Epoch 20/20  Iteration 3405/3560 Training loss: 1.8174 0.4000 sec/batch
Epoch 20/20  Iteration 3406/3560 Training loss: 1.8172 0.3941 sec/batch
Epoch 20/20  Iteration 3407/3560 Training loss: 1.8167 0.3950 sec/batch
Epoch 20/20  Iteration 3408/3560 Training loss: 1.8156 0.3843 sec/batch
Epoch 20/20  Iteration 3409/3560 Training loss: 1.8148 0.3823 sec/batch
Epoch 20/20  Iteration 3410/3560 Training loss: 1.8156 0.3849 sec/batch
Epoch 20/20  Iteration 3411/3560 Training loss: 1.8166 0.3784 sec/batch
Epoch 20/20  Iteration 3412/3560 Training loss: 1.8168 0.3819 sec/batch
Epoch 20/20  Iteration 3413/3560 Training loss: 1.8168 0.3791 sec/batch
Epoch 20/20  Iteration 3414/3560 Training loss: 1.8163 0.3875 sec/batch
Epoch 20/20  Iteration 3415/3560 Training loss: 1.8169 0.3921 sec/batch
Epoch 20/20  Iteration 3416/3560 Training loss: 1.8176 0.3887 sec/batch
Epoch 20/20  Iteration 3417/3560 Training loss: 1.8174 0.3844 sec/batch
Epoch 20/20  Iteration 3418/3560 Training loss: 1.8171 0.4241 sec/batch
Epoch 20/20  Iteration 3419/3560 Training loss: 1.8169 0.3930 sec/batch
Epoch 20/20  Iteration 3420/3560 Training loss: 1.8159 0.3779 sec/batch
Epoch 20/20  Iteration 3421/3560 Training loss: 1.8147 0.3957 sec/batch
Epoch 20/20  Iteration 3422/3560 Training loss: 1.8141 0.3797 sec/batch
Epoch 20/20  Iteration 3423/3560 Training loss: 1.8138 0.3840 sec/batch
Epoch 20/20  Iteration 3424/3560 Training loss: 1.8144 0.3882 sec/batch
Epoch 20/20  Iteration 3425/3560 Training loss: 1.8137 0.3827 sec/batch
Epoch 20/20  Iteration 3426/3560 Training loss: 1.8130 0.3832 sec/batch
Epoch 20/20  Iteration 3427/3560 Training loss: 1.8134 0.3926 sec/batch
Epoch 20/20  Iteration 3428/3560 Training loss: 1.8126 0.3811 sec/batch
Epoch 20/20  Iteration 3429/3560 Training loss: 1.8122 0.3823 sec/batch
Epoch 20/20  Iteration 3430/3560 Training loss: 1.8115 0.3876 sec/batch
Epoch 20/20  Iteration 3431/3560 Training loss: 1.8116 0.3958 sec/batch
Epoch 20/20  Iteration 3432/3560 Training loss: 1.8124 0.3949 sec/batch
Epoch 20/20  Iteration 3433/3560 Training loss: 1.8119 0.3824 sec/batch
Epoch 20/20  Iteration 3434/3560 Training loss: 1.8127 0.3842 sec/batch
Epoch 20/20  Iteration 3435/3560 Training loss: 1.8125 0.3964 sec/batch
Epoch 20/20  Iteration 3436/3560 Training loss: 1.8125 0.3857 sec/batch
Epoch 20/20  Iteration 3437/3560 Training loss: 1.8122 0.3901 sec/batch
Epoch 20/20  Iteration 3438/3560 Training loss: 1.8125 0.3852 sec/batch
Epoch 20/20  Iteration 3439/3560 Training loss: 1.8129 0.3871 sec/batch
Epoch 20/20  Iteration 3440/3560 Training loss: 1.8126 0.3862 sec/batch
Epoch 20/20  Iteration 3441/3560 Training loss: 1.8121 0.3886 sec/batch
Epoch 20/20  Iteration 3442/3560 Training loss: 1.8127 0.3863 sec/batch
Epoch 20/20  Iteration 3443/3560 Training loss: 1.8127 0.3851 sec/batch
Epoch 20/20  Iteration 3444/3560 Training loss: 1.8133 0.4057 sec/batch
Epoch 20/20  Iteration 3445/3560 Training loss: 1.8136 0.3939 sec/batch
Epoch 20/20  Iteration 3446/3560 Training loss: 1.8138 0.3896 sec/batch
Epoch 20/20  Iteration 3447/3560 Training loss: 1.8138 0.3850 sec/batch
Epoch 20/20  Iteration 3448/3560 Training loss: 1.8140 0.3862 sec/batch
Epoch 20/20  Iteration 3449/3560 Training loss: 1.8143 0.3890 sec/batch
Epoch 20/20  Iteration 3450/3560 Training loss: 1.8138 0.3860 sec/batch
Epoch 20/20  Iteration 3451/3560 Training loss: 1.8138 0.3930 sec/batch
Epoch 20/20  Iteration 3452/3560 Training loss: 1.8137 0.3799 sec/batch
Epoch 20/20  Iteration 3453/3560 Training loss: 1.8141 0.3977 sec/batch
Epoch 20/20  Iteration 3454/3560 Training loss: 1.8143 0.3806 sec/batch
Epoch 20/20  Iteration 3455/3560 Training loss: 1.8146 0.3819 sec/batch
Epoch 20/20  Iteration 3456/3560 Training loss: 1.8144 0.4345 sec/batch
Epoch 20/20  Iteration 3457/3560 Training loss: 1.8143 0.4727 sec/batch
Epoch 20/20  Iteration 3458/3560 Training loss: 1.8147 0.5104 sec/batch
Epoch 20/20  Iteration 3459/3560 Training loss: 1.8147 0.4144 sec/batch
Epoch 20/20  Iteration 3460/3560 Training loss: 1.8148 0.3822 sec/batch
Epoch 20/20  Iteration 3461/3560 Training loss: 1.8143 0.3812 sec/batch
Epoch 20/20  Iteration 3462/3560 Training loss: 1.8142 0.4021 sec/batch
Epoch 20/20  Iteration 3463/3560 Training loss: 1.8137 0.3831 sec/batch
Epoch 20/20  Iteration 3464/3560 Training loss: 1.8138 0.3895 sec/batch
Epoch 20/20  Iteration 3465/3560 Training loss: 1.8134 0.3883 sec/batch
Epoch 20/20  Iteration 3466/3560 Training loss: 1.8134 0.3891 sec/batch
Epoch 20/20  Iteration 3467/3560 Training loss: 1.8130 0.3850 sec/batch
Epoch 20/20  Iteration 3468/3560 Training loss: 1.8128 0.3864 sec/batch
Epoch 20/20  Iteration 3469/3560 Training loss: 1.8127 0.3962 sec/batch
Epoch 20/20  Iteration 3470/3560 Training loss: 1.8124 0.4084 sec/batch
Epoch 20/20  Iteration 3471/3560 Training loss: 1.8118 0.3813 sec/batch
Epoch 20/20  Iteration 3472/3560 Training loss: 1.8119 0.3831 sec/batch
Epoch 20/20  Iteration 3473/3560 Training loss: 1.8116 0.3928 sec/batch
Epoch 20/20  Iteration 3474/3560 Training loss: 1.8115 0.3827 sec/batch
Epoch 20/20  Iteration 3475/3560 Training loss: 1.8110 0.3829 sec/batch
Epoch 20/20  Iteration 3476/3560 Training loss: 1.8106 0.3882 sec/batch
Epoch 20/20  Iteration 3477/3560 Training loss: 1.8103 0.3958 sec/batch
Epoch 20/20  Iteration 3478/3560 Training loss: 1.8103 0.3852 sec/batch
Epoch 20/20  Iteration 3479/3560 Training loss: 1.8103 0.3885 sec/batch
Epoch 20/20  Iteration 3480/3560 Training loss: 1.8098 0.3834 sec/batch
Epoch 20/20  Iteration 3481/3560 Training loss: 1.8095 0.3823 sec/batch
Epoch 20/20  Iteration 3482/3560 Training loss: 1.8090 0.3847 sec/batch
Epoch 20/20  Iteration 3483/3560 Training loss: 1.8090 0.4163 sec/batch
Epoch 20/20  Iteration 3484/3560 Training loss: 1.8090 0.3854 sec/batch
Epoch 20/20  Iteration 3485/3560 Training loss: 1.8088 0.3828 sec/batch
Epoch 20/20  Iteration 3486/3560 Training loss: 1.8086 0.3774 sec/batch
Epoch 20/20  Iteration 3487/3560 Training loss: 1.8084 0.3827 sec/batch
Epoch 20/20  Iteration 3488/3560 Training loss: 1.8082 0.3845 sec/batch
Epoch 20/20  Iteration 3489/3560 Training loss: 1.8081 0.3844 sec/batch
Epoch 20/20  Iteration 3490/3560 Training loss: 1.8082 0.3829 sec/batch
Epoch 20/20  Iteration 3491/3560 Training loss: 1.8082 0.3790 sec/batch
Epoch 20/20  Iteration 3492/3560 Training loss: 1.8083 0.4022 sec/batch
Epoch 20/20  Iteration 3493/3560 Training loss: 1.8083 0.3868 sec/batch
Epoch 20/20  Iteration 3494/3560 Training loss: 1.8082 0.3841 sec/batch
Epoch 20/20  Iteration 3495/3560 Training loss: 1.8081 0.3799 sec/batch
Epoch 20/20  Iteration 3496/3560 Training loss: 1.8080 0.4036 sec/batch
Epoch 20/20  Iteration 3497/3560 Training loss: 1.8078 0.3831 sec/batch
Epoch 20/20  Iteration 3498/3560 Training loss: 1.8075 0.3787 sec/batch
Epoch 20/20  Iteration 3499/3560 Training loss: 1.8076 0.3910 sec/batch
Epoch 20/20  Iteration 3500/3560 Training loss: 1.8076 0.3946 sec/batch
Epoch 20/20  Iteration 3501/3560 Training loss: 1.8076 0.3819 sec/batch
Epoch 20/20  Iteration 3502/3560 Training loss: 1.8075 0.3851 sec/batch
Epoch 20/20  Iteration 3503/3560 Training loss: 1.8076 0.3793 sec/batch
Epoch 20/20  Iteration 3504/3560 Training loss: 1.8072 0.3822 sec/batch
Epoch 20/20  Iteration 3505/3560 Training loss: 1.8069 0.3879 sec/batch
Epoch 20/20  Iteration 3506/3560 Training loss: 1.8071 0.3859 sec/batch
Epoch 20/20  Iteration 3507/3560 Training loss: 1.8070 0.3858 sec/batch
Epoch 20/20  Iteration 3508/3560 Training loss: 1.8067 0.3920 sec/batch
Epoch 20/20  Iteration 3509/3560 Training loss: 1.8069 0.4073 sec/batch
Epoch 20/20  Iteration 3510/3560 Training loss: 1.8070 0.3845 sec/batch
Epoch 20/20  Iteration 3511/3560 Training loss: 1.8069 0.3875 sec/batch
Epoch 20/20  Iteration 3512/3560 Training loss: 1.8068 0.3904 sec/batch
Epoch 20/20  Iteration 3513/3560 Training loss: 1.8065 0.3841 sec/batch
Epoch 20/20  Iteration 3514/3560 Training loss: 1.8062 0.3811 sec/batch
Epoch 20/20  Iteration 3515/3560 Training loss: 1.8061 0.4048 sec/batch
Epoch 20/20  Iteration 3516/3560 Training loss: 1.8062 0.5009 sec/batch
Epoch 20/20  Iteration 3517/3560 Training loss: 1.8062 0.4814 sec/batch
Epoch 20/20  Iteration 3518/3560 Training loss: 1.8063 0.4087 sec/batch
Epoch 20/20  Iteration 3519/3560 Training loss: 1.8063 0.3801 sec/batch
Epoch 20/20  Iteration 3520/3560 Training loss: 1.8064 0.4096 sec/batch
Epoch 20/20  Iteration 3521/3560 Training loss: 1.8066 0.4403 sec/batch
Epoch 20/20  Iteration 3522/3560 Training loss: 1.8064 0.3827 sec/batch
Epoch 20/20  Iteration 3523/3560 Training loss: 1.8067 0.3872 sec/batch
Epoch 20/20  Iteration 3524/3560 Training loss: 1.8066 0.3778 sec/batch
Epoch 20/20  Iteration 3525/3560 Training loss: 1.8066 0.3912 sec/batch
Epoch 20/20  Iteration 3526/3560 Training loss: 1.8067 0.3842 sec/batch
Epoch 20/20  Iteration 3527/3560 Training loss: 1.8066 0.3808 sec/batch
Epoch 20/20  Iteration 3528/3560 Training loss: 1.8067 0.4140 sec/batch
Epoch 20/20  Iteration 3529/3560 Training loss: 1.8068 0.3977 sec/batch
Epoch 20/20  Iteration 3530/3560 Training loss: 1.8069 0.3867 sec/batch
Epoch 20/20  Iteration 3531/3560 Training loss: 1.8069 0.3876 sec/batch
Epoch 20/20  Iteration 3532/3560 Training loss: 1.8068 0.3917 sec/batch
Epoch 20/20  Iteration 3533/3560 Training loss: 1.8066 0.3814 sec/batch
Epoch 20/20  Iteration 3534/3560 Training loss: 1.8067 0.4090 sec/batch
Epoch 20/20  Iteration 3535/3560 Training loss: 1.8068 0.4040 sec/batch
Epoch 20/20  Iteration 3536/3560 Training loss: 1.8069 0.3886 sec/batch
Epoch 20/20  Iteration 3537/3560 Training loss: 1.8069 0.3908 sec/batch
Epoch 20/20  Iteration 3538/3560 Training loss: 1.8069 0.3890 sec/batch
Epoch 20/20  Iteration 3539/3560 Training loss: 1.8070 0.3840 sec/batch
Epoch 20/20  Iteration 3540/3560 Training loss: 1.8069 0.3830 sec/batch
Epoch 20/20  Iteration 3541/3560 Training loss: 1.8067 0.3812 sec/batch
Epoch 20/20  Iteration 3542/3560 Training loss: 1.8069 0.3935 sec/batch
Epoch 20/20  Iteration 3543/3560 Training loss: 1.8071 0.3878 sec/batch
Epoch 20/20  Iteration 3544/3560 Training loss: 1.8071 0.3885 sec/batch
Epoch 20/20  Iteration 3545/3560 Training loss: 1.8072 0.3869 sec/batch
Epoch 20/20  Iteration 3546/3560 Training loss: 1.8072 0.3860 sec/batch
Epoch 20/20  Iteration 3547/3560 Training loss: 1.8072 0.4101 sec/batch
Epoch 20/20  Iteration 3548/3560 Training loss: 1.8073 0.3817 sec/batch
Epoch 20/20  Iteration 3549/3560 Training loss: 1.8073 0.3827 sec/batch
Epoch 20/20  Iteration 3550/3560 Training loss: 1.8077 0.3904 sec/batch
Epoch 20/20  Iteration 3551/3560 Training loss: 1.8078 0.3887 sec/batch
Epoch 20/20  Iteration 3552/3560 Training loss: 1.8077 0.3821 sec/batch
Epoch 20/20  Iteration 3553/3560 Training loss: 1.8076 0.3811 sec/batch
Epoch 20/20  Iteration 3554/3560 Training loss: 1.8076 0.3813 sec/batch
Epoch 20/20  Iteration 3555/3560 Training loss: 1.8077 0.3896 sec/batch
Epoch 20/20  Iteration 3556/3560 Training loss: 1.8078 0.3936 sec/batch
Epoch 20/20  Iteration 3557/3560 Training loss: 1.8078 0.3812 sec/batch
Epoch 20/20  Iteration 3558/3560 Training loss: 1.8078 0.3881 sec/batch
Epoch 20/20  Iteration 3559/3560 Training loss: 1.8077 0.3958 sec/batch
Epoch 20/20  Iteration 3560/3560 Training loss: 1.8077 0.4178 sec/batch
Epoch 1/20  Iteration 1/3560 Training loss: 4.4211 0.6613 sec/batch
Epoch 1/20  Iteration 2/3560 Training loss: 4.4080 0.6416 sec/batch
Epoch 1/20  Iteration 3/3560 Training loss: 4.3869 0.6197 sec/batch
Epoch 1/20  Iteration 4/3560 Training loss: 4.3381 0.6339 sec/batch
Epoch 1/20  Iteration 5/3560 Training loss: 4.2436 0.6311 sec/batch
Epoch 1/20  Iteration 6/3560 Training loss: 4.1514 0.6402 sec/batch
Epoch 1/20  Iteration 7/3560 Training loss: 4.0772 0.6357 sec/batch
Epoch 1/20  Iteration 8/3560 Training loss: 4.0087 0.6334 sec/batch
Epoch 1/20  Iteration 9/3560 Training loss: 3.9463 0.6489 sec/batch
Epoch 1/20  Iteration 10/3560 Training loss: 3.8925 0.6377 sec/batch
Epoch 1/20  Iteration 11/3560 Training loss: 3.8446 0.6280 sec/batch
Epoch 1/20  Iteration 12/3560 Training loss: 3.8029 0.6360 sec/batch
Epoch 1/20  Iteration 13/3560 Training loss: 3.7671 0.6360 sec/batch
Epoch 1/20  Iteration 14/3560 Training loss: 3.7361 0.6480 sec/batch
Epoch 1/20  Iteration 15/3560 Training loss: 3.7079 0.6394 sec/batch
Epoch 1/20  Iteration 16/3560 Training loss: 3.6819 0.6298 sec/batch
Epoch 1/20  Iteration 17/3560 Training loss: 3.6583 0.6296 sec/batch
Epoch 1/20  Iteration 18/3560 Training loss: 3.6378 0.6283 sec/batch
Epoch 1/20  Iteration 19/3560 Training loss: 3.6178 0.6493 sec/batch
Epoch 1/20  Iteration 20/3560 Training loss: 3.5981 0.6459 sec/batch
Epoch 1/20  Iteration 21/3560 Training loss: 3.5813 0.6267 sec/batch
Epoch 1/20  Iteration 22/3560 Training loss: 3.5655 0.6720 sec/batch
Epoch 1/20  Iteration 23/3560 Training loss: 3.5504 0.6391 sec/batch
Epoch 1/20  Iteration 24/3560 Training loss: 3.5370 0.6395 sec/batch
Epoch 1/20  Iteration 25/3560 Training loss: 3.5238 0.6405 sec/batch
Epoch 1/20  Iteration 26/3560 Training loss: 3.5124 0.6337 sec/batch
Epoch 1/20  Iteration 27/3560 Training loss: 3.5016 0.6311 sec/batch
Epoch 1/20  Iteration 28/3560 Training loss: 3.4908 0.6361 sec/batch
Epoch 1/20  Iteration 29/3560 Training loss: 3.4809 0.6442 sec/batch
Epoch 1/20  Iteration 30/3560 Training loss: 3.4717 0.6431 sec/batch
Epoch 1/20  Iteration 31/3560 Training loss: 3.4636 0.6330 sec/batch
Epoch 1/20  Iteration 32/3560 Training loss: 3.4550 0.6267 sec/batch
Epoch 1/20  Iteration 33/3560 Training loss: 3.4467 0.6150 sec/batch
Epoch 1/20  Iteration 34/3560 Training loss: 3.4393 0.6291 sec/batch
Epoch 1/20  Iteration 35/3560 Training loss: 3.4317 0.6539 sec/batch
Epoch 1/20  Iteration 36/3560 Training loss: 3.4250 0.6309 sec/batch
Epoch 1/20  Iteration 37/3560 Training loss: 3.4178 0.6253 sec/batch
Epoch 1/20  Iteration 38/3560 Training loss: 3.4112 0.6492 sec/batch
Epoch 1/20  Iteration 39/3560 Training loss: 3.4046 0.6284 sec/batch
Epoch 1/20  Iteration 40/3560 Training loss: 3.3986 0.6352 sec/batch
Epoch 1/20  Iteration 41/3560 Training loss: 3.3926 0.6320 sec/batch
Epoch 1/20  Iteration 42/3560 Training loss: 3.3871 0.6342 sec/batch
Epoch 1/20  Iteration 43/3560 Training loss: 3.3818 0.5960 sec/batch
Epoch 1/20  Iteration 44/3560 Training loss: 3.3765 0.6380 sec/batch
Epoch 1/20  Iteration 45/3560 Training loss: 3.3714 0.6482 sec/batch
Epoch 1/20  Iteration 46/3560 Training loss: 3.3669 0.6583 sec/batch
Epoch 1/20  Iteration 47/3560 Training loss: 3.3626 0.6908 sec/batch
Epoch 1/20  Iteration 48/3560 Training loss: 3.3585 0.6321 sec/batch
Epoch 1/20  Iteration 49/3560 Training loss: 3.3545 0.6496 sec/batch
Epoch 1/20  Iteration 50/3560 Training loss: 3.3507 0.6485 sec/batch
Epoch 1/20  Iteration 51/3560 Training loss: 3.3468 0.6304 sec/batch
Epoch 1/20  Iteration 52/3560 Training loss: 3.3430 0.6337 sec/batch
Epoch 1/20  Iteration 53/3560 Training loss: 3.3396 0.6474 sec/batch
Epoch 1/20  Iteration 54/3560 Training loss: 3.3359 0.6344 sec/batch
Epoch 1/20  Iteration 55/3560 Training loss: 3.3326 0.6418 sec/batch
Epoch 1/20  Iteration 56/3560 Training loss: 3.3291 0.6479 sec/batch
Epoch 1/20  Iteration 57/3560 Training loss: 3.3258 0.6095 sec/batch
Epoch 1/20  Iteration 58/3560 Training loss: 3.3227 0.6213 sec/batch
Epoch 1/20  Iteration 59/3560 Training loss: 3.3194 0.6299 sec/batch
Epoch 1/20  Iteration 60/3560 Training loss: 3.3165 0.6300 sec/batch
Epoch 1/20  Iteration 61/3560 Training loss: 3.3137 0.6406 sec/batch
Epoch 1/20  Iteration 62/3560 Training loss: 3.3113 0.6325 sec/batch
Epoch 1/20  Iteration 63/3560 Training loss: 3.3090 0.6263 sec/batch
Epoch 1/20  Iteration 64/3560 Training loss: 3.3061 0.6318 sec/batch
Epoch 1/20  Iteration 65/3560 Training loss: 3.3033 0.6445 sec/batch
Epoch 1/20  Iteration 66/3560 Training loss: 3.3011 0.6407 sec/batch
Epoch 1/20  Iteration 67/3560 Training loss: 3.2988 0.6345 sec/batch
Epoch 1/20  Iteration 68/3560 Training loss: 3.2959 0.6355 sec/batch
Epoch 1/20  Iteration 69/3560 Training loss: 3.2933 0.6380 sec/batch
Epoch 1/20  Iteration 70/3560 Training loss: 3.2911 0.6267 sec/batch
Epoch 1/20  Iteration 71/3560 Training loss: 3.2888 0.6303 sec/batch
Epoch 1/20  Iteration 72/3560 Training loss: 3.2870 0.6329 sec/batch
Epoch 1/20  Iteration 73/3560 Training loss: 3.2848 0.6332 sec/batch
Epoch 1/20  Iteration 74/3560 Training loss: 3.2828 0.6456 sec/batch
Epoch 1/20  Iteration 75/3560 Training loss: 3.2808 0.6425 sec/batch
Epoch 1/20  Iteration 76/3560 Training loss: 3.2790 0.6211 sec/batch
Epoch 1/20  Iteration 77/3560 Training loss: 3.2771 0.6471 sec/batch
Epoch 1/20  Iteration 78/3560 Training loss: 3.2753 0.6250 sec/batch
Epoch 1/20  Iteration 79/3560 Training loss: 3.2733 0.6465 sec/batch
Epoch 1/20  Iteration 80/3560 Training loss: 3.2712 0.6316 sec/batch
Epoch 1/20  Iteration 81/3560 Training loss: 3.2694 0.6346 sec/batch
Epoch 1/20  Iteration 82/3560 Training loss: 3.2677 0.6451 sec/batch
Epoch 1/20  Iteration 83/3560 Training loss: 3.2661 0.6312 sec/batch
Epoch 1/20  Iteration 84/3560 Training loss: 3.2643 0.6333 sec/batch
Epoch 1/20  Iteration 85/3560 Training loss: 3.2624 0.6348 sec/batch
Epoch 1/20  Iteration 86/3560 Training loss: 3.2606 0.6297 sec/batch
Epoch 1/20  Iteration 87/3560 Training loss: 3.2588 0.6268 sec/batch
Epoch 1/20  Iteration 88/3560 Training loss: 3.2571 0.6288 sec/batch
Epoch 1/20  Iteration 89/3560 Training loss: 3.2555 0.6249 sec/batch
Epoch 1/20  Iteration 90/3560 Training loss: 3.2540 0.6424 sec/batch
Epoch 1/20  Iteration 91/3560 Training loss: 3.2525 0.6273 sec/batch
Epoch 1/20  Iteration 92/3560 Training loss: 3.2508 0.6341 sec/batch
Epoch 1/20  Iteration 93/3560 Training loss: 3.2493 0.6437 sec/batch
Epoch 1/20  Iteration 94/3560 Training loss: 3.2478 0.6248 sec/batch
Epoch 1/20  Iteration 95/3560 Training loss: 3.2462 0.6378 sec/batch
Epoch 1/20  Iteration 96/3560 Training loss: 3.2445 0.6297 sec/batch
Epoch 1/20  Iteration 97/3560 Training loss: 3.2430 0.6335 sec/batch
Epoch 1/20  Iteration 98/3560 Training loss: 3.2414 0.6325 sec/batch
Epoch 1/20  Iteration 99/3560 Training loss: 3.2399 0.6491 sec/batch
Epoch 1/20  Iteration 100/3560 Training loss: 3.2384 0.6441 sec/batch
Epoch 1/20  Iteration 101/3560 Training loss: 3.2368 0.6506 sec/batch
Epoch 1/20  Iteration 102/3560 Training loss: 3.2354 0.6286 sec/batch
Epoch 1/20  Iteration 103/3560 Training loss: 3.2338 0.6380 sec/batch
Epoch 1/20  Iteration 104/3560 Training loss: 3.2322 0.6364 sec/batch
Epoch 1/20  Iteration 105/3560 Training loss: 3.2307 0.6318 sec/batch
Epoch 1/20  Iteration 106/3560 Training loss: 3.2291 0.6312 sec/batch
Epoch 1/20  Iteration 107/3560 Training loss: 3.2274 0.6351 sec/batch
Epoch 1/20  Iteration 108/3560 Training loss: 3.2255 0.6405 sec/batch
Epoch 1/20  Iteration 109/3560 Training loss: 3.2240 0.6615 sec/batch
Epoch 1/20  Iteration 110/3560 Training loss: 3.2221 0.6463 sec/batch
Epoch 1/20  Iteration 111/3560 Training loss: 3.2204 0.6359 sec/batch
Epoch 1/20  Iteration 112/3560 Training loss: 3.2188 0.6337 sec/batch
Epoch 1/20  Iteration 113/3560 Training loss: 3.2171 0.6374 sec/batch
Epoch 1/20  Iteration 114/3560 Training loss: 3.2152 0.6274 sec/batch
Epoch 1/20  Iteration 115/3560 Training loss: 3.2133 0.6311 sec/batch
Epoch 1/20  Iteration 116/3560 Training loss: 3.2114 0.6592 sec/batch
Epoch 1/20  Iteration 117/3560 Training loss: 3.2096 0.6284 sec/batch
Epoch 1/20  Iteration 118/3560 Training loss: 3.2079 0.6384 sec/batch
Epoch 1/20  Iteration 119/3560 Training loss: 3.2062 0.6441 sec/batch
Epoch 1/20  Iteration 120/3560 Training loss: 3.2044 0.6366 sec/batch
Epoch 1/20  Iteration 121/3560 Training loss: 3.2028 0.6202 sec/batch
Epoch 1/20  Iteration 122/3560 Training loss: 3.2010 0.6317 sec/batch
Epoch 1/20  Iteration 123/3560 Training loss: 3.1993 0.6286 sec/batch
Epoch 1/20  Iteration 124/3560 Training loss: 3.1976 0.6483 sec/batch
Epoch 1/20  Iteration 125/3560 Training loss: 3.1957 0.6349 sec/batch
Epoch 1/20  Iteration 126/3560 Training loss: 3.1936 0.6356 sec/batch
Epoch 1/20  Iteration 127/3560 Training loss: 3.1918 0.6340 sec/batch
Epoch 1/20  Iteration 128/3560 Training loss: 3.1899 0.6449 sec/batch
Epoch 1/20  Iteration 129/3560 Training loss: 3.1879 0.6320 sec/batch
Epoch 1/20  Iteration 130/3560 Training loss: 3.1859 0.6344 sec/batch
Epoch 1/20  Iteration 131/3560 Training loss: 3.1841 0.6348 sec/batch
Epoch 1/20  Iteration 132/3560 Training loss: 3.1820 0.6426 sec/batch
Epoch 1/20  Iteration 133/3560 Training loss: 3.1801 0.6340 sec/batch
Epoch 1/20  Iteration 134/3560 Training loss: 3.1780 0.6361 sec/batch
Epoch 1/20  Iteration 135/3560 Training loss: 3.1757 0.6350 sec/batch
Epoch 1/20  Iteration 136/3560 Training loss: 3.1734 0.6292 sec/batch
Epoch 1/20  Iteration 137/3560 Training loss: 3.1713 0.6445 sec/batch
Epoch 1/20  Iteration 138/3560 Training loss: 3.1691 0.6352 sec/batch
Epoch 1/20  Iteration 139/3560 Training loss: 3.1670 0.6409 sec/batch
Epoch 1/20  Iteration 140/3560 Training loss: 3.1648 0.6423 sec/batch
Epoch 1/20  Iteration 141/3560 Training loss: 3.1627 0.6333 sec/batch
Epoch 1/20  Iteration 142/3560 Training loss: 3.1604 0.6287 sec/batch
Epoch 1/20  Iteration 143/3560 Training loss: 3.1581 0.6326 sec/batch
Epoch 1/20  Iteration 144/3560 Training loss: 3.1558 0.6371 sec/batch
Epoch 1/20  Iteration 145/3560 Training loss: 3.1536 0.6117 sec/batch
Epoch 1/20  Iteration 146/3560 Training loss: 3.1514 0.6489 sec/batch
Epoch 1/20  Iteration 147/3560 Training loss: 3.1492 0.6331 sec/batch
Epoch 1/20  Iteration 148/3560 Training loss: 3.1471 0.6470 sec/batch
Epoch 1/20  Iteration 149/3560 Training loss: 3.1447 0.6271 sec/batch
Epoch 1/20  Iteration 150/3560 Training loss: 3.1424 0.6434 sec/batch
Epoch 1/20  Iteration 151/3560 Training loss: 3.1404 0.6391 sec/batch
Epoch 1/20  Iteration 152/3560 Training loss: 3.1383 0.6404 sec/batch
Epoch 1/20  Iteration 153/3560 Training loss: 3.1360 0.6342 sec/batch
Epoch 1/20  Iteration 154/3560 Training loss: 3.1337 0.6401 sec/batch
Epoch 1/20  Iteration 155/3560 Training loss: 3.1312 0.6480 sec/batch
Epoch 1/20  Iteration 156/3560 Training loss: 3.1288 0.6481 sec/batch
Epoch 1/20  Iteration 157/3560 Training loss: 3.1263 0.6352 sec/batch
Epoch 1/20  Iteration 158/3560 Training loss: 3.1239 0.6332 sec/batch
Epoch 1/20  Iteration 159/3560 Training loss: 3.1213 0.6344 sec/batch
Epoch 1/20  Iteration 160/3560 Training loss: 3.1188 0.6253 sec/batch
Epoch 1/20  Iteration 161/3560 Training loss: 3.1164 0.6275 sec/batch
Epoch 1/20  Iteration 162/3560 Training loss: 3.1137 0.6362 sec/batch
Epoch 1/20  Iteration 163/3560 Training loss: 3.1111 0.6427 sec/batch
Epoch 1/20  Iteration 164/3560 Training loss: 3.1087 0.6607 sec/batch
Epoch 1/20  Iteration 165/3560 Training loss: 3.1062 0.6418 sec/batch
Epoch 1/20  Iteration 166/3560 Training loss: 3.1036 0.6509 sec/batch
Epoch 1/20  Iteration 167/3560 Training loss: 3.1011 0.6380 sec/batch
Epoch 1/20  Iteration 168/3560 Training loss: 3.0987 0.6339 sec/batch
Epoch 1/20  Iteration 169/3560 Training loss: 3.0962 0.6223 sec/batch
Epoch 1/20  Iteration 170/3560 Training loss: 3.0936 0.6394 sec/batch
Epoch 1/20  Iteration 171/3560 Training loss: 3.0912 0.6360 sec/batch
Epoch 1/20  Iteration 172/3560 Training loss: 3.0889 0.6517 sec/batch
Epoch 1/20  Iteration 173/3560 Training loss: 3.0867 0.6374 sec/batch
Epoch 1/20  Iteration 174/3560 Training loss: 3.0845 0.6485 sec/batch
Epoch 1/20  Iteration 175/3560 Training loss: 3.0822 0.6322 sec/batch
Epoch 1/20  Iteration 176/3560 Training loss: 3.0798 0.6291 sec/batch
Epoch 1/20  Iteration 177/3560 Training loss: 3.0773 0.6346 sec/batch
Epoch 1/20  Iteration 178/3560 Training loss: 3.0747 0.6478 sec/batch
Epoch 2/20  Iteration 179/3560 Training loss: 2.6706 0.6267 sec/batch
Epoch 2/20  Iteration 180/3560 Training loss: 2.6335 0.6573 sec/batch
Epoch 2/20  Iteration 181/3560 Training loss: 2.6232 0.6269 sec/batch
Epoch 2/20  Iteration 182/3560 Training loss: 2.6189 0.6442 sec/batch
Epoch 2/20  Iteration 183/3560 Training loss: 2.6152 0.6386 sec/batch
Epoch 2/20  Iteration 184/3560 Training loss: 2.6130 0.6313 sec/batch
Epoch 2/20  Iteration 185/3560 Training loss: 2.6125 0.6284 sec/batch
Epoch 2/20  Iteration 186/3560 Training loss: 2.6132 0.6339 sec/batch
Epoch 2/20  Iteration 187/3560 Training loss: 2.6127 0.6474 sec/batch
Epoch 2/20  Iteration 188/3560 Training loss: 2.6100 0.6555 sec/batch
Epoch 2/20  Iteration 189/3560 Training loss: 2.6067 0.6389 sec/batch
Epoch 2/20  Iteration 190/3560 Training loss: 2.6056 0.6306 sec/batch
Epoch 2/20  Iteration 191/3560 Training loss: 2.6044 0.6433 sec/batch
Epoch 2/20  Iteration 192/3560 Training loss: 2.6049 0.6222 sec/batch
Epoch 2/20  Iteration 193/3560 Training loss: 2.6034 0.6539 sec/batch
Epoch 2/20  Iteration 194/3560 Training loss: 2.6025 0.6488 sec/batch
Epoch 2/20  Iteration 195/3560 Training loss: 2.6004 0.6531 sec/batch
Epoch 2/20  Iteration 196/3560 Training loss: 2.6009 0.6350 sec/batch
Epoch 2/20  Iteration 197/3560 Training loss: 2.5996 0.6253 sec/batch
Epoch 2/20  Iteration 198/3560 Training loss: 2.5971 0.6400 sec/batch
Epoch 2/20  Iteration 199/3560 Training loss: 2.5950 0.6492 sec/batch
Epoch 2/20  Iteration 200/3560 Training loss: 2.5944 0.6308 sec/batch
Epoch 2/20  Iteration 201/3560 Training loss: 2.5933 0.6338 sec/batch
Epoch 2/20  Iteration 202/3560 Training loss: 2.5912 0.6398 sec/batch
Epoch 2/20  Iteration 203/3560 Training loss: 2.5896 0.6478 sec/batch
Epoch 2/20  Iteration 204/3560 Training loss: 2.5883 0.6409 sec/batch
Epoch 2/20  Iteration 205/3560 Training loss: 2.5863 0.6360 sec/batch
Epoch 2/20  Iteration 206/3560 Training loss: 2.5845 0.6390 sec/batch
Epoch 2/20  Iteration 207/3560 Training loss: 2.5832 0.6462 sec/batch
Epoch 2/20  Iteration 208/3560 Training loss: 2.5819 0.6366 sec/batch
Epoch 2/20  Iteration 209/3560 Training loss: 2.5812 0.6382 sec/batch
Epoch 2/20  Iteration 210/3560 Training loss: 2.5796 0.6301 sec/batch
Epoch 2/20  Iteration 211/3560 Training loss: 2.5781 0.6505 sec/batch
Epoch 2/20  Iteration 212/3560 Training loss: 2.5768 0.6462 sec/batch
Epoch 2/20  Iteration 213/3560 Training loss: 2.5748 0.6358 sec/batch
Epoch 2/20  Iteration 214/3560 Training loss: 2.5734 0.6330 sec/batch
Epoch 2/20  Iteration 215/3560 Training loss: 2.5718 0.6297 sec/batch
Epoch 2/20  Iteration 216/3560 Training loss: 2.5695 0.6381 sec/batch
Epoch 2/20  Iteration 217/3560 Training loss: 2.5678 0.6326 sec/batch
Epoch 2/20  Iteration 218/3560 Training loss: 2.5660 0.6337 sec/batch
Epoch 2/20  Iteration 219/3560 Training loss: 2.5644 0.6429 sec/batch
Epoch 2/20  Iteration 220/3560 Training loss: 2.5627 0.6354 sec/batch
Epoch 2/20  Iteration 221/3560 Training loss: 2.5610 0.6376 sec/batch
Epoch 2/20  Iteration 222/3560 Training loss: 2.5593 0.6309 sec/batch
Epoch 2/20  Iteration 223/3560 Training loss: 2.5577 0.6339 sec/batch
Epoch 2/20  Iteration 224/3560 Training loss: 2.5556 0.6257 sec/batch
Epoch 2/20  Iteration 225/3560 Training loss: 2.5546 0.6376 sec/batch
Epoch 2/20  Iteration 226/3560 Training loss: 2.5534 0.6490 sec/batch
Epoch 2/20  Iteration 227/3560 Training loss: 2.5520 0.6444 sec/batch
Epoch 2/20  Iteration 228/3560 Training loss: 2.5509 0.6169 sec/batch
Epoch 2/20  Iteration 229/3560 Training loss: 2.5496 0.6323 sec/batch
Epoch 2/20  Iteration 230/3560 Training loss: 2.5483 0.6379 sec/batch
Epoch 2/20  Iteration 231/3560 Training loss: 2.5469 0.6387 sec/batch
Epoch 2/20  Iteration 232/3560 Training loss: 2.5456 0.6359 sec/batch
Epoch 2/20  Iteration 233/3560 Training loss: 2.5442 0.6313 sec/batch
Epoch 2/20  Iteration 234/3560 Training loss: 2.5430 0.7200 sec/batch
Epoch 2/20  Iteration 235/3560 Training loss: 2.5418 0.6490 sec/batch
Epoch 2/20  Iteration 236/3560 Training loss: 2.5404 0.6679 sec/batch
Epoch 2/20  Iteration 237/3560 Training loss: 2.5390 0.6563 sec/batch
Epoch 2/20  Iteration 238/3560 Training loss: 2.5382 0.6326 sec/batch
Epoch 2/20  Iteration 239/3560 Training loss: 2.5368 0.6330 sec/batch
Epoch 2/20  Iteration 240/3560 Training loss: 2.5359 0.6317 sec/batch
Epoch 2/20  Iteration 241/3560 Training loss: 2.5351 0.6294 sec/batch
Epoch 2/20  Iteration 242/3560 Training loss: 2.5337 0.6193 sec/batch
Epoch 2/20  Iteration 243/3560 Training loss: 2.5324 0.6658 sec/batch
Epoch 2/20  Iteration 244/3560 Training loss: 2.5316 0.6244 sec/batch
Epoch 2/20  Iteration 245/3560 Training loss: 2.5305 0.6216 sec/batch
Epoch 2/20  Iteration 246/3560 Training loss: 2.5290 0.6291 sec/batch
Epoch 2/20  Iteration 247/3560 Training loss: 2.5277 0.6352 sec/batch
Epoch 2/20  Iteration 248/3560 Training loss: 2.5268 0.6398 sec/batch
Epoch 2/20  Iteration 249/3560 Training loss: 2.5258 0.6365 sec/batch
Epoch 2/20  Iteration 250/3560 Training loss: 2.5250 0.6436 sec/batch
Epoch 2/20  Iteration 251/3560 Training loss: 2.5241 0.6513 sec/batch
Epoch 2/20  Iteration 252/3560 Training loss: 2.5229 0.6361 sec/batch
Epoch 2/20  Iteration 253/3560 Training loss: 2.5219 0.6260 sec/batch
Epoch 2/20  Iteration 254/3560 Training loss: 2.5214 0.6339 sec/batch
Epoch 2/20  Iteration 255/3560 Training loss: 2.5204 0.6434 sec/batch
Epoch 2/20  Iteration 256/3560 Training loss: 2.5197 0.6313 sec/batch
Epoch 2/20  Iteration 257/3560 Training loss: 2.5186 0.6286 sec/batch
Epoch 2/20  Iteration 258/3560 Training loss: 2.5174 0.6474 sec/batch
Epoch 2/20  Iteration 259/3560 Training loss: 2.5163 0.6363 sec/batch
Epoch 2/20  Iteration 260/3560 Training loss: 2.5156 0.6357 sec/batch
Epoch 2/20  Iteration 261/3560 Training loss: 2.5147 0.6384 sec/batch
Epoch 2/20  Iteration 262/3560 Training loss: 2.5135 0.6370 sec/batch
Epoch 2/20  Iteration 263/3560 Training loss: 2.5121 0.6285 sec/batch
Epoch 2/20  Iteration 264/3560 Training loss: 2.5109 0.6033 sec/batch
Epoch 2/20  Iteration 265/3560 Training loss: 2.5100 0.6307 sec/batch
Epoch 2/20  Iteration 266/3560 Training loss: 2.5090 0.6531 sec/batch
Epoch 2/20  Iteration 267/3560 Training loss: 2.5081 0.6387 sec/batch
Epoch 2/20  Iteration 268/3560 Training loss: 2.5072 0.6468 sec/batch
Epoch 2/20  Iteration 269/3560 Training loss: 2.5064 0.6522 sec/batch
Epoch 2/20  Iteration 270/3560 Training loss: 2.5056 0.6325 sec/batch
Epoch 2/20  Iteration 271/3560 Training loss: 2.5047 0.6361 sec/batch
Epoch 2/20  Iteration 272/3560 Training loss: 2.5036 0.6307 sec/batch
Epoch 2/20  Iteration 273/3560 Training loss: 2.5025 0.6356 sec/batch
Epoch 2/20  Iteration 274/3560 Training loss: 2.5014 0.6464 sec/batch
Epoch 2/20  Iteration 275/3560 Training loss: 2.5005 0.6372 sec/batch
Epoch 2/20  Iteration 276/3560 Training loss: 2.4996 0.6361 sec/batch
Epoch 2/20  Iteration 277/3560 Training loss: 2.4987 0.6361 sec/batch
Epoch 2/20  Iteration 278/3560 Training loss: 2.4978 0.6420 sec/batch
Epoch 2/20  Iteration 279/3560 Training loss: 2.4971 0.6891 sec/batch
Epoch 2/20  Iteration 280/3560 Training loss: 2.4963 0.6258 sec/batch
Epoch 2/20  Iteration 281/3560 Training loss: 2.4953 0.6270 sec/batch
Epoch 2/20  Iteration 282/3560 Training loss: 2.4943 0.6537 sec/batch
Epoch 2/20  Iteration 283/3560 Training loss: 2.4933 0.6368 sec/batch
Epoch 2/20  Iteration 284/3560 Training loss: 2.4926 0.6250 sec/batch
Epoch 2/20  Iteration 285/3560 Training loss: 2.4917 0.6310 sec/batch
Epoch 2/20  Iteration 286/3560 Training loss: 2.4910 0.6353 sec/batch
Epoch 2/20  Iteration 287/3560 Training loss: 2.4903 0.6285 sec/batch
Epoch 2/20  Iteration 288/3560 Training loss: 2.4893 0.6261 sec/batch
Epoch 2/20  Iteration 289/3560 Training loss: 2.4886 0.6285 sec/batch
Epoch 2/20  Iteration 290/3560 Training loss: 2.4880 0.6493 sec/batch
Epoch 2/20  Iteration 291/3560 Training loss: 2.4872 0.6326 sec/batch
Epoch 2/20  Iteration 292/3560 Training loss: 2.4863 0.6341 sec/batch
Epoch 2/20  Iteration 293/3560 Training loss: 2.4854 0.6326 sec/batch
Epoch 2/20  Iteration 294/3560 Training loss: 2.4844 0.6308 sec/batch
Epoch 2/20  Iteration 295/3560 Training loss: 2.4837 0.6335 sec/batch
Epoch 2/20  Iteration 296/3560 Training loss: 2.4828 0.6297 sec/batch
Epoch 2/20  Iteration 297/3560 Training loss: 2.4824 0.6456 sec/batch
Epoch 2/20  Iteration 298/3560 Training loss: 2.4816 0.6351 sec/batch
Epoch 2/20  Iteration 299/3560 Training loss: 2.4811 0.6328 sec/batch
Epoch 2/20  Iteration 300/3560 Training loss: 2.4804 0.6794 sec/batch
Epoch 2/20  Iteration 301/3560 Training loss: 2.4795 0.6464 sec/batch
Epoch 2/20  Iteration 302/3560 Training loss: 2.4789 0.6576 sec/batch
Epoch 2/20  Iteration 303/3560 Training loss: 2.4781 0.6559 sec/batch
Epoch 2/20  Iteration 304/3560 Training loss: 2.4772 0.6331 sec/batch
Epoch 2/20  Iteration 305/3560 Training loss: 2.4765 0.6347 sec/batch
Epoch 2/20  Iteration 306/3560 Training loss: 2.4759 0.6773 sec/batch
Epoch 2/20  Iteration 307/3560 Training loss: 2.4751 0.6326 sec/batch
Epoch 2/20  Iteration 308/3560 Training loss: 2.4744 0.6393 sec/batch
Epoch 2/20  Iteration 309/3560 Training loss: 2.4737 0.6281 sec/batch
Epoch 2/20  Iteration 310/3560 Training loss: 2.4728 0.6382 sec/batch
Epoch 2/20  Iteration 311/3560 Training loss: 2.4721 0.6340 sec/batch
Epoch 2/20  Iteration 312/3560 Training loss: 2.4716 0.6351 sec/batch
Epoch 2/20  Iteration 313/3560 Training loss: 2.4707 0.6314 sec/batch
Epoch 2/20  Iteration 314/3560 Training loss: 2.4700 0.6515 sec/batch
Epoch 2/20  Iteration 315/3560 Training loss: 2.4693 0.6340 sec/batch
Epoch 2/20  Iteration 316/3560 Training loss: 2.4686 0.6293 sec/batch
Epoch 2/20  Iteration 317/3560 Training loss: 2.4680 0.6348 sec/batch
Epoch 2/20  Iteration 318/3560 Training loss: 2.4673 0.6274 sec/batch
Epoch 2/20  Iteration 319/3560 Training loss: 2.4668 0.6368 sec/batch
Epoch 2/20  Iteration 320/3560 Training loss: 2.4660 0.6376 sec/batch
Epoch 2/20  Iteration 321/3560 Training loss: 2.4653 0.6186 sec/batch
Epoch 2/20  Iteration 322/3560 Training loss: 2.4646 0.6623 sec/batch
Epoch 2/20  Iteration 323/3560 Training loss: 2.4640 0.6353 sec/batch
Epoch 2/20  Iteration 324/3560 Training loss: 2.4635 0.6338 sec/batch
Epoch 2/20  Iteration 325/3560 Training loss: 2.4629 0.6346 sec/batch
Epoch 2/20  Iteration 326/3560 Training loss: 2.4624 0.6389 sec/batch
Epoch 2/20  Iteration 327/3560 Training loss: 2.4616 0.6317 sec/batch
Epoch 2/20  Iteration 328/3560 Training loss: 2.4608 0.6352 sec/batch
Epoch 2/20  Iteration 329/3560 Training loss: 2.4604 0.6403 sec/batch
Epoch 2/20  Iteration 330/3560 Training loss: 2.4600 0.6386 sec/batch
Epoch 2/20  Iteration 331/3560 Training loss: 2.4594 0.6339 sec/batch
Epoch 2/20  Iteration 332/3560 Training loss: 2.4588 0.6299 sec/batch
Epoch 2/20  Iteration 333/3560 Training loss: 2.4582 0.6291 sec/batch
Epoch 2/20  Iteration 334/3560 Training loss: 2.4575 0.6300 sec/batch
Epoch 2/20  Iteration 335/3560 Training loss: 2.4568 0.6480 sec/batch
Epoch 2/20  Iteration 336/3560 Training loss: 2.4562 0.6375 sec/batch
Epoch 2/20  Iteration 337/3560 Training loss: 2.4553 0.6482 sec/batch
Epoch 2/20  Iteration 338/3560 Training loss: 2.4548 0.6369 sec/batch
Epoch 2/20  Iteration 339/3560 Training loss: 2.4541 0.6446 sec/batch
Epoch 2/20  Iteration 340/3560 Training loss: 2.4533 0.6318 sec/batch
Epoch 2/20  Iteration 341/3560 Training loss: 2.4526 0.6422 sec/batch
Epoch 2/20  Iteration 342/3560 Training loss: 2.4519 0.6388 sec/batch
Epoch 2/20  Iteration 343/3560 Training loss: 2.4514 0.6329 sec/batch
Epoch 2/20  Iteration 344/3560 Training loss: 2.4507 0.6365 sec/batch
Epoch 2/20  Iteration 345/3560 Training loss: 2.4501 0.6446 sec/batch
Epoch 2/20  Iteration 346/3560 Training loss: 2.4495 0.6338 sec/batch
Epoch 2/20  Iteration 347/3560 Training loss: 2.4490 0.6440 sec/batch
Epoch 2/20  Iteration 348/3560 Training loss: 2.4482 0.6332 sec/batch
Epoch 2/20  Iteration 349/3560 Training loss: 2.4476 0.6350 sec/batch
Epoch 2/20  Iteration 350/3560 Training loss: 2.4472 0.6117 sec/batch
Epoch 2/20  Iteration 351/3560 Training loss: 2.4468 0.6095 sec/batch
Epoch 2/20  Iteration 352/3560 Training loss: 2.4465 0.6454 sec/batch
Epoch 2/20  Iteration 353/3560 Training loss: 2.4461 0.6557 sec/batch
Epoch 2/20  Iteration 354/3560 Training loss: 2.4455 0.6450 sec/batch
Epoch 2/20  Iteration 355/3560 Training loss: 2.4449 0.6324 sec/batch
Epoch 2/20  Iteration 356/3560 Training loss: 2.4441 0.6412 sec/batch
Epoch 3/20  Iteration 357/3560 Training loss: 2.3836 0.6350 sec/batch
Epoch 3/20  Iteration 358/3560 Training loss: 2.3436 0.6361 sec/batch
Epoch 3/20  Iteration 359/3560 Training loss: 2.3318 0.6304 sec/batch
Epoch 3/20  Iteration 360/3560 Training loss: 2.3279 0.6346 sec/batch
Epoch 3/20  Iteration 361/3560 Training loss: 2.3272 0.6489 sec/batch
Epoch 3/20  Iteration 362/3560 Training loss: 2.3257 0.6284 sec/batch
Epoch 3/20  Iteration 363/3560 Training loss: 2.3256 0.6180 sec/batch
Epoch 3/20  Iteration 364/3560 Training loss: 2.3281 0.6372 sec/batch
Epoch 3/20  Iteration 365/3560 Training loss: 2.3291 0.6334 sec/batch
Epoch 3/20  Iteration 366/3560 Training loss: 2.3281 0.6356 sec/batch
Epoch 3/20  Iteration 367/3560 Training loss: 2.3267 0.6397 sec/batch
Epoch 3/20  Iteration 368/3560 Training loss: 2.3257 0.6338 sec/batch
Epoch 3/20  Iteration 369/3560 Training loss: 2.3248 0.6482 sec/batch
Epoch 3/20  Iteration 370/3560 Training loss: 2.3267 0.6166 sec/batch
Epoch 3/20  Iteration 371/3560 Training loss: 2.3266 0.6378 sec/batch
Epoch 3/20  Iteration 372/3560 Training loss: 2.3264 0.6341 sec/batch
Epoch 3/20  Iteration 373/3560 Training loss: 2.3258 0.6462 sec/batch
Epoch 3/20  Iteration 374/3560 Training loss: 2.3271 0.6380 sec/batch
Epoch 3/20  Iteration 375/3560 Training loss: 2.3272 0.6480 sec/batch
Epoch 3/20  Iteration 376/3560 Training loss: 2.3258 0.6337 sec/batch
Epoch 3/20  Iteration 377/3560 Training loss: 2.3246 0.6458 sec/batch
Epoch 3/20  Iteration 378/3560 Training loss: 2.3254 0.6408 sec/batch
Epoch 3/20  Iteration 379/3560 Training loss: 2.3247 0.5968 sec/batch
Epoch 3/20  Iteration 380/3560 Training loss: 2.3237 0.6273 sec/batch
Epoch 3/20  Iteration 381/3560 Training loss: 2.3233 0.6349 sec/batch
Epoch 3/20  Iteration 382/3560 Training loss: 2.3230 0.6377 sec/batch
Epoch 3/20  Iteration 383/3560 Training loss: 2.3227 0.6360 sec/batch
Epoch 3/20  Iteration 384/3560 Training loss: 2.3220 0.6476 sec/batch
Epoch 3/20  Iteration 385/3560 Training loss: 2.3222 0.6389 sec/batch
Epoch 3/20  Iteration 386/3560 Training loss: 2.3218 0.6476 sec/batch
Epoch 3/20  Iteration 387/3560 Training loss: 2.3219 0.6350 sec/batch
Epoch 3/20  Iteration 388/3560 Training loss: 2.3213 0.6535 sec/batch
Epoch 3/20  Iteration 389/3560 Training loss: 2.3207 0.6340 sec/batch
Epoch 3/20  Iteration 390/3560 Training loss: 2.3206 0.6391 sec/batch
Epoch 3/20  Iteration 391/3560 Training loss: 2.3199 0.6460 sec/batch
Epoch 3/20  Iteration 392/3560 Training loss: 2.3196 0.6290 sec/batch
Epoch 3/20  Iteration 393/3560 Training loss: 2.3189 0.6550 sec/batch
Epoch 3/20  Iteration 394/3560 Training loss: 2.3177 0.6377 sec/batch
Epoch 3/20  Iteration 395/3560 Training loss: 2.3168 0.6228 sec/batch
Epoch 3/20  Iteration 396/3560 Training loss: 2.3156 0.6415 sec/batch
Epoch 3/20  Iteration 397/3560 Training loss: 2.3148 0.6347 sec/batch
Epoch 3/20  Iteration 398/3560 Training loss: 2.3140 0.6398 sec/batch
Epoch 3/20  Iteration 399/3560 Training loss: 2.3130 0.6548 sec/batch
Epoch 3/20  Iteration 400/3560 Training loss: 2.3123 0.6484 sec/batch
Epoch 3/20  Iteration 401/3560 Training loss: 2.3116 0.6560 sec/batch
Epoch 3/20  Iteration 402/3560 Training loss: 2.3101 0.6361 sec/batch
Epoch 3/20  Iteration 403/3560 Training loss: 2.3098 0.6137 sec/batch
Epoch 3/20  Iteration 404/3560 Training loss: 2.3094 0.6253 sec/batch
Epoch 3/20  Iteration 405/3560 Training loss: 2.3087 0.6374 sec/batch
Epoch 3/20  Iteration 406/3560 Training loss: 2.3087 0.6338 sec/batch
Epoch 3/20  Iteration 407/3560 Training loss: 2.3078 0.6349 sec/batch
Epoch 3/20  Iteration 408/3560 Training loss: 2.3074 0.6266 sec/batch
Epoch 3/20  Iteration 409/3560 Training loss: 2.3068 0.6649 sec/batch
Epoch 3/20  Iteration 410/3560 Training loss: 2.3060 0.6430 sec/batch
Epoch 3/20  Iteration 411/3560 Training loss: 2.3052 0.6328 sec/batch
Epoch 3/20  Iteration 412/3560 Training loss: 2.3049 0.6218 sec/batch
Epoch 3/20  Iteration 413/3560 Training loss: 2.3044 0.6360 sec/batch
Epoch 3/20  Iteration 414/3560 Training loss: 2.3036 0.6527 sec/batch
Epoch 3/20  Iteration 415/3560 Training loss: 2.3029 0.6280 sec/batch
Epoch 3/20  Iteration 416/3560 Training loss: 2.3028 0.6666 sec/batch
Epoch 3/20  Iteration 417/3560 Training loss: 2.3021 0.6420 sec/batch
Epoch 3/20  Iteration 418/3560 Training loss: 2.3019 0.6352 sec/batch
Epoch 3/20  Iteration 419/3560 Training loss: 2.3018 0.6265 sec/batch
Epoch 3/20  Iteration 420/3560 Training loss: 2.3012 0.6656 sec/batch
Epoch 3/20  Iteration 421/3560 Training loss: 2.3005 0.6865 sec/batch
Epoch 3/20  Iteration 422/3560 Training loss: 2.3004 0.6685 sec/batch
Epoch 3/20  Iteration 423/3560 Training loss: 2.3001 0.6370 sec/batch
Epoch 3/20  Iteration 424/3560 Training loss: 2.2991 0.6731 sec/batch
Epoch 3/20  Iteration 425/3560 Training loss: 2.2984 0.6408 sec/batch
Epoch 3/20  Iteration 426/3560 Training loss: 2.2980 0.6397 sec/batch
Epoch 3/20  Iteration 427/3560 Training loss: 2.2977 0.6327 sec/batch
Epoch 3/20  Iteration 428/3560 Training loss: 2.2975 0.6571 sec/batch
Epoch 3/20  Iteration 429/3560 Training loss: 2.2973 0.6360 sec/batch
Epoch 3/20  Iteration 430/3560 Training loss: 2.2968 0.6330 sec/batch
Epoch 3/20  Iteration 431/3560 Training loss: 2.2964 0.6322 sec/batch
Epoch 3/20  Iteration 432/3560 Training loss: 2.2965 0.6585 sec/batch
Epoch 3/20  Iteration 433/3560 Training loss: 2.2958 0.6240 sec/batch
Epoch 3/20  Iteration 434/3560 Training loss: 2.2956 0.6339 sec/batch
Epoch 3/20  Iteration 435/3560 Training loss: 2.2949 0.6278 sec/batch
Epoch 3/20  Iteration 436/3560 Training loss: 2.2943 0.6387 sec/batch
Epoch 3/20  Iteration 437/3560 Training loss: 2.2937 0.6926 sec/batch
Epoch 3/20  Iteration 438/3560 Training loss: 2.2934 0.8406 sec/batch
Epoch 3/20  Iteration 439/3560 Training loss: 2.2929 0.6797 sec/batch
Epoch 3/20  Iteration 440/3560 Training loss: 2.2922 0.6400 sec/batch
Epoch 3/20  Iteration 441/3560 Training loss: 2.2914 0.6385 sec/batch
Epoch 3/20  Iteration 442/3560 Training loss: 2.2909 0.6335 sec/batch
Epoch 3/20  Iteration 443/3560 Training loss: 2.2906 0.6341 sec/batch
Epoch 3/20  Iteration 444/3560 Training loss: 2.2900 0.6431 sec/batch
Epoch 3/20  Iteration 445/3560 Training loss: 2.2895 0.6539 sec/batch
Epoch 3/20  Iteration 446/3560 Training loss: 2.2892 0.6482 sec/batch
Epoch 3/20  Iteration 447/3560 Training loss: 2.2887 0.6573 sec/batch
Epoch 3/20  Iteration 448/3560 Training loss: 2.2884 0.6360 sec/batch
Epoch 3/20  Iteration 449/3560 Training loss: 2.2878 0.6345 sec/batch
Epoch 3/20  Iteration 450/3560 Training loss: 2.2873 0.6214 sec/batch
Epoch 3/20  Iteration 451/3560 Training loss: 2.2866 0.6364 sec/batch
Epoch 3/20  Iteration 452/3560 Training loss: 2.2861 0.6338 sec/batch
Epoch 3/20  Iteration 453/3560 Training loss: 2.2856 0.6392 sec/batch
Epoch 3/20  Iteration 454/3560 Training loss: 2.2852 0.6108 sec/batch
Epoch 3/20  Iteration 455/3560 Training loss: 2.2846 0.6553 sec/batch
Epoch 3/20  Iteration 456/3560 Training loss: 2.2842 0.6399 sec/batch
Epoch 3/20  Iteration 457/3560 Training loss: 2.2838 0.6342 sec/batch
Epoch 3/20  Iteration 458/3560 Training loss: 2.2836 0.6313 sec/batch
Epoch 3/20  Iteration 459/3560 Training loss: 2.2830 0.6996 sec/batch
Epoch 3/20  Iteration 460/3560 Training loss: 2.2826 0.6755 sec/batch
Epoch 3/20  Iteration 461/3560 Training loss: 2.2820 0.6353 sec/batch
Epoch 3/20  Iteration 462/3560 Training loss: 2.2817 0.6323 sec/batch
Epoch 3/20  Iteration 463/3560 Training loss: 2.2812 0.6354 sec/batch
Epoch 3/20  Iteration 464/3560 Training loss: 2.2808 0.6313 sec/batch
Epoch 3/20  Iteration 465/3560 Training loss: 2.2806 0.6187 sec/batch
Epoch 3/20  Iteration 466/3560 Training loss: 2.2800 0.6283 sec/batch
Epoch 3/20  Iteration 467/3560 Training loss: 2.2797 0.6275 sec/batch
Epoch 3/20  Iteration 468/3560 Training loss: 2.2793 0.6318 sec/batch
Epoch 3/20  Iteration 469/3560 Training loss: 2.2789 0.6256 sec/batch
Epoch 3/20  Iteration 470/3560 Training loss: 2.2785 0.6245 sec/batch
Epoch 3/20  Iteration 471/3560 Training loss: 2.2780 0.6393 sec/batch
Epoch 3/20  Iteration 472/3560 Training loss: 2.2773 0.6271 sec/batch
Epoch 3/20  Iteration 473/3560 Training loss: 2.2769 0.6274 sec/batch
Epoch 3/20  Iteration 474/3560 Training loss: 2.2766 0.6371 sec/batch
Epoch 3/20  Iteration 475/3560 Training loss: 2.2764 0.6272 sec/batch
Epoch 3/20  Iteration 476/3560 Training loss: 2.2761 0.6247 sec/batch
Epoch 3/20  Iteration 477/3560 Training loss: 2.2758 0.6254 sec/batch
Epoch 3/20  Iteration 478/3560 Training loss: 2.2754 0.6265 sec/batch
Epoch 3/20  Iteration 479/3560 Training loss: 2.2749 0.6433 sec/batch
Epoch 3/20  Iteration 480/3560 Training loss: 2.2747 0.6313 sec/batch
Epoch 3/20  Iteration 481/3560 Training loss: 2.2743 0.6225 sec/batch
Epoch 3/20  Iteration 482/3560 Training loss: 2.2738 0.6263 sec/batch
Epoch 3/20  Iteration 483/3560 Training loss: 2.2735 0.6268 sec/batch
Epoch 3/20  Iteration 484/3560 Training loss: 2.2733 0.6349 sec/batch
Epoch 3/20  Iteration 485/3560 Training loss: 2.2729 0.6308 sec/batch
Epoch 3/20  Iteration 486/3560 Training loss: 2.2727 0.6228 sec/batch
Epoch 3/20  Iteration 487/3560 Training loss: 2.2723 0.6357 sec/batch
Epoch 3/20  Iteration 488/3560 Training loss: 2.2718 0.6087 sec/batch
Epoch 3/20  Iteration 489/3560 Training loss: 2.2715 0.6426 sec/batch
Epoch 3/20  Iteration 490/3560 Training loss: 2.2713 0.6301 sec/batch
Epoch 3/20  Iteration 491/3560 Training loss: 2.2708 0.6310 sec/batch
Epoch 3/20  Iteration 492/3560 Training loss: 2.2705 0.6437 sec/batch
Epoch 3/20  Iteration 493/3560 Training loss: 2.2701 0.6312 sec/batch
Epoch 3/20  Iteration 494/3560 Training loss: 2.2699 0.6109 sec/batch
Epoch 3/20  Iteration 495/3560 Training loss: 2.2697 0.6334 sec/batch
Epoch 3/20  Iteration 496/3560 Training loss: 2.2694 0.6267 sec/batch
Epoch 3/20  Iteration 497/3560 Training loss: 2.2692 0.6273 sec/batch
Epoch 3/20  Iteration 498/3560 Training loss: 2.2687 0.6277 sec/batch
Epoch 3/20  Iteration 499/3560 Training loss: 2.2685 0.6207 sec/batch
Epoch 3/20  Iteration 500/3560 Training loss: 2.2680 0.6226 sec/batch
Epoch 3/20  Iteration 501/3560 Training loss: 2.2677 0.6288 sec/batch
Epoch 3/20  Iteration 502/3560 Training loss: 2.2676 0.6070 sec/batch
Epoch 3/20  Iteration 503/3560 Training loss: 2.2673 0.6316 sec/batch
Epoch 3/20  Iteration 504/3560 Training loss: 2.2671 0.6318 sec/batch
Epoch 3/20  Iteration 505/3560 Training loss: 2.2668 0.6287 sec/batch
Epoch 3/20  Iteration 506/3560 Training loss: 2.2664 0.6230 sec/batch
Epoch 3/20  Iteration 507/3560 Training loss: 2.2662 0.6232 sec/batch
Epoch 3/20  Iteration 508/3560 Training loss: 2.2662 0.6191 sec/batch
Epoch 3/20  Iteration 509/3560 Training loss: 2.2659 0.6232 sec/batch
Epoch 3/20  Iteration 510/3560 Training loss: 2.2657 0.6261 sec/batch
Epoch 3/20  Iteration 511/3560 Training loss: 2.2653 0.6519 sec/batch
Epoch 3/20  Iteration 512/3560 Training loss: 2.2650 0.6288 sec/batch
Epoch 3/20  Iteration 513/3560 Training loss: 2.2646 0.6210 sec/batch
Epoch 3/20  Iteration 514/3560 Training loss: 2.2642 0.6342 sec/batch
Epoch 3/20  Iteration 515/3560 Training loss: 2.2638 0.6282 sec/batch
Epoch 3/20  Iteration 516/3560 Training loss: 2.2636 0.6234 sec/batch
Epoch 3/20  Iteration 517/3560 Training loss: 2.2635 0.6247 sec/batch
Epoch 3/20  Iteration 518/3560 Training loss: 2.2630 0.6238 sec/batch
Epoch 3/20  Iteration 519/3560 Training loss: 2.2627 0.6571 sec/batch
Epoch 3/20  Iteration 520/3560 Training loss: 2.2623 0.6212 sec/batch
Epoch 3/20  Iteration 521/3560 Training loss: 2.2621 0.6206 sec/batch
Epoch 3/20  Iteration 522/3560 Training loss: 2.2617 0.6089 sec/batch
Epoch 3/20  Iteration 523/3560 Training loss: 2.2614 0.6222 sec/batch
Epoch 3/20  Iteration 524/3560 Training loss: 2.2611 0.6198 sec/batch
Epoch 3/20  Iteration 525/3560 Training loss: 2.2609 0.6182 sec/batch
Epoch 3/20  Iteration 526/3560 Training loss: 2.2605 0.6268 sec/batch
Epoch 3/20  Iteration 527/3560 Training loss: 2.2602 0.6263 sec/batch
Epoch 3/20  Iteration 528/3560 Training loss: 2.2599 0.6290 sec/batch
Epoch 3/20  Iteration 529/3560 Training loss: 2.2598 0.6246 sec/batch
Epoch 3/20  Iteration 530/3560 Training loss: 2.2596 0.6246 sec/batch
Epoch 3/20  Iteration 531/3560 Training loss: 2.2595 0.6260 sec/batch
Epoch 3/20  Iteration 532/3560 Training loss: 2.2592 0.6245 sec/batch
Epoch 3/20  Iteration 533/3560 Training loss: 2.2588 0.6246 sec/batch
Epoch 3/20  Iteration 534/3560 Training loss: 2.2584 0.6235 sec/batch
Epoch 4/20  Iteration 535/3560 Training loss: 2.2558 0.6430 sec/batch
Epoch 4/20  Iteration 536/3560 Training loss: 2.2219 0.6246 sec/batch
Epoch 4/20  Iteration 537/3560 Training loss: 2.2039 0.6363 sec/batch
Epoch 4/20  Iteration 538/3560 Training loss: 2.1973 0.6273 sec/batch
Epoch 4/20  Iteration 539/3560 Training loss: 2.1959 0.6223 sec/batch
Epoch 4/20  Iteration 540/3560 Training loss: 2.1943 0.6476 sec/batch
Epoch 4/20  Iteration 541/3560 Training loss: 2.1947 0.6219 sec/batch
Epoch 4/20  Iteration 542/3560 Training loss: 2.1964 0.6246 sec/batch
Epoch 4/20  Iteration 543/3560 Training loss: 2.1991 0.6370 sec/batch
Epoch 4/20  Iteration 544/3560 Training loss: 2.1978 0.6319 sec/batch
Epoch 4/20  Iteration 545/3560 Training loss: 2.1959 0.6312 sec/batch
Epoch 4/20  Iteration 546/3560 Training loss: 2.1947 0.6232 sec/batch
Epoch 4/20  Iteration 547/3560 Training loss: 2.1944 0.6322 sec/batch
Epoch 4/20  Iteration 548/3560 Training loss: 2.1965 0.6231 sec/batch
Epoch 4/20  Iteration 549/3560 Training loss: 2.1970 0.6265 sec/batch
Epoch 4/20  Iteration 550/3560 Training loss: 2.1962 0.6111 sec/batch
Epoch 4/20  Iteration 551/3560 Training loss: 2.1962 0.6506 sec/batch
Epoch 4/20  Iteration 552/3560 Training loss: 2.1978 0.6429 sec/batch
Epoch 4/20  Iteration 553/3560 Training loss: 2.1974 0.6256 sec/batch
Epoch 4/20  Iteration 554/3560 Training loss: 2.1964 0.6338 sec/batch
Epoch 4/20  Iteration 555/3560 Training loss: 2.1958 0.6293 sec/batch
Epoch 4/20  Iteration 556/3560 Training loss: 2.1968 0.6275 sec/batch
Epoch 4/20  Iteration 557/3560 Training loss: 2.1961 0.6229 sec/batch
Epoch 4/20  Iteration 558/3560 Training loss: 2.1951 0.6206 sec/batch
Epoch 4/20  Iteration 559/3560 Training loss: 2.1942 0.6346 sec/batch
Epoch 4/20  Iteration 560/3560 Training loss: 2.1939 0.6249 sec/batch
Epoch 4/20  Iteration 561/3560 Training loss: 2.1931 0.6270 sec/batch
Epoch 4/20  Iteration 562/3560 Training loss: 2.1929 0.6302 sec/batch
Epoch 4/20  Iteration 563/3560 Training loss: 2.1935 0.6350 sec/batch
Epoch 4/20  Iteration 564/3560 Training loss: 2.1936 0.6224 sec/batch
Epoch 4/20  Iteration 565/3560 Training loss: 2.1939 0.6319 sec/batch
Epoch 4/20  Iteration 566/3560 Training loss: 2.1931 0.6277 sec/batch
Epoch 4/20  Iteration 567/3560 Training loss: 2.1929 0.6495 sec/batch
Epoch 4/20  Iteration 568/3560 Training loss: 2.1933 0.6271 sec/batch
Epoch 4/20  Iteration 569/3560 Training loss: 2.1927 0.6237 sec/batch
Epoch 4/20  Iteration 570/3560 Training loss: 2.1922 0.6278 sec/batch
Epoch 4/20  Iteration 571/3560 Training loss: 2.1917 0.6146 sec/batch
Epoch 4/20  Iteration 572/3560 Training loss: 2.1907 0.6271 sec/batch
Epoch 4/20  Iteration 573/3560 Training loss: 2.1897 0.6310 sec/batch
Epoch 4/20  Iteration 574/3560 Training loss: 2.1887 0.6310 sec/batch
Epoch 4/20  Iteration 575/3560 Training loss: 2.1878 0.6407 sec/batch
Epoch 4/20  Iteration 576/3560 Training loss: 2.1871 0.6752 sec/batch
Epoch 4/20  Iteration 577/3560 Training loss: 2.1864 0.6487 sec/batch
Epoch 4/20  Iteration 578/3560 Training loss: 2.1856 0.6313 sec/batch
Epoch 4/20  Iteration 579/3560 Training loss: 2.1852 0.6272 sec/batch
Epoch 4/20  Iteration 580/3560 Training loss: 2.1838 0.6208 sec/batch
Epoch 4/20  Iteration 581/3560 Training loss: 2.1837 0.6641 sec/batch
Epoch 4/20  Iteration 582/3560 Training loss: 2.1831 0.6129 sec/batch
Epoch 4/20  Iteration 583/3560 Training loss: 2.1826 0.6479 sec/batch
Epoch 4/20  Iteration 584/3560 Training loss: 2.1826 0.6401 sec/batch
Epoch 4/20  Iteration 585/3560 Training loss: 2.1819 597.2756 sec/batch
Epoch 4/20  Iteration 586/3560 Training loss: 2.1820 2.8332 sec/batch
Epoch 4/20  Iteration 587/3560 Training loss: 2.1816 1.0587 sec/batch
Epoch 4/20  Iteration 588/3560 Training loss: 2.1811 0.8032 sec/batch
Epoch 4/20  Iteration 589/3560 Training loss: 2.1805 0.8134 sec/batch
Epoch 4/20  Iteration 590/3560 Training loss: 2.1805 0.6819 sec/batch
Epoch 4/20  Iteration 591/3560 Training loss: 2.1804 0.6630 sec/batch
Epoch 4/20  Iteration 592/3560 Training loss: 2.1799 0.7168 sec/batch
Epoch 4/20  Iteration 593/3560 Training loss: 2.1793 0.6371 sec/batch
Epoch 4/20  Iteration 594/3560 Training loss: 2.1793 0.6486 sec/batch
Epoch 4/20  Iteration 595/3560 Training loss: 2.1788 0.6314 sec/batch
Epoch 4/20  Iteration 596/3560 Training loss: 2.1789 0.6738 sec/batch
Epoch 4/20  Iteration 597/3560 Training loss: 2.1789 0.7290 sec/batch
Epoch 4/20  Iteration 598/3560 Training loss: 2.1786 0.6944 sec/batch
Epoch 4/20  Iteration 599/3560 Training loss: 2.1780 0.8617 sec/batch
Epoch 4/20  Iteration 600/3560 Training loss: 2.1780 0.7055 sec/batch
Epoch 4/20  Iteration 601/3560 Training loss: 2.1778 0.6577 sec/batch
Epoch 4/20  Iteration 602/3560 Training loss: 2.1770 0.6497 sec/batch
Epoch 4/20  Iteration 603/3560 Training loss: 2.1766 0.6945 sec/batch
Epoch 4/20  Iteration 604/3560 Training loss: 2.1763 0.6647 sec/batch
Epoch 4/20  Iteration 605/3560 Training loss: 2.1762 0.6745 sec/batch
Epoch 4/20  Iteration 606/3560 Training loss: 2.1762 0.7160 sec/batch
Epoch 4/20  Iteration 607/3560 Training loss: 2.1763 0.7098 sec/batch
Epoch 4/20  Iteration 608/3560 Training loss: 2.1758 0.6716 sec/batch
Epoch 4/20  Iteration 609/3560 Training loss: 2.1756 0.6341 sec/batch
Epoch 4/20  Iteration 610/3560 Training loss: 2.1758 0.6445 sec/batch
Epoch 4/20  Iteration 611/3560 Training loss: 2.1754 0.6670 sec/batch
Epoch 4/20  Iteration 612/3560 Training loss: 2.1753 0.6708 sec/batch
Epoch 4/20  Iteration 613/3560 Training loss: 2.1748 0.6513 sec/batch
Epoch 4/20  Iteration 614/3560 Training loss: 2.1743 0.6467 sec/batch
Epoch 4/20  Iteration 615/3560 Training loss: 2.1738 0.6863 sec/batch
Epoch 4/20  Iteration 616/3560 Training loss: 2.1736 0.7020 sec/batch
Epoch 4/20  Iteration 617/3560 Training loss: 2.1730 0.6554 sec/batch
Epoch 4/20  Iteration 618/3560 Training loss: 2.1727 0.6587 sec/batch
Epoch 4/20  Iteration 619/3560 Training loss: 2.1720 0.6375 sec/batch
Epoch 4/20  Iteration 620/3560 Training loss: 2.1715 0.6318 sec/batch
Epoch 4/20  Iteration 621/3560 Training loss: 2.1712 0.6334 sec/batch
Epoch 4/20  Iteration 622/3560 Training loss: 2.1707 0.6401 sec/batch
Epoch 4/20  Iteration 623/3560 Training loss: 2.1703 0.6382 sec/batch
Epoch 4/20  Iteration 624/3560 Training loss: 2.1702 0.6262 sec/batch
Epoch 4/20  Iteration 625/3560 Training loss: 2.1700 0.6304 sec/batch
Epoch 4/20  Iteration 626/3560 Training loss: 2.1697 0.6747 sec/batch
Epoch 4/20  Iteration 627/3560 Training loss: 2.1694 0.6462 sec/batch
Epoch 4/20  Iteration 628/3560 Training loss: 2.1688 0.6422 sec/batch
Epoch 4/20  Iteration 629/3560 Training loss: 2.1682 0.6276 sec/batch
Epoch 4/20  Iteration 630/3560 Training loss: 2.1680 0.6341 sec/batch
Epoch 4/20  Iteration 631/3560 Training loss: 2.1677 0.6332 sec/batch
Epoch 4/20  Iteration 632/3560 Training loss: 2.1672 0.6337 sec/batch
Epoch 4/20  Iteration 633/3560 Training loss: 2.1668 0.6300 sec/batch
Epoch 4/20  Iteration 634/3560 Training loss: 2.1662 0.6498 sec/batch
Epoch 4/20  Iteration 635/3560 Training loss: 2.1660 0.6427 sec/batch
Epoch 4/20  Iteration 636/3560 Training loss: 2.1658 0.6382 sec/batch
Epoch 4/20  Iteration 637/3560 Training loss: 2.1652 0.6630 sec/batch
Epoch 4/20  Iteration 638/3560 Training loss: 2.1649 0.6403 sec/batch
Epoch 4/20  Iteration 639/3560 Training loss: 2.1645 0.6343 sec/batch
Epoch 4/20  Iteration 640/3560 Training loss: 2.1642 0.6395 sec/batch
Epoch 4/20  Iteration 641/3560 Training loss: 2.1639 0.6508 sec/batch
Epoch 4/20  Iteration 642/3560 Training loss: 2.1637 0.6371 sec/batch
Epoch 4/20  Iteration 643/3560 Training loss: 2.1637 0.6296 sec/batch
Epoch 4/20  Iteration 644/3560 Training loss: 2.1634 0.6293 sec/batch
Epoch 4/20  Iteration 645/3560 Training loss: 2.1633 0.6484 sec/batch
Epoch 4/20  Iteration 646/3560 Training loss: 2.1631 0.6402 sec/batch
Epoch 4/20  Iteration 647/3560 Training loss: 2.1627 0.6328 sec/batch
Epoch 4/20  Iteration 648/3560 Training loss: 2.1625 0.6417 sec/batch
Epoch 4/20  Iteration 649/3560 Training loss: 2.1621 0.6496 sec/batch
Epoch 4/20  Iteration 650/3560 Training loss: 2.1615 0.6299 sec/batch
Epoch 4/20  Iteration 651/3560 Training loss: 2.1613 0.6517 sec/batch
Epoch 4/20  Iteration 652/3560 Training loss: 2.1611 0.6356 sec/batch
Epoch 4/20  Iteration 653/3560 Training loss: 2.1609 0.6508 sec/batch
Epoch 4/20  Iteration 654/3560 Training loss: 2.1607 0.6321 sec/batch
Epoch 4/20  Iteration 655/3560 Training loss: 2.1605 0.6400 sec/batch
Epoch 4/20  Iteration 656/3560 Training loss: 2.1601 0.6396 sec/batch
Epoch 4/20  Iteration 657/3560 Training loss: 2.1597 0.6337 sec/batch
Epoch 4/20  Iteration 658/3560 Training loss: 2.1596 0.6387 sec/batch
Epoch 4/20  Iteration 659/3560 Training loss: 2.1594 0.6309 sec/batch
Epoch 4/20  Iteration 660/3560 Training loss: 2.1589 0.6125 sec/batch
Epoch 4/20  Iteration 661/3560 Training loss: 2.1587 0.6352 sec/batch
Epoch 4/20  Iteration 662/3560 Training loss: 2.1586 0.6308 sec/batch
Epoch 4/20  Iteration 663/3560 Training loss: 2.1584 0.6490 sec/batch
Epoch 4/20  Iteration 664/3560 Training loss: 2.1582 0.6426 sec/batch
Epoch 4/20  Iteration 665/3560 Training loss: 2.1579 0.6473 sec/batch
Epoch 4/20  Iteration 666/3560 Training loss: 2.1575 0.6318 sec/batch
Epoch 4/20  Iteration 667/3560 Training loss: 2.1574 0.6344 sec/batch
Epoch 4/20  Iteration 668/3560 Training loss: 2.1572 0.6006 sec/batch
Epoch 4/20  Iteration 669/3560 Training loss: 2.1568 0.6352 sec/batch
Epoch 4/20  Iteration 670/3560 Training loss: 2.1567 0.6438 sec/batch
Epoch 4/20  Iteration 671/3560 Training loss: 2.1565 0.6342 sec/batch
Epoch 4/20  Iteration 672/3560 Training loss: 2.1563 0.6466 sec/batch
Epoch 4/20  Iteration 673/3560 Training loss: 2.1562 0.6645 sec/batch
Epoch 4/20  Iteration 674/3560 Training loss: 2.1559 0.6320 sec/batch
Epoch 4/20  Iteration 675/3560 Training loss: 2.1558 0.6345 sec/batch
Epoch 4/20  Iteration 676/3560 Training loss: 2.1556 0.6333 sec/batch
Epoch 4/20  Iteration 677/3560 Training loss: 2.1554 0.6409 sec/batch
Epoch 4/20  Iteration 678/3560 Training loss: 2.1551 0.6309 sec/batch
Epoch 4/20  Iteration 679/3560 Training loss: 2.1548 0.6299 sec/batch
Epoch 4/20  Iteration 680/3560 Training loss: 2.1548 0.6344 sec/batch
Epoch 4/20  Iteration 681/3560 Training loss: 2.1546 0.6600 sec/batch
Epoch 4/20  Iteration 682/3560 Training loss: 2.1546 0.6393 sec/batch
Epoch 4/20  Iteration 683/3560 Training loss: 2.1543 0.6378 sec/batch
Epoch 4/20  Iteration 684/3560 Training loss: 2.1540 0.6410 sec/batch
Epoch 4/20  Iteration 685/3560 Training loss: 2.1538 0.6906 sec/batch
Epoch 4/20  Iteration 686/3560 Training loss: 2.1539 0.6235 sec/batch
Epoch 4/20  Iteration 687/3560 Training loss: 2.1538 0.6734 sec/batch
Epoch 4/20  Iteration 688/3560 Training loss: 2.1537 0.6516 sec/batch
Epoch 4/20  Iteration 689/3560 Training loss: 2.1534 0.6737 sec/batch
Epoch 4/20  Iteration 690/3560 Training loss: 2.1531 0.6441 sec/batch
Epoch 4/20  Iteration 691/3560 Training loss: 2.1529 0.6562 sec/batch
Epoch 4/20  Iteration 692/3560 Training loss: 2.1526 0.6391 sec/batch
Epoch 4/20  Iteration 693/3560 Training loss: 2.1522 0.6295 sec/batch
Epoch 4/20  Iteration 694/3560 Training loss: 2.1522 0.6344 sec/batch
Epoch 4/20  Iteration 695/3560 Training loss: 2.1521 0.6458 sec/batch
Epoch 4/20  Iteration 696/3560 Training loss: 2.1518 0.6502 sec/batch
Epoch 4/20  Iteration 697/3560 Training loss: 2.1515 0.6314 sec/batch
Epoch 4/20  Iteration 698/3560 Training loss: 2.1513 0.6358 sec/batch
Epoch 4/20  Iteration 699/3560 Training loss: 2.1512 0.6457 sec/batch
Epoch 4/20  Iteration 700/3560 Training loss: 2.1509 0.6373 sec/batch
Epoch 4/20  Iteration 701/3560 Training loss: 2.1506 0.6335 sec/batch
Epoch 4/20  Iteration 702/3560 Training loss: 2.1506 0.6362 sec/batch
Epoch 4/20  Iteration 703/3560 Training loss: 2.1504 0.6368 sec/batch
Epoch 4/20  Iteration 704/3560 Training loss: 2.1501 0.6583 sec/batch
Epoch 4/20  Iteration 705/3560 Training loss: 2.1499 0.6290 sec/batch
Epoch 4/20  Iteration 706/3560 Training loss: 2.1497 0.6388 sec/batch
Epoch 4/20  Iteration 707/3560 Training loss: 2.1495 0.6470 sec/batch
Epoch 4/20  Iteration 708/3560 Training loss: 2.1495 0.6461 sec/batch
Epoch 4/20  Iteration 709/3560 Training loss: 2.1494 0.6588 sec/batch
Epoch 4/20  Iteration 710/3560 Training loss: 2.1492 0.6440 sec/batch
Epoch 4/20  Iteration 711/3560 Training loss: 2.1489 0.6311 sec/batch
Epoch 4/20  Iteration 712/3560 Training loss: 2.1486 0.6553 sec/batch
Epoch 5/20  Iteration 713/3560 Training loss: 2.1626 0.6328 sec/batch
Epoch 5/20  Iteration 714/3560 Training loss: 2.1211 0.6370 sec/batch
Epoch 5/20  Iteration 715/3560 Training loss: 2.1100 0.6318 sec/batch
Epoch 5/20  Iteration 716/3560 Training loss: 2.1049 0.6353 sec/batch
Epoch 5/20  Iteration 717/3560 Training loss: 2.1035 0.6414 sec/batch
Epoch 5/20  Iteration 718/3560 Training loss: 2.0999 0.6307 sec/batch
Epoch 5/20  Iteration 719/3560 Training loss: 2.1011 0.6287 sec/batch
Epoch 5/20  Iteration 720/3560 Training loss: 2.1017 0.6498 sec/batch
Epoch 5/20  Iteration 721/3560 Training loss: 2.1043 0.6337 sec/batch
Epoch 5/20  Iteration 722/3560 Training loss: 2.1049 0.6442 sec/batch
Epoch 5/20  Iteration 723/3560 Training loss: 2.1029 0.6385 sec/batch
Epoch 5/20  Iteration 724/3560 Training loss: 2.1017 0.6389 sec/batch
Epoch 5/20  Iteration 725/3560 Training loss: 2.1018 0.6336 sec/batch
Epoch 5/20  Iteration 726/3560 Training loss: 2.1039 0.6422 sec/batch
Epoch 5/20  Iteration 727/3560 Training loss: 2.1044 0.6326 sec/batch
Epoch 5/20  Iteration 728/3560 Training loss: 2.1033 0.6341 sec/batch
Epoch 5/20  Iteration 729/3560 Training loss: 2.1031 0.6375 sec/batch
Epoch 5/20  Iteration 730/3560 Training loss: 2.1054 0.6319 sec/batch
Epoch 5/20  Iteration 731/3560 Training loss: 2.1048 0.6538 sec/batch
Epoch 5/20  Iteration 732/3560 Training loss: 2.1041 0.6305 sec/batch
Epoch 5/20  Iteration 733/3560 Training loss: 2.1031 0.6547 sec/batch
Epoch 5/20  Iteration 734/3560 Training loss: 2.1045 0.6446 sec/batch
Epoch 5/20  Iteration 735/3560 Training loss: 2.1038 0.6484 sec/batch
Epoch 5/20  Iteration 736/3560 Training loss: 2.1029 0.6523 sec/batch
Epoch 5/20  Iteration 737/3560 Training loss: 2.1022 0.6363 sec/batch
Epoch 5/20  Iteration 738/3560 Training loss: 2.1017 0.6346 sec/batch
Epoch 5/20  Iteration 739/3560 Training loss: 2.1014 0.6380 sec/batch
Epoch 5/20  Iteration 740/3560 Training loss: 2.1011 0.6357 sec/batch
Epoch 5/20  Iteration 741/3560 Training loss: 2.1019 0.6370 sec/batch
Epoch 5/20  Iteration 742/3560 Training loss: 2.1023 0.6392 sec/batch
Epoch 5/20  Iteration 743/3560 Training loss: 2.1026 0.6372 sec/batch
Epoch 5/20  Iteration 744/3560 Training loss: 2.1020 0.6482 sec/batch
Epoch 5/20  Iteration 745/3560 Training loss: 2.1017 0.6370 sec/batch
Epoch 5/20  Iteration 746/3560 Training loss: 2.1021 0.6309 sec/batch
Epoch 5/20  Iteration 747/3560 Training loss: 2.1018 0.6342 sec/batch
Epoch 5/20  Iteration 748/3560 Training loss: 2.1015 0.6398 sec/batch
Epoch 5/20  Iteration 749/3560 Training loss: 2.1007 0.6072 sec/batch
Epoch 5/20  Iteration 750/3560 Training loss: 2.0995 0.6363 sec/batch
Epoch 5/20  Iteration 751/3560 Training loss: 2.0985 0.6379 sec/batch
Epoch 5/20  Iteration 752/3560 Training loss: 2.0979 0.6501 sec/batch
Epoch 5/20  Iteration 753/3560 Training loss: 2.0972 0.6409 sec/batch
Epoch 5/20  Iteration 754/3560 Training loss: 2.0965 0.6104 sec/batch
Epoch 5/20  Iteration 755/3560 Training loss: 2.0956 0.6367 sec/batch
Epoch 5/20  Iteration 756/3560 Training loss: 2.0950 0.6532 sec/batch
Epoch 5/20  Iteration 757/3560 Training loss: 2.0948 0.6348 sec/batch
Epoch 5/20  Iteration 758/3560 Training loss: 2.0935 0.6326 sec/batch
Epoch 5/20  Iteration 759/3560 Training loss: 2.0936 0.6104 sec/batch
Epoch 5/20  Iteration 760/3560 Training loss: 2.0929 0.6500 sec/batch
Epoch 5/20  Iteration 761/3560 Training loss: 2.0926 0.6331 sec/batch
Epoch 5/20  Iteration 762/3560 Training loss: 2.0930 0.6499 sec/batch
Epoch 5/20  Iteration 763/3560 Training loss: 2.0923 0.6446 sec/batch
Epoch 5/20  Iteration 764/3560 Training loss: 2.0926 0.6234 sec/batch
Epoch 5/20  Iteration 765/3560 Training loss: 2.0920 0.6400 sec/batch
Epoch 5/20  Iteration 766/3560 Training loss: 2.0917 0.6373 sec/batch
Epoch 5/20  Iteration 767/3560 Training loss: 2.0912 0.6382 sec/batch
Epoch 5/20  Iteration 768/3560 Training loss: 2.0909 0.6388 sec/batch
Epoch 5/20  Iteration 769/3560 Training loss: 2.0908 0.6499 sec/batch
Epoch 5/20  Iteration 770/3560 Training loss: 2.0906 0.6303 sec/batch
Epoch 5/20  Iteration 771/3560 Training loss: 2.0901 0.6426 sec/batch
Epoch 5/20  Iteration 772/3560 Training loss: 2.0900 0.6356 sec/batch
Epoch 5/20  Iteration 773/3560 Training loss: 2.0898 0.6467 sec/batch
Epoch 5/20  Iteration 774/3560 Training loss: 2.0900 0.6343 sec/batch
Epoch 5/20  Iteration 775/3560 Training loss: 2.0903 0.6439 sec/batch
Epoch 5/20  Iteration 776/3560 Training loss: 2.0902 0.6370 sec/batch
Epoch 5/20  Iteration 777/3560 Training loss: 2.0898 0.6383 sec/batch
Epoch 5/20  Iteration 778/3560 Training loss: 2.0898 0.6359 sec/batch
Epoch 5/20  Iteration 779/3560 Training loss: 2.0898 0.6279 sec/batch
Epoch 5/20  Iteration 780/3560 Training loss: 2.0890 0.6348 sec/batch
Epoch 5/20  Iteration 781/3560 Training loss: 2.0887 0.6316 sec/batch
Epoch 5/20  Iteration 782/3560 Training loss: 2.0884 0.6380 sec/batch
Epoch 5/20  Iteration 783/3560 Training loss: 2.0885 0.6536 sec/batch
Epoch 5/20  Iteration 784/3560 Training loss: 2.0884 0.6451 sec/batch
Epoch 5/20  Iteration 785/3560 Training loss: 2.0885 0.6354 sec/batch
Epoch 5/20  Iteration 786/3560 Training loss: 2.0881 0.6295 sec/batch
Epoch 5/20  Iteration 787/3560 Training loss: 2.0880 0.6349 sec/batch
Epoch 5/20  Iteration 788/3560 Training loss: 2.0882 0.6286 sec/batch
Epoch 5/20  Iteration 789/3560 Training loss: 2.0879 0.6380 sec/batch
Epoch 5/20  Iteration 790/3560 Training loss: 2.0878 0.6456 sec/batch
Epoch 5/20  Iteration 791/3560 Training loss: 2.0874 0.6629 sec/batch
Epoch 5/20  Iteration 792/3560 Training loss: 2.0871 0.6436 sec/batch
Epoch 5/20  Iteration 793/3560 Training loss: 2.0866 0.6366 sec/batch
Epoch 5/20  Iteration 794/3560 Training loss: 2.0865 0.6585 sec/batch
Epoch 5/20  Iteration 795/3560 Training loss: 2.0859 0.8354 sec/batch
Epoch 5/20  Iteration 796/3560 Training loss: 2.0857 0.8210 sec/batch
Epoch 5/20  Iteration 797/3560 Training loss: 2.0851 0.6317 sec/batch
Epoch 5/20  Iteration 798/3560 Training loss: 2.0846 0.6509 sec/batch
Epoch 5/20  Iteration 799/3560 Training loss: 2.0844 0.6382 sec/batch
Epoch 5/20  Iteration 800/3560 Training loss: 2.0840 0.6469 sec/batch
Epoch 5/20  Iteration 801/3560 Training loss: 2.0835 0.6330 sec/batch
Epoch 5/20  Iteration 802/3560 Training loss: 2.0835 0.6558 sec/batch
Epoch 5/20  Iteration 803/3560 Training loss: 2.0831 0.6376 sec/batch
Epoch 5/20  Iteration 804/3560 Training loss: 2.0830 0.6275 sec/batch
Epoch 5/20  Iteration 805/3560 Training loss: 2.0825 0.6497 sec/batch
Epoch 5/20  Iteration 806/3560 Training loss: 2.0821 0.6476 sec/batch
Epoch 5/20  Iteration 807/3560 Training loss: 2.0817 0.6316 sec/batch
Epoch 5/20  Iteration 808/3560 Training loss: 2.0814 0.6327 sec/batch
Epoch 5/20  Iteration 809/3560 Training loss: 2.0812 0.6365 sec/batch
Epoch 5/20  Iteration 810/3560 Training loss: 2.0807 0.6434 sec/batch
Epoch 5/20  Iteration 811/3560 Training loss: 2.0803 0.6294 sec/batch
Epoch 5/20  Iteration 812/3560 Training loss: 2.0798 0.6407 sec/batch
Epoch 5/20  Iteration 813/3560 Training loss: 2.0797 0.6580 sec/batch
Epoch 5/20  Iteration 814/3560 Training loss: 2.0796 0.6525 sec/batch
Epoch 5/20  Iteration 815/3560 Training loss: 2.0792 0.6323 sec/batch
Epoch 5/20  Iteration 816/3560 Training loss: 2.0788 0.6283 sec/batch
Epoch 5/20  Iteration 817/3560 Training loss: 2.0785 0.6295 sec/batch
Epoch 5/20  Iteration 818/3560 Training loss: 2.0783 0.6393 sec/batch
Epoch 5/20  Iteration 819/3560 Training loss: 2.0780 0.6270 sec/batch
Epoch 5/20  Iteration 820/3560 Training loss: 2.0778 0.6513 sec/batch
Epoch 5/20  Iteration 821/3560 Training loss: 2.0778 0.6232 sec/batch
Epoch 5/20  Iteration 822/3560 Training loss: 2.0774 0.6325 sec/batch
Epoch 5/20  Iteration 823/3560 Training loss: 2.0772 0.6354 sec/batch
Epoch 5/20  Iteration 824/3560 Training loss: 2.0769 0.6372 sec/batch
Epoch 5/20  Iteration 825/3560 Training loss: 2.0766 0.6812 sec/batch
Epoch 5/20  Iteration 826/3560 Training loss: 2.0763 0.6364 sec/batch
Epoch 5/20  Iteration 827/3560 Training loss: 2.0760 0.6323 sec/batch
Epoch 5/20  Iteration 828/3560 Training loss: 2.0755 0.6382 sec/batch
Epoch 5/20  Iteration 829/3560 Training loss: 2.0752 0.6414 sec/batch
Epoch 5/20  Iteration 830/3560 Training loss: 2.0750 0.6498 sec/batch
Epoch 5/20  Iteration 831/3560 Training loss: 2.0750 0.6287 sec/batch
Epoch 5/20  Iteration 832/3560 Training loss: 2.0747 0.6366 sec/batch
Epoch 5/20  Iteration 833/3560 Training loss: 2.0746 0.6347 sec/batch
Epoch 5/20  Iteration 834/3560 Training loss: 2.0742 0.6604 sec/batch
Epoch 5/20  Iteration 835/3560 Training loss: 2.0738 0.6381 sec/batch
Epoch 5/20  Iteration 836/3560 Training loss: 2.0738 0.6320 sec/batch
Epoch 5/20  Iteration 837/3560 Training loss: 2.0736 0.6408 sec/batch
Epoch 5/20  Iteration 838/3560 Training loss: 2.0732 0.6722 sec/batch
Epoch 5/20  Iteration 839/3560 Training loss: 2.0731 0.6301 sec/batch
Epoch 5/20  Iteration 840/3560 Training loss: 2.0730 0.6316 sec/batch
Epoch 5/20  Iteration 841/3560 Training loss: 2.0728 0.6326 sec/batch
Epoch 5/20  Iteration 842/3560 Training loss: 2.0726 0.6251 sec/batch
Epoch 5/20  Iteration 843/3560 Training loss: 2.0724 0.6362 sec/batch
Epoch 5/20  Iteration 844/3560 Training loss: 2.0721 0.6378 sec/batch
Epoch 5/20  Iteration 845/3560 Training loss: 2.0720 0.6288 sec/batch
Epoch 5/20  Iteration 846/3560 Training loss: 2.0719 0.6614 sec/batch
Epoch 5/20  Iteration 847/3560 Training loss: 2.0716 0.6211 sec/batch
Epoch 5/20  Iteration 848/3560 Training loss: 2.0716 0.6312 sec/batch
Epoch 5/20  Iteration 849/3560 Training loss: 2.0715 0.6526 sec/batch
Epoch 5/20  Iteration 850/3560 Training loss: 2.0714 0.6326 sec/batch
Epoch 5/20  Iteration 851/3560 Training loss: 2.0714 0.6214 sec/batch
Epoch 5/20  Iteration 852/3560 Training loss: 2.0710 0.6372 sec/batch
Epoch 5/20  Iteration 853/3560 Training loss: 2.0710 0.6346 sec/batch
Epoch 5/20  Iteration 854/3560 Training loss: 2.0707 0.6593 sec/batch
Epoch 5/20  Iteration 855/3560 Training loss: 2.0706 0.6331 sec/batch
Epoch 5/20  Iteration 856/3560 Training loss: 2.0704 0.6452 sec/batch
Epoch 5/20  Iteration 857/3560 Training loss: 2.0702 0.6557 sec/batch
Epoch 5/20  Iteration 858/3560 Training loss: 2.0702 0.6388 sec/batch
Epoch 5/20  Iteration 859/3560 Training loss: 2.0701 0.6362 sec/batch
Epoch 5/20  Iteration 860/3560 Training loss: 2.0700 0.6336 sec/batch
Epoch 5/20  Iteration 861/3560 Training loss: 2.0699 0.6650 sec/batch
Epoch 5/20  Iteration 862/3560 Training loss: 2.0696 0.6377 sec/batch
Epoch 5/20  Iteration 863/3560 Training loss: 2.0694 0.6392 sec/batch
Epoch 5/20  Iteration 864/3560 Training loss: 2.0695 0.6368 sec/batch
Epoch 5/20  Iteration 865/3560 Training loss: 2.0693 0.6362 sec/batch
Epoch 5/20  Iteration 866/3560 Training loss: 2.0692 0.6354 sec/batch
Epoch 5/20  Iteration 867/3560 Training loss: 2.0690 0.6307 sec/batch
Epoch 5/20  Iteration 868/3560 Training loss: 2.0688 0.6321 sec/batch
Epoch 5/20  Iteration 869/3560 Training loss: 2.0687 0.6541 sec/batch
Epoch 5/20  Iteration 870/3560 Training loss: 2.0685 0.7047 sec/batch
Epoch 5/20  Iteration 871/3560 Training loss: 2.0682 0.6354 sec/batch
Epoch 5/20  Iteration 872/3560 Training loss: 2.0681 0.6650 sec/batch
Epoch 5/20  Iteration 873/3560 Training loss: 2.0681 0.6600 sec/batch
Epoch 5/20  Iteration 874/3560 Training loss: 2.0679 0.6351 sec/batch
Epoch 5/20  Iteration 875/3560 Training loss: 2.0677 0.6294 sec/batch
Epoch 5/20  Iteration 876/3560 Training loss: 2.0676 0.6338 sec/batch
Epoch 5/20  Iteration 877/3560 Training loss: 2.0674 0.6497 sec/batch
Epoch 5/20  Iteration 878/3560 Training loss: 2.0672 0.6364 sec/batch
Epoch 5/20  Iteration 879/3560 Training loss: 2.0670 0.6302 sec/batch
Epoch 5/20  Iteration 880/3560 Training loss: 2.0670 0.6342 sec/batch
Epoch 5/20  Iteration 881/3560 Training loss: 2.0668 0.6286 sec/batch
Epoch 5/20  Iteration 882/3560 Training loss: 2.0666 0.6375 sec/batch
Epoch 5/20  Iteration 883/3560 Training loss: 2.0663 0.6403 sec/batch
Epoch 5/20  Iteration 884/3560 Training loss: 2.0661 0.6382 sec/batch
Epoch 5/20  Iteration 885/3560 Training loss: 2.0660 0.6407 sec/batch
Epoch 5/20  Iteration 886/3560 Training loss: 2.0659 0.6342 sec/batch
Epoch 5/20  Iteration 887/3560 Training loss: 2.0658 0.6399 sec/batch
Epoch 5/20  Iteration 888/3560 Training loss: 2.0657 0.6336 sec/batch
Epoch 5/20  Iteration 889/3560 Training loss: 2.0654 0.6345 sec/batch
Epoch 5/20  Iteration 890/3560 Training loss: 2.0652 0.6339 sec/batch
Epoch 6/20  Iteration 891/3560 Training loss: 2.0964 0.6281 sec/batch
Epoch 6/20  Iteration 892/3560 Training loss: 2.0570 0.6523 sec/batch
Epoch 6/20  Iteration 893/3560 Training loss: 2.0418 0.6571 sec/batch
Epoch 6/20  Iteration 894/3560 Training loss: 2.0365 0.6357 sec/batch
Epoch 6/20  Iteration 895/3560 Training loss: 2.0351 0.6497 sec/batch
Epoch 6/20  Iteration 896/3560 Training loss: 2.0290 0.6391 sec/batch
Epoch 6/20  Iteration 897/3560 Training loss: 2.0290 0.6411 sec/batch
Epoch 6/20  Iteration 898/3560 Training loss: 2.0298 0.6389 sec/batch
Epoch 6/20  Iteration 899/3560 Training loss: 2.0320 0.6328 sec/batch
Epoch 6/20  Iteration 900/3560 Training loss: 2.0318 0.6292 sec/batch
Epoch 6/20  Iteration 901/3560 Training loss: 2.0296 0.6586 sec/batch
Epoch 6/20  Iteration 902/3560 Training loss: 2.0283 0.6567 sec/batch
Epoch 6/20  Iteration 903/3560 Training loss: 2.0283 0.8374 sec/batch
Epoch 6/20  Iteration 904/3560 Training loss: 2.0303 0.6440 sec/batch
Epoch 6/20  Iteration 905/3560 Training loss: 2.0305 0.6420 sec/batch
Epoch 6/20  Iteration 906/3560 Training loss: 2.0290 0.6411 sec/batch
Epoch 6/20  Iteration 907/3560 Training loss: 2.0290 0.6350 sec/batch
Epoch 6/20  Iteration 908/3560 Training loss: 2.0310 0.6419 sec/batch
Epoch 6/20  Iteration 909/3560 Training loss: 2.0304 0.6367 sec/batch
Epoch 6/20  Iteration 910/3560 Training loss: 2.0305 0.6451 sec/batch
Epoch 6/20  Iteration 911/3560 Training loss: 2.0296 0.6390 sec/batch
Epoch 6/20  Iteration 912/3560 Training loss: 2.0307 0.6292 sec/batch
Epoch 6/20  Iteration 913/3560 Training loss: 2.0299 0.6380 sec/batch
Epoch 6/20  Iteration 914/3560 Training loss: 2.0294 0.6418 sec/batch
Epoch 6/20  Iteration 915/3560 Training loss: 2.0290 0.6574 sec/batch
Epoch 6/20  Iteration 916/3560 Training loss: 2.0281 0.6450 sec/batch
Epoch 6/20  Iteration 917/3560 Training loss: 2.0272 0.6316 sec/batch
Epoch 6/20  Iteration 918/3560 Training loss: 2.0271 0.6438 sec/batch
Epoch 6/20  Iteration 919/3560 Training loss: 2.0281 0.6304 sec/batch
Epoch 6/20  Iteration 920/3560 Training loss: 2.0284 0.6395 sec/batch
Epoch 6/20  Iteration 921/3560 Training loss: 2.0284 0.6346 sec/batch
Epoch 6/20  Iteration 922/3560 Training loss: 2.0281 0.6302 sec/batch
Epoch 6/20  Iteration 923/3560 Training loss: 2.0278 0.6138 sec/batch
Epoch 6/20  Iteration 924/3560 Training loss: 2.0284 0.6555 sec/batch
Epoch 6/20  Iteration 925/3560 Training loss: 2.0279 0.6367 sec/batch
Epoch 6/20  Iteration 926/3560 Training loss: 2.0275 0.6422 sec/batch
Epoch 6/20  Iteration 927/3560 Training loss: 2.0271 0.6378 sec/batch
Epoch 6/20  Iteration 928/3560 Training loss: 2.0261 0.6385 sec/batch
Epoch 6/20  Iteration 929/3560 Training loss: 2.0253 0.6346 sec/batch
Epoch 6/20  Iteration 930/3560 Training loss: 2.0246 0.6326 sec/batch
Epoch 6/20  Iteration 931/3560 Training loss: 2.0241 0.6348 sec/batch
Epoch 6/20  Iteration 932/3560 Training loss: 2.0240 0.6442 sec/batch
Epoch 6/20  Iteration 933/3560 Training loss: 2.0234 0.6351 sec/batch
Epoch 6/20  Iteration 934/3560 Training loss: 2.0226 0.6352 sec/batch
Epoch 6/20  Iteration 935/3560 Training loss: 2.0226 0.6470 sec/batch
Epoch 6/20  Iteration 936/3560 Training loss: 2.0214 0.6419 sec/batch
Epoch 6/20  Iteration 937/3560 Training loss: 2.0216 0.6371 sec/batch
Epoch 6/20  Iteration 938/3560 Training loss: 2.0211 0.6358 sec/batch
Epoch 6/20  Iteration 939/3560 Training loss: 2.0207 0.6431 sec/batch
Epoch 6/20  Iteration 940/3560 Training loss: 2.0213 0.6591 sec/batch
Epoch 6/20  Iteration 941/3560 Training loss: 2.0208 0.6556 sec/batch
Epoch 6/20  Iteration 942/3560 Training loss: 2.0215 0.6438 sec/batch
Epoch 6/20  Iteration 943/3560 Training loss: 2.0212 0.6258 sec/batch
Epoch 6/20  Iteration 944/3560 Training loss: 2.0208 0.6346 sec/batch
Epoch 6/20  Iteration 945/3560 Training loss: 2.0203 0.6297 sec/batch
Epoch 6/20  Iteration 946/3560 Training loss: 2.0204 0.6361 sec/batch
Epoch 6/20  Iteration 947/3560 Training loss: 2.0204 0.6334 sec/batch
Epoch 6/20  Iteration 948/3560 Training loss: 2.0201 0.6557 sec/batch
Epoch 6/20  Iteration 949/3560 Training loss: 2.0196 0.6316 sec/batch
Epoch 6/20  Iteration 950/3560 Training loss: 2.0196 0.6310 sec/batch
Epoch 6/20  Iteration 951/3560 Training loss: 2.0194 0.6442 sec/batch
Epoch 6/20  Iteration 952/3560 Training loss: 2.0200 0.6280 sec/batch
Epoch 6/20  Iteration 953/3560 Training loss: 2.0204 0.6367 sec/batch
Epoch 6/20  Iteration 954/3560 Training loss: 2.0204 0.6500 sec/batch
Epoch 6/20  Iteration 955/3560 Training loss: 2.0201 0.6360 sec/batch
Epoch 6/20  Iteration 956/3560 Training loss: 2.0203 0.6595 sec/batch
Epoch 6/20  Iteration 957/3560 Training loss: 2.0202 0.6374 sec/batch
Epoch 6/20  Iteration 958/3560 Training loss: 2.0197 0.6503 sec/batch
Epoch 6/20  Iteration 959/3560 Training loss: 2.0193 0.6333 sec/batch
Epoch 6/20  Iteration 960/3560 Training loss: 2.0191 0.6330 sec/batch
Epoch 6/20  Iteration 961/3560 Training loss: 2.0192 0.6451 sec/batch
Epoch 6/20  Iteration 962/3560 Training loss: 2.0191 0.6376 sec/batch
Epoch 6/20  Iteration 963/3560 Training loss: 2.0193 0.6305 sec/batch
Epoch 6/20  Iteration 964/3560 Training loss: 2.0188 0.6368 sec/batch
Epoch 6/20  Iteration 965/3560 Training loss: 2.0186 0.6280 sec/batch
Epoch 6/20  Iteration 966/3560 Training loss: 2.0187 0.6385 sec/batch
Epoch 6/20  Iteration 967/3560 Training loss: 2.0185 0.6385 sec/batch
Epoch 6/20  Iteration 968/3560 Training loss: 2.0184 0.6477 sec/batch
Epoch 6/20  Iteration 969/3560 Training loss: 2.0179 0.6411 sec/batch
Epoch 6/20  Iteration 970/3560 Training loss: 2.0177 0.6454 sec/batch
Epoch 6/20  Iteration 971/3560 Training loss: 2.0170 0.6496 sec/batch
Epoch 6/20  Iteration 972/3560 Training loss: 2.0169 0.6373 sec/batch
Epoch 6/20  Iteration 973/3560 Training loss: 2.0164 0.6360 sec/batch
Epoch 6/20  Iteration 974/3560 Training loss: 2.0161 0.6342 sec/batch
Epoch 6/20  Iteration 975/3560 Training loss: 2.0154 0.6352 sec/batch
Epoch 6/20  Iteration 976/3560 Training loss: 2.0149 0.6361 sec/batch
Epoch 6/20  Iteration 977/3560 Training loss: 2.0147 0.6313 sec/batch
Epoch 6/20  Iteration 978/3560 Training loss: 2.0142 0.6353 sec/batch
Epoch 6/20  Iteration 979/3560 Training loss: 2.0137 0.6439 sec/batch
Epoch 6/20  Iteration 980/3560 Training loss: 2.0137 0.6373 sec/batch
Epoch 6/20  Iteration 981/3560 Training loss: 2.0134 0.6395 sec/batch
Epoch 6/20  Iteration 982/3560 Training loss: 2.0132 0.6354 sec/batch
Epoch 6/20  Iteration 983/3560 Training loss: 2.0127 0.6390 sec/batch
Epoch 6/20  Iteration 984/3560 Training loss: 2.0122 0.6328 sec/batch
Epoch 6/20  Iteration 985/3560 Training loss: 2.0118 0.6237 sec/batch
Epoch 6/20  Iteration 986/3560 Training loss: 2.0115 0.6395 sec/batch
Epoch 6/20  Iteration 987/3560 Training loss: 2.0114 0.6592 sec/batch
Epoch 6/20  Iteration 988/3560 Training loss: 2.0109 0.6636 sec/batch
Epoch 6/20  Iteration 989/3560 Training loss: 2.0105 0.6408 sec/batch
Epoch 6/20  Iteration 990/3560 Training loss: 2.0099 0.6300 sec/batch
Epoch 6/20  Iteration 991/3560 Training loss: 2.0100 0.6295 sec/batch
Epoch 6/20  Iteration 992/3560 Training loss: 2.0098 0.6298 sec/batch
Epoch 6/20  Iteration 993/3560 Training loss: 2.0095 0.6412 sec/batch
Epoch 6/20  Iteration 994/3560 Training loss: 2.0090 0.6266 sec/batch
Epoch 6/20  Iteration 995/3560 Training loss: 2.0086 0.6509 sec/batch
Epoch 6/20  Iteration 996/3560 Training loss: 2.0085 0.6262 sec/batch
Epoch 6/20  Iteration 997/3560 Training loss: 2.0082 0.6452 sec/batch
Epoch 6/20  Iteration 998/3560 Training loss: 2.0081 0.6349 sec/batch
Epoch 6/20  Iteration 999/3560 Training loss: 2.0081 0.6398 sec/batch
Epoch 6/20  Iteration 1000/3560 Training loss: 2.0078 0.6374 sec/batch
Epoch 6/20  Iteration 1001/3560 Training loss: 2.0077 0.6311 sec/batch
Epoch 6/20  Iteration 1002/3560 Training loss: 2.0075 0.5978 sec/batch
Epoch 6/20  Iteration 1003/3560 Training loss: 2.0072 0.6517 sec/batch
Epoch 6/20  Iteration 1004/3560 Training loss: 2.0070 0.6427 sec/batch
Epoch 6/20  Iteration 1005/3560 Training loss: 2.0067 0.6305 sec/batch
Epoch 6/20  Iteration 1006/3560 Training loss: 2.0062 0.6399 sec/batch
Epoch 6/20  Iteration 1007/3560 Training loss: 2.0060 0.6235 sec/batch
Epoch 6/20  Iteration 1008/3560 Training loss: 2.0058 0.6401 sec/batch
Epoch 6/20  Iteration 1009/3560 Training loss: 2.0057 0.6379 sec/batch
Epoch 6/20  Iteration 1010/3560 Training loss: 2.0055 0.6303 sec/batch
Epoch 6/20  Iteration 1011/3560 Training loss: 2.0054 0.6531 sec/batch
Epoch 6/20  Iteration 1012/3560 Training loss: 2.0051 0.6357 sec/batch
Epoch 6/20  Iteration 1013/3560 Training loss: 2.0049 0.6330 sec/batch
Epoch 6/20  Iteration 1014/3560 Training loss: 2.0049 0.6225 sec/batch
Epoch 6/20  Iteration 1015/3560 Training loss: 2.0048 0.6349 sec/batch
Epoch 6/20  Iteration 1016/3560 Training loss: 2.0044 0.6391 sec/batch
Epoch 6/20  Iteration 1017/3560 Training loss: 2.0044 0.6346 sec/batch
Epoch 6/20  Iteration 1018/3560 Training loss: 2.0043 0.6316 sec/batch
Epoch 6/20  Iteration 1019/3560 Training loss: 2.0042 0.6511 sec/batch
Epoch 6/20  Iteration 1020/3560 Training loss: 2.0041 0.6426 sec/batch
Epoch 6/20  Iteration 1021/3560 Training loss: 2.0038 0.6353 sec/batch
Epoch 6/20  Iteration 1022/3560 Training loss: 2.0035 0.6364 sec/batch
Epoch 6/20  Iteration 1023/3560 Training loss: 2.0035 0.6306 sec/batch
Epoch 6/20  Iteration 1024/3560 Training loss: 2.0034 0.6273 sec/batch
Epoch 6/20  Iteration 1025/3560 Training loss: 2.0032 0.6689 sec/batch
Epoch 6/20  Iteration 1026/3560 Training loss: 2.0032 0.6346 sec/batch
Epoch 6/20  Iteration 1027/3560 Training loss: 2.0031 0.6521 sec/batch
Epoch 6/20  Iteration 1028/3560 Training loss: 2.0030 0.6308 sec/batch
Epoch 6/20  Iteration 1029/3560 Training loss: 2.0031 0.6385 sec/batch
Epoch 6/20  Iteration 1030/3560 Training loss: 2.0029 0.6265 sec/batch
Epoch 6/20  Iteration 1031/3560 Training loss: 2.0030 0.6350 sec/batch
Epoch 6/20  Iteration 1032/3560 Training loss: 2.0028 0.6353 sec/batch
Epoch 6/20  Iteration 1033/3560 Training loss: 2.0027 0.6332 sec/batch
Epoch 6/20  Iteration 1034/3560 Training loss: 2.0026 0.6456 sec/batch
Epoch 6/20  Iteration 1035/3560 Training loss: 2.0024 0.6664 sec/batch
Epoch 6/20  Iteration 1036/3560 Training loss: 2.0024 0.6253 sec/batch
Epoch 6/20  Iteration 1037/3560 Training loss: 2.0023 0.6390 sec/batch
Epoch 6/20  Iteration 1038/3560 Training loss: 2.0024 0.6115 sec/batch
Epoch 6/20  Iteration 1039/3560 Training loss: 2.0023 0.6257 sec/batch
Epoch 6/20  Iteration 1040/3560 Training loss: 2.0020 0.6352 sec/batch
Epoch 6/20  Iteration 1041/3560 Training loss: 2.0017 0.6342 sec/batch
Epoch 6/20  Iteration 1042/3560 Training loss: 2.0018 0.6407 sec/batch
Epoch 6/20  Iteration 1043/3560 Training loss: 2.0018 0.6404 sec/batch
Epoch 6/20  Iteration 1044/3560 Training loss: 2.0017 0.6513 sec/batch
Epoch 6/20  Iteration 1045/3560 Training loss: 2.0015 0.6315 sec/batch
Epoch 6/20  Iteration 1046/3560 Training loss: 2.0014 0.6333 sec/batch
Epoch 6/20  Iteration 1047/3560 Training loss: 2.0013 0.6234 sec/batch
Epoch 6/20  Iteration 1048/3560 Training loss: 2.0012 0.6381 sec/batch
Epoch 6/20  Iteration 1049/3560 Training loss: 2.0010 0.6410 sec/batch
Epoch 6/20  Iteration 1050/3560 Training loss: 2.0010 0.6548 sec/batch
Epoch 6/20  Iteration 1051/3560 Training loss: 2.0010 0.6408 sec/batch
Epoch 6/20  Iteration 1052/3560 Training loss: 2.0008 0.6392 sec/batch
Epoch 6/20  Iteration 1053/3560 Training loss: 2.0008 0.6436 sec/batch
Epoch 6/20  Iteration 1054/3560 Training loss: 2.0007 0.6351 sec/batch
Epoch 6/20  Iteration 1055/3560 Training loss: 2.0005 0.6343 sec/batch
Epoch 6/20  Iteration 1056/3560 Training loss: 2.0004 0.7058 sec/batch
Epoch 6/20  Iteration 1057/3560 Training loss: 2.0002 0.6327 sec/batch
Epoch 6/20  Iteration 1058/3560 Training loss: 2.0004 0.6920 sec/batch
Epoch 6/20  Iteration 1059/3560 Training loss: 2.0003 0.6558 sec/batch
Epoch 6/20  Iteration 1060/3560 Training loss: 2.0001 0.6367 sec/batch
Epoch 6/20  Iteration 1061/3560 Training loss: 1.9999 0.6327 sec/batch
Epoch 6/20  Iteration 1062/3560 Training loss: 1.9997 0.6468 sec/batch
Epoch 6/20  Iteration 1063/3560 Training loss: 1.9997 0.6341 sec/batch
Epoch 6/20  Iteration 1064/3560 Training loss: 1.9996 0.6358 sec/batch
Epoch 6/20  Iteration 1065/3560 Training loss: 1.9996 0.6376 sec/batch
Epoch 6/20  Iteration 1066/3560 Training loss: 1.9996 0.6579 sec/batch
Epoch 6/20  Iteration 1067/3560 Training loss: 1.9993 0.6315 sec/batch
Epoch 6/20  Iteration 1068/3560 Training loss: 1.9992 0.6448 sec/batch
Epoch 7/20  Iteration 1069/3560 Training loss: 2.0384 0.6139 sec/batch
Epoch 7/20  Iteration 1070/3560 Training loss: 1.9984 0.6389 sec/batch
Epoch 7/20  Iteration 1071/3560 Training loss: 1.9873 0.6309 sec/batch
Epoch 7/20  Iteration 1072/3560 Training loss: 1.9809 0.6516 sec/batch
Epoch 7/20  Iteration 1073/3560 Training loss: 1.9806 0.6330 sec/batch
Epoch 7/20  Iteration 1074/3560 Training loss: 1.9745 0.6547 sec/batch
Epoch 7/20  Iteration 1075/3560 Training loss: 1.9751 0.6375 sec/batch
Epoch 7/20  Iteration 1076/3560 Training loss: 1.9739 0.6392 sec/batch
Epoch 7/20  Iteration 1077/3560 Training loss: 1.9768 0.6427 sec/batch
Epoch 7/20  Iteration 1078/3560 Training loss: 1.9769 0.6347 sec/batch
Epoch 7/20  Iteration 1079/3560 Training loss: 1.9742 0.6279 sec/batch
Epoch 7/20  Iteration 1080/3560 Training loss: 1.9723 0.6337 sec/batch
Epoch 7/20  Iteration 1081/3560 Training loss: 1.9724 0.6542 sec/batch
Epoch 7/20  Iteration 1082/3560 Training loss: 1.9754 0.6459 sec/batch
Epoch 7/20  Iteration 1083/3560 Training loss: 1.9749 0.6384 sec/batch
Epoch 7/20  Iteration 1084/3560 Training loss: 1.9732 0.6333 sec/batch
Epoch 7/20  Iteration 1085/3560 Training loss: 1.9725 0.6307 sec/batch
Epoch 7/20  Iteration 1086/3560 Training loss: 1.9747 0.6273 sec/batch
Epoch 7/20  Iteration 1087/3560 Training loss: 1.9740 0.6459 sec/batch
Epoch 7/20  Iteration 1088/3560 Training loss: 1.9737 0.6281 sec/batch
Epoch 7/20  Iteration 1089/3560 Training loss: 1.9727 0.6362 sec/batch
Epoch 7/20  Iteration 1090/3560 Training loss: 1.9741 0.6564 sec/batch
Epoch 7/20  Iteration 1091/3560 Training loss: 1.9731 0.6373 sec/batch
Epoch 7/20  Iteration 1092/3560 Training loss: 1.9724 0.6344 sec/batch
Epoch 7/20  Iteration 1093/3560 Training loss: 1.9717 0.6332 sec/batch
Epoch 7/20  Iteration 1094/3560 Training loss: 1.9711 0.6339 sec/batch
Epoch 7/20  Iteration 1095/3560 Training loss: 1.9701 0.6394 sec/batch
Epoch 7/20  Iteration 1096/3560 Training loss: 1.9701 0.6359 sec/batch
Epoch 7/20  Iteration 1097/3560 Training loss: 1.9709 0.6340 sec/batch
Epoch 7/20  Iteration 1098/3560 Training loss: 1.9712 0.6507 sec/batch
Epoch 7/20  Iteration 1099/3560 Training loss: 1.9710 0.6208 sec/batch
Epoch 7/20  Iteration 1100/3560 Training loss: 1.9699 0.6294 sec/batch
Epoch 7/20  Iteration 1101/3560 Training loss: 1.9698 0.6415 sec/batch
Epoch 7/20  Iteration 1102/3560 Training loss: 1.9701 0.6452 sec/batch
Epoch 7/20  Iteration 1103/3560 Training loss: 1.9696 0.6367 sec/batch
Epoch 7/20  Iteration 1104/3560 Training loss: 1.9694 0.6371 sec/batch
Epoch 7/20  Iteration 1105/3560 Training loss: 1.9687 0.6569 sec/batch
Epoch 7/20  Iteration 1106/3560 Training loss: 1.9675 0.6366 sec/batch
Epoch 7/20  Iteration 1107/3560 Training loss: 1.9664 0.6382 sec/batch
Epoch 7/20  Iteration 1108/3560 Training loss: 1.9657 0.6292 sec/batch
Epoch 7/20  Iteration 1109/3560 Training loss: 1.9654 0.6377 sec/batch
Epoch 7/20  Iteration 1110/3560 Training loss: 1.9651 0.6324 sec/batch
Epoch 7/20  Iteration 1111/3560 Training loss: 1.9646 0.6334 sec/batch
Epoch 7/20  Iteration 1112/3560 Training loss: 1.9637 0.6280 sec/batch
Epoch 7/20  Iteration 1113/3560 Training loss: 1.9637 0.6494 sec/batch
Epoch 7/20  Iteration 1114/3560 Training loss: 1.9624 0.6349 sec/batch
Epoch 7/20  Iteration 1115/3560 Training loss: 1.9622 0.6400 sec/batch
Epoch 7/20  Iteration 1116/3560 Training loss: 1.9616 0.6424 sec/batch
Epoch 7/20  Iteration 1117/3560 Training loss: 1.9614 0.6225 sec/batch
Epoch 7/20  Iteration 1118/3560 Training loss: 1.9621 0.6218 sec/batch
Epoch 7/20  Iteration 1119/3560 Training loss: 1.9613 0.6367 sec/batch
Epoch 7/20  Iteration 1120/3560 Training loss: 1.9620 0.6360 sec/batch
Epoch 7/20  Iteration 1121/3560 Training loss: 1.9618 0.6511 sec/batch
Epoch 7/20  Iteration 1122/3560 Training loss: 1.9616 0.6391 sec/batch
Epoch 7/20  Iteration 1123/3560 Training loss: 1.9610 0.6403 sec/batch
Epoch 7/20  Iteration 1124/3560 Training loss: 1.9611 0.6395 sec/batch
Epoch 7/20  Iteration 1125/3560 Training loss: 1.9613 0.6352 sec/batch
Epoch 7/20  Iteration 1126/3560 Training loss: 1.9609 0.6398 sec/batch
Epoch 7/20  Iteration 1127/3560 Training loss: 1.9603 0.6381 sec/batch
Epoch 7/20  Iteration 1128/3560 Training loss: 1.9604 0.6626 sec/batch
Epoch 7/20  Iteration 1129/3560 Training loss: 1.9603 0.6531 sec/batch
Epoch 7/20  Iteration 1130/3560 Training loss: 1.9608 0.6345 sec/batch
Epoch 7/20  Iteration 1131/3560 Training loss: 1.9611 0.6372 sec/batch
Epoch 7/20  Iteration 1132/3560 Training loss: 1.9612 0.6387 sec/batch
Epoch 7/20  Iteration 1133/3560 Training loss: 1.9608 0.6449 sec/batch
Epoch 7/20  Iteration 1134/3560 Training loss: 1.9611 0.6469 sec/batch
Epoch 7/20  Iteration 1135/3560 Training loss: 1.9612 0.6264 sec/batch
Epoch 7/20  Iteration 1136/3560 Training loss: 1.9606 0.6287 sec/batch
Epoch 7/20  Iteration 1137/3560 Training loss: 1.9603 0.6516 sec/batch
Epoch 7/20  Iteration 1138/3560 Training loss: 1.9601 0.6387 sec/batch
Epoch 7/20  Iteration 1139/3560 Training loss: 1.9604 0.6411 sec/batch
Epoch 7/20  Iteration 1140/3560 Training loss: 1.9604 0.6383 sec/batch
Epoch 7/20  Iteration 1141/3560 Training loss: 1.9606 0.6358 sec/batch
Epoch 7/20  Iteration 1142/3560 Training loss: 1.9603 0.6272 sec/batch
Epoch 7/20  Iteration 1143/3560 Training loss: 1.9602 0.6511 sec/batch
Epoch 7/20  Iteration 1144/3560 Training loss: 1.9603 0.6395 sec/batch
Epoch 7/20  Iteration 1145/3560 Training loss: 1.9601 0.6497 sec/batch
Epoch 7/20  Iteration 1146/3560 Training loss: 1.9602 0.6366 sec/batch
Epoch 7/20  Iteration 1147/3560 Training loss: 1.9598 0.6254 sec/batch
Epoch 7/20  Iteration 1148/3560 Training loss: 1.9596 0.6378 sec/batch
Epoch 7/20  Iteration 1149/3560 Training loss: 1.9590 0.6344 sec/batch
Epoch 7/20  Iteration 1150/3560 Training loss: 1.9589 0.6240 sec/batch
Epoch 7/20  Iteration 1151/3560 Training loss: 1.9584 0.6222 sec/batch
Epoch 7/20  Iteration 1152/3560 Training loss: 1.9583 0.6534 sec/batch
Epoch 7/20  Iteration 1153/3560 Training loss: 1.9577 0.6577 sec/batch
Epoch 7/20  Iteration 1154/3560 Training loss: 1.9573 0.6264 sec/batch
Epoch 7/20  Iteration 1155/3560 Training loss: 1.9571 0.6392 sec/batch
Epoch 7/20  Iteration 1156/3560 Training loss: 1.9568 0.6396 sec/batch
Epoch 7/20  Iteration 1157/3560 Training loss: 1.9563 0.6267 sec/batch
Epoch 7/20  Iteration 1158/3560 Training loss: 1.9564 0.6386 sec/batch
Epoch 7/20  Iteration 1159/3560 Training loss: 1.9560 0.6378 sec/batch
Epoch 7/20  Iteration 1160/3560 Training loss: 1.9558 0.6659 sec/batch
Epoch 7/20  Iteration 1161/3560 Training loss: 1.9553 0.6504 sec/batch
Epoch 7/20  Iteration 1162/3560 Training loss: 1.9549 0.6308 sec/batch
Epoch 7/20  Iteration 1163/3560 Training loss: 1.9544 0.6286 sec/batch
Epoch 7/20  Iteration 1164/3560 Training loss: 1.9542 0.6406 sec/batch
Epoch 7/20  Iteration 1165/3560 Training loss: 1.9540 0.6345 sec/batch
Epoch 7/20  Iteration 1166/3560 Training loss: 1.9535 0.6288 sec/batch
Epoch 7/20  Iteration 1167/3560 Training loss: 1.9531 0.6379 sec/batch
Epoch 7/20  Iteration 1168/3560 Training loss: 1.9527 0.6344 sec/batch
Epoch 7/20  Iteration 1169/3560 Training loss: 1.9527 0.6608 sec/batch
Epoch 7/20  Iteration 1170/3560 Training loss: 1.9526 0.6351 sec/batch
Epoch 7/20  Iteration 1171/3560 Training loss: 1.9523 0.6287 sec/batch
Epoch 7/20  Iteration 1172/3560 Training loss: 1.9520 0.6229 sec/batch
Epoch 7/20  Iteration 1173/3560 Training loss: 1.9517 0.6342 sec/batch
Epoch 7/20  Iteration 1174/3560 Training loss: 1.9516 0.6521 sec/batch
Epoch 7/20  Iteration 1175/3560 Training loss: 1.9514 0.6420 sec/batch
Epoch 7/20  Iteration 1176/3560 Training loss: 1.9513 0.6412 sec/batch
Epoch 7/20  Iteration 1177/3560 Training loss: 1.9513 0.6332 sec/batch
Epoch 7/20  Iteration 1178/3560 Training loss: 1.9511 0.6371 sec/batch
Epoch 7/20  Iteration 1179/3560 Training loss: 1.9511 0.6444 sec/batch
Epoch 7/20  Iteration 1180/3560 Training loss: 1.9508 0.6281 sec/batch
Epoch 7/20  Iteration 1181/3560 Training loss: 1.9506 0.6319 sec/batch
Epoch 7/20  Iteration 1182/3560 Training loss: 1.9504 0.6273 sec/batch
Epoch 7/20  Iteration 1183/3560 Training loss: 1.9502 0.6315 sec/batch
Epoch 7/20  Iteration 1184/3560 Training loss: 1.9498 0.6409 sec/batch
Epoch 7/20  Iteration 1185/3560 Training loss: 1.9496 0.6303 sec/batch
Epoch 7/20  Iteration 1186/3560 Training loss: 1.9496 0.6319 sec/batch
Epoch 7/20  Iteration 1187/3560 Training loss: 1.9496 0.6312 sec/batch
Epoch 7/20  Iteration 1188/3560 Training loss: 1.9495 0.6462 sec/batch
Epoch 7/20  Iteration 1189/3560 Training loss: 1.9494 0.6397 sec/batch
Epoch 7/20  Iteration 1190/3560 Training loss: 1.9491 0.6364 sec/batch
Epoch 7/20  Iteration 1191/3560 Training loss: 1.9489 0.6331 sec/batch
Epoch 7/20  Iteration 1192/3560 Training loss: 1.9489 0.6440 sec/batch
Epoch 7/20  Iteration 1193/3560 Training loss: 1.9488 0.6374 sec/batch
Epoch 7/20  Iteration 1194/3560 Training loss: 1.9484 0.6281 sec/batch
Epoch 7/20  Iteration 1195/3560 Training loss: 1.9483 0.6227 sec/batch
Epoch 7/20  Iteration 1196/3560 Training loss: 1.9484 0.6260 sec/batch
Epoch 7/20  Iteration 1197/3560 Training loss: 1.9482 0.6446 sec/batch
Epoch 7/20  Iteration 1198/3560 Training loss: 1.9481 0.6298 sec/batch
Epoch 7/20  Iteration 1199/3560 Training loss: 1.9478 0.6311 sec/batch
Epoch 7/20  Iteration 1200/3560 Training loss: 1.9476 0.6514 sec/batch
Epoch 7/20  Iteration 1201/3560 Training loss: 1.9476 0.6274 sec/batch
Epoch 7/20  Iteration 1202/3560 Training loss: 1.9475 0.6365 sec/batch
Epoch 7/20  Iteration 1203/3560 Training loss: 1.9473 0.6316 sec/batch
Epoch 7/20  Iteration 1204/3560 Training loss: 1.9474 0.6323 sec/batch
Epoch 7/20  Iteration 1205/3560 Training loss: 1.9474 0.6310 sec/batch
Epoch 7/20  Iteration 1206/3560 Training loss: 1.9474 0.6405 sec/batch
Epoch 7/20  Iteration 1207/3560 Training loss: 1.9474 0.6428 sec/batch
Epoch 7/20  Iteration 1208/3560 Training loss: 1.9473 0.6567 sec/batch
Epoch 7/20  Iteration 1209/3560 Training loss: 1.9474 0.6309 sec/batch
Epoch 7/20  Iteration 1210/3560 Training loss: 1.9473 0.6394 sec/batch
Epoch 7/20  Iteration 1211/3560 Training loss: 1.9471 0.6387 sec/batch
Epoch 7/20  Iteration 1212/3560 Training loss: 1.9471 0.6435 sec/batch
Epoch 7/20  Iteration 1213/3560 Training loss: 1.9469 0.6341 sec/batch
Epoch 7/20  Iteration 1214/3560 Training loss: 1.9468 0.6403 sec/batch
Epoch 7/20  Iteration 1215/3560 Training loss: 1.9468 0.6401 sec/batch
Epoch 7/20  Iteration 1216/3560 Training loss: 1.9468 0.6702 sec/batch
Epoch 7/20  Iteration 1217/3560 Training loss: 1.9467 0.6366 sec/batch
Epoch 7/20  Iteration 1218/3560 Training loss: 1.9466 0.6372 sec/batch
Epoch 7/20  Iteration 1219/3560 Training loss: 1.9464 0.6371 sec/batch
Epoch 7/20  Iteration 1220/3560 Training loss: 1.9465 0.6332 sec/batch
Epoch 7/20  Iteration 1221/3560 Training loss: 1.9465 0.6551 sec/batch
Epoch 7/20  Iteration 1222/3560 Training loss: 1.9464 0.6382 sec/batch
Epoch 7/20  Iteration 1223/3560 Training loss: 1.9463 0.6316 sec/batch
Epoch 7/20  Iteration 1224/3560 Training loss: 1.9462 0.6501 sec/batch
Epoch 7/20  Iteration 1225/3560 Training loss: 1.9461 0.6415 sec/batch
Epoch 7/20  Iteration 1226/3560 Training loss: 1.9460 0.6309 sec/batch
Epoch 7/20  Iteration 1227/3560 Training loss: 1.9456 0.6370 sec/batch
Epoch 7/20  Iteration 1228/3560 Training loss: 1.9457 0.6353 sec/batch
Epoch 7/20  Iteration 1229/3560 Training loss: 1.9457 0.6335 sec/batch
Epoch 7/20  Iteration 1230/3560 Training loss: 1.9456 0.6383 sec/batch
Epoch 7/20  Iteration 1231/3560 Training loss: 1.9456 0.6484 sec/batch
Epoch 7/20  Iteration 1232/3560 Training loss: 1.9455 0.6590 sec/batch
Epoch 7/20  Iteration 1233/3560 Training loss: 1.9454 0.6588 sec/batch
Epoch 7/20  Iteration 1234/3560 Training loss: 1.9452 0.6554 sec/batch
Epoch 7/20  Iteration 1235/3560 Training loss: 1.9451 0.6331 sec/batch
Epoch 7/20  Iteration 1236/3560 Training loss: 1.9453 0.6352 sec/batch
Epoch 7/20  Iteration 1237/3560 Training loss: 1.9452 0.6341 sec/batch
Epoch 7/20  Iteration 1238/3560 Training loss: 1.9450 0.6350 sec/batch
Epoch 7/20  Iteration 1239/3560 Training loss: 1.9448 0.6356 sec/batch
Epoch 7/20  Iteration 1240/3560 Training loss: 1.9446 0.6529 sec/batch
Epoch 7/20  Iteration 1241/3560 Training loss: 1.9446 0.6317 sec/batch
Epoch 7/20  Iteration 1242/3560 Training loss: 1.9446 0.7147 sec/batch
Epoch 7/20  Iteration 1243/3560 Training loss: 1.9446 0.6415 sec/batch
Epoch 7/20  Iteration 1244/3560 Training loss: 1.9444 0.6715 sec/batch
Epoch 7/20  Iteration 1245/3560 Training loss: 1.9442 0.6698 sec/batch
Epoch 7/20  Iteration 1246/3560 Training loss: 1.9442 0.6433 sec/batch
Epoch 8/20  Iteration 1247/3560 Training loss: 1.9813 0.6509 sec/batch
Epoch 8/20  Iteration 1248/3560 Training loss: 1.9519 0.6410 sec/batch
Epoch 8/20  Iteration 1249/3560 Training loss: 1.9359 0.6398 sec/batch
Epoch 8/20  Iteration 1250/3560 Training loss: 1.9293 0.6515 sec/batch
Epoch 8/20  Iteration 1251/3560 Training loss: 1.9248 0.6408 sec/batch
Epoch 8/20  Iteration 1252/3560 Training loss: 1.9189 0.6391 sec/batch
Epoch 8/20  Iteration 1253/3560 Training loss: 1.9194 0.6380 sec/batch
Epoch 8/20  Iteration 1254/3560 Training loss: 1.9201 0.6447 sec/batch
Epoch 8/20  Iteration 1255/3560 Training loss: 1.9239 0.6508 sec/batch
Epoch 8/20  Iteration 1256/3560 Training loss: 1.9243 0.6370 sec/batch
Epoch 8/20  Iteration 1257/3560 Training loss: 1.9211 0.6335 sec/batch
Epoch 8/20  Iteration 1258/3560 Training loss: 1.9195 0.6413 sec/batch
Epoch 8/20  Iteration 1259/3560 Training loss: 1.9194 0.6442 sec/batch
Epoch 8/20  Iteration 1260/3560 Training loss: 1.9219 0.6382 sec/batch
Epoch 8/20  Iteration 1261/3560 Training loss: 1.9213 0.6385 sec/batch
Epoch 8/20  Iteration 1262/3560 Training loss: 1.9194 0.6409 sec/batch
Epoch 8/20  Iteration 1263/3560 Training loss: 1.9193 0.6484 sec/batch
Epoch 8/20  Iteration 1264/3560 Training loss: 1.9210 0.6316 sec/batch
Epoch 8/20  Iteration 1265/3560 Training loss: 1.9204 0.6270 sec/batch
Epoch 8/20  Iteration 1266/3560 Training loss: 1.9203 0.6386 sec/batch
Epoch 8/20  Iteration 1267/3560 Training loss: 1.9193 0.6408 sec/batch
Epoch 8/20  Iteration 1268/3560 Training loss: 1.9202 0.6409 sec/batch
Epoch 8/20  Iteration 1269/3560 Training loss: 1.9195 0.6331 sec/batch
Epoch 8/20  Iteration 1270/3560 Training loss: 1.9188 0.6405 sec/batch
Epoch 8/20  Iteration 1271/3560 Training loss: 1.9182 0.6517 sec/batch
Epoch 8/20  Iteration 1272/3560 Training loss: 1.9173 0.6400 sec/batch
Epoch 8/20  Iteration 1273/3560 Training loss: 1.9163 0.6376 sec/batch
Epoch 8/20  Iteration 1274/3560 Training loss: 1.9162 0.6383 sec/batch
Epoch 8/20  Iteration 1275/3560 Training loss: 1.9168 0.6402 sec/batch
Epoch 8/20  Iteration 1276/3560 Training loss: 1.9173 0.6351 sec/batch
Epoch 8/20  Iteration 1277/3560 Training loss: 1.9172 0.6384 sec/batch
Epoch 8/20  Iteration 1278/3560 Training loss: 1.9168 0.6422 sec/batch
Epoch 8/20  Iteration 1279/3560 Training loss: 1.9173 0.6594 sec/batch
Epoch 8/20  Iteration 1280/3560 Training loss: 1.9180 0.6403 sec/batch
Epoch 8/20  Iteration 1281/3560 Training loss: 1.9176 0.6086 sec/batch
Epoch 8/20  Iteration 1282/3560 Training loss: 1.9172 0.6335 sec/batch
Epoch 8/20  Iteration 1283/3560 Training loss: 1.9169 0.6299 sec/batch
Epoch 8/20  Iteration 1284/3560 Training loss: 1.9158 0.6369 sec/batch
Epoch 8/20  Iteration 1285/3560 Training loss: 1.9147 0.6384 sec/batch
Epoch 8/20  Iteration 1286/3560 Training loss: 1.9139 0.6372 sec/batch
Epoch 8/20  Iteration 1287/3560 Training loss: 1.9132 0.6593 sec/batch
Epoch 8/20  Iteration 1288/3560 Training loss: 1.9131 0.6412 sec/batch
Epoch 8/20  Iteration 1289/3560 Training loss: 1.9126 0.6373 sec/batch
Epoch 8/20  Iteration 1290/3560 Training loss: 1.9117 0.6432 sec/batch
Epoch 8/20  Iteration 1291/3560 Training loss: 1.9118 0.6501 sec/batch
Epoch 8/20  Iteration 1292/3560 Training loss: 1.9106 0.6317 sec/batch
Epoch 8/20  Iteration 1293/3560 Training loss: 1.9105 0.6328 sec/batch
Epoch 8/20  Iteration 1294/3560 Training loss: 1.9100 0.6403 sec/batch
Epoch 8/20  Iteration 1295/3560 Training loss: 1.9096 0.6411 sec/batch
Epoch 8/20  Iteration 1296/3560 Training loss: 1.9104 0.6297 sec/batch
Epoch 8/20  Iteration 1297/3560 Training loss: 1.9099 0.6474 sec/batch
Epoch 8/20  Iteration 1298/3560 Training loss: 1.9108 0.6386 sec/batch
Epoch 8/20  Iteration 1299/3560 Training loss: 1.9106 0.6377 sec/batch
Epoch 8/20  Iteration 1300/3560 Training loss: 1.9104 0.6361 sec/batch
Epoch 8/20  Iteration 1301/3560 Training loss: 1.9100 0.6271 sec/batch
Epoch 8/20  Iteration 1302/3560 Training loss: 1.9102 0.6507 sec/batch
Epoch 8/20  Iteration 1303/3560 Training loss: 1.9104 0.6319 sec/batch
Epoch 8/20  Iteration 1304/3560 Training loss: 1.9103 0.6314 sec/batch
Epoch 8/20  Iteration 1305/3560 Training loss: 1.9099 0.6352 sec/batch
Epoch 8/20  Iteration 1306/3560 Training loss: 1.9102 0.6480 sec/batch
Epoch 8/20  Iteration 1307/3560 Training loss: 1.9099 0.6422 sec/batch
Epoch 8/20  Iteration 1308/3560 Training loss: 1.9105 0.6348 sec/batch
Epoch 8/20  Iteration 1309/3560 Training loss: 1.9110 0.6313 sec/batch
Epoch 8/20  Iteration 1310/3560 Training loss: 1.9111 0.6468 sec/batch
Epoch 8/20  Iteration 1311/3560 Training loss: 1.9108 0.6291 sec/batch
Epoch 8/20  Iteration 1312/3560 Training loss: 1.9110 0.6316 sec/batch
Epoch 8/20  Iteration 1313/3560 Training loss: 1.9110 0.6437 sec/batch
Epoch 8/20  Iteration 1314/3560 Training loss: 1.9104 0.6463 sec/batch
Epoch 8/20  Iteration 1315/3560 Training loss: 1.9103 0.6338 sec/batch
Epoch 8/20  Iteration 1316/3560 Training loss: 1.9101 0.6331 sec/batch
Epoch 8/20  Iteration 1317/3560 Training loss: 1.9105 0.6374 sec/batch
Epoch 8/20  Iteration 1318/3560 Training loss: 1.9104 0.6525 sec/batch
Epoch 8/20  Iteration 1319/3560 Training loss: 1.9106 0.6302 sec/batch
Epoch 8/20  Iteration 1320/3560 Training loss: 1.9103 0.6449 sec/batch
Epoch 8/20  Iteration 1321/3560 Training loss: 1.9103 0.6392 sec/batch
Epoch 8/20  Iteration 1322/3560 Training loss: 1.9104 0.6284 sec/batch
Epoch 8/20  Iteration 1323/3560 Training loss: 1.9104 0.6397 sec/batch
Epoch 8/20  Iteration 1324/3560 Training loss: 1.9103 0.6397 sec/batch
Epoch 8/20  Iteration 1325/3560 Training loss: 1.9098 0.6443 sec/batch
Epoch 8/20  Iteration 1326/3560 Training loss: 1.9095 0.6537 sec/batch
Epoch 8/20  Iteration 1327/3560 Training loss: 1.9089 0.6503 sec/batch
Epoch 8/20  Iteration 1328/3560 Training loss: 1.9090 0.6377 sec/batch
Epoch 8/20  Iteration 1329/3560 Training loss: 1.9085 0.6364 sec/batch
Epoch 8/20  Iteration 1330/3560 Training loss: 1.9084 0.6336 sec/batch
Epoch 8/20  Iteration 1331/3560 Training loss: 1.9078 0.6338 sec/batch
Epoch 8/20  Iteration 1332/3560 Training loss: 1.9074 0.6298 sec/batch
Epoch 8/20  Iteration 1333/3560 Training loss: 1.9074 0.6354 sec/batch
Epoch 8/20  Iteration 1334/3560 Training loss: 1.9071 0.6651 sec/batch
Epoch 8/20  Iteration 1335/3560 Training loss: 1.9066 0.6259 sec/batch
Epoch 8/20  Iteration 1336/3560 Training loss: 1.9067 0.6224 sec/batch
Epoch 8/20  Iteration 1337/3560 Training loss: 1.9063 0.6376 sec/batch
Epoch 8/20  Iteration 1338/3560 Training loss: 1.9063 0.6367 sec/batch
Epoch 8/20  Iteration 1339/3560 Training loss: 1.9058 0.6351 sec/batch
Epoch 8/20  Iteration 1340/3560 Training loss: 1.9055 0.6364 sec/batch
Epoch 8/20  Iteration 1341/3560 Training loss: 1.9052 0.6339 sec/batch
Epoch 8/20  Iteration 1342/3560 Training loss: 1.9052 0.6509 sec/batch
Epoch 8/20  Iteration 1343/3560 Training loss: 1.9049 0.6457 sec/batch
Epoch 8/20  Iteration 1344/3560 Training loss: 1.9045 0.6400 sec/batch
Epoch 8/20  Iteration 1345/3560 Training loss: 1.9043 0.6538 sec/batch
Epoch 8/20  Iteration 1346/3560 Training loss: 1.9038 0.6673 sec/batch
Epoch 8/20  Iteration 1347/3560 Training loss: 1.9038 0.6354 sec/batch
Epoch 8/20  Iteration 1348/3560 Training loss: 1.9037 0.6281 sec/batch
Epoch 8/20  Iteration 1349/3560 Training loss: 1.9035 0.6189 sec/batch
Epoch 8/20  Iteration 1350/3560 Training loss: 1.9033 0.6639 sec/batch
Epoch 8/20  Iteration 1351/3560 Training loss: 1.9030 0.6511 sec/batch
Epoch 8/20  Iteration 1352/3560 Training loss: 1.9029 0.6345 sec/batch
Epoch 8/20  Iteration 1353/3560 Training loss: 1.9027 0.6349 sec/batch
Epoch 8/20  Iteration 1354/3560 Training loss: 1.9027 0.6394 sec/batch
Epoch 8/20  Iteration 1355/3560 Training loss: 1.9026 0.6354 sec/batch
Epoch 8/20  Iteration 1356/3560 Training loss: 1.9025 0.6314 sec/batch
Epoch 8/20  Iteration 1357/3560 Training loss: 1.9023 0.6282 sec/batch
Epoch 8/20  Iteration 1358/3560 Training loss: 1.9020 0.6536 sec/batch
Epoch 8/20  Iteration 1359/3560 Training loss: 1.9020 0.6350 sec/batch
Epoch 8/20  Iteration 1360/3560 Training loss: 1.9018 0.6543 sec/batch
Epoch 8/20  Iteration 1361/3560 Training loss: 1.9016 0.6552 sec/batch
Epoch 8/20  Iteration 1362/3560 Training loss: 1.9012 0.7253 sec/batch
Epoch 8/20  Iteration 1363/3560 Training loss: 1.9011 0.6688 sec/batch
Epoch 8/20  Iteration 1364/3560 Training loss: 1.9010 0.6567 sec/batch
Epoch 8/20  Iteration 1365/3560 Training loss: 1.9009 0.6503 sec/batch
Epoch 8/20  Iteration 1366/3560 Training loss: 1.9009 0.6366 sec/batch
Epoch 8/20  Iteration 1367/3560 Training loss: 1.9009 0.6380 sec/batch
Epoch 8/20  Iteration 1368/3560 Training loss: 1.9006 0.8425 sec/batch
Epoch 8/20  Iteration 1369/3560 Training loss: 1.9003 0.7157 sec/batch
Epoch 8/20  Iteration 1370/3560 Training loss: 1.9003 0.6356 sec/batch
Epoch 8/20  Iteration 1371/3560 Training loss: 1.9003 0.6338 sec/batch
Epoch 8/20  Iteration 1372/3560 Training loss: 1.8999 0.6314 sec/batch
Epoch 8/20  Iteration 1373/3560 Training loss: 1.9000 0.6537 sec/batch
Epoch 8/20  Iteration 1374/3560 Training loss: 1.9001 0.6306 sec/batch
Epoch 8/20  Iteration 1375/3560 Training loss: 1.9000 0.6333 sec/batch
Epoch 8/20  Iteration 1376/3560 Training loss: 1.8999 0.6377 sec/batch
Epoch 8/20  Iteration 1377/3560 Training loss: 1.8997 0.6317 sec/batch
Epoch 8/20  Iteration 1378/3560 Training loss: 1.8994 0.6361 sec/batch
Epoch 8/20  Iteration 1379/3560 Training loss: 1.8994 0.6276 sec/batch
Epoch 8/20  Iteration 1380/3560 Training loss: 1.8994 0.6347 sec/batch
Epoch 8/20  Iteration 1381/3560 Training loss: 1.8993 0.6527 sec/batch
Epoch 8/20  Iteration 1382/3560 Training loss: 1.8993 0.6740 sec/batch
Epoch 8/20  Iteration 1383/3560 Training loss: 1.8994 0.7365 sec/batch
Epoch 8/20  Iteration 1384/3560 Training loss: 1.8993 0.7059 sec/batch
Epoch 8/20  Iteration 1385/3560 Training loss: 1.8993 0.6975 sec/batch
Epoch 8/20  Iteration 1386/3560 Training loss: 1.8991 0.6703 sec/batch
Epoch 8/20  Iteration 1387/3560 Training loss: 1.8993 0.6822 sec/batch
Epoch 8/20  Iteration 1388/3560 Training loss: 1.8992 0.6668 sec/batch
Epoch 8/20  Iteration 1389/3560 Training loss: 1.8992 0.6562 sec/batch
Epoch 8/20  Iteration 1390/3560 Training loss: 1.8992 0.6502 sec/batch
Epoch 8/20  Iteration 1391/3560 Training loss: 1.8991 0.6575 sec/batch
Epoch 8/20  Iteration 1392/3560 Training loss: 1.8992 0.6852 sec/batch
Epoch 8/20  Iteration 1393/3560 Training loss: 1.8992 0.6855 sec/batch
Epoch 8/20  Iteration 1394/3560 Training loss: 1.8994 0.6586 sec/batch
Epoch 8/20  Iteration 1395/3560 Training loss: 1.8993 0.6744 sec/batch
Epoch 8/20  Iteration 1396/3560 Training loss: 1.8991 0.7481 sec/batch
Epoch 8/20  Iteration 1397/3560 Training loss: 1.8988 0.6681 sec/batch
Epoch 8/20  Iteration 1398/3560 Training loss: 1.8990 0.6469 sec/batch
Epoch 8/20  Iteration 1399/3560 Training loss: 1.8990 0.6493 sec/batch
Epoch 8/20  Iteration 1400/3560 Training loss: 1.8990 0.6469 sec/batch
Epoch 8/20  Iteration 1401/3560 Training loss: 1.8989 0.6398 sec/batch
Epoch 8/20  Iteration 1402/3560 Training loss: 1.8988 0.6278 sec/batch
Epoch 8/20  Iteration 1403/3560 Training loss: 1.8988 0.6499 sec/batch
Epoch 8/20  Iteration 1404/3560 Training loss: 1.8987 0.6503 sec/batch
Epoch 8/20  Iteration 1405/3560 Training loss: 1.8984 0.6589 sec/batch
Epoch 8/20  Iteration 1406/3560 Training loss: 1.8985 0.6449 sec/batch
Epoch 8/20  Iteration 1407/3560 Training loss: 1.8986 0.6406 sec/batch
Epoch 8/20  Iteration 1408/3560 Training loss: 1.8985 0.6341 sec/batch
Epoch 8/20  Iteration 1409/3560 Training loss: 1.8985 0.6403 sec/batch
Epoch 8/20  Iteration 1410/3560 Training loss: 1.8984 0.6436 sec/batch
Epoch 8/20  Iteration 1411/3560 Training loss: 1.8983 0.6537 sec/batch
Epoch 8/20  Iteration 1412/3560 Training loss: 1.8982 0.6433 sec/batch
Epoch 8/20  Iteration 1413/3560 Training loss: 1.8982 0.6470 sec/batch
Epoch 8/20  Iteration 1414/3560 Training loss: 1.8984 0.6516 sec/batch
Epoch 8/20  Iteration 1415/3560 Training loss: 1.8983 0.6542 sec/batch
Epoch 8/20  Iteration 1416/3560 Training loss: 1.8982 0.6477 sec/batch
Epoch 8/20  Iteration 1417/3560 Training loss: 1.8981 0.6457 sec/batch
Epoch 8/20  Iteration 1418/3560 Training loss: 1.8979 0.7495 sec/batch
Epoch 8/20  Iteration 1419/3560 Training loss: 1.8980 0.7986 sec/batch
Epoch 8/20  Iteration 1420/3560 Training loss: 1.8979 0.7557 sec/batch
Epoch 8/20  Iteration 1421/3560 Training loss: 1.8979 0.7200 sec/batch
Epoch 8/20  Iteration 1422/3560 Training loss: 1.8978 0.8183 sec/batch
Epoch 8/20  Iteration 1423/3560 Training loss: 1.8976 0.8111 sec/batch
Epoch 8/20  Iteration 1424/3560 Training loss: 1.8976 0.7772 sec/batch
Epoch 9/20  Iteration 1425/3560 Training loss: 1.9373 0.8808 sec/batch
Epoch 9/20  Iteration 1426/3560 Training loss: 1.9004 0.8010 sec/batch
Epoch 9/20  Iteration 1427/3560 Training loss: 1.8914 0.7249 sec/batch
Epoch 9/20  Iteration 1428/3560 Training loss: 1.8882 0.7581 sec/batch
Epoch 9/20  Iteration 1429/3560 Training loss: 1.8848 0.7975 sec/batch
Epoch 9/20  Iteration 1430/3560 Training loss: 1.8762 0.7488 sec/batch
Epoch 9/20  Iteration 1431/3560 Training loss: 1.8774 0.7875 sec/batch
Epoch 9/20  Iteration 1432/3560 Training loss: 1.8770 0.8057 sec/batch
Epoch 9/20  Iteration 1433/3560 Training loss: 1.8799 0.7375 sec/batch
Epoch 9/20  Iteration 1434/3560 Training loss: 1.8789 0.7599 sec/batch
Epoch 9/20  Iteration 1435/3560 Training loss: 1.8761 0.7607 sec/batch
Epoch 9/20  Iteration 1436/3560 Training loss: 1.8749 0.7320 sec/batch
Epoch 9/20  Iteration 1437/3560 Training loss: 1.8760 0.7674 sec/batch
Epoch 9/20  Iteration 1438/3560 Training loss: 1.8791 0.7812 sec/batch
Epoch 9/20  Iteration 1439/3560 Training loss: 1.8788 0.7800 sec/batch
Epoch 9/20  Iteration 1440/3560 Training loss: 1.8778 0.7704 sec/batch
Epoch 9/20  Iteration 1441/3560 Training loss: 1.8775 0.7726 sec/batch
Epoch 9/20  Iteration 1442/3560 Training loss: 1.8796 0.7764 sec/batch
Epoch 9/20  Iteration 1443/3560 Training loss: 1.8795 0.8391 sec/batch
Epoch 9/20  Iteration 1444/3560 Training loss: 1.8798 0.7502 sec/batch
Epoch 9/20  Iteration 1445/3560 Training loss: 1.8789 0.7006 sec/batch
Epoch 9/20  Iteration 1446/3560 Training loss: 1.8799 0.6706 sec/batch
Epoch 9/20  Iteration 1447/3560 Training loss: 1.8789 0.6535 sec/batch
Epoch 9/20  Iteration 1448/3560 Training loss: 1.8783 0.7268 sec/batch
Epoch 9/20  Iteration 1449/3560 Training loss: 1.8778 0.7682 sec/batch
Epoch 9/20  Iteration 1450/3560 Training loss: 1.8770 0.7217 sec/batch
Epoch 9/20  Iteration 1451/3560 Training loss: 1.8764 0.7146 sec/batch
Epoch 9/20  Iteration 1452/3560 Training loss: 1.8764 0.7980 sec/batch
Epoch 9/20  Iteration 1453/3560 Training loss: 1.8774 0.7759 sec/batch
Epoch 9/20  Iteration 1454/3560 Training loss: 1.8776 0.7374 sec/batch
Epoch 9/20  Iteration 1455/3560 Training loss: 1.8775 0.7086 sec/batch
Epoch 9/20  Iteration 1456/3560 Training loss: 1.8769 0.7793 sec/batch
Epoch 9/20  Iteration 1457/3560 Training loss: 1.8771 0.7308 sec/batch
Epoch 9/20  Iteration 1458/3560 Training loss: 1.8776 0.7721 sec/batch
Epoch 9/20  Iteration 1459/3560 Training loss: 1.8774 0.8071 sec/batch
Epoch 9/20  Iteration 1460/3560 Training loss: 1.8773 0.8073 sec/batch
Epoch 9/20  Iteration 1461/3560 Training loss: 1.8768 0.8168 sec/batch
Epoch 9/20  Iteration 1462/3560 Training loss: 1.8760 0.7206 sec/batch
Epoch 9/20  Iteration 1463/3560 Training loss: 1.8750 0.7894 sec/batch
Epoch 9/20  Iteration 1464/3560 Training loss: 1.8745 0.7803 sec/batch
Epoch 9/20  Iteration 1465/3560 Training loss: 1.8740 0.7090 sec/batch
Epoch 9/20  Iteration 1466/3560 Training loss: 1.8741 0.6536 sec/batch
Epoch 9/20  Iteration 1467/3560 Training loss: 1.8736 0.7679 sec/batch
Epoch 9/20  Iteration 1468/3560 Training loss: 1.8726 0.6909 sec/batch
Epoch 9/20  Iteration 1469/3560 Training loss: 1.8728 0.7766 sec/batch
Epoch 9/20  Iteration 1470/3560 Training loss: 1.8717 0.7394 sec/batch
Epoch 9/20  Iteration 1471/3560 Training loss: 1.8716 0.6911 sec/batch
Epoch 9/20  Iteration 1472/3560 Training loss: 1.8712 0.7737 sec/batch
Epoch 9/20  Iteration 1473/3560 Training loss: 1.8711 0.7386 sec/batch
Epoch 9/20  Iteration 1474/3560 Training loss: 1.8719 0.7635 sec/batch
Epoch 9/20  Iteration 1475/3560 Training loss: 1.8715 0.6973 sec/batch
Epoch 9/20  Iteration 1476/3560 Training loss: 1.8722 0.8083 sec/batch
Epoch 9/20  Iteration 1477/3560 Training loss: 1.8720 0.8123 sec/batch
Epoch 9/20  Iteration 1478/3560 Training loss: 1.8721 0.8159 sec/batch
Epoch 9/20  Iteration 1479/3560 Training loss: 1.8716 0.6696 sec/batch
Epoch 9/20  Iteration 1480/3560 Training loss: 1.8719 0.7605 sec/batch
Epoch 9/20  Iteration 1481/3560 Training loss: 1.8721 0.7695 sec/batch
Epoch 9/20  Iteration 1482/3560 Training loss: 1.8717 0.7353 sec/batch
Epoch 9/20  Iteration 1483/3560 Training loss: 1.8712 0.7852 sec/batch
Epoch 9/20  Iteration 1484/3560 Training loss: 1.8715 0.7988 sec/batch
Epoch 9/20  Iteration 1485/3560 Training loss: 1.8712 0.7474 sec/batch
Epoch 9/20  Iteration 1486/3560 Training loss: 1.8719 0.6803 sec/batch
Epoch 9/20  Iteration 1487/3560 Training loss: 1.8722 0.8032 sec/batch
Epoch 9/20  Iteration 1488/3560 Training loss: 1.8725 0.7014 sec/batch
Epoch 9/20  Iteration 1489/3560 Training loss: 1.8723 0.8352 sec/batch
Epoch 9/20  Iteration 1490/3560 Training loss: 1.8725 0.7912 sec/batch
Epoch 9/20  Iteration 1491/3560 Training loss: 1.8724 0.7555 sec/batch
Epoch 9/20  Iteration 1492/3560 Training loss: 1.8718 0.7211 sec/batch
Epoch 9/20  Iteration 1493/3560 Training loss: 1.8717 0.7245 sec/batch
Epoch 9/20  Iteration 1494/3560 Training loss: 1.8713 0.7264 sec/batch
Epoch 9/20  Iteration 1495/3560 Training loss: 1.8717 0.7327 sec/batch
Epoch 9/20  Iteration 1496/3560 Training loss: 1.8717 0.7487 sec/batch
Epoch 9/20  Iteration 1497/3560 Training loss: 1.8720 0.6938 sec/batch
Epoch 9/20  Iteration 1498/3560 Training loss: 1.8717 0.8455 sec/batch
Epoch 9/20  Iteration 1499/3560 Training loss: 1.8717 0.8381 sec/batch
Epoch 9/20  Iteration 1500/3560 Training loss: 1.8719 0.7724 sec/batch
Epoch 9/20  Iteration 1501/3560 Training loss: 1.8717 0.7220 sec/batch
Epoch 9/20  Iteration 1502/3560 Training loss: 1.8717 0.7255 sec/batch
Epoch 9/20  Iteration 1503/3560 Training loss: 1.8712 0.7518 sec/batch
Epoch 9/20  Iteration 1504/3560 Training loss: 1.8711 0.7681 sec/batch
Epoch 9/20  Iteration 1505/3560 Training loss: 1.8705 0.6832 sec/batch
Epoch 9/20  Iteration 1506/3560 Training loss: 1.8705 0.6505 sec/batch
Epoch 9/20  Iteration 1507/3560 Training loss: 1.8701 0.6357 sec/batch
Epoch 9/20  Iteration 1508/3560 Training loss: 1.8699 0.6262 sec/batch
Epoch 9/20  Iteration 1509/3560 Training loss: 1.8694 0.6201 sec/batch
Epoch 9/20  Iteration 1510/3560 Training loss: 1.8689 0.6383 sec/batch
Epoch 9/20  Iteration 1511/3560 Training loss: 1.8687 0.6414 sec/batch
Epoch 9/20  Iteration 1512/3560 Training loss: 1.8684 0.6490 sec/batch
Epoch 9/20  Iteration 1513/3560 Training loss: 1.8679 0.6425 sec/batch
Epoch 9/20  Iteration 1514/3560 Training loss: 1.8680 0.6332 sec/batch
Epoch 9/20  Iteration 1515/3560 Training loss: 1.8676 0.6298 sec/batch
Epoch 9/20  Iteration 1516/3560 Training loss: 1.8675 0.6353 sec/batch
Epoch 9/20  Iteration 1517/3560 Training loss: 1.8670 0.6384 sec/batch
Epoch 9/20  Iteration 1518/3560 Training loss: 1.8665 0.6443 sec/batch
Epoch 9/20  Iteration 1519/3560 Training loss: 1.8661 0.6155 sec/batch
Epoch 9/20  Iteration 1520/3560 Training loss: 1.8660 0.6243 sec/batch
Epoch 9/20  Iteration 1521/3560 Training loss: 1.8658 0.6271 sec/batch
Epoch 9/20  Iteration 1522/3560 Training loss: 1.8654 0.6359 sec/batch
Epoch 9/20  Iteration 1523/3560 Training loss: 1.8650 0.6449 sec/batch
Epoch 9/20  Iteration 1524/3560 Training loss: 1.8645 0.6316 sec/batch
Epoch 9/20  Iteration 1525/3560 Training loss: 1.8644 0.6532 sec/batch
Epoch 9/20  Iteration 1526/3560 Training loss: 1.8644 0.6383 sec/batch
Epoch 9/20  Iteration 1527/3560 Training loss: 1.8642 0.6231 sec/batch
Epoch 9/20  Iteration 1528/3560 Training loss: 1.8638 0.6506 sec/batch
Epoch 9/20  Iteration 1529/3560 Training loss: 1.8635 0.6500 sec/batch
Epoch 9/20  Iteration 1530/3560 Training loss: 1.8634 0.6326 sec/batch
Epoch 9/20  Iteration 1531/3560 Training loss: 1.8632 0.6469 sec/batch
Epoch 9/20  Iteration 1532/3560 Training loss: 1.8631 0.6300 sec/batch
Epoch 9/20  Iteration 1533/3560 Training loss: 1.8632 0.6421 sec/batch
Epoch 9/20  Iteration 1534/3560 Training loss: 1.8632 0.6391 sec/batch
Epoch 9/20  Iteration 1535/3560 Training loss: 1.8631 0.6450 sec/batch
Epoch 9/20  Iteration 1536/3560 Training loss: 1.8628 0.6522 sec/batch
Epoch 9/20  Iteration 1537/3560 Training loss: 1.8627 0.6348 sec/batch
Epoch 9/20  Iteration 1538/3560 Training loss: 1.8625 0.6389 sec/batch
Epoch 9/20  Iteration 1539/3560 Training loss: 1.8622 0.6336 sec/batch
Epoch 9/20  Iteration 1540/3560 Training loss: 1.8618 0.6329 sec/batch
Epoch 9/20  Iteration 1541/3560 Training loss: 1.8618 0.6436 sec/batch
Epoch 9/20  Iteration 1542/3560 Training loss: 1.8617 0.6386 sec/batch
Epoch 9/20  Iteration 1543/3560 Training loss: 1.8616 0.6274 sec/batch
Epoch 9/20  Iteration 1544/3560 Training loss: 1.8615 0.6447 sec/batch
Epoch 9/20  Iteration 1545/3560 Training loss: 1.8614 0.6436 sec/batch
Epoch 9/20  Iteration 1546/3560 Training loss: 1.8611 0.6373 sec/batch
Epoch 9/20  Iteration 1547/3560 Training loss: 1.8608 0.6344 sec/batch
Epoch 9/20  Iteration 1548/3560 Training loss: 1.8609 0.6333 sec/batch
Epoch 9/20  Iteration 1549/3560 Training loss: 1.8608 0.6436 sec/batch
Epoch 9/20  Iteration 1550/3560 Training loss: 1.8604 0.6471 sec/batch
Epoch 9/20  Iteration 1551/3560 Training loss: 1.8605 0.6346 sec/batch
Epoch 9/20  Iteration 1552/3560 Training loss: 1.8605 0.6489 sec/batch
Epoch 9/20  Iteration 1553/3560 Training loss: 1.8604 0.6238 sec/batch
Epoch 9/20  Iteration 1554/3560 Training loss: 1.8603 0.6299 sec/batch
Epoch 9/20  Iteration 1555/3560 Training loss: 1.8599 0.6341 sec/batch
Epoch 9/20  Iteration 1556/3560 Training loss: 1.8596 0.6353 sec/batch
Epoch 9/20  Iteration 1557/3560 Training loss: 1.8596 0.6382 sec/batch
Epoch 9/20  Iteration 1558/3560 Training loss: 1.8596 0.6385 sec/batch
Epoch 9/20  Iteration 1559/3560 Training loss: 1.8595 0.6359 sec/batch
Epoch 9/20  Iteration 1560/3560 Training loss: 1.8596 0.6614 sec/batch
Epoch 9/20  Iteration 1561/3560 Training loss: 1.8596 0.6345 sec/batch
Epoch 9/20  Iteration 1562/3560 Training loss: 1.8596 0.6352 sec/batch
Epoch 9/20  Iteration 1563/3560 Training loss: 1.8597 0.6393 sec/batch
Epoch 9/20  Iteration 1564/3560 Training loss: 1.8595 0.6505 sec/batch
Epoch 9/20  Iteration 1565/3560 Training loss: 1.8596 0.6313 sec/batch
Epoch 9/20  Iteration 1566/3560 Training loss: 1.8595 0.6305 sec/batch
Epoch 9/20  Iteration 1567/3560 Training loss: 1.8594 0.6300 sec/batch
Epoch 9/20  Iteration 1568/3560 Training loss: 1.8593 0.6512 sec/batch
Epoch 9/20  Iteration 1569/3560 Training loss: 1.8592 0.6381 sec/batch
Epoch 9/20  Iteration 1570/3560 Training loss: 1.8593 0.6338 sec/batch
Epoch 9/20  Iteration 1571/3560 Training loss: 1.8593 0.6301 sec/batch
Epoch 9/20  Iteration 1572/3560 Training loss: 1.8595 0.6334 sec/batch
Epoch 9/20  Iteration 1573/3560 Training loss: 1.8595 0.6314 sec/batch
Epoch 9/20  Iteration 1574/3560 Training loss: 1.8593 0.6296 sec/batch
Epoch 9/20  Iteration 1575/3560 Training loss: 1.8591 0.6395 sec/batch
Epoch 9/20  Iteration 1576/3560 Training loss: 1.8591 0.6640 sec/batch
Epoch 9/20  Iteration 1577/3560 Training loss: 1.8592 0.6355 sec/batch
Epoch 9/20  Iteration 1578/3560 Training loss: 1.8592 0.6404 sec/batch
Epoch 9/20  Iteration 1579/3560 Training loss: 1.8591 0.6309 sec/batch
Epoch 9/20  Iteration 1580/3560 Training loss: 1.8590 0.6270 sec/batch
Epoch 9/20  Iteration 1581/3560 Training loss: 1.8590 0.6360 sec/batch
Epoch 9/20  Iteration 1582/3560 Training loss: 1.8589 0.6274 sec/batch
Epoch 9/20  Iteration 1583/3560 Training loss: 1.8586 0.6326 sec/batch
Epoch 9/20  Iteration 1584/3560 Training loss: 1.8586 0.6526 sec/batch
Epoch 9/20  Iteration 1585/3560 Training loss: 1.8587 0.6298 sec/batch
Epoch 9/20  Iteration 1586/3560 Training loss: 1.8586 0.6439 sec/batch
Epoch 9/20  Iteration 1587/3560 Training loss: 1.8587 0.6478 sec/batch
Epoch 9/20  Iteration 1588/3560 Training loss: 1.8586 0.6412 sec/batch
Epoch 9/20  Iteration 1589/3560 Training loss: 1.8585 0.6335 sec/batch
Epoch 9/20  Iteration 1590/3560 Training loss: 1.8585 0.6302 sec/batch
Epoch 9/20  Iteration 1591/3560 Training loss: 1.8584 0.6396 sec/batch
Epoch 9/20  Iteration 1592/3560 Training loss: 1.8587 0.6554 sec/batch
Epoch 9/20  Iteration 1593/3560 Training loss: 1.8586 0.6397 sec/batch
Epoch 9/20  Iteration 1594/3560 Training loss: 1.8585 0.6324 sec/batch
Epoch 9/20  Iteration 1595/3560 Training loss: 1.8583 0.6368 sec/batch
Epoch 9/20  Iteration 1596/3560 Training loss: 1.8582 0.6507 sec/batch
Epoch 9/20  Iteration 1597/3560 Training loss: 1.8582 0.6998 sec/batch
Epoch 9/20  Iteration 1598/3560 Training loss: 1.8582 0.6330 sec/batch
Epoch 9/20  Iteration 1599/3560 Training loss: 1.8583 0.6673 sec/batch
Epoch 9/20  Iteration 1600/3560 Training loss: 1.8582 0.6752 sec/batch
Epoch 9/20  Iteration 1601/3560 Training loss: 1.8580 0.6337 sec/batch
Epoch 9/20  Iteration 1602/3560 Training loss: 1.8580 0.6339 sec/batch
Epoch 10/20  Iteration 1603/3560 Training loss: 1.9025 0.6416 sec/batch
Epoch 10/20  Iteration 1604/3560 Training loss: 1.8681 0.6437 sec/batch
Epoch 10/20  Iteration 1605/3560 Training loss: 1.8571 0.6387 sec/batch
Epoch 10/20  Iteration 1606/3560 Training loss: 1.8530 0.6278 sec/batch
Epoch 10/20  Iteration 1607/3560 Training loss: 1.8492 0.6542 sec/batch
Epoch 10/20  Iteration 1608/3560 Training loss: 1.8419 0.6426 sec/batch
Epoch 10/20  Iteration 1609/3560 Training loss: 1.8421 0.6387 sec/batch
Epoch 10/20  Iteration 1610/3560 Training loss: 1.8417 0.6271 sec/batch
Epoch 10/20  Iteration 1611/3560 Training loss: 1.8438 0.6378 sec/batch
Epoch 10/20  Iteration 1612/3560 Training loss: 1.8437 0.6068 sec/batch
Epoch 10/20  Iteration 1613/3560 Training loss: 1.8408 0.6319 sec/batch
Epoch 10/20  Iteration 1614/3560 Training loss: 1.8394 0.6236 sec/batch
Epoch 10/20  Iteration 1615/3560 Training loss: 1.8393 0.6568 sec/batch
Epoch 10/20  Iteration 1616/3560 Training loss: 1.8421 0.6185 sec/batch
Epoch 10/20  Iteration 1617/3560 Training loss: 1.8416 0.6376 sec/batch
Epoch 10/20  Iteration 1618/3560 Training loss: 1.8401 0.6356 sec/batch
Epoch 10/20  Iteration 1619/3560 Training loss: 1.8395 0.6339 sec/batch
Epoch 10/20  Iteration 1620/3560 Training loss: 1.8423 0.6328 sec/batch
Epoch 10/20  Iteration 1621/3560 Training loss: 1.8415 0.6480 sec/batch
Epoch 10/20  Iteration 1622/3560 Training loss: 1.8416 0.6550 sec/batch
Epoch 10/20  Iteration 1623/3560 Training loss: 1.8412 0.6553 sec/batch
Epoch 10/20  Iteration 1624/3560 Training loss: 1.8421 0.6370 sec/batch
Epoch 10/20  Iteration 1625/3560 Training loss: 1.8418 0.6297 sec/batch
Epoch 10/20  Iteration 1626/3560 Training loss: 1.8415 0.6324 sec/batch
Epoch 10/20  Iteration 1627/3560 Training loss: 1.8417 0.6334 sec/batch
Epoch 10/20  Iteration 1628/3560 Training loss: 1.8408 0.6322 sec/batch
Epoch 10/20  Iteration 1629/3560 Training loss: 1.8401 0.6399 sec/batch
Epoch 10/20  Iteration 1630/3560 Training loss: 1.8402 0.6433 sec/batch
Epoch 10/20  Iteration 1631/3560 Training loss: 1.8408 0.6458 sec/batch
Epoch 10/20  Iteration 1632/3560 Training loss: 1.8414 0.6333 sec/batch
Epoch 10/20  Iteration 1633/3560 Training loss: 1.8410 0.6329 sec/batch
Epoch 10/20  Iteration 1634/3560 Training loss: 1.8402 0.6294 sec/batch
Epoch 10/20  Iteration 1635/3560 Training loss: 1.8410 0.6430 sec/batch
Epoch 10/20  Iteration 1636/3560 Training loss: 1.8416 0.6407 sec/batch
Epoch 10/20  Iteration 1637/3560 Training loss: 1.8411 0.6305 sec/batch
Epoch 10/20  Iteration 1638/3560 Training loss: 1.8411 0.6490 sec/batch
Epoch 10/20  Iteration 1639/3560 Training loss: 1.8409 0.6541 sec/batch
Epoch 10/20  Iteration 1640/3560 Training loss: 1.8398 0.6321 sec/batch
Epoch 10/20  Iteration 1641/3560 Training loss: 1.8384 0.6335 sec/batch
Epoch 10/20  Iteration 1642/3560 Training loss: 1.8379 0.6313 sec/batch
Epoch 10/20  Iteration 1643/3560 Training loss: 1.8375 0.6308 sec/batch
Epoch 10/20  Iteration 1644/3560 Training loss: 1.8375 0.6348 sec/batch
Epoch 10/20  Iteration 1645/3560 Training loss: 1.8368 0.6294 sec/batch
Epoch 10/20  Iteration 1646/3560 Training loss: 1.8359 0.6378 sec/batch
Epoch 10/20  Iteration 1647/3560 Training loss: 1.8360 0.6565 sec/batch
Epoch 10/20  Iteration 1648/3560 Training loss: 1.8348 0.6440 sec/batch
Epoch 10/20  Iteration 1649/3560 Training loss: 1.8346 0.6363 sec/batch
Epoch 10/20  Iteration 1650/3560 Training loss: 1.8342 0.6287 sec/batch
Epoch 10/20  Iteration 1651/3560 Training loss: 1.8342 0.6250 sec/batch
Epoch 10/20  Iteration 1652/3560 Training loss: 1.8350 0.6276 sec/batch
Epoch 10/20  Iteration 1653/3560 Training loss: 1.8345 0.6335 sec/batch
Epoch 10/20  Iteration 1654/3560 Training loss: 1.8351 0.6309 sec/batch
Epoch 10/20  Iteration 1655/3560 Training loss: 1.8349 0.6662 sec/batch
Epoch 10/20  Iteration 1656/3560 Training loss: 1.8348 0.6401 sec/batch
Epoch 10/20  Iteration 1657/3560 Training loss: 1.8343 0.6372 sec/batch
Epoch 10/20  Iteration 1658/3560 Training loss: 1.8345 0.6429 sec/batch
Epoch 10/20  Iteration 1659/3560 Training loss: 1.8346 0.6316 sec/batch
Epoch 10/20  Iteration 1660/3560 Training loss: 1.8343 0.6280 sec/batch
Epoch 10/20  Iteration 1661/3560 Training loss: 1.8340 0.6369 sec/batch
Epoch 10/20  Iteration 1662/3560 Training loss: 1.8344 0.6499 sec/batch
Epoch 10/20  Iteration 1663/3560 Training loss: 1.8343 0.6442 sec/batch
Epoch 10/20  Iteration 1664/3560 Training loss: 1.8352 0.6333 sec/batch
Epoch 10/20  Iteration 1665/3560 Training loss: 1.8356 0.6514 sec/batch
Epoch 10/20  Iteration 1666/3560 Training loss: 1.8358 0.6271 sec/batch
Epoch 10/20  Iteration 1667/3560 Training loss: 1.8357 0.6307 sec/batch
Epoch 10/20  Iteration 1668/3560 Training loss: 1.8359 0.6433 sec/batch
Epoch 10/20  Iteration 1669/3560 Training loss: 1.8361 0.6393 sec/batch
Epoch 10/20  Iteration 1670/3560 Training loss: 1.8356 0.6578 sec/batch
Epoch 10/20  Iteration 1671/3560 Training loss: 1.8354 0.6352 sec/batch
Epoch 10/20  Iteration 1672/3560 Training loss: 1.8352 0.6343 sec/batch
Epoch 10/20  Iteration 1673/3560 Training loss: 1.8353 0.6215 sec/batch
Epoch 10/20  Iteration 1674/3560 Training loss: 1.8356 0.6467 sec/batch
Epoch 10/20  Iteration 1675/3560 Training loss: 1.8359 0.6523 sec/batch
Epoch 10/20  Iteration 1676/3560 Training loss: 1.8356 0.6385 sec/batch
Epoch 10/20  Iteration 1677/3560 Training loss: 1.8356 0.6040 sec/batch
Epoch 10/20  Iteration 1678/3560 Training loss: 1.8358 0.6624 sec/batch
Epoch 10/20  Iteration 1679/3560 Training loss: 1.8357 0.6466 sec/batch
Epoch 10/20  Iteration 1680/3560 Training loss: 1.8357 0.6339 sec/batch
Epoch 10/20  Iteration 1681/3560 Training loss: 1.8352 0.6541 sec/batch
Epoch 10/20  Iteration 1682/3560 Training loss: 1.8351 0.6324 sec/batch
Epoch 10/20  Iteration 1683/3560 Training loss: 1.8347 0.6371 sec/batch
Epoch 10/20  Iteration 1684/3560 Training loss: 1.8346 0.6409 sec/batch
Epoch 10/20  Iteration 1685/3560 Training loss: 1.8341 0.6378 sec/batch
Epoch 10/20  Iteration 1686/3560 Training loss: 1.8339 0.6607 sec/batch
Epoch 10/20  Iteration 1687/3560 Training loss: 1.8334 0.6392 sec/batch
Epoch 10/20  Iteration 1688/3560 Training loss: 1.8331 0.7493 sec/batch
Epoch 10/20  Iteration 1689/3560 Training loss: 1.8329 0.8086 sec/batch
Epoch 10/20  Iteration 1690/3560 Training loss: 1.8325 0.7617 sec/batch
Epoch 10/20  Iteration 1691/3560 Training loss: 1.8321 0.6686 sec/batch
Epoch 10/20  Iteration 1692/3560 Training loss: 1.8322 0.6701 sec/batch
Epoch 10/20  Iteration 1693/3560 Training loss: 1.8320 0.6529 sec/batch
Epoch 10/20  Iteration 1694/3560 Training loss: 1.8319 0.6375 sec/batch
Epoch 10/20  Iteration 1695/3560 Training loss: 1.8315 0.6319 sec/batch
Epoch 10/20  Iteration 1696/3560 Training loss: 1.8311 0.6449 sec/batch
Epoch 10/20  Iteration 1697/3560 Training loss: 1.8307 0.6400 sec/batch
Epoch 10/20  Iteration 1698/3560 Training loss: 1.8307 0.6373 sec/batch
Epoch 10/20  Iteration 1699/3560 Training loss: 1.8306 0.6352 sec/batch
Epoch 10/20  Iteration 1700/3560 Training loss: 1.8302 0.6376 sec/batch
Epoch 10/20  Iteration 1701/3560 Training loss: 1.8298 0.6535 sec/batch
Epoch 10/20  Iteration 1702/3560 Training loss: 1.8295 0.6424 sec/batch
Epoch 10/20  Iteration 1703/3560 Training loss: 1.8295 0.6432 sec/batch
Epoch 10/20  Iteration 1704/3560 Training loss: 1.8293 0.6491 sec/batch
Epoch 10/20  Iteration 1705/3560 Training loss: 1.8290 0.6497 sec/batch
Epoch 10/20  Iteration 1706/3560 Training loss: 1.8289 0.6468 sec/batch
Epoch 10/20  Iteration 1707/3560 Training loss: 1.8286 0.6433 sec/batch
Epoch 10/20  Iteration 1708/3560 Training loss: 1.8284 0.6216 sec/batch
Epoch 10/20  Iteration 1709/3560 Training loss: 1.8282 0.6628 sec/batch
Epoch 10/20  Iteration 1710/3560 Training loss: 1.8281 0.6463 sec/batch
Epoch 10/20  Iteration 1711/3560 Training loss: 1.8283 0.6346 sec/batch
Epoch 10/20  Iteration 1712/3560 Training loss: 1.8281 0.6333 sec/batch
Epoch 10/20  Iteration 1713/3560 Training loss: 1.8281 0.6283 sec/batch
Epoch 10/20  Iteration 1714/3560 Training loss: 1.8279 0.6483 sec/batch
Epoch 10/20  Iteration 1715/3560 Training loss: 1.8277 0.6451 sec/batch
Epoch 10/20  Iteration 1716/3560 Training loss: 1.8275 0.6248 sec/batch
Epoch 10/20  Iteration 1717/3560 Training loss: 1.8273 0.6549 sec/batch
Epoch 10/20  Iteration 1718/3560 Training loss: 1.8269 0.6376 sec/batch
Epoch 10/20  Iteration 1719/3560 Training loss: 1.8269 0.6238 sec/batch
Epoch 10/20  Iteration 1720/3560 Training loss: 1.8268 0.6389 sec/batch
Epoch 10/20  Iteration 1721/3560 Training loss: 1.8267 0.6431 sec/batch
Epoch 10/20  Iteration 1722/3560 Training loss: 1.8266 0.6318 sec/batch
Epoch 10/20  Iteration 1723/3560 Training loss: 1.8267 0.6311 sec/batch
Epoch 10/20  Iteration 1724/3560 Training loss: 1.8264 0.6287 sec/batch
Epoch 10/20  Iteration 1725/3560 Training loss: 1.8262 0.6414 sec/batch
Epoch 10/20  Iteration 1726/3560 Training loss: 1.8263 0.6414 sec/batch
Epoch 10/20  Iteration 1727/3560 Training loss: 1.8262 0.6332 sec/batch
Epoch 10/20  Iteration 1728/3560 Training loss: 1.8258 0.6312 sec/batch
Epoch 10/20  Iteration 1729/3560 Training loss: 1.8259 0.6228 sec/batch
Epoch 10/20  Iteration 1730/3560 Training loss: 1.8260 0.6439 sec/batch
Epoch 10/20  Iteration 1731/3560 Training loss: 1.8258 0.6431 sec/batch
Epoch 10/20  Iteration 1732/3560 Training loss: 1.8257 0.6381 sec/batch
Epoch 10/20  Iteration 1733/3560 Training loss: 1.8255 0.6552 sec/batch
Epoch 10/20  Iteration 1734/3560 Training loss: 1.8252 0.6357 sec/batch
Epoch 10/20  Iteration 1735/3560 Training loss: 1.8252 0.6424 sec/batch
Epoch 10/20  Iteration 1736/3560 Training loss: 1.8253 0.6409 sec/batch
Epoch 10/20  Iteration 1737/3560 Training loss: 1.8253 0.6465 sec/batch
Epoch 10/20  Iteration 1738/3560 Training loss: 1.8254 0.6344 sec/batch
Epoch 10/20  Iteration 1739/3560 Training loss: 1.8254 0.6354 sec/batch
Epoch 10/20  Iteration 1740/3560 Training loss: 1.8254 0.6494 sec/batch
Epoch 10/20  Iteration 1741/3560 Training loss: 1.8256 0.6162 sec/batch
Epoch 10/20  Iteration 1742/3560 Training loss: 1.8254 0.6328 sec/batch
Epoch 10/20  Iteration 1743/3560 Training loss: 1.8257 0.6334 sec/batch
Epoch 10/20  Iteration 1744/3560 Training loss: 1.8256 0.6348 sec/batch
Epoch 10/20  Iteration 1745/3560 Training loss: 1.8255 0.6355 sec/batch
Epoch 10/20  Iteration 1746/3560 Training loss: 1.8255 0.6446 sec/batch
Epoch 10/20  Iteration 1747/3560 Training loss: 1.8254 0.6446 sec/batch
Epoch 10/20  Iteration 1748/3560 Training loss: 1.8253 0.6506 sec/batch
Epoch 10/20  Iteration 1749/3560 Training loss: 1.8253 0.6337 sec/batch
Epoch 10/20  Iteration 1750/3560 Training loss: 1.8255 0.6174 sec/batch
Epoch 10/20  Iteration 1751/3560 Training loss: 1.8255 0.6338 sec/batch
Epoch 10/20  Iteration 1752/3560 Training loss: 1.8253 0.6304 sec/batch
Epoch 10/20  Iteration 1753/3560 Training loss: 1.8251 0.6442 sec/batch
Epoch 10/20  Iteration 1754/3560 Training loss: 1.8251 0.6259 sec/batch
Epoch 10/20  Iteration 1755/3560 Training loss: 1.8251 0.6399 sec/batch
Epoch 10/20  Iteration 1756/3560 Training loss: 1.8251 0.6495 sec/batch
Epoch 10/20  Iteration 1757/3560 Training loss: 1.8251 0.6291 sec/batch
Epoch 10/20  Iteration 1758/3560 Training loss: 1.8250 0.6361 sec/batch
Epoch 10/20  Iteration 1759/3560 Training loss: 1.8250 0.6344 sec/batch
Epoch 10/20  Iteration 1760/3560 Training loss: 1.8249 0.6200 sec/batch
Epoch 10/20  Iteration 1761/3560 Training loss: 1.8246 0.6558 sec/batch
Epoch 10/20  Iteration 1762/3560 Training loss: 1.8247 0.6367 sec/batch
Epoch 10/20  Iteration 1763/3560 Training loss: 1.8249 0.6463 sec/batch
Epoch 10/20  Iteration 1764/3560 Training loss: 1.8248 0.6300 sec/batch
Epoch 10/20  Iteration 1765/3560 Training loss: 1.8248 0.6434 sec/batch
Epoch 10/20  Iteration 1766/3560 Training loss: 1.8248 0.6424 sec/batch
Epoch 10/20  Iteration 1767/3560 Training loss: 1.8247 0.6345 sec/batch
Epoch 10/20  Iteration 1768/3560 Training loss: 1.8245 0.6303 sec/batch
Epoch 10/20  Iteration 1769/3560 Training loss: 1.8245 0.6378 sec/batch
Epoch 10/20  Iteration 1770/3560 Training loss: 1.8248 0.6493 sec/batch
Epoch 10/20  Iteration 1771/3560 Training loss: 1.8247 0.6384 sec/batch
Epoch 10/20  Iteration 1772/3560 Training loss: 1.8247 0.6509 sec/batch
Epoch 10/20  Iteration 1773/3560 Training loss: 1.8245 0.6529 sec/batch
Epoch 10/20  Iteration 1774/3560 Training loss: 1.8244 0.6298 sec/batch
Epoch 10/20  Iteration 1775/3560 Training loss: 1.8244 0.6472 sec/batch
Epoch 10/20  Iteration 1776/3560 Training loss: 1.8244 0.6404 sec/batch
Epoch 10/20  Iteration 1777/3560 Training loss: 1.8244 0.6393 sec/batch
Epoch 10/20  Iteration 1778/3560 Training loss: 1.8243 0.6292 sec/batch
Epoch 10/20  Iteration 1779/3560 Training loss: 1.8242 0.6473 sec/batch
Epoch 10/20  Iteration 1780/3560 Training loss: 1.8242 0.6653 sec/batch
Epoch 11/20  Iteration 1781/3560 Training loss: 1.8679 0.6267 sec/batch
Epoch 11/20  Iteration 1782/3560 Training loss: 1.8340 0.6998 sec/batch
Epoch 11/20  Iteration 1783/3560 Training loss: 1.8251 0.6633 sec/batch
Epoch 11/20  Iteration 1784/3560 Training loss: 1.8209 0.6610 sec/batch
Epoch 11/20  Iteration 1785/3560 Training loss: 1.8188 0.6679 sec/batch
Epoch 11/20  Iteration 1786/3560 Training loss: 1.8097 0.6351 sec/batch
Epoch 11/20  Iteration 1787/3560 Training loss: 1.8105 0.6377 sec/batch
Epoch 11/20  Iteration 1788/3560 Training loss: 1.8101 0.6589 sec/batch
Epoch 11/20  Iteration 1789/3560 Training loss: 1.8133 0.6291 sec/batch
Epoch 11/20  Iteration 1790/3560 Training loss: 1.8133 0.6360 sec/batch
Epoch 11/20  Iteration 1791/3560 Training loss: 1.8105 0.6457 sec/batch
Epoch 11/20  Iteration 1792/3560 Training loss: 1.8096 0.6367 sec/batch
Epoch 11/20  Iteration 1793/3560 Training loss: 1.8095 0.6444 sec/batch
Epoch 11/20  Iteration 1794/3560 Training loss: 1.8117 0.6329 sec/batch
Epoch 11/20  Iteration 1795/3560 Training loss: 1.8110 0.6509 sec/batch
Epoch 11/20  Iteration 1796/3560 Training loss: 1.8093 0.6313 sec/batch
Epoch 11/20  Iteration 1797/3560 Training loss: 1.8092 0.6394 sec/batch
Epoch 11/20  Iteration 1798/3560 Training loss: 1.8110 0.6308 sec/batch
Epoch 11/20  Iteration 1799/3560 Training loss: 1.8111 0.6236 sec/batch
Epoch 11/20  Iteration 1800/3560 Training loss: 1.8112 0.6455 sec/batch
Epoch 11/20  Iteration 1801/3560 Training loss: 1.8105 0.6361 sec/batch
Epoch 11/20  Iteration 1802/3560 Training loss: 1.8115 0.6262 sec/batch
Epoch 11/20  Iteration 1803/3560 Training loss: 1.8107 0.6451 sec/batch
Epoch 11/20  Iteration 1804/3560 Training loss: 1.8100 0.6273 sec/batch
Epoch 11/20  Iteration 1805/3560 Training loss: 1.8095 0.6281 sec/batch
Epoch 11/20  Iteration 1806/3560 Training loss: 1.8086 0.6318 sec/batch
Epoch 11/20  Iteration 1807/3560 Training loss: 1.8079 0.6518 sec/batch
Epoch 11/20  Iteration 1808/3560 Training loss: 1.8082 0.6419 sec/batch
Epoch 11/20  Iteration 1809/3560 Training loss: 1.8094 0.6384 sec/batch
Epoch 11/20  Iteration 1810/3560 Training loss: 1.8100 0.6429 sec/batch
Epoch 11/20  Iteration 1811/3560 Training loss: 1.8095 0.6550 sec/batch
Epoch 11/20  Iteration 1812/3560 Training loss: 1.8087 0.6336 sec/batch
Epoch 11/20  Iteration 1813/3560 Training loss: 1.8089 0.6432 sec/batch
Epoch 11/20  Iteration 1814/3560 Training loss: 1.8093 0.6414 sec/batch
Epoch 11/20  Iteration 1815/3560 Training loss: 1.8091 0.8516 sec/batch
Epoch 11/20  Iteration 1816/3560 Training loss: 1.8089 0.8219 sec/batch
Epoch 11/20  Iteration 1817/3560 Training loss: 1.8085 0.6773 sec/batch
Epoch 11/20  Iteration 1818/3560 Training loss: 1.8074 0.6579 sec/batch
Epoch 11/20  Iteration 1819/3560 Training loss: 1.8063 0.6500 sec/batch
Epoch 11/20  Iteration 1820/3560 Training loss: 1.8055 0.6437 sec/batch
Epoch 11/20  Iteration 1821/3560 Training loss: 1.8052 0.6269 sec/batch
Epoch 11/20  Iteration 1822/3560 Training loss: 1.8052 0.6348 sec/batch
Epoch 11/20  Iteration 1823/3560 Training loss: 1.8048 0.6497 sec/batch
Epoch 11/20  Iteration 1824/3560 Training loss: 1.8039 0.6327 sec/batch
Epoch 11/20  Iteration 1825/3560 Training loss: 1.8039 0.6286 sec/batch
Epoch 11/20  Iteration 1826/3560 Training loss: 1.8028 0.6507 sec/batch
Epoch 11/20  Iteration 1827/3560 Training loss: 1.8030 0.6311 sec/batch
Epoch 11/20  Iteration 1828/3560 Training loss: 1.8025 0.6296 sec/batch
Epoch 11/20  Iteration 1829/3560 Training loss: 1.8025 0.6271 sec/batch
Epoch 11/20  Iteration 1830/3560 Training loss: 1.8029 0.8482 sec/batch
Epoch 11/20  Iteration 1831/3560 Training loss: 1.8024 0.6901 sec/batch
Epoch 11/20  Iteration 1832/3560 Training loss: 1.8032 0.6338 sec/batch
Epoch 11/20  Iteration 1833/3560 Training loss: 1.8030 0.6361 sec/batch
Epoch 11/20  Iteration 1834/3560 Training loss: 1.8030 0.6559 sec/batch
Epoch 11/20  Iteration 1835/3560 Training loss: 1.8026 0.6349 sec/batch
Epoch 11/20  Iteration 1836/3560 Training loss: 1.8026 0.6297 sec/batch
Epoch 11/20  Iteration 1837/3560 Training loss: 1.8028 0.6407 sec/batch
Epoch 11/20  Iteration 1838/3560 Training loss: 1.8028 0.6475 sec/batch
Epoch 11/20  Iteration 1839/3560 Training loss: 1.8023 0.6563 sec/batch
Epoch 11/20  Iteration 1840/3560 Training loss: 1.8027 0.6304 sec/batch
Epoch 11/20  Iteration 1841/3560 Training loss: 1.8026 0.6358 sec/batch
Epoch 11/20  Iteration 1842/3560 Training loss: 1.8034 0.6534 sec/batch
Epoch 11/20  Iteration 1843/3560 Training loss: 1.8039 0.6428 sec/batch
Epoch 11/20  Iteration 1844/3560 Training loss: 1.8042 0.6372 sec/batch
Epoch 11/20  Iteration 1845/3560 Training loss: 1.8041 0.6352 sec/batch
Epoch 11/20  Iteration 1846/3560 Training loss: 1.8044 0.6379 sec/batch
Epoch 11/20  Iteration 1847/3560 Training loss: 1.8044 0.6244 sec/batch
Epoch 11/20  Iteration 1848/3560 Training loss: 1.8039 0.6273 sec/batch
Epoch 11/20  Iteration 1849/3560 Training loss: 1.8037 0.6257 sec/batch
Epoch 11/20  Iteration 1850/3560 Training loss: 1.8036 0.6582 sec/batch
Epoch 11/20  Iteration 1851/3560 Training loss: 1.8039 0.6450 sec/batch
Epoch 11/20  Iteration 1852/3560 Training loss: 1.8040 0.6393 sec/batch
Epoch 11/20  Iteration 1853/3560 Training loss: 1.8044 0.6559 sec/batch
Epoch 11/20  Iteration 1854/3560 Training loss: 1.8041 0.6465 sec/batch
Epoch 11/20  Iteration 1855/3560 Training loss: 1.8040 0.6453 sec/batch
Epoch 11/20  Iteration 1856/3560 Training loss: 1.8041 0.6433 sec/batch
Epoch 11/20  Iteration 1857/3560 Training loss: 1.8041 0.6564 sec/batch
Epoch 11/20  Iteration 1858/3560 Training loss: 1.8041 0.6207 sec/batch
Epoch 11/20  Iteration 1859/3560 Training loss: 1.8038 0.6362 sec/batch
Epoch 11/20  Iteration 1860/3560 Training loss: 1.8035 0.6206 sec/batch
Epoch 11/20  Iteration 1861/3560 Training loss: 1.8029 0.6446 sec/batch
Epoch 11/20  Iteration 1862/3560 Training loss: 1.8030 0.6342 sec/batch
Epoch 11/20  Iteration 1863/3560 Training loss: 1.8025 0.6339 sec/batch
Epoch 11/20  Iteration 1864/3560 Training loss: 1.8024 0.6355 sec/batch
Epoch 11/20  Iteration 1865/3560 Training loss: 1.8019 0.6870 sec/batch
Epoch 11/20  Iteration 1866/3560 Training loss: 1.8017 0.6387 sec/batch
Epoch 11/20  Iteration 1867/3560 Training loss: 1.8015 0.6214 sec/batch
Epoch 11/20  Iteration 1868/3560 Training loss: 1.8012 0.6413 sec/batch
Epoch 11/20  Iteration 1869/3560 Training loss: 1.8008 0.6396 sec/batch
Epoch 11/20  Iteration 1870/3560 Training loss: 1.8010 0.6427 sec/batch
Epoch 11/20  Iteration 1871/3560 Training loss: 1.8006 0.6352 sec/batch
Epoch 11/20  Iteration 1872/3560 Training loss: 1.8006 0.6367 sec/batch
Epoch 11/20  Iteration 1873/3560 Training loss: 1.8002 0.6532 sec/batch
Epoch 11/20  Iteration 1874/3560 Training loss: 1.7998 0.6526 sec/batch
Epoch 11/20  Iteration 1875/3560 Training loss: 1.7993 0.6412 sec/batch
Epoch 11/20  Iteration 1876/3560 Training loss: 1.7994 0.6371 sec/batch
Epoch 11/20  Iteration 1877/3560 Training loss: 1.7994 0.6360 sec/batch
Epoch 11/20  Iteration 1878/3560 Training loss: 1.7990 0.6379 sec/batch
Epoch 11/20  Iteration 1879/3560 Training loss: 1.7987 0.6335 sec/batch
Epoch 11/20  Iteration 1880/3560 Training loss: 1.7983 0.6338 sec/batch
Epoch 11/20  Iteration 1881/3560 Training loss: 1.7982 0.6552 sec/batch
Epoch 11/20  Iteration 1882/3560 Training loss: 1.7981 0.6323 sec/batch
Epoch 11/20  Iteration 1883/3560 Training loss: 1.7978 0.6420 sec/batch
Epoch 11/20  Iteration 1884/3560 Training loss: 1.7975 0.6409 sec/batch
Epoch 11/20  Iteration 1885/3560 Training loss: 1.7973 0.6320 sec/batch
Epoch 11/20  Iteration 1886/3560 Training loss: 1.7972 0.6347 sec/batch
Epoch 11/20  Iteration 1887/3560 Training loss: 1.7971 0.6303 sec/batch
Epoch 11/20  Iteration 1888/3560 Training loss: 1.7971 0.6317 sec/batch
Epoch 11/20  Iteration 1889/3560 Training loss: 1.7972 0.6552 sec/batch
Epoch 11/20  Iteration 1890/3560 Training loss: 1.7971 0.6399 sec/batch
Epoch 11/20  Iteration 1891/3560 Training loss: 1.7970 0.6235 sec/batch
Epoch 11/20  Iteration 1892/3560 Training loss: 1.7967 0.6440 sec/batch
Epoch 11/20  Iteration 1893/3560 Training loss: 1.7967 0.6208 sec/batch
Epoch 11/20  Iteration 1894/3560 Training loss: 1.7966 0.6325 sec/batch
Epoch 11/20  Iteration 1895/3560 Training loss: 1.7963 0.6413 sec/batch
Epoch 11/20  Iteration 1896/3560 Training loss: 1.7961 0.6435 sec/batch
Epoch 11/20  Iteration 1897/3560 Training loss: 1.7960 0.6545 sec/batch
Epoch 11/20  Iteration 1898/3560 Training loss: 1.7959 0.6354 sec/batch
Epoch 11/20  Iteration 1899/3560 Training loss: 1.7958 0.6503 sec/batch
Epoch 11/20  Iteration 1900/3560 Training loss: 1.7957 0.6473 sec/batch
Epoch 11/20  Iteration 1901/3560 Training loss: 1.7957 0.6163 sec/batch
Epoch 11/20  Iteration 1902/3560 Training loss: 1.7955 0.6393 sec/batch
Epoch 11/20  Iteration 1903/3560 Training loss: 1.7953 0.6366 sec/batch
Epoch 11/20  Iteration 1904/3560 Training loss: 1.7954 0.6342 sec/batch
Epoch 11/20  Iteration 1905/3560 Training loss: 1.7954 0.6529 sec/batch
Epoch 11/20  Iteration 1906/3560 Training loss: 1.7950 0.6405 sec/batch
Epoch 11/20  Iteration 1907/3560 Training loss: 1.7951 0.6329 sec/batch
Epoch 11/20  Iteration 1908/3560 Training loss: 1.7952 0.6370 sec/batch
Epoch 11/20  Iteration 1909/3560 Training loss: 1.7950 0.6446 sec/batch
Epoch 11/20  Iteration 1910/3560 Training loss: 1.7950 0.6308 sec/batch
Epoch 11/20  Iteration 1911/3560 Training loss: 1.7948 0.6359 sec/batch
Epoch 11/20  Iteration 1912/3560 Training loss: 1.7946 0.6313 sec/batch
Epoch 11/20  Iteration 1913/3560 Training loss: 1.7946 0.6580 sec/batch
Epoch 11/20  Iteration 1914/3560 Training loss: 1.7946 0.6379 sec/batch
Epoch 11/20  Iteration 1915/3560 Training loss: 1.7946 0.6337 sec/batch
Epoch 11/20  Iteration 1916/3560 Training loss: 1.7946 0.6259 sec/batch
Epoch 11/20  Iteration 1917/3560 Training loss: 1.7947 0.6446 sec/batch
Epoch 11/20  Iteration 1918/3560 Training loss: 1.7948 0.6322 sec/batch
Epoch 11/20  Iteration 1919/3560 Training loss: 1.7950 0.6485 sec/batch
Epoch 11/20  Iteration 1920/3560 Training loss: 1.7949 0.6558 sec/batch
Epoch 11/20  Iteration 1921/3560 Training loss: 1.7951 0.6336 sec/batch
Epoch 11/20  Iteration 1922/3560 Training loss: 1.7951 0.6310 sec/batch
Epoch 11/20  Iteration 1923/3560 Training loss: 1.7950 0.6407 sec/batch
Epoch 11/20  Iteration 1924/3560 Training loss: 1.7950 0.6291 sec/batch
Epoch 11/20  Iteration 1925/3560 Training loss: 1.7949 0.6302 sec/batch
Epoch 11/20  Iteration 1926/3560 Training loss: 1.7950 0.6358 sec/batch
Epoch 11/20  Iteration 1927/3560 Training loss: 1.7950 0.6283 sec/batch
Epoch 11/20  Iteration 1928/3560 Training loss: 1.7952 0.6620 sec/batch
Epoch 11/20  Iteration 1929/3560 Training loss: 1.7952 0.6283 sec/batch
Epoch 11/20  Iteration 1930/3560 Training loss: 1.7951 0.6336 sec/batch
Epoch 11/20  Iteration 1931/3560 Training loss: 1.7949 0.6379 sec/batch
Epoch 11/20  Iteration 1932/3560 Training loss: 1.7950 0.6331 sec/batch
Epoch 11/20  Iteration 1933/3560 Training loss: 1.7950 0.6358 sec/batch
Epoch 11/20  Iteration 1934/3560 Training loss: 1.7950 0.6289 sec/batch
Epoch 11/20  Iteration 1935/3560 Training loss: 1.7950 0.6383 sec/batch
Epoch 11/20  Iteration 1936/3560 Training loss: 1.7950 0.6531 sec/batch
Epoch 11/20  Iteration 1937/3560 Training loss: 1.7950 0.6463 sec/batch
Epoch 11/20  Iteration 1938/3560 Training loss: 1.7950 0.6368 sec/batch
Epoch 11/20  Iteration 1939/3560 Training loss: 1.7948 0.6395 sec/batch
Epoch 11/20  Iteration 1940/3560 Training loss: 1.7949 0.6253 sec/batch
Epoch 11/20  Iteration 1941/3560 Training loss: 1.7951 0.6288 sec/batch
Epoch 11/20  Iteration 1942/3560 Training loss: 1.7950 0.6371 sec/batch
Epoch 11/20  Iteration 1943/3560 Training loss: 1.7950 0.6276 sec/batch
Epoch 11/20  Iteration 1944/3560 Training loss: 1.7950 0.6543 sec/batch
Epoch 11/20  Iteration 1945/3560 Training loss: 1.7949 0.6224 sec/batch
Epoch 11/20  Iteration 1946/3560 Training loss: 1.7949 0.6565 sec/batch
Epoch 11/20  Iteration 1947/3560 Training loss: 1.7949 0.6486 sec/batch
Epoch 11/20  Iteration 1948/3560 Training loss: 1.7953 0.6233 sec/batch
Epoch 11/20  Iteration 1949/3560 Training loss: 1.7952 0.6408 sec/batch
Epoch 11/20  Iteration 1950/3560 Training loss: 1.7951 0.6396 sec/batch
Epoch 11/20  Iteration 1951/3560 Training loss: 1.7950 0.6370 sec/batch
Epoch 11/20  Iteration 1952/3560 Training loss: 1.7948 0.6525 sec/batch
Epoch 11/20  Iteration 1953/3560 Training loss: 1.7949 0.6329 sec/batch
Epoch 11/20  Iteration 1954/3560 Training loss: 1.7950 0.6335 sec/batch
Epoch 11/20  Iteration 1955/3560 Training loss: 1.7951 0.6144 sec/batch
Epoch 11/20  Iteration 1956/3560 Training loss: 1.7949 0.6565 sec/batch
Epoch 11/20  Iteration 1957/3560 Training loss: 1.7948 0.6135 sec/batch
Epoch 11/20  Iteration 1958/3560 Training loss: 1.7948 0.6425 sec/batch
Epoch 12/20  Iteration 1959/3560 Training loss: 1.8514 0.6549 sec/batch
Epoch 12/20  Iteration 1960/3560 Training loss: 1.8130 0.6490 sec/batch
Epoch 12/20  Iteration 1961/3560 Training loss: 1.8030 0.6279 sec/batch
Epoch 12/20  Iteration 1962/3560 Training loss: 1.7955 0.6298 sec/batch
Epoch 12/20  Iteration 1963/3560 Training loss: 1.7922 0.6298 sec/batch
Epoch 12/20  Iteration 1964/3560 Training loss: 1.7844 0.6252 sec/batch
Epoch 12/20  Iteration 1965/3560 Training loss: 1.7836 0.6286 sec/batch
Epoch 12/20  Iteration 1966/3560 Training loss: 1.7830 0.6539 sec/batch
Epoch 12/20  Iteration 1967/3560 Training loss: 1.7856 0.7135 sec/batch
Epoch 12/20  Iteration 1968/3560 Training loss: 1.7862 0.6859 sec/batch
Epoch 12/20  Iteration 1969/3560 Training loss: 1.7834 0.6361 sec/batch
Epoch 12/20  Iteration 1970/3560 Training loss: 1.7815 0.6543 sec/batch
Epoch 12/20  Iteration 1971/3560 Training loss: 1.7816 0.6772 sec/batch
Epoch 12/20  Iteration 1972/3560 Training loss: 1.7844 0.6390 sec/batch
Epoch 12/20  Iteration 1973/3560 Training loss: 1.7843 0.6440 sec/batch
Epoch 12/20  Iteration 1974/3560 Training loss: 1.7827 0.6250 sec/batch
Epoch 12/20  Iteration 1975/3560 Training loss: 1.7826 0.6422 sec/batch
Epoch 12/20  Iteration 1976/3560 Training loss: 1.7845 0.6614 sec/batch
Epoch 12/20  Iteration 1977/3560 Training loss: 1.7844 0.6348 sec/batch
Epoch 12/20  Iteration 1978/3560 Training loss: 1.7849 0.6279 sec/batch
Epoch 12/20  Iteration 1979/3560 Training loss: 1.7843 0.6347 sec/batch
Epoch 12/20  Iteration 1980/3560 Training loss: 1.7850 0.6291 sec/batch
Epoch 12/20  Iteration 1981/3560 Training loss: 1.7845 0.6481 sec/batch
Epoch 12/20  Iteration 1982/3560 Training loss: 1.7839 0.6329 sec/batch
Epoch 12/20  Iteration 1983/3560 Training loss: 1.7837 0.6531 sec/batch
Epoch 12/20  Iteration 1984/3560 Training loss: 1.7830 0.6370 sec/batch
Epoch 12/20  Iteration 1985/3560 Training loss: 1.7822 0.6345 sec/batch
Epoch 12/20  Iteration 1986/3560 Training loss: 1.7825 0.6370 sec/batch
Epoch 12/20  Iteration 1987/3560 Training loss: 1.7832 0.6332 sec/batch
Epoch 12/20  Iteration 1988/3560 Training loss: 1.7836 0.6323 sec/batch
Epoch 12/20  Iteration 1989/3560 Training loss: 1.7835 0.6307 sec/batch
Epoch 12/20  Iteration 1990/3560 Training loss: 1.7831 0.6473 sec/batch
Epoch 12/20  Iteration 1991/3560 Training loss: 1.7837 0.6550 sec/batch
Epoch 12/20  Iteration 1992/3560 Training loss: 1.7845 0.6473 sec/batch
Epoch 12/20  Iteration 1993/3560 Training loss: 1.7842 0.6978 sec/batch
Epoch 12/20  Iteration 1994/3560 Training loss: 1.7842 0.6325 sec/batch
Epoch 12/20  Iteration 1995/3560 Training loss: 1.7837 0.6364 sec/batch
Epoch 12/20  Iteration 1996/3560 Training loss: 1.7825 0.6262 sec/batch
Epoch 12/20  Iteration 1997/3560 Training loss: 1.7813 0.6262 sec/batch
Epoch 12/20  Iteration 1998/3560 Training loss: 1.7807 0.6748 sec/batch
Epoch 12/20  Iteration 1999/3560 Training loss: 1.7801 0.6469 sec/batch
Epoch 12/20  Iteration 2000/3560 Training loss: 1.7802 0.6443 sec/batch
Epoch 12/20  Iteration 2001/3560 Training loss: 1.7798 0.6312 sec/batch
Epoch 12/20  Iteration 2002/3560 Training loss: 1.7790 0.6351 sec/batch
Epoch 12/20  Iteration 2003/3560 Training loss: 1.7790 0.6338 sec/batch
Epoch 12/20  Iteration 2004/3560 Training loss: 1.7780 0.6351 sec/batch
Epoch 12/20  Iteration 2005/3560 Training loss: 1.7778 0.6352 sec/batch
Epoch 12/20  Iteration 2006/3560 Training loss: 1.7773 0.6373 sec/batch
Epoch 12/20  Iteration 2007/3560 Training loss: 1.7772 0.6534 sec/batch
Epoch 12/20  Iteration 2008/3560 Training loss: 1.7780 0.6353 sec/batch
Epoch 12/20  Iteration 2009/3560 Training loss: 1.7776 0.6502 sec/batch
Epoch 12/20  Iteration 2010/3560 Training loss: 1.7784 0.6353 sec/batch
Epoch 12/20  Iteration 2011/3560 Training loss: 1.7781 0.6318 sec/batch
Epoch 12/20  Iteration 2012/3560 Training loss: 1.7780 0.6372 sec/batch
Epoch 12/20  Iteration 2013/3560 Training loss: 1.7779 0.6323 sec/batch
Epoch 12/20  Iteration 2014/3560 Training loss: 1.7778 0.6380 sec/batch
Epoch 12/20  Iteration 2015/3560 Training loss: 1.7782 0.6565 sec/batch
Epoch 12/20  Iteration 2016/3560 Training loss: 1.7779 0.6402 sec/batch
Epoch 12/20  Iteration 2017/3560 Training loss: 1.7775 0.6513 sec/batch
Epoch 12/20  Iteration 2018/3560 Training loss: 1.7780 0.6339 sec/batch
Epoch 12/20  Iteration 2019/3560 Training loss: 1.7777 0.6274 sec/batch
Epoch 12/20  Iteration 2020/3560 Training loss: 1.7784 0.6213 sec/batch
Epoch 12/20  Iteration 2021/3560 Training loss: 1.7788 0.6368 sec/batch
Epoch 12/20  Iteration 2022/3560 Training loss: 1.7789 0.6433 sec/batch
Epoch 12/20  Iteration 2023/3560 Training loss: 1.7789 0.6562 sec/batch
Epoch 12/20  Iteration 2024/3560 Training loss: 1.7790 0.6347 sec/batch
Epoch 12/20  Iteration 2025/3560 Training loss: 1.7791 0.6305 sec/batch
Epoch 12/20  Iteration 2026/3560 Training loss: 1.7787 0.6446 sec/batch
Epoch 12/20  Iteration 2027/3560 Training loss: 1.7787 0.6357 sec/batch
Epoch 12/20  Iteration 2028/3560 Training loss: 1.7784 0.6406 sec/batch
Epoch 12/20  Iteration 2029/3560 Training loss: 1.7788 0.6334 sec/batch
Epoch 12/20  Iteration 2030/3560 Training loss: 1.7789 0.6283 sec/batch
Epoch 12/20  Iteration 2031/3560 Training loss: 1.7792 0.6493 sec/batch
Epoch 12/20  Iteration 2032/3560 Training loss: 1.7789 0.6366 sec/batch
Epoch 12/20  Iteration 2033/3560 Training loss: 1.7787 0.6432 sec/batch
Epoch 12/20  Iteration 2034/3560 Training loss: 1.7789 0.6120 sec/batch
Epoch 12/20  Iteration 2035/3560 Training loss: 1.7789 0.6420 sec/batch
Epoch 12/20  Iteration 2036/3560 Training loss: 1.7789 0.6130 sec/batch
Epoch 12/20  Iteration 2037/3560 Training loss: 1.7785 0.6302 sec/batch
Epoch 12/20  Iteration 2038/3560 Training loss: 1.7784 0.6366 sec/batch
Epoch 12/20  Iteration 2039/3560 Training loss: 1.7780 0.6719 sec/batch
Epoch 12/20  Iteration 2040/3560 Training loss: 1.7781 0.6420 sec/batch
Epoch 12/20  Iteration 2041/3560 Training loss: 1.7776 0.6420 sec/batch
Epoch 12/20  Iteration 2042/3560 Training loss: 1.7776 0.6516 sec/batch
Epoch 12/20  Iteration 2043/3560 Training loss: 1.7770 0.6375 sec/batch
Epoch 12/20  Iteration 2044/3560 Training loss: 1.7768 0.6379 sec/batch
Epoch 12/20  Iteration 2045/3560 Training loss: 1.7766 0.6396 sec/batch
Epoch 12/20  Iteration 2046/3560 Training loss: 1.7762 0.6472 sec/batch
Epoch 12/20  Iteration 2047/3560 Training loss: 1.7758 0.6255 sec/batch
Epoch 12/20  Iteration 2048/3560 Training loss: 1.7758 0.6347 sec/batch
Epoch 12/20  Iteration 2049/3560 Training loss: 1.7755 0.6326 sec/batch
Epoch 12/20  Iteration 2050/3560 Training loss: 1.7753 0.6235 sec/batch
Epoch 12/20  Iteration 2051/3560 Training loss: 1.7750 0.6352 sec/batch
Epoch 12/20  Iteration 2052/3560 Training loss: 1.7746 0.6405 sec/batch
Epoch 12/20  Iteration 2053/3560 Training loss: 1.7742 0.6382 sec/batch
Epoch 12/20  Iteration 2054/3560 Training loss: 1.7741 0.6575 sec/batch
Epoch 12/20  Iteration 2055/3560 Training loss: 1.7741 0.6375 sec/batch
Epoch 12/20  Iteration 2056/3560 Training loss: 1.7737 0.6439 sec/batch
Epoch 12/20  Iteration 2057/3560 Training loss: 1.7734 0.6405 sec/batch
Epoch 12/20  Iteration 2058/3560 Training loss: 1.7729 0.6343 sec/batch
Epoch 12/20  Iteration 2059/3560 Training loss: 1.7728 0.6311 sec/batch
Epoch 12/20  Iteration 2060/3560 Training loss: 1.7727 0.6340 sec/batch
Epoch 12/20  Iteration 2061/3560 Training loss: 1.7725 0.6338 sec/batch
Epoch 12/20  Iteration 2062/3560 Training loss: 1.7722 0.6473 sec/batch
Epoch 12/20  Iteration 2063/3560 Training loss: 1.7719 0.6387 sec/batch
Epoch 12/20  Iteration 2064/3560 Training loss: 1.7718 0.6471 sec/batch
Epoch 12/20  Iteration 2065/3560 Training loss: 1.7718 0.6377 sec/batch
Epoch 12/20  Iteration 2066/3560 Training loss: 1.7717 0.6238 sec/batch
Epoch 12/20  Iteration 2067/3560 Training loss: 1.7719 0.6353 sec/batch
Epoch 12/20  Iteration 2068/3560 Training loss: 1.7718 0.6413 sec/batch
Epoch 12/20  Iteration 2069/3560 Training loss: 1.7718 0.6392 sec/batch
Epoch 12/20  Iteration 2070/3560 Training loss: 1.7716 0.6486 sec/batch
Epoch 12/20  Iteration 2071/3560 Training loss: 1.7715 0.6399 sec/batch
Epoch 12/20  Iteration 2072/3560 Training loss: 1.7713 0.6451 sec/batch
Epoch 12/20  Iteration 2073/3560 Training loss: 1.7710 0.6315 sec/batch
Epoch 12/20  Iteration 2074/3560 Training loss: 1.7708 0.6316 sec/batch
Epoch 12/20  Iteration 2075/3560 Training loss: 1.7707 0.6370 sec/batch
Epoch 12/20  Iteration 2076/3560 Training loss: 1.7706 0.6294 sec/batch
Epoch 12/20  Iteration 2077/3560 Training loss: 1.7705 0.6273 sec/batch
Epoch 12/20  Iteration 2078/3560 Training loss: 1.7704 0.6442 sec/batch
Epoch 12/20  Iteration 2079/3560 Training loss: 1.7705 0.6334 sec/batch
Epoch 12/20  Iteration 2080/3560 Training loss: 1.7702 0.6314 sec/batch
Epoch 12/20  Iteration 2081/3560 Training loss: 1.7700 0.6398 sec/batch
Epoch 12/20  Iteration 2082/3560 Training loss: 1.7700 0.6448 sec/batch
Epoch 12/20  Iteration 2083/3560 Training loss: 1.7699 0.6378 sec/batch
Epoch 12/20  Iteration 2084/3560 Training loss: 1.7695 0.6400 sec/batch
Epoch 12/20  Iteration 2085/3560 Training loss: 1.7696 0.6550 sec/batch
Epoch 12/20  Iteration 2086/3560 Training loss: 1.7696 0.8181 sec/batch
Epoch 12/20  Iteration 2087/3560 Training loss: 1.7695 0.6403 sec/batch
Epoch 12/20  Iteration 2088/3560 Training loss: 1.7694 0.6198 sec/batch
Epoch 12/20  Iteration 2089/3560 Training loss: 1.7691 0.6484 sec/batch
Epoch 12/20  Iteration 2090/3560 Training loss: 1.7689 0.6321 sec/batch
Epoch 12/20  Iteration 2091/3560 Training loss: 1.7690 0.6330 sec/batch
Epoch 12/20  Iteration 2092/3560 Training loss: 1.7689 0.6334 sec/batch
Epoch 12/20  Iteration 2093/3560 Training loss: 1.7689 0.6398 sec/batch
Epoch 12/20  Iteration 2094/3560 Training loss: 1.7690 0.6464 sec/batch
Epoch 12/20  Iteration 2095/3560 Training loss: 1.7692 0.6315 sec/batch
Epoch 12/20  Iteration 2096/3560 Training loss: 1.7692 0.6340 sec/batch
Epoch 12/20  Iteration 2097/3560 Training loss: 1.7693 0.6379 sec/batch
Epoch 12/20  Iteration 2098/3560 Training loss: 1.7693 0.6356 sec/batch
Epoch 12/20  Iteration 2099/3560 Training loss: 1.7695 0.6470 sec/batch
Epoch 12/20  Iteration 2100/3560 Training loss: 1.7695 0.6361 sec/batch
Epoch 12/20  Iteration 2101/3560 Training loss: 1.7694 0.6350 sec/batch
Epoch 12/20  Iteration 2102/3560 Training loss: 1.7694 0.6430 sec/batch
Epoch 12/20  Iteration 2103/3560 Training loss: 1.7694 0.6390 sec/batch
Epoch 12/20  Iteration 2104/3560 Training loss: 1.7695 0.6303 sec/batch
Epoch 12/20  Iteration 2105/3560 Training loss: 1.7695 0.6343 sec/batch
Epoch 12/20  Iteration 2106/3560 Training loss: 1.7697 0.6336 sec/batch
Epoch 12/20  Iteration 2107/3560 Training loss: 1.7697 0.6436 sec/batch
Epoch 12/20  Iteration 2108/3560 Training loss: 1.7696 0.6414 sec/batch
Epoch 12/20  Iteration 2109/3560 Training loss: 1.7694 0.6590 sec/batch
Epoch 12/20  Iteration 2110/3560 Training loss: 1.7695 0.6316 sec/batch
Epoch 12/20  Iteration 2111/3560 Training loss: 1.7696 0.6358 sec/batch
Epoch 12/20  Iteration 2112/3560 Training loss: 1.7696 0.6253 sec/batch
Epoch 12/20  Iteration 2113/3560 Training loss: 1.7696 0.6301 sec/batch
Epoch 12/20  Iteration 2114/3560 Training loss: 1.7695 0.6316 sec/batch
Epoch 12/20  Iteration 2115/3560 Training loss: 1.7696 0.6383 sec/batch
Epoch 12/20  Iteration 2116/3560 Training loss: 1.7695 0.6337 sec/batch
Epoch 12/20  Iteration 2117/3560 Training loss: 1.7693 0.6591 sec/batch
Epoch 12/20  Iteration 2118/3560 Training loss: 1.7694 0.6226 sec/batch
Epoch 12/20  Iteration 2119/3560 Training loss: 1.7696 0.6381 sec/batch
Epoch 12/20  Iteration 2120/3560 Training loss: 1.7695 0.6121 sec/batch
Epoch 12/20  Iteration 2121/3560 Training loss: 1.7695 0.6349 sec/batch
Epoch 12/20  Iteration 2122/3560 Training loss: 1.7695 0.6309 sec/batch
Epoch 12/20  Iteration 2123/3560 Training loss: 1.7694 0.6372 sec/batch
Epoch 12/20  Iteration 2124/3560 Training loss: 1.7694 0.6331 sec/batch
Epoch 12/20  Iteration 2125/3560 Training loss: 1.7694 0.6529 sec/batch
Epoch 12/20  Iteration 2126/3560 Training loss: 1.7699 0.6368 sec/batch
Epoch 12/20  Iteration 2127/3560 Training loss: 1.7697 0.6402 sec/batch
Epoch 12/20  Iteration 2128/3560 Training loss: 1.7697 0.6275 sec/batch
Epoch 12/20  Iteration 2129/3560 Training loss: 1.7696 0.6378 sec/batch
Epoch 12/20  Iteration 2130/3560 Training loss: 1.7695 0.6261 sec/batch
Epoch 12/20  Iteration 2131/3560 Training loss: 1.7696 0.6369 sec/batch
Epoch 12/20  Iteration 2132/3560 Training loss: 1.7695 0.6511 sec/batch
Epoch 12/20  Iteration 2133/3560 Training loss: 1.7696 0.6526 sec/batch
Epoch 12/20  Iteration 2134/3560 Training loss: 1.7695 0.6425 sec/batch
Epoch 12/20  Iteration 2135/3560 Training loss: 1.7694 0.6377 sec/batch
Epoch 12/20  Iteration 2136/3560 Training loss: 1.7694 0.6418 sec/batch
Epoch 13/20  Iteration 2137/3560 Training loss: 1.8218 0.6242 sec/batch
Epoch 13/20  Iteration 2138/3560 Training loss: 1.7914 0.6320 sec/batch
Epoch 13/20  Iteration 2139/3560 Training loss: 1.7795 0.6385 sec/batch
Epoch 13/20  Iteration 2140/3560 Training loss: 1.7730 0.6321 sec/batch
Epoch 13/20  Iteration 2141/3560 Training loss: 1.7670 0.6504 sec/batch
Epoch 13/20  Iteration 2142/3560 Training loss: 1.7582 0.6359 sec/batch
Epoch 13/20  Iteration 2143/3560 Training loss: 1.7603 0.6365 sec/batch
Epoch 13/20  Iteration 2144/3560 Training loss: 1.7591 0.6439 sec/batch
Epoch 13/20  Iteration 2145/3560 Training loss: 1.7613 0.6441 sec/batch
Epoch 13/20  Iteration 2146/3560 Training loss: 1.7601 0.6285 sec/batch
Epoch 13/20  Iteration 2147/3560 Training loss: 1.7570 0.6364 sec/batch
Epoch 13/20  Iteration 2148/3560 Training loss: 1.7560 0.6381 sec/batch
Epoch 13/20  Iteration 2149/3560 Training loss: 1.7552 0.6539 sec/batch
Epoch 13/20  Iteration 2150/3560 Training loss: 1.7575 0.6406 sec/batch
Epoch 13/20  Iteration 2151/3560 Training loss: 1.7569 0.6382 sec/batch
Epoch 13/20  Iteration 2152/3560 Training loss: 1.7554 0.6371 sec/batch
Epoch 13/20  Iteration 2153/3560 Training loss: 1.7560 0.7206 sec/batch
Epoch 13/20  Iteration 2154/3560 Training loss: 1.7580 0.6363 sec/batch
Epoch 13/20  Iteration 2155/3560 Training loss: 1.7579 0.6752 sec/batch
Epoch 13/20  Iteration 2156/3560 Training loss: 1.7581 0.6614 sec/batch
Epoch 13/20  Iteration 2157/3560 Training loss: 1.7581 0.6419 sec/batch
Epoch 13/20  Iteration 2158/3560 Training loss: 1.7590 0.6335 sec/batch
Epoch 13/20  Iteration 2159/3560 Training loss: 1.7580 0.6314 sec/batch
Epoch 13/20  Iteration 2160/3560 Training loss: 1.7575 0.6457 sec/batch
Epoch 13/20  Iteration 2161/3560 Training loss: 1.7572 0.6499 sec/batch
Epoch 13/20  Iteration 2162/3560 Training loss: 1.7560 0.6442 sec/batch
Epoch 13/20  Iteration 2163/3560 Training loss: 1.7551 0.6459 sec/batch
Epoch 13/20  Iteration 2164/3560 Training loss: 1.7559 0.6530 sec/batch
Epoch 13/20  Iteration 2165/3560 Training loss: 1.7567 0.6403 sec/batch
Epoch 13/20  Iteration 2166/3560 Training loss: 1.7572 0.6394 sec/batch
Epoch 13/20  Iteration 2167/3560 Training loss: 1.7568 0.6299 sec/batch
Epoch 13/20  Iteration 2168/3560 Training loss: 1.7565 0.6330 sec/batch
Epoch 13/20  Iteration 2169/3560 Training loss: 1.7572 0.6338 sec/batch
Epoch 13/20  Iteration 2170/3560 Training loss: 1.7576 0.6267 sec/batch
Epoch 13/20  Iteration 2171/3560 Training loss: 1.7576 0.6458 sec/batch
Epoch 13/20  Iteration 2172/3560 Training loss: 1.7575 0.6496 sec/batch
Epoch 13/20  Iteration 2173/3560 Training loss: 1.7570 0.6552 sec/batch
Epoch 13/20  Iteration 2174/3560 Training loss: 1.7562 0.6313 sec/batch
Epoch 13/20  Iteration 2175/3560 Training loss: 1.7552 0.6317 sec/batch
Epoch 13/20  Iteration 2176/3560 Training loss: 1.7549 0.6360 sec/batch
Epoch 13/20  Iteration 2177/3560 Training loss: 1.7547 0.6348 sec/batch
Epoch 13/20  Iteration 2178/3560 Training loss: 1.7553 0.6475 sec/batch
Epoch 13/20  Iteration 2179/3560 Training loss: 1.7551 0.6366 sec/batch
Epoch 13/20  Iteration 2180/3560 Training loss: 1.7544 0.6617 sec/batch
Epoch 13/20  Iteration 2181/3560 Training loss: 1.7545 0.6288 sec/batch
Epoch 13/20  Iteration 2182/3560 Training loss: 1.7534 0.6373 sec/batch
Epoch 13/20  Iteration 2183/3560 Training loss: 1.7532 0.6312 sec/batch
Epoch 13/20  Iteration 2184/3560 Training loss: 1.7528 0.6413 sec/batch
Epoch 13/20  Iteration 2185/3560 Training loss: 1.7525 0.6261 sec/batch
Epoch 13/20  Iteration 2186/3560 Training loss: 1.7532 0.6394 sec/batch
Epoch 13/20  Iteration 2187/3560 Training loss: 1.7527 0.6355 sec/batch
Epoch 13/20  Iteration 2188/3560 Training loss: 1.7536 0.6482 sec/batch
Epoch 13/20  Iteration 2189/3560 Training loss: 1.7534 0.6486 sec/batch
Epoch 13/20  Iteration 2190/3560 Training loss: 1.7536 0.6267 sec/batch
Epoch 13/20  Iteration 2191/3560 Training loss: 1.7534 0.6191 sec/batch
Epoch 13/20  Iteration 2192/3560 Training loss: 1.7535 0.6425 sec/batch
Epoch 13/20  Iteration 2193/3560 Training loss: 1.7540 0.6375 sec/batch
Epoch 13/20  Iteration 2194/3560 Training loss: 1.7538 0.6366 sec/batch
Epoch 13/20  Iteration 2195/3560 Training loss: 1.7533 0.6404 sec/batch
Epoch 13/20  Iteration 2196/3560 Training loss: 1.7536 0.6497 sec/batch
Epoch 13/20  Iteration 2197/3560 Training loss: 1.7534 0.6346 sec/batch
Epoch 13/20  Iteration 2198/3560 Training loss: 1.7542 0.6463 sec/batch
Epoch 13/20  Iteration 2199/3560 Training loss: 1.7545 0.6384 sec/batch
Epoch 13/20  Iteration 2200/3560 Training loss: 1.7546 0.6368 sec/batch
Epoch 13/20  Iteration 2201/3560 Training loss: 1.7544 0.6634 sec/batch
Epoch 13/20  Iteration 2202/3560 Training loss: 1.7546 0.6411 sec/batch
Epoch 13/20  Iteration 2203/3560 Training loss: 1.7548 0.6391 sec/batch
Epoch 13/20  Iteration 2204/3560 Training loss: 1.7544 0.6602 sec/batch
Epoch 13/20  Iteration 2205/3560 Training loss: 1.7544 0.6405 sec/batch
Epoch 13/20  Iteration 2206/3560 Training loss: 1.7543 0.6336 sec/batch
Epoch 13/20  Iteration 2207/3560 Training loss: 1.7548 0.6278 sec/batch
Epoch 13/20  Iteration 2208/3560 Training loss: 1.7550 0.6316 sec/batch
Epoch 13/20  Iteration 2209/3560 Training loss: 1.7553 0.6236 sec/batch
Epoch 13/20  Iteration 2210/3560 Training loss: 1.7551 0.6377 sec/batch
Epoch 13/20  Iteration 2211/3560 Training loss: 1.7551 0.6298 sec/batch
Epoch 13/20  Iteration 2212/3560 Training loss: 1.7553 0.6652 sec/batch
Epoch 13/20  Iteration 2213/3560 Training loss: 1.7553 0.6373 sec/batch
Epoch 13/20  Iteration 2214/3560 Training loss: 1.7552 0.6497 sec/batch
Epoch 13/20  Iteration 2215/3560 Training loss: 1.7548 0.6253 sec/batch
Epoch 13/20  Iteration 2216/3560 Training loss: 1.7545 0.6275 sec/batch
Epoch 13/20  Iteration 2217/3560 Training loss: 1.7539 0.6328 sec/batch
Epoch 13/20  Iteration 2218/3560 Training loss: 1.7541 0.6350 sec/batch
Epoch 13/20  Iteration 2219/3560 Training loss: 1.7535 0.6362 sec/batch
Epoch 13/20  Iteration 2220/3560 Training loss: 1.7535 0.6441 sec/batch
Epoch 13/20  Iteration 2221/3560 Training loss: 1.7530 0.6377 sec/batch
Epoch 13/20  Iteration 2222/3560 Training loss: 1.7526 0.6345 sec/batch
Epoch 13/20  Iteration 2223/3560 Training loss: 1.7524 0.6401 sec/batch
Epoch 13/20  Iteration 2224/3560 Training loss: 1.7521 0.6411 sec/batch
Epoch 13/20  Iteration 2225/3560 Training loss: 1.7517 0.6509 sec/batch
Epoch 13/20  Iteration 2226/3560 Training loss: 1.7519 0.6322 sec/batch
Epoch 13/20  Iteration 2227/3560 Training loss: 1.7516 0.6460 sec/batch
Epoch 13/20  Iteration 2228/3560 Training loss: 1.7515 0.6358 sec/batch
Epoch 13/20  Iteration 2229/3560 Training loss: 1.7512 0.6299 sec/batch
Epoch 13/20  Iteration 2230/3560 Training loss: 1.7508 0.6260 sec/batch
Epoch 13/20  Iteration 2231/3560 Training loss: 1.7505 0.6320 sec/batch
Epoch 13/20  Iteration 2232/3560 Training loss: 1.7505 0.6291 sec/batch
Epoch 13/20  Iteration 2233/3560 Training loss: 1.7505 0.6520 sec/batch
Epoch 13/20  Iteration 2234/3560 Training loss: 1.7501 0.6371 sec/batch
Epoch 13/20  Iteration 2235/3560 Training loss: 1.7498 0.6563 sec/batch
Epoch 13/20  Iteration 2236/3560 Training loss: 1.7494 0.6329 sec/batch
Epoch 13/20  Iteration 2237/3560 Training loss: 1.7495 0.6396 sec/batch
Epoch 13/20  Iteration 2238/3560 Training loss: 1.7494 0.6415 sec/batch
Epoch 13/20  Iteration 2239/3560 Training loss: 1.7492 0.6429 sec/batch
Epoch 13/20  Iteration 2240/3560 Training loss: 1.7491 0.6380 sec/batch
Epoch 13/20  Iteration 2241/3560 Training loss: 1.7488 0.6478 sec/batch
Epoch 13/20  Iteration 2242/3560 Training loss: 1.7488 0.6333 sec/batch
Epoch 13/20  Iteration 2243/3560 Training loss: 1.7486 0.6555 sec/batch
Epoch 13/20  Iteration 2244/3560 Training loss: 1.7487 0.6416 sec/batch
Epoch 13/20  Iteration 2245/3560 Training loss: 1.7487 0.6424 sec/batch
Epoch 13/20  Iteration 2246/3560 Training loss: 1.7486 0.6383 sec/batch
Epoch 13/20  Iteration 2247/3560 Training loss: 1.7485 0.6446 sec/batch
Epoch 13/20  Iteration 2248/3560 Training loss: 1.7483 0.6271 sec/batch
Epoch 13/20  Iteration 2249/3560 Training loss: 1.7482 0.6408 sec/batch
Epoch 13/20  Iteration 2250/3560 Training loss: 1.7479 0.6387 sec/batch
Epoch 13/20  Iteration 2251/3560 Training loss: 1.7477 0.6553 sec/batch
Epoch 13/20  Iteration 2252/3560 Training loss: 1.7473 0.6069 sec/batch
Epoch 13/20  Iteration 2253/3560 Training loss: 1.7474 0.6272 sec/batch
Epoch 13/20  Iteration 2254/3560 Training loss: 1.7474 0.6239 sec/batch
Epoch 13/20  Iteration 2255/3560 Training loss: 1.7473 0.6400 sec/batch
Epoch 13/20  Iteration 2256/3560 Training loss: 1.7473 0.6419 sec/batch
Epoch 13/20  Iteration 2257/3560 Training loss: 1.7473 0.6151 sec/batch
Epoch 13/20  Iteration 2258/3560 Training loss: 1.7470 0.6389 sec/batch
Epoch 13/20  Iteration 2259/3560 Training loss: 1.7467 0.6497 sec/batch
Epoch 13/20  Iteration 2260/3560 Training loss: 1.7468 0.6342 sec/batch
Epoch 13/20  Iteration 2261/3560 Training loss: 1.7467 0.6349 sec/batch
Epoch 13/20  Iteration 2262/3560 Training loss: 1.7464 0.6143 sec/batch
Epoch 13/20  Iteration 2263/3560 Training loss: 1.7465 0.6321 sec/batch
Epoch 13/20  Iteration 2264/3560 Training loss: 1.7466 0.6273 sec/batch
Epoch 13/20  Iteration 2265/3560 Training loss: 1.7465 0.6400 sec/batch
Epoch 13/20  Iteration 2266/3560 Training loss: 1.7463 0.6312 sec/batch
Epoch 13/20  Iteration 2267/3560 Training loss: 1.7460 0.6482 sec/batch
Epoch 13/20  Iteration 2268/3560 Training loss: 1.7458 0.6478 sec/batch
Epoch 13/20  Iteration 2269/3560 Training loss: 1.7459 0.6372 sec/batch
Epoch 13/20  Iteration 2270/3560 Training loss: 1.7459 0.6320 sec/batch
Epoch 13/20  Iteration 2271/3560 Training loss: 1.7458 0.6491 sec/batch
Epoch 13/20  Iteration 2272/3560 Training loss: 1.7458 0.6376 sec/batch
Epoch 13/20  Iteration 2273/3560 Training loss: 1.7459 0.6386 sec/batch
Epoch 13/20  Iteration 2274/3560 Training loss: 1.7461 0.6316 sec/batch
Epoch 13/20  Iteration 2275/3560 Training loss: 1.7463 0.6489 sec/batch
Epoch 13/20  Iteration 2276/3560 Training loss: 1.7462 0.6326 sec/batch
Epoch 13/20  Iteration 2277/3560 Training loss: 1.7464 0.6332 sec/batch
Epoch 13/20  Iteration 2278/3560 Training loss: 1.7464 0.6615 sec/batch
Epoch 13/20  Iteration 2279/3560 Training loss: 1.7464 0.7695 sec/batch
Epoch 13/20  Iteration 2280/3560 Training loss: 1.7463 0.8011 sec/batch
Epoch 13/20  Iteration 2281/3560 Training loss: 1.7463 0.7166 sec/batch
Epoch 13/20  Iteration 2282/3560 Training loss: 1.7464 0.6553 sec/batch
Epoch 13/20  Iteration 2283/3560 Training loss: 1.7464 0.6305 sec/batch
Epoch 13/20  Iteration 2284/3560 Training loss: 1.7466 0.6411 sec/batch
Epoch 13/20  Iteration 2285/3560 Training loss: 1.7467 0.6265 sec/batch
Epoch 13/20  Iteration 2286/3560 Training loss: 1.7466 0.6435 sec/batch
Epoch 13/20  Iteration 2287/3560 Training loss: 1.7464 0.6171 sec/batch
Epoch 13/20  Iteration 2288/3560 Training loss: 1.7465 0.6381 sec/batch
Epoch 13/20  Iteration 2289/3560 Training loss: 1.7465 0.6296 sec/batch
Epoch 13/20  Iteration 2290/3560 Training loss: 1.7465 0.6434 sec/batch
Epoch 13/20  Iteration 2291/3560 Training loss: 1.7465 0.6288 sec/batch
Epoch 13/20  Iteration 2292/3560 Training loss: 1.7465 0.6361 sec/batch
Epoch 13/20  Iteration 2293/3560 Training loss: 1.7465 0.6357 sec/batch
Epoch 13/20  Iteration 2294/3560 Training loss: 1.7465 0.6313 sec/batch
Epoch 13/20  Iteration 2295/3560 Training loss: 1.7463 0.6244 sec/batch
Epoch 13/20  Iteration 2296/3560 Training loss: 1.7463 0.6387 sec/batch
Epoch 13/20  Iteration 2297/3560 Training loss: 1.7465 0.6324 sec/batch
Epoch 13/20  Iteration 2298/3560 Training loss: 1.7464 0.6497 sec/batch
Epoch 13/20  Iteration 2299/3560 Training loss: 1.7464 0.6403 sec/batch
Epoch 13/20  Iteration 2300/3560 Training loss: 1.7464 0.6409 sec/batch
Epoch 13/20  Iteration 2301/3560 Training loss: 1.7464 0.6460 sec/batch
Epoch 13/20  Iteration 2302/3560 Training loss: 1.7463 0.6347 sec/batch
Epoch 13/20  Iteration 2303/3560 Training loss: 1.7464 0.6342 sec/batch
Epoch 13/20  Iteration 2304/3560 Training loss: 1.7469 0.6407 sec/batch
Epoch 13/20  Iteration 2305/3560 Training loss: 1.7468 0.6352 sec/batch
Epoch 13/20  Iteration 2306/3560 Training loss: 1.7467 0.6582 sec/batch
Epoch 13/20  Iteration 2307/3560 Training loss: 1.7465 0.6203 sec/batch
Epoch 13/20  Iteration 2308/3560 Training loss: 1.7463 0.6115 sec/batch
Epoch 13/20  Iteration 2309/3560 Training loss: 1.7464 0.6399 sec/batch
Epoch 13/20  Iteration 2310/3560 Training loss: 1.7464 0.6953 sec/batch
Epoch 13/20  Iteration 2311/3560 Training loss: 1.7465 0.6463 sec/batch
Epoch 13/20  Iteration 2312/3560 Training loss: 1.7464 0.6325 sec/batch
Epoch 13/20  Iteration 2313/3560 Training loss: 1.7463 0.6354 sec/batch
Epoch 13/20  Iteration 2314/3560 Training loss: 1.7463 0.6502 sec/batch
Epoch 14/20  Iteration 2315/3560 Training loss: 1.8053 0.6256 sec/batch
Epoch 14/20  Iteration 2316/3560 Training loss: 1.7717 0.6363 sec/batch
Epoch 14/20  Iteration 2317/3560 Training loss: 1.7578 0.6559 sec/batch
Epoch 14/20  Iteration 2318/3560 Training loss: 1.7558 0.6366 sec/batch
Epoch 14/20  Iteration 2319/3560 Training loss: 1.7487 0.6325 sec/batch
Epoch 14/20  Iteration 2320/3560 Training loss: 1.7391 0.6216 sec/batch
Epoch 14/20  Iteration 2321/3560 Training loss: 1.7390 0.6350 sec/batch
Epoch 14/20  Iteration 2322/3560 Training loss: 1.7391 0.6540 sec/batch
Epoch 14/20  Iteration 2323/3560 Training loss: 1.7408 0.6383 sec/batch
Epoch 14/20  Iteration 2324/3560 Training loss: 1.7402 0.6277 sec/batch
Epoch 14/20  Iteration 2325/3560 Training loss: 1.7374 0.6424 sec/batch
Epoch 14/20  Iteration 2326/3560 Training loss: 1.7361 0.6207 sec/batch
Epoch 14/20  Iteration 2327/3560 Training loss: 1.7363 0.6335 sec/batch
Epoch 14/20  Iteration 2328/3560 Training loss: 1.7392 0.6284 sec/batch
Epoch 14/20  Iteration 2329/3560 Training loss: 1.7381 0.6431 sec/batch
Epoch 14/20  Iteration 2330/3560 Training loss: 1.7363 0.6708 sec/batch
Epoch 14/20  Iteration 2331/3560 Training loss: 1.7362 0.6667 sec/batch
Epoch 14/20  Iteration 2332/3560 Training loss: 1.7383 0.6470 sec/batch
Epoch 14/20  Iteration 2333/3560 Training loss: 1.7379 0.6384 sec/batch
Epoch 14/20  Iteration 2334/3560 Training loss: 1.7383 0.6307 sec/batch
Epoch 14/20  Iteration 2335/3560 Training loss: 1.7379 0.6305 sec/batch
Epoch 14/20  Iteration 2336/3560 Training loss: 1.7389 0.6396 sec/batch
Epoch 14/20  Iteration 2337/3560 Training loss: 1.7383 0.6486 sec/batch
Epoch 14/20  Iteration 2338/3560 Training loss: 1.7380 0.6346 sec/batch
Epoch 14/20  Iteration 2339/3560 Training loss: 1.7377 0.7355 sec/batch
Epoch 14/20  Iteration 2340/3560 Training loss: 1.7370 0.6283 sec/batch
Epoch 14/20  Iteration 2341/3560 Training loss: 1.7363 0.6901 sec/batch
Epoch 14/20  Iteration 2342/3560 Training loss: 1.7369 0.6483 sec/batch
Epoch 14/20  Iteration 2343/3560 Training loss: 1.7377 0.6322 sec/batch
Epoch 14/20  Iteration 2344/3560 Training loss: 1.7382 0.6197 sec/batch
Epoch 14/20  Iteration 2345/3560 Training loss: 1.7381 0.6574 sec/batch
Epoch 14/20  Iteration 2346/3560 Training loss: 1.7374 0.6326 sec/batch
Epoch 14/20  Iteration 2347/3560 Training loss: 1.7380 0.6370 sec/batch
Epoch 14/20  Iteration 2348/3560 Training loss: 1.7385 0.6477 sec/batch
Epoch 14/20  Iteration 2349/3560 Training loss: 1.7388 0.6462 sec/batch
Epoch 14/20  Iteration 2350/3560 Training loss: 1.7386 0.6384 sec/batch
Epoch 14/20  Iteration 2351/3560 Training loss: 1.7380 0.6340 sec/batch
Epoch 14/20  Iteration 2352/3560 Training loss: 1.7368 0.6356 sec/batch
Epoch 14/20  Iteration 2353/3560 Training loss: 1.7358 0.6567 sec/batch
Epoch 14/20  Iteration 2354/3560 Training loss: 1.7353 0.6485 sec/batch
Epoch 14/20  Iteration 2355/3560 Training loss: 1.7351 0.6446 sec/batch
Epoch 14/20  Iteration 2356/3560 Training loss: 1.7352 0.6347 sec/batch
Epoch 14/20  Iteration 2357/3560 Training loss: 1.7349 0.6355 sec/batch
Epoch 14/20  Iteration 2358/3560 Training loss: 1.7340 0.6464 sec/batch
Epoch 14/20  Iteration 2359/3560 Training loss: 1.7342 0.6354 sec/batch
Epoch 14/20  Iteration 2360/3560 Training loss: 1.7331 0.6178 sec/batch
Epoch 14/20  Iteration 2361/3560 Training loss: 1.7328 0.6620 sec/batch
Epoch 14/20  Iteration 2362/3560 Training loss: 1.7322 0.6391 sec/batch
Epoch 14/20  Iteration 2363/3560 Training loss: 1.7319 0.6429 sec/batch
Epoch 14/20  Iteration 2364/3560 Training loss: 1.7326 0.6467 sec/batch
Epoch 14/20  Iteration 2365/3560 Training loss: 1.7322 0.6274 sec/batch
Epoch 14/20  Iteration 2366/3560 Training loss: 1.7330 0.6475 sec/batch
Epoch 14/20  Iteration 2367/3560 Training loss: 1.7329 0.6201 sec/batch
Epoch 14/20  Iteration 2368/3560 Training loss: 1.7330 0.6391 sec/batch
Epoch 14/20  Iteration 2369/3560 Training loss: 1.7328 0.6553 sec/batch
Epoch 14/20  Iteration 2370/3560 Training loss: 1.7330 0.6349 sec/batch
Epoch 14/20  Iteration 2371/3560 Training loss: 1.7336 0.6386 sec/batch
Epoch 14/20  Iteration 2372/3560 Training loss: 1.7334 0.6448 sec/batch
Epoch 14/20  Iteration 2373/3560 Training loss: 1.7328 0.6337 sec/batch
Epoch 14/20  Iteration 2374/3560 Training loss: 1.7334 0.6494 sec/batch
Epoch 14/20  Iteration 2375/3560 Training loss: 1.7332 0.6485 sec/batch
Epoch 14/20  Iteration 2376/3560 Training loss: 1.7341 0.6437 sec/batch
Epoch 14/20  Iteration 2377/3560 Training loss: 1.7344 0.6431 sec/batch
Epoch 14/20  Iteration 2378/3560 Training loss: 1.7347 0.6163 sec/batch
Epoch 14/20  Iteration 2379/3560 Training loss: 1.7346 0.6359 sec/batch
Epoch 14/20  Iteration 2380/3560 Training loss: 1.7348 0.6185 sec/batch
Epoch 14/20  Iteration 2381/3560 Training loss: 1.7351 0.6394 sec/batch
Epoch 14/20  Iteration 2382/3560 Training loss: 1.7346 0.6458 sec/batch
Epoch 14/20  Iteration 2383/3560 Training loss: 1.7345 0.6383 sec/batch
Epoch 14/20  Iteration 2384/3560 Training loss: 1.7345 0.6539 sec/batch
Epoch 14/20  Iteration 2385/3560 Training loss: 1.7350 0.6367 sec/batch
Epoch 14/20  Iteration 2386/3560 Training loss: 1.7350 0.6319 sec/batch
Epoch 14/20  Iteration 2387/3560 Training loss: 1.7354 0.6374 sec/batch
Epoch 14/20  Iteration 2388/3560 Training loss: 1.7351 0.6283 sec/batch
Epoch 14/20  Iteration 2389/3560 Training loss: 1.7351 0.6319 sec/batch
Epoch 14/20  Iteration 2390/3560 Training loss: 1.7351 0.6321 sec/batch
Epoch 14/20  Iteration 2391/3560 Training loss: 1.7350 0.6318 sec/batch
Epoch 14/20  Iteration 2392/3560 Training loss: 1.7351 0.6652 sec/batch
Epoch 14/20  Iteration 2393/3560 Training loss: 1.7345 0.6347 sec/batch
Epoch 14/20  Iteration 2394/3560 Training loss: 1.7345 0.6318 sec/batch
Epoch 14/20  Iteration 2395/3560 Training loss: 1.7339 0.6352 sec/batch
Epoch 14/20  Iteration 2396/3560 Training loss: 1.7339 0.6273 sec/batch
Epoch 14/20  Iteration 2397/3560 Training loss: 1.7333 0.6407 sec/batch
Epoch 14/20  Iteration 2398/3560 Training loss: 1.7332 0.6369 sec/batch
Epoch 14/20  Iteration 2399/3560 Training loss: 1.7329 0.6376 sec/batch
Epoch 14/20  Iteration 2400/3560 Training loss: 1.7326 0.6458 sec/batch
Epoch 14/20  Iteration 2401/3560 Training loss: 1.7325 0.6261 sec/batch
Epoch 14/20  Iteration 2402/3560 Training loss: 1.7322 0.6391 sec/batch
Epoch 14/20  Iteration 2403/3560 Training loss: 1.7318 0.6457 sec/batch
Epoch 14/20  Iteration 2404/3560 Training loss: 1.7318 0.6327 sec/batch
Epoch 14/20  Iteration 2405/3560 Training loss: 1.7316 0.6304 sec/batch
Epoch 14/20  Iteration 2406/3560 Training loss: 1.7316 0.6370 sec/batch
Epoch 14/20  Iteration 2407/3560 Training loss: 1.7313 0.6213 sec/batch
Epoch 14/20  Iteration 2408/3560 Training loss: 1.7310 0.6608 sec/batch
Epoch 14/20  Iteration 2409/3560 Training loss: 1.7307 0.6223 sec/batch
Epoch 14/20  Iteration 2410/3560 Training loss: 1.7307 0.6524 sec/batch
Epoch 14/20  Iteration 2411/3560 Training loss: 1.7306 0.6324 sec/batch
Epoch 14/20  Iteration 2412/3560 Training loss: 1.7303 0.6292 sec/batch
Epoch 14/20  Iteration 2413/3560 Training loss: 1.7300 0.6359 sec/batch
Epoch 14/20  Iteration 2414/3560 Training loss: 1.7295 0.6371 sec/batch
Epoch 14/20  Iteration 2415/3560 Training loss: 1.7295 0.6305 sec/batch
Epoch 14/20  Iteration 2416/3560 Training loss: 1.7294 0.6502 sec/batch
Epoch 14/20  Iteration 2417/3560 Training loss: 1.7292 0.6302 sec/batch
Epoch 14/20  Iteration 2418/3560 Training loss: 1.7290 0.6387 sec/batch
Epoch 14/20  Iteration 2419/3560 Training loss: 1.7288 0.6459 sec/batch
Epoch 14/20  Iteration 2420/3560 Training loss: 1.7288 0.6285 sec/batch
Epoch 14/20  Iteration 2421/3560 Training loss: 1.7287 0.6334 sec/batch
Epoch 14/20  Iteration 2422/3560 Training loss: 1.7287 0.6340 sec/batch
Epoch 14/20  Iteration 2423/3560 Training loss: 1.7289 0.6313 sec/batch
Epoch 14/20  Iteration 2424/3560 Training loss: 1.7289 0.6505 sec/batch
Epoch 14/20  Iteration 2425/3560 Training loss: 1.7288 0.6517 sec/batch
Epoch 14/20  Iteration 2426/3560 Training loss: 1.7286 0.6400 sec/batch
Epoch 14/20  Iteration 2427/3560 Training loss: 1.7285 0.6375 sec/batch
Epoch 14/20  Iteration 2428/3560 Training loss: 1.7284 0.6487 sec/batch
Epoch 14/20  Iteration 2429/3560 Training loss: 1.7282 0.6313 sec/batch
Epoch 14/20  Iteration 2430/3560 Training loss: 1.7279 0.6374 sec/batch
Epoch 14/20  Iteration 2431/3560 Training loss: 1.7280 0.6327 sec/batch
Epoch 14/20  Iteration 2432/3560 Training loss: 1.7279 0.6592 sec/batch
Epoch 14/20  Iteration 2433/3560 Training loss: 1.7279 0.6273 sec/batch
Epoch 14/20  Iteration 2434/3560 Training loss: 1.7278 0.6355 sec/batch
Epoch 14/20  Iteration 2435/3560 Training loss: 1.7277 0.6407 sec/batch
Epoch 14/20  Iteration 2436/3560 Training loss: 1.7273 0.6359 sec/batch
Epoch 14/20  Iteration 2437/3560 Training loss: 1.7271 0.6391 sec/batch
Epoch 14/20  Iteration 2438/3560 Training loss: 1.7273 0.6306 sec/batch
Epoch 14/20  Iteration 2439/3560 Training loss: 1.7272 0.6336 sec/batch
Epoch 14/20  Iteration 2440/3560 Training loss: 1.7269 0.6560 sec/batch
Epoch 14/20  Iteration 2441/3560 Training loss: 1.7270 0.6320 sec/batch
Epoch 14/20  Iteration 2442/3560 Training loss: 1.7271 0.6338 sec/batch
Epoch 14/20  Iteration 2443/3560 Training loss: 1.7270 0.6388 sec/batch
Epoch 14/20  Iteration 2444/3560 Training loss: 1.7268 0.6244 sec/batch
Epoch 14/20  Iteration 2445/3560 Training loss: 1.7266 0.6288 sec/batch
Epoch 14/20  Iteration 2446/3560 Training loss: 1.7264 0.6320 sec/batch
Epoch 14/20  Iteration 2447/3560 Training loss: 1.7264 0.6404 sec/batch
Epoch 14/20  Iteration 2448/3560 Training loss: 1.7265 0.6549 sec/batch
Epoch 14/20  Iteration 2449/3560 Training loss: 1.7264 0.6357 sec/batch
Epoch 14/20  Iteration 2450/3560 Training loss: 1.7265 0.6427 sec/batch
Epoch 14/20  Iteration 2451/3560 Training loss: 1.7266 0.6389 sec/batch
Epoch 14/20  Iteration 2452/3560 Training loss: 1.7267 0.6341 sec/batch
Epoch 14/20  Iteration 2453/3560 Training loss: 1.7269 0.6392 sec/batch
Epoch 14/20  Iteration 2454/3560 Training loss: 1.7268 0.6360 sec/batch
Epoch 14/20  Iteration 2455/3560 Training loss: 1.7270 0.6455 sec/batch
Epoch 14/20  Iteration 2456/3560 Training loss: 1.7270 0.6510 sec/batch
Epoch 14/20  Iteration 2457/3560 Training loss: 1.7269 0.6579 sec/batch
Epoch 14/20  Iteration 2458/3560 Training loss: 1.7269 0.6257 sec/batch
Epoch 14/20  Iteration 2459/3560 Training loss: 1.7268 0.6441 sec/batch
Epoch 14/20  Iteration 2460/3560 Training loss: 1.7269 0.6365 sec/batch
Epoch 14/20  Iteration 2461/3560 Training loss: 1.7270 0.6250 sec/batch
Epoch 14/20  Iteration 2462/3560 Training loss: 1.7272 0.6311 sec/batch
Epoch 14/20  Iteration 2463/3560 Training loss: 1.7272 0.6606 sec/batch
Epoch 14/20  Iteration 2464/3560 Training loss: 1.7270 0.6499 sec/batch
Epoch 14/20  Iteration 2465/3560 Training loss: 1.7268 0.6389 sec/batch
Epoch 14/20  Iteration 2466/3560 Training loss: 1.7268 0.6373 sec/batch
Epoch 14/20  Iteration 2467/3560 Training loss: 1.7268 0.6304 sec/batch
Epoch 14/20  Iteration 2468/3560 Training loss: 1.7268 0.6330 sec/batch
Epoch 14/20  Iteration 2469/3560 Training loss: 1.7268 0.6360 sec/batch
Epoch 14/20  Iteration 2470/3560 Training loss: 1.7268 0.6311 sec/batch
Epoch 14/20  Iteration 2471/3560 Training loss: 1.7269 0.6507 sec/batch
Epoch 14/20  Iteration 2472/3560 Training loss: 1.7269 0.6153 sec/batch
Epoch 14/20  Iteration 2473/3560 Training loss: 1.7267 0.6450 sec/batch
Epoch 14/20  Iteration 2474/3560 Training loss: 1.7268 0.6347 sec/batch
Epoch 14/20  Iteration 2475/3560 Training loss: 1.7270 0.6430 sec/batch
Epoch 14/20  Iteration 2476/3560 Training loss: 1.7269 0.6276 sec/batch
Epoch 14/20  Iteration 2477/3560 Training loss: 1.7269 0.6367 sec/batch
Epoch 14/20  Iteration 2478/3560 Training loss: 1.7269 0.6480 sec/batch
Epoch 14/20  Iteration 2479/3560 Training loss: 1.7268 0.6485 sec/batch
Epoch 14/20  Iteration 2480/3560 Training loss: 1.7268 0.6390 sec/batch
Epoch 14/20  Iteration 2481/3560 Training loss: 1.7268 0.7481 sec/batch
Epoch 14/20  Iteration 2482/3560 Training loss: 1.7272 0.7446 sec/batch
Epoch 14/20  Iteration 2483/3560 Training loss: 1.7272 0.6499 sec/batch
Epoch 14/20  Iteration 2484/3560 Training loss: 1.7271 0.6450 sec/batch
Epoch 14/20  Iteration 2485/3560 Training loss: 1.7270 0.6341 sec/batch
Epoch 14/20  Iteration 2486/3560 Training loss: 1.7268 0.6409 sec/batch
Epoch 14/20  Iteration 2487/3560 Training loss: 1.7269 0.6580 sec/batch
Epoch 14/20  Iteration 2488/3560 Training loss: 1.7268 0.6290 sec/batch
Epoch 14/20  Iteration 2489/3560 Training loss: 1.7270 0.6376 sec/batch
Epoch 14/20  Iteration 2490/3560 Training loss: 1.7269 0.6330 sec/batch
Epoch 14/20  Iteration 2491/3560 Training loss: 1.7268 0.6280 sec/batch
Epoch 14/20  Iteration 2492/3560 Training loss: 1.7269 0.6468 sec/batch
Epoch 15/20  Iteration 2493/3560 Training loss: 1.7934 0.6355 sec/batch
Epoch 15/20  Iteration 2494/3560 Training loss: 1.7582 0.6167 sec/batch
Epoch 15/20  Iteration 2495/3560 Training loss: 1.7417 0.6535 sec/batch
Epoch 15/20  Iteration 2496/3560 Training loss: 1.7344 0.6394 sec/batch
Epoch 15/20  Iteration 2497/3560 Training loss: 1.7273 0.6357 sec/batch
Epoch 15/20  Iteration 2498/3560 Training loss: 1.7197 0.6228 sec/batch
Epoch 15/20  Iteration 2499/3560 Training loss: 1.7202 0.6461 sec/batch
Epoch 15/20  Iteration 2500/3560 Training loss: 1.7199 0.6331 sec/batch
Epoch 15/20  Iteration 2501/3560 Training loss: 1.7234 0.6450 sec/batch
Epoch 15/20  Iteration 2502/3560 Training loss: 1.7235 0.6456 sec/batch
Epoch 15/20  Iteration 2503/3560 Training loss: 1.7199 0.6496 sec/batch
Epoch 15/20  Iteration 2504/3560 Training loss: 1.7196 0.6422 sec/batch
Epoch 15/20  Iteration 2505/3560 Training loss: 1.7197 0.6420 sec/batch
Epoch 15/20  Iteration 2506/3560 Training loss: 1.7226 0.6401 sec/batch
Epoch 15/20  Iteration 2507/3560 Training loss: 1.7218 0.6324 sec/batch
Epoch 15/20  Iteration 2508/3560 Training loss: 1.7199 0.6355 sec/batch
Epoch 15/20  Iteration 2509/3560 Training loss: 1.7201 0.6348 sec/batch
Epoch 15/20  Iteration 2510/3560 Training loss: 1.7225 0.6499 sec/batch
Epoch 15/20  Iteration 2511/3560 Training loss: 1.7223 0.6509 sec/batch
Epoch 15/20  Iteration 2512/3560 Training loss: 1.7230 0.6302 sec/batch
Epoch 15/20  Iteration 2513/3560 Training loss: 1.7226 0.6404 sec/batch
Epoch 15/20  Iteration 2514/3560 Training loss: 1.7235 0.6342 sec/batch
Epoch 15/20  Iteration 2515/3560 Training loss: 1.7222 0.6394 sec/batch
Epoch 15/20  Iteration 2516/3560 Training loss: 1.7215 0.6410 sec/batch
Epoch 15/20  Iteration 2517/3560 Training loss: 1.7215 0.6229 sec/batch
Epoch 15/20  Iteration 2518/3560 Training loss: 1.7206 0.6471 sec/batch
Epoch 15/20  Iteration 2519/3560 Training loss: 1.7196 0.6359 sec/batch
Epoch 15/20  Iteration 2520/3560 Training loss: 1.7198 0.6423 sec/batch
Epoch 15/20  Iteration 2521/3560 Training loss: 1.7207 0.6402 sec/batch
Epoch 15/20  Iteration 2522/3560 Training loss: 1.7210 0.6280 sec/batch
Epoch 15/20  Iteration 2523/3560 Training loss: 1.7208 0.6342 sec/batch
Epoch 15/20  Iteration 2524/3560 Training loss: 1.7200 0.6596 sec/batch
Epoch 15/20  Iteration 2525/3560 Training loss: 1.7203 0.6897 sec/batch
Epoch 15/20  Iteration 2526/3560 Training loss: 1.7210 0.6559 sec/batch
Epoch 15/20  Iteration 2527/3560 Training loss: 1.7210 0.6478 sec/batch
Epoch 15/20  Iteration 2528/3560 Training loss: 1.7209 0.6292 sec/batch
Epoch 15/20  Iteration 2529/3560 Training loss: 1.7204 0.6412 sec/batch
Epoch 15/20  Iteration 2530/3560 Training loss: 1.7193 0.6366 sec/batch
Epoch 15/20  Iteration 2531/3560 Training loss: 1.7181 0.6312 sec/batch
Epoch 15/20  Iteration 2532/3560 Training loss: 1.7177 0.6432 sec/batch
Epoch 15/20  Iteration 2533/3560 Training loss: 1.7174 0.6402 sec/batch
Epoch 15/20  Iteration 2534/3560 Training loss: 1.7178 0.6476 sec/batch
Epoch 15/20  Iteration 2535/3560 Training loss: 1.7173 0.6403 sec/batch
Epoch 15/20  Iteration 2536/3560 Training loss: 1.7164 0.6408 sec/batch
Epoch 15/20  Iteration 2537/3560 Training loss: 1.7165 0.6307 sec/batch
Epoch 15/20  Iteration 2538/3560 Training loss: 1.7155 0.6456 sec/batch
Epoch 15/20  Iteration 2539/3560 Training loss: 1.7154 0.6334 sec/batch
Epoch 15/20  Iteration 2540/3560 Training loss: 1.7150 0.6365 sec/batch
Epoch 15/20  Iteration 2541/3560 Training loss: 1.7148 0.6447 sec/batch
Epoch 15/20  Iteration 2542/3560 Training loss: 1.7158 0.6508 sec/batch
Epoch 15/20  Iteration 2543/3560 Training loss: 1.7154 0.6422 sec/batch
Epoch 15/20  Iteration 2544/3560 Training loss: 1.7161 0.6568 sec/batch
Epoch 15/20  Iteration 2545/3560 Training loss: 1.7160 0.6293 sec/batch
Epoch 15/20  Iteration 2546/3560 Training loss: 1.7161 0.6453 sec/batch
Epoch 15/20  Iteration 2547/3560 Training loss: 1.7160 0.6425 sec/batch
Epoch 15/20  Iteration 2548/3560 Training loss: 1.7161 0.6349 sec/batch
Epoch 15/20  Iteration 2549/3560 Training loss: 1.7164 0.6347 sec/batch
Epoch 15/20  Iteration 2550/3560 Training loss: 1.7160 0.6750 sec/batch
Epoch 15/20  Iteration 2551/3560 Training loss: 1.7155 0.6339 sec/batch
Epoch 15/20  Iteration 2552/3560 Training loss: 1.7158 0.6294 sec/batch
Epoch 15/20  Iteration 2553/3560 Training loss: 1.7156 0.6266 sec/batch
Epoch 15/20  Iteration 2554/3560 Training loss: 1.7165 0.6571 sec/batch
Epoch 15/20  Iteration 2555/3560 Training loss: 1.7169 0.6342 sec/batch
Epoch 15/20  Iteration 2556/3560 Training loss: 1.7170 0.6338 sec/batch
Epoch 15/20  Iteration 2557/3560 Training loss: 1.7170 0.6527 sec/batch
Epoch 15/20  Iteration 2558/3560 Training loss: 1.7173 0.6461 sec/batch
Epoch 15/20  Iteration 2559/3560 Training loss: 1.7175 0.6309 sec/batch
Epoch 15/20  Iteration 2560/3560 Training loss: 1.7171 0.6437 sec/batch
Epoch 15/20  Iteration 2561/3560 Training loss: 1.7171 0.6401 sec/batch
Epoch 15/20  Iteration 2562/3560 Training loss: 1.7170 0.6383 sec/batch
Epoch 15/20  Iteration 2563/3560 Training loss: 1.7175 0.6418 sec/batch
Epoch 15/20  Iteration 2564/3560 Training loss: 1.7176 0.6351 sec/batch
Epoch 15/20  Iteration 2565/3560 Training loss: 1.7179 0.6497 sec/batch
Epoch 15/20  Iteration 2566/3560 Training loss: 1.7177 0.6294 sec/batch
Epoch 15/20  Iteration 2567/3560 Training loss: 1.7177 0.6251 sec/batch
Epoch 15/20  Iteration 2568/3560 Training loss: 1.7179 0.6402 sec/batch
Epoch 15/20  Iteration 2569/3560 Training loss: 1.7177 0.6394 sec/batch
Epoch 15/20  Iteration 2570/3560 Training loss: 1.7176 0.6207 sec/batch
Epoch 15/20  Iteration 2571/3560 Training loss: 1.7171 0.6396 sec/batch
Epoch 15/20  Iteration 2572/3560 Training loss: 1.7170 0.6571 sec/batch
Epoch 15/20  Iteration 2573/3560 Training loss: 1.7166 0.6654 sec/batch
Epoch 15/20  Iteration 2574/3560 Training loss: 1.7166 0.6153 sec/batch
Epoch 15/20  Iteration 2575/3560 Training loss: 1.7160 0.6321 sec/batch
Epoch 15/20  Iteration 2576/3560 Training loss: 1.7158 0.6327 sec/batch
Epoch 15/20  Iteration 2577/3560 Training loss: 1.7153 0.6421 sec/batch
Epoch 15/20  Iteration 2578/3560 Training loss: 1.7150 0.6359 sec/batch
Epoch 15/20  Iteration 2579/3560 Training loss: 1.7149 0.6358 sec/batch
Epoch 15/20  Iteration 2580/3560 Training loss: 1.7145 0.6348 sec/batch
Epoch 15/20  Iteration 2581/3560 Training loss: 1.7140 0.6513 sec/batch
Epoch 15/20  Iteration 2582/3560 Training loss: 1.7143 0.6483 sec/batch
Epoch 15/20  Iteration 2583/3560 Training loss: 1.7140 0.6262 sec/batch
Epoch 15/20  Iteration 2584/3560 Training loss: 1.7138 0.6305 sec/batch
Epoch 15/20  Iteration 2585/3560 Training loss: 1.7135 0.6338 sec/batch
Epoch 15/20  Iteration 2586/3560 Training loss: 1.7133 0.6351 sec/batch
Epoch 15/20  Iteration 2587/3560 Training loss: 1.7130 0.6217 sec/batch
Epoch 15/20  Iteration 2588/3560 Training loss: 1.7131 0.6380 sec/batch
Epoch 15/20  Iteration 2589/3560 Training loss: 1.7130 0.6429 sec/batch
Epoch 15/20  Iteration 2590/3560 Training loss: 1.7127 0.6327 sec/batch
Epoch 15/20  Iteration 2591/3560 Training loss: 1.7124 0.6338 sec/batch
Epoch 15/20  Iteration 2592/3560 Training loss: 1.7120 0.6419 sec/batch
Epoch 15/20  Iteration 2593/3560 Training loss: 1.7120 0.6329 sec/batch
Epoch 15/20  Iteration 2594/3560 Training loss: 1.7118 0.6331 sec/batch
Epoch 15/20  Iteration 2595/3560 Training loss: 1.7117 0.6349 sec/batch
Epoch 15/20  Iteration 2596/3560 Training loss: 1.7115 0.6491 sec/batch
Epoch 15/20  Iteration 2597/3560 Training loss: 1.7113 0.6548 sec/batch
Epoch 15/20  Iteration 2598/3560 Training loss: 1.7113 0.6334 sec/batch
Epoch 15/20  Iteration 2599/3560 Training loss: 1.7112 0.6316 sec/batch
Epoch 15/20  Iteration 2600/3560 Training loss: 1.7111 0.6434 sec/batch
Epoch 15/20  Iteration 2601/3560 Training loss: 1.7112 0.6349 sec/batch
Epoch 15/20  Iteration 2602/3560 Training loss: 1.7112 0.6252 sec/batch
Epoch 15/20  Iteration 2603/3560 Training loss: 1.7112 0.6290 sec/batch
Epoch 15/20  Iteration 2604/3560 Training loss: 1.7111 0.6393 sec/batch
Epoch 15/20  Iteration 2605/3560 Training loss: 1.7110 0.6521 sec/batch
Epoch 15/20  Iteration 2606/3560 Training loss: 1.7109 0.6366 sec/batch
Epoch 15/20  Iteration 2607/3560 Training loss: 1.7106 0.6360 sec/batch
Epoch 15/20  Iteration 2608/3560 Training loss: 1.7103 0.6259 sec/batch
Epoch 15/20  Iteration 2609/3560 Training loss: 1.7103 0.6280 sec/batch
Epoch 15/20  Iteration 2610/3560 Training loss: 1.7103 0.6328 sec/batch
Epoch 15/20  Iteration 2611/3560 Training loss: 1.7102 0.6463 sec/batch
Epoch 15/20  Iteration 2612/3560 Training loss: 1.7102 0.6398 sec/batch
Epoch 15/20  Iteration 2613/3560 Training loss: 1.7102 0.6539 sec/batch
Epoch 15/20  Iteration 2614/3560 Training loss: 1.7099 0.6423 sec/batch
Epoch 15/20  Iteration 2615/3560 Training loss: 1.7096 0.6257 sec/batch
Epoch 15/20  Iteration 2616/3560 Training loss: 1.7097 0.6357 sec/batch
Epoch 15/20  Iteration 2617/3560 Training loss: 1.7098 0.6301 sec/batch
Epoch 15/20  Iteration 2618/3560 Training loss: 1.7092 0.6423 sec/batch
Epoch 15/20  Iteration 2619/3560 Training loss: 1.7094 0.6425 sec/batch
Epoch 15/20  Iteration 2620/3560 Training loss: 1.7094 0.6261 sec/batch
Epoch 15/20  Iteration 2621/3560 Training loss: 1.7094 0.6541 sec/batch
Epoch 15/20  Iteration 2622/3560 Training loss: 1.7092 0.6347 sec/batch
Epoch 15/20  Iteration 2623/3560 Training loss: 1.7089 0.6255 sec/batch
Epoch 15/20  Iteration 2624/3560 Training loss: 1.7087 0.6409 sec/batch
Epoch 15/20  Iteration 2625/3560 Training loss: 1.7088 0.6200 sec/batch
Epoch 15/20  Iteration 2626/3560 Training loss: 1.7088 0.6398 sec/batch
Epoch 15/20  Iteration 2627/3560 Training loss: 1.7088 0.6312 sec/batch
Epoch 15/20  Iteration 2628/3560 Training loss: 1.7089 0.6275 sec/batch
Epoch 15/20  Iteration 2629/3560 Training loss: 1.7090 0.6620 sec/batch
Epoch 15/20  Iteration 2630/3560 Training loss: 1.7090 0.6327 sec/batch
Epoch 15/20  Iteration 2631/3560 Training loss: 1.7091 0.6296 sec/batch
Epoch 15/20  Iteration 2632/3560 Training loss: 1.7090 0.6326 sec/batch
Epoch 15/20  Iteration 2633/3560 Training loss: 1.7094 0.6299 sec/batch
Epoch 15/20  Iteration 2634/3560 Training loss: 1.7094 0.6269 sec/batch
Epoch 15/20  Iteration 2635/3560 Training loss: 1.7093 0.6315 sec/batch
Epoch 15/20  Iteration 2636/3560 Training loss: 1.7093 0.6334 sec/batch
Epoch 15/20  Iteration 2637/3560 Training loss: 1.7093 0.6502 sec/batch
Epoch 15/20  Iteration 2638/3560 Training loss: 1.7093 0.6443 sec/batch
Epoch 15/20  Iteration 2639/3560 Training loss: 1.7093 0.6238 sec/batch
Epoch 15/20  Iteration 2640/3560 Training loss: 1.7095 0.6428 sec/batch
Epoch 15/20  Iteration 2641/3560 Training loss: 1.7096 0.6346 sec/batch
Epoch 15/20  Iteration 2642/3560 Training loss: 1.7095 0.6356 sec/batch
Epoch 15/20  Iteration 2643/3560 Training loss: 1.7092 0.6440 sec/batch
Epoch 15/20  Iteration 2644/3560 Training loss: 1.7093 0.6358 sec/batch
Epoch 15/20  Iteration 2645/3560 Training loss: 1.7093 0.6328 sec/batch
Epoch 15/20  Iteration 2646/3560 Training loss: 1.7093 0.6487 sec/batch
Epoch 15/20  Iteration 2647/3560 Training loss: 1.7093 0.6392 sec/batch
Epoch 15/20  Iteration 2648/3560 Training loss: 1.7092 0.6389 sec/batch
Epoch 15/20  Iteration 2649/3560 Training loss: 1.7093 0.6355 sec/batch
Epoch 15/20  Iteration 2650/3560 Training loss: 1.7094 0.6371 sec/batch
Epoch 15/20  Iteration 2651/3560 Training loss: 1.7092 0.6444 sec/batch
Epoch 15/20  Iteration 2652/3560 Training loss: 1.7094 0.6590 sec/batch
Epoch 15/20  Iteration 2653/3560 Training loss: 1.7096 0.6450 sec/batch
Epoch 15/20  Iteration 2654/3560 Training loss: 1.7095 0.6311 sec/batch
Epoch 15/20  Iteration 2655/3560 Training loss: 1.7095 0.6230 sec/batch
Epoch 15/20  Iteration 2656/3560 Training loss: 1.7095 0.6354 sec/batch
Epoch 15/20  Iteration 2657/3560 Training loss: 1.7094 0.6377 sec/batch
Epoch 15/20  Iteration 2658/3560 Training loss: 1.7094 0.6278 sec/batch
Epoch 15/20  Iteration 2659/3560 Training loss: 1.7095 0.6327 sec/batch
Epoch 15/20  Iteration 2660/3560 Training loss: 1.7098 0.6498 sec/batch
Epoch 15/20  Iteration 2661/3560 Training loss: 1.7098 0.6215 sec/batch
Epoch 15/20  Iteration 2662/3560 Training loss: 1.7098 0.6274 sec/batch
Epoch 15/20  Iteration 2663/3560 Training loss: 1.7096 0.6503 sec/batch
Epoch 15/20  Iteration 2664/3560 Training loss: 1.7095 0.6334 sec/batch
Epoch 15/20  Iteration 2665/3560 Training loss: 1.7096 0.6372 sec/batch
Epoch 15/20  Iteration 2666/3560 Training loss: 1.7096 0.6368 sec/batch
Epoch 15/20  Iteration 2667/3560 Training loss: 1.7096 0.6285 sec/batch
Epoch 15/20  Iteration 2668/3560 Training loss: 1.7096 0.6348 sec/batch
Epoch 15/20  Iteration 2669/3560 Training loss: 1.7095 0.6317 sec/batch
Epoch 15/20  Iteration 2670/3560 Training loss: 1.7094 0.6294 sec/batch
Epoch 16/20  Iteration 2671/3560 Training loss: 1.7616 0.6308 sec/batch
Epoch 16/20  Iteration 2672/3560 Training loss: 1.7279 0.6470 sec/batch
Epoch 16/20  Iteration 2673/3560 Training loss: 1.7208 0.6499 sec/batch
Epoch 16/20  Iteration 2674/3560 Training loss: 1.7183 0.6376 sec/batch
Epoch 16/20  Iteration 2675/3560 Training loss: 1.7124 0.6378 sec/batch
Epoch 16/20  Iteration 2676/3560 Training loss: 1.7048 0.6551 sec/batch
Epoch 16/20  Iteration 2677/3560 Training loss: 1.7049 0.6125 sec/batch
Epoch 16/20  Iteration 2678/3560 Training loss: 1.7030 0.6448 sec/batch
Epoch 16/20  Iteration 2679/3560 Training loss: 1.7050 0.6544 sec/batch
Epoch 16/20  Iteration 2680/3560 Training loss: 1.7040 0.6446 sec/batch
Epoch 16/20  Iteration 2681/3560 Training loss: 1.7013 0.6437 sec/batch
Epoch 16/20  Iteration 2682/3560 Training loss: 1.7003 0.6353 sec/batch
Epoch 16/20  Iteration 2683/3560 Training loss: 1.6999 0.6337 sec/batch
Epoch 16/20  Iteration 2684/3560 Training loss: 1.7028 0.6680 sec/batch
Epoch 16/20  Iteration 2685/3560 Training loss: 1.7014 0.6252 sec/batch
Epoch 16/20  Iteration 2686/3560 Training loss: 1.6996 0.6366 sec/batch
Epoch 16/20  Iteration 2687/3560 Training loss: 1.6999 0.6295 sec/batch
Epoch 16/20  Iteration 2688/3560 Training loss: 1.7022 0.6463 sec/batch
Epoch 16/20  Iteration 2689/3560 Training loss: 1.7017 0.6294 sec/batch
Epoch 16/20  Iteration 2690/3560 Training loss: 1.7022 0.6526 sec/batch
Epoch 16/20  Iteration 2691/3560 Training loss: 1.7019 0.6532 sec/batch
Epoch 16/20  Iteration 2692/3560 Training loss: 1.7022 0.6497 sec/batch
Epoch 16/20  Iteration 2693/3560 Training loss: 1.7019 0.6258 sec/batch
Epoch 16/20  Iteration 2694/3560 Training loss: 1.7013 0.6313 sec/batch
Epoch 16/20  Iteration 2695/3560 Training loss: 1.7013 0.6271 sec/batch
Epoch 16/20  Iteration 2696/3560 Training loss: 1.7000 0.6323 sec/batch
Epoch 16/20  Iteration 2697/3560 Training loss: 1.6992 0.6302 sec/batch
Epoch 16/20  Iteration 2698/3560 Training loss: 1.7002 0.6348 sec/batch
Epoch 16/20  Iteration 2699/3560 Training loss: 1.7009 0.6355 sec/batch
Epoch 16/20  Iteration 2700/3560 Training loss: 1.7013 0.6624 sec/batch
Epoch 16/20  Iteration 2701/3560 Training loss: 1.7015 0.6475 sec/batch
Epoch 16/20  Iteration 2702/3560 Training loss: 1.7007 0.6263 sec/batch
Epoch 16/20  Iteration 2703/3560 Training loss: 1.7010 0.6300 sec/batch
Epoch 16/20  Iteration 2704/3560 Training loss: 1.7014 0.6461 sec/batch
Epoch 16/20  Iteration 2705/3560 Training loss: 1.7011 0.6323 sec/batch
Epoch 16/20  Iteration 2706/3560 Training loss: 1.7009 0.6377 sec/batch
Epoch 16/20  Iteration 2707/3560 Training loss: 1.7002 0.6326 sec/batch
Epoch 16/20  Iteration 2708/3560 Training loss: 1.6990 0.6554 sec/batch
Epoch 16/20  Iteration 2709/3560 Training loss: 1.6979 0.6453 sec/batch
Epoch 16/20  Iteration 2710/3560 Training loss: 1.6973 0.6302 sec/batch
Epoch 16/20  Iteration 2711/3560 Training loss: 1.6969 0.7067 sec/batch
Epoch 16/20  Iteration 2712/3560 Training loss: 1.6972 0.6332 sec/batch
Epoch 16/20  Iteration 2713/3560 Training loss: 1.6967 0.6565 sec/batch
Epoch 16/20  Iteration 2714/3560 Training loss: 1.6960 0.6592 sec/batch
Epoch 16/20  Iteration 2715/3560 Training loss: 1.6963 0.6546 sec/batch
Epoch 16/20  Iteration 2716/3560 Training loss: 1.6952 0.6355 sec/batch
Epoch 16/20  Iteration 2717/3560 Training loss: 1.6951 0.6441 sec/batch
Epoch 16/20  Iteration 2718/3560 Training loss: 1.6947 0.6372 sec/batch
Epoch 16/20  Iteration 2719/3560 Training loss: 1.6947 0.6399 sec/batch
Epoch 16/20  Iteration 2720/3560 Training loss: 1.6956 0.6365 sec/batch
Epoch 16/20  Iteration 2721/3560 Training loss: 1.6952 0.6457 sec/batch
Epoch 16/20  Iteration 2722/3560 Training loss: 1.6960 0.6419 sec/batch
Epoch 16/20  Iteration 2723/3560 Training loss: 1.6957 0.6546 sec/batch
Epoch 16/20  Iteration 2724/3560 Training loss: 1.6959 0.6343 sec/batch
Epoch 16/20  Iteration 2725/3560 Training loss: 1.6957 0.6368 sec/batch
Epoch 16/20  Iteration 2726/3560 Training loss: 1.6959 0.6367 sec/batch
Epoch 16/20  Iteration 2727/3560 Training loss: 1.6965 0.6397 sec/batch
Epoch 16/20  Iteration 2728/3560 Training loss: 1.6962 0.6462 sec/batch
Epoch 16/20  Iteration 2729/3560 Training loss: 1.6958 0.6371 sec/batch
Epoch 16/20  Iteration 2730/3560 Training loss: 1.6962 0.6359 sec/batch
Epoch 16/20  Iteration 2731/3560 Training loss: 1.6961 0.6489 sec/batch
Epoch 16/20  Iteration 2732/3560 Training loss: 1.6970 0.6433 sec/batch
Epoch 16/20  Iteration 2733/3560 Training loss: 1.6974 0.6241 sec/batch
Epoch 16/20  Iteration 2734/3560 Training loss: 1.6978 0.6458 sec/batch
Epoch 16/20  Iteration 2735/3560 Training loss: 1.6977 0.6334 sec/batch
Epoch 16/20  Iteration 2736/3560 Training loss: 1.6979 0.6621 sec/batch
Epoch 16/20  Iteration 2737/3560 Training loss: 1.6982 0.6262 sec/batch
Epoch 16/20  Iteration 2738/3560 Training loss: 1.6978 0.6448 sec/batch
Epoch 16/20  Iteration 2739/3560 Training loss: 1.6979 0.6542 sec/batch
Epoch 16/20  Iteration 2740/3560 Training loss: 1.6977 0.6362 sec/batch
Epoch 16/20  Iteration 2741/3560 Training loss: 1.6982 0.6389 sec/batch
Epoch 16/20  Iteration 2742/3560 Training loss: 1.6984 0.6349 sec/batch
Epoch 16/20  Iteration 2743/3560 Training loss: 1.6988 0.6645 sec/batch
Epoch 16/20  Iteration 2744/3560 Training loss: 1.6986 0.8337 sec/batch
Epoch 16/20  Iteration 2745/3560 Training loss: 1.6985 0.6499 sec/batch
Epoch 16/20  Iteration 2746/3560 Training loss: 1.6987 0.6558 sec/batch
Epoch 16/20  Iteration 2747/3560 Training loss: 1.6988 0.6294 sec/batch
Epoch 16/20  Iteration 2748/3560 Training loss: 1.6988 0.6380 sec/batch
Epoch 16/20  Iteration 2749/3560 Training loss: 1.6985 0.6338 sec/batch
Epoch 16/20  Iteration 2750/3560 Training loss: 1.6984 0.6326 sec/batch
Epoch 16/20  Iteration 2751/3560 Training loss: 1.6979 0.6567 sec/batch
Epoch 16/20  Iteration 2752/3560 Training loss: 1.6980 0.6335 sec/batch
Epoch 16/20  Iteration 2753/3560 Training loss: 1.6975 0.6454 sec/batch
Epoch 16/20  Iteration 2754/3560 Training loss: 1.6976 0.6620 sec/batch
Epoch 16/20  Iteration 2755/3560 Training loss: 1.6972 0.6115 sec/batch
Epoch 16/20  Iteration 2756/3560 Training loss: 1.6970 0.6408 sec/batch
Epoch 16/20  Iteration 2757/3560 Training loss: 1.6968 0.6312 sec/batch
Epoch 16/20  Iteration 2758/3560 Training loss: 1.6964 0.6802 sec/batch
Epoch 16/20  Iteration 2759/3560 Training loss: 1.6960 0.6654 sec/batch
Epoch 16/20  Iteration 2760/3560 Training loss: 1.6961 0.6568 sec/batch
Epoch 16/20  Iteration 2761/3560 Training loss: 1.6958 0.6424 sec/batch
Epoch 16/20  Iteration 2762/3560 Training loss: 1.6957 0.6500 sec/batch
Epoch 16/20  Iteration 2763/3560 Training loss: 1.6953 0.6375 sec/batch
Epoch 16/20  Iteration 2764/3560 Training loss: 1.6951 0.6357 sec/batch
Epoch 16/20  Iteration 2765/3560 Training loss: 1.6948 0.6419 sec/batch
Epoch 16/20  Iteration 2766/3560 Training loss: 1.6949 0.6398 sec/batch
Epoch 16/20  Iteration 2767/3560 Training loss: 1.6948 0.6385 sec/batch
Epoch 16/20  Iteration 2768/3560 Training loss: 1.6943 0.6419 sec/batch
Epoch 16/20  Iteration 2769/3560 Training loss: 1.6941 0.6319 sec/batch
Epoch 16/20  Iteration 2770/3560 Training loss: 1.6937 0.6439 sec/batch
Epoch 16/20  Iteration 2771/3560 Training loss: 1.6938 0.6470 sec/batch
Epoch 16/20  Iteration 2772/3560 Training loss: 1.6937 0.6352 sec/batch
Epoch 16/20  Iteration 2773/3560 Training loss: 1.6934 0.6362 sec/batch
Epoch 16/20  Iteration 2774/3560 Training loss: 1.6932 0.6480 sec/batch
Epoch 16/20  Iteration 2775/3560 Training loss: 1.6929 0.6397 sec/batch
Epoch 16/20  Iteration 2776/3560 Training loss: 1.6928 0.6316 sec/batch
Epoch 16/20  Iteration 2777/3560 Training loss: 1.6928 0.6394 sec/batch
Epoch 16/20  Iteration 2778/3560 Training loss: 1.6928 0.6590 sec/batch
Epoch 16/20  Iteration 2779/3560 Training loss: 1.6928 0.6224 sec/batch
Epoch 16/20  Iteration 2780/3560 Training loss: 1.6928 0.6484 sec/batch
Epoch 16/20  Iteration 2781/3560 Training loss: 1.6927 0.6459 sec/batch
Epoch 16/20  Iteration 2782/3560 Training loss: 1.6925 0.6533 sec/batch
Epoch 16/20  Iteration 2783/3560 Training loss: 1.6924 0.7611 sec/batch
Epoch 16/20  Iteration 2784/3560 Training loss: 1.6923 0.6460 sec/batch
Epoch 16/20  Iteration 2785/3560 Training loss: 1.6920 0.6437 sec/batch
Epoch 16/20  Iteration 2786/3560 Training loss: 1.6916 0.6261 sec/batch
Epoch 16/20  Iteration 2787/3560 Training loss: 1.6916 0.7450 sec/batch
Epoch 16/20  Iteration 2788/3560 Training loss: 1.6916 0.7128 sec/batch
Epoch 16/20  Iteration 2789/3560 Training loss: 1.6915 0.6799 sec/batch
Epoch 16/20  Iteration 2790/3560 Training loss: 1.6916 0.6637 sec/batch
Epoch 16/20  Iteration 2791/3560 Training loss: 1.6916 0.6624 sec/batch
Epoch 16/20  Iteration 2792/3560 Training loss: 1.6913 0.6649 sec/batch
Epoch 16/20  Iteration 2793/3560 Training loss: 1.6910 0.6688 sec/batch
Epoch 16/20  Iteration 2794/3560 Training loss: 1.6911 0.6648 sec/batch
Epoch 16/20  Iteration 2795/3560 Training loss: 1.6911 0.6605 sec/batch
Epoch 16/20  Iteration 2796/3560 Training loss: 1.6907 0.6993 sec/batch
Epoch 16/20  Iteration 2797/3560 Training loss: 1.6908 0.6475 sec/batch
Epoch 16/20  Iteration 2798/3560 Training loss: 1.6908 0.6416 sec/batch
Epoch 16/20  Iteration 2799/3560 Training loss: 1.6907 0.6651 sec/batch
Epoch 16/20  Iteration 2800/3560 Training loss: 1.6906 0.6543 sec/batch
Epoch 16/20  Iteration 2801/3560 Training loss: 1.6904 0.8169 sec/batch
Epoch 16/20  Iteration 2802/3560 Training loss: 1.6902 0.6484 sec/batch
Epoch 16/20  Iteration 2803/3560 Training loss: 1.6903 0.6750 sec/batch
Epoch 16/20  Iteration 2804/3560 Training loss: 1.6902 0.6451 sec/batch
Epoch 16/20  Iteration 2805/3560 Training loss: 1.6902 0.6935 sec/batch
Epoch 16/20  Iteration 2806/3560 Training loss: 1.6903 0.7018 sec/batch
Epoch 16/20  Iteration 2807/3560 Training loss: 1.6904 0.6415 sec/batch
Epoch 16/20  Iteration 2808/3560 Training loss: 1.6904 0.6741 sec/batch
Epoch 16/20  Iteration 2809/3560 Training loss: 1.6906 0.6443 sec/batch
Epoch 16/20  Iteration 2810/3560 Training loss: 1.6905 0.6277 sec/batch
Epoch 16/20  Iteration 2811/3560 Training loss: 1.6909 0.6385 sec/batch
Epoch 16/20  Iteration 2812/3560 Training loss: 1.6908 0.6346 sec/batch
Epoch 16/20  Iteration 2813/3560 Training loss: 1.6908 0.6489 sec/batch
Epoch 16/20  Iteration 2814/3560 Training loss: 1.6909 0.6122 sec/batch
Epoch 16/20  Iteration 2815/3560 Training loss: 1.6908 0.6445 sec/batch
Epoch 16/20  Iteration 2816/3560 Training loss: 1.6909 0.7241 sec/batch
Epoch 16/20  Iteration 2817/3560 Training loss: 1.6909 0.6910 sec/batch
Epoch 16/20  Iteration 2818/3560 Training loss: 1.6912 0.6755 sec/batch
Epoch 16/20  Iteration 2819/3560 Training loss: 1.6913 0.7358 sec/batch
Epoch 16/20  Iteration 2820/3560 Training loss: 1.6912 0.6902 sec/batch
Epoch 16/20  Iteration 2821/3560 Training loss: 1.6910 0.7177 sec/batch
Epoch 16/20  Iteration 2822/3560 Training loss: 1.6911 0.7128 sec/batch
Epoch 16/20  Iteration 2823/3560 Training loss: 1.6912 0.7392 sec/batch
Epoch 16/20  Iteration 2824/3560 Training loss: 1.6912 0.7124 sec/batch
Epoch 16/20  Iteration 2825/3560 Training loss: 1.6912 0.7060 sec/batch
Epoch 16/20  Iteration 2826/3560 Training loss: 1.6913 0.7251 sec/batch
Epoch 16/20  Iteration 2827/3560 Training loss: 1.6913 0.7086 sec/batch
Epoch 16/20  Iteration 2828/3560 Training loss: 1.6913 0.6586 sec/batch
Epoch 16/20  Iteration 2829/3560 Training loss: 1.6911 0.6690 sec/batch
Epoch 16/20  Iteration 2830/3560 Training loss: 1.6912 0.6987 sec/batch
Epoch 16/20  Iteration 2831/3560 Training loss: 1.6915 0.7248 sec/batch
Epoch 16/20  Iteration 2832/3560 Training loss: 1.6915 0.7221 sec/batch
Epoch 16/20  Iteration 2833/3560 Training loss: 1.6915 0.8093 sec/batch
Epoch 16/20  Iteration 2834/3560 Training loss: 1.6915 0.8549 sec/batch
Epoch 16/20  Iteration 2835/3560 Training loss: 1.6914 0.7810 sec/batch
Epoch 16/20  Iteration 2836/3560 Training loss: 1.6914 0.8222 sec/batch
Epoch 16/20  Iteration 2837/3560 Training loss: 1.6914 0.7532 sec/batch
Epoch 16/20  Iteration 2838/3560 Training loss: 1.6919 0.7444 sec/batch
Epoch 16/20  Iteration 2839/3560 Training loss: 1.6919 0.6877 sec/batch
Epoch 16/20  Iteration 2840/3560 Training loss: 1.6919 0.7568 sec/batch
Epoch 16/20  Iteration 2841/3560 Training loss: 1.6917 0.6414 sec/batch
Epoch 16/20  Iteration 2842/3560 Training loss: 1.6916 0.7351 sec/batch
Epoch 16/20  Iteration 2843/3560 Training loss: 1.6917 0.6507 sec/batch
Epoch 16/20  Iteration 2844/3560 Training loss: 1.6917 0.7335 sec/batch
Epoch 16/20  Iteration 2845/3560 Training loss: 1.6918 0.7165 sec/batch
Epoch 16/20  Iteration 2846/3560 Training loss: 1.6918 0.6715 sec/batch
Epoch 16/20  Iteration 2847/3560 Training loss: 1.6916 0.6526 sec/batch
Epoch 16/20  Iteration 2848/3560 Training loss: 1.6917 0.6947 sec/batch
Epoch 17/20  Iteration 2849/3560 Training loss: 1.7599 0.7052 sec/batch
Epoch 17/20  Iteration 2850/3560 Training loss: 1.7200 0.6605 sec/batch
Epoch 17/20  Iteration 2851/3560 Training loss: 1.7114 0.6576 sec/batch
Epoch 17/20  Iteration 2852/3560 Training loss: 1.7066 0.6471 sec/batch
Epoch 17/20  Iteration 2853/3560 Training loss: 1.7002 0.6274 sec/batch
Epoch 17/20  Iteration 2854/3560 Training loss: 1.6906 0.6418 sec/batch
Epoch 17/20  Iteration 2855/3560 Training loss: 1.6920 0.6320 sec/batch
Epoch 17/20  Iteration 2856/3560 Training loss: 1.6905 0.6309 sec/batch
Epoch 17/20  Iteration 2857/3560 Training loss: 1.6925 0.6348 sec/batch
Epoch 17/20  Iteration 2858/3560 Training loss: 1.6912 0.6370 sec/batch
Epoch 17/20  Iteration 2859/3560 Training loss: 1.6886 0.6504 sec/batch
Epoch 17/20  Iteration 2860/3560 Training loss: 1.6876 0.6409 sec/batch
Epoch 17/20  Iteration 2861/3560 Training loss: 1.6865 0.6405 sec/batch
Epoch 17/20  Iteration 2862/3560 Training loss: 1.6889 0.6373 sec/batch
Epoch 17/20  Iteration 2863/3560 Training loss: 1.6883 0.6469 sec/batch
Epoch 17/20  Iteration 2864/3560 Training loss: 1.6860 0.6303 sec/batch
Epoch 17/20  Iteration 2865/3560 Training loss: 1.6865 0.6339 sec/batch
Epoch 17/20  Iteration 2866/3560 Training loss: 1.6887 0.6492 sec/batch
Epoch 17/20  Iteration 2867/3560 Training loss: 1.6888 0.6373 sec/batch
Epoch 17/20  Iteration 2868/3560 Training loss: 1.6891 0.6300 sec/batch
Epoch 17/20  Iteration 2869/3560 Training loss: 1.6891 0.6496 sec/batch
Epoch 17/20  Iteration 2870/3560 Training loss: 1.6893 0.6373 sec/batch
Epoch 17/20  Iteration 2871/3560 Training loss: 1.6882 0.6428 sec/batch
Epoch 17/20  Iteration 2872/3560 Training loss: 1.6878 0.6460 sec/batch
Epoch 17/20  Iteration 2873/3560 Training loss: 1.6874 0.6377 sec/batch
Epoch 17/20  Iteration 2874/3560 Training loss: 1.6864 0.6417 sec/batch
Epoch 17/20  Iteration 2875/3560 Training loss: 1.6859 0.6291 sec/batch
Epoch 17/20  Iteration 2876/3560 Training loss: 1.6863 0.6791 sec/batch
Epoch 17/20  Iteration 2877/3560 Training loss: 1.6871 0.6651 sec/batch
Epoch 17/20  Iteration 2878/3560 Training loss: 1.6876 0.6566 sec/batch
Epoch 17/20  Iteration 2879/3560 Training loss: 1.6876 0.6631 sec/batch
Epoch 17/20  Iteration 2880/3560 Training loss: 1.6865 0.6428 sec/batch
Epoch 17/20  Iteration 2881/3560 Training loss: 1.6871 0.6396 sec/batch
Epoch 17/20  Iteration 2882/3560 Training loss: 1.6878 0.6631 sec/batch
Epoch 17/20  Iteration 2883/3560 Training loss: 1.6874 0.6400 sec/batch
Epoch 17/20  Iteration 2884/3560 Training loss: 1.6873 0.6438 sec/batch
Epoch 17/20  Iteration 2885/3560 Training loss: 1.6866 0.6325 sec/batch
Epoch 17/20  Iteration 2886/3560 Training loss: 1.6854 0.6452 sec/batch
Epoch 17/20  Iteration 2887/3560 Training loss: 1.6844 0.6387 sec/batch
Epoch 17/20  Iteration 2888/3560 Training loss: 1.6839 0.6312 sec/batch
Epoch 17/20  Iteration 2889/3560 Training loss: 1.6836 0.6335 sec/batch
Epoch 17/20  Iteration 2890/3560 Training loss: 1.6840 0.6947 sec/batch
Epoch 17/20  Iteration 2891/3560 Training loss: 1.6836 0.6715 sec/batch
Epoch 17/20  Iteration 2892/3560 Training loss: 1.6829 0.6566 sec/batch
Epoch 17/20  Iteration 2893/3560 Training loss: 1.6829 0.6396 sec/batch
Epoch 17/20  Iteration 2894/3560 Training loss: 1.6818 0.6533 sec/batch
Epoch 17/20  Iteration 2895/3560 Training loss: 1.6819 0.6398 sec/batch
Epoch 17/20  Iteration 2896/3560 Training loss: 1.6816 0.6491 sec/batch
Epoch 17/20  Iteration 2897/3560 Training loss: 1.6815 0.6317 sec/batch
Epoch 17/20  Iteration 2898/3560 Training loss: 1.6822 0.6542 sec/batch
Epoch 17/20  Iteration 2899/3560 Training loss: 1.6815 0.6355 sec/batch
Epoch 17/20  Iteration 2900/3560 Training loss: 1.6823 0.6369 sec/batch
Epoch 17/20  Iteration 2901/3560 Training loss: 1.6824 0.6412 sec/batch
Epoch 17/20  Iteration 2902/3560 Training loss: 1.6827 0.6390 sec/batch
Epoch 17/20  Iteration 2903/3560 Training loss: 1.6823 0.6399 sec/batch
Epoch 17/20  Iteration 2904/3560 Training loss: 1.6827 0.6370 sec/batch
Epoch 17/20  Iteration 2905/3560 Training loss: 1.6830 0.6450 sec/batch
Epoch 17/20  Iteration 2906/3560 Training loss: 1.6827 0.6308 sec/batch
Epoch 17/20  Iteration 2907/3560 Training loss: 1.6823 0.6329 sec/batch
Epoch 17/20  Iteration 2908/3560 Training loss: 1.6828 0.6275 sec/batch
Epoch 17/20  Iteration 2909/3560 Training loss: 1.6827 0.7098 sec/batch
Epoch 17/20  Iteration 2910/3560 Training loss: 1.6835 0.6518 sec/batch
Epoch 17/20  Iteration 2911/3560 Training loss: 1.6839 0.6649 sec/batch
Epoch 17/20  Iteration 2912/3560 Training loss: 1.6842 0.6333 sec/batch
Epoch 17/20  Iteration 2913/3560 Training loss: 1.6841 0.6511 sec/batch
Epoch 17/20  Iteration 2914/3560 Training loss: 1.6845 0.6341 sec/batch
Epoch 17/20  Iteration 2915/3560 Training loss: 1.6846 0.6245 sec/batch
Epoch 17/20  Iteration 2916/3560 Training loss: 1.6843 0.6291 sec/batch
Epoch 17/20  Iteration 2917/3560 Training loss: 1.6842 0.6355 sec/batch
Epoch 17/20  Iteration 2918/3560 Training loss: 1.6841 0.6296 sec/batch
Epoch 17/20  Iteration 2919/3560 Training loss: 1.6846 0.6462 sec/batch
Epoch 17/20  Iteration 2920/3560 Training loss: 1.6847 0.6342 sec/batch
Epoch 17/20  Iteration 2921/3560 Training loss: 1.6849 0.6688 sec/batch
Epoch 17/20  Iteration 2922/3560 Training loss: 1.6847 0.6298 sec/batch
Epoch 17/20  Iteration 2923/3560 Training loss: 1.6845 0.6382 sec/batch
Epoch 17/20  Iteration 2924/3560 Training loss: 1.6846 0.6508 sec/batch
Epoch 17/20  Iteration 2925/3560 Training loss: 1.6846 0.6437 sec/batch
Epoch 17/20  Iteration 2926/3560 Training loss: 1.6847 0.6413 sec/batch
Epoch 17/20  Iteration 2927/3560 Training loss: 1.6841 0.6466 sec/batch
Epoch 17/20  Iteration 2928/3560 Training loss: 1.6838 0.6407 sec/batch
Epoch 17/20  Iteration 2929/3560 Training loss: 1.6833 0.6548 sec/batch
Epoch 17/20  Iteration 2930/3560 Training loss: 1.6833 0.6343 sec/batch
Epoch 17/20  Iteration 2931/3560 Training loss: 1.6828 0.6279 sec/batch
Epoch 17/20  Iteration 2932/3560 Training loss: 1.6827 0.5979 sec/batch
Epoch 17/20  Iteration 2933/3560 Training loss: 1.6822 0.6363 sec/batch
Epoch 17/20  Iteration 2934/3560 Training loss: 1.6820 0.6312 sec/batch
Epoch 17/20  Iteration 2935/3560 Training loss: 1.6818 0.6219 sec/batch
Epoch 17/20  Iteration 2936/3560 Training loss: 1.6815 0.6442 sec/batch
Epoch 17/20  Iteration 2937/3560 Training loss: 1.6810 0.6688 sec/batch
Epoch 17/20  Iteration 2938/3560 Training loss: 1.6811 0.6656 sec/batch
Epoch 17/20  Iteration 2939/3560 Training loss: 1.6809 0.6843 sec/batch
Epoch 17/20  Iteration 2940/3560 Training loss: 1.6806 0.6829 sec/batch
Epoch 17/20  Iteration 2941/3560 Training loss: 1.6802 0.6516 sec/batch
Epoch 17/20  Iteration 2942/3560 Training loss: 1.6799 0.6316 sec/batch
Epoch 17/20  Iteration 2943/3560 Training loss: 1.6796 0.6243 sec/batch
Epoch 17/20  Iteration 2944/3560 Training loss: 1.6795 0.6447 sec/batch
Epoch 17/20  Iteration 2945/3560 Training loss: 1.6793 0.6565 sec/batch
Epoch 17/20  Iteration 2946/3560 Training loss: 1.6789 0.6321 sec/batch
Epoch 17/20  Iteration 2947/3560 Training loss: 1.6786 0.6395 sec/batch
Epoch 17/20  Iteration 2948/3560 Training loss: 1.6782 0.6307 sec/batch
Epoch 17/20  Iteration 2949/3560 Training loss: 1.6782 0.6318 sec/batch
Epoch 17/20  Iteration 2950/3560 Training loss: 1.6781 0.6948 sec/batch
Epoch 17/20  Iteration 2951/3560 Training loss: 1.6778 0.7980 sec/batch
Epoch 17/20  Iteration 2952/3560 Training loss: 1.6777 0.6623 sec/batch
Epoch 17/20  Iteration 2953/3560 Training loss: 1.6776 0.6398 sec/batch
Epoch 17/20  Iteration 2954/3560 Training loss: 1.6775 0.6391 sec/batch
Epoch 17/20  Iteration 2955/3560 Training loss: 1.6774 0.6486 sec/batch
Epoch 17/20  Iteration 2956/3560 Training loss: 1.6773 0.6101 sec/batch
Epoch 17/20  Iteration 2957/3560 Training loss: 1.6773 0.6235 sec/batch
Epoch 17/20  Iteration 2958/3560 Training loss: 1.6774 0.6369 sec/batch
Epoch 17/20  Iteration 2959/3560 Training loss: 1.6773 0.6347 sec/batch
Epoch 17/20  Iteration 2960/3560 Training loss: 1.6771 0.6496 sec/batch
Epoch 17/20  Iteration 2961/3560 Training loss: 1.6769 0.6349 sec/batch
Epoch 17/20  Iteration 2962/3560 Training loss: 1.6768 0.6510 sec/batch
Epoch 17/20  Iteration 2963/3560 Training loss: 1.6765 0.6368 sec/batch
Epoch 17/20  Iteration 2964/3560 Training loss: 1.6762 0.6468 sec/batch
Epoch 17/20  Iteration 2965/3560 Training loss: 1.6762 0.6184 sec/batch
Epoch 17/20  Iteration 2966/3560 Training loss: 1.6761 0.6372 sec/batch
Epoch 17/20  Iteration 2967/3560 Training loss: 1.6760 0.6387 sec/batch
Epoch 17/20  Iteration 2968/3560 Training loss: 1.6760 0.6530 sec/batch
Epoch 17/20  Iteration 2969/3560 Training loss: 1.6760 0.6378 sec/batch
Epoch 17/20  Iteration 2970/3560 Training loss: 1.6757 0.6312 sec/batch
Epoch 17/20  Iteration 2971/3560 Training loss: 1.6754 0.6367 sec/batch
Epoch 17/20  Iteration 2972/3560 Training loss: 1.6755 0.6457 sec/batch
Epoch 17/20  Iteration 2973/3560 Training loss: 1.6755 0.6911 sec/batch
Epoch 17/20  Iteration 2974/3560 Training loss: 1.6751 0.6552 sec/batch
Epoch 17/20  Iteration 2975/3560 Training loss: 1.6753 0.6550 sec/batch
Epoch 17/20  Iteration 2976/3560 Training loss: 1.6753 0.6558 sec/batch
Epoch 17/20  Iteration 2977/3560 Training loss: 1.6753 0.6446 sec/batch
Epoch 17/20  Iteration 2978/3560 Training loss: 1.6752 0.6529 sec/batch
Epoch 17/20  Iteration 2979/3560 Training loss: 1.6749 0.6375 sec/batch
Epoch 17/20  Iteration 2980/3560 Training loss: 1.6748 0.6374 sec/batch
Epoch 17/20  Iteration 2981/3560 Training loss: 1.6749 0.6324 sec/batch
Epoch 17/20  Iteration 2982/3560 Training loss: 1.6749 0.6360 sec/batch
Epoch 17/20  Iteration 2983/3560 Training loss: 1.6749 0.6467 sec/batch
Epoch 17/20  Iteration 2984/3560 Training loss: 1.6751 0.6382 sec/batch
Epoch 17/20  Iteration 2985/3560 Training loss: 1.6753 0.6372 sec/batch
Epoch 17/20  Iteration 2986/3560 Training loss: 1.6753 0.6324 sec/batch
Epoch 17/20  Iteration 2987/3560 Training loss: 1.6755 0.6278 sec/batch
Epoch 17/20  Iteration 2988/3560 Training loss: 1.6754 0.6303 sec/batch
Epoch 17/20  Iteration 2989/3560 Training loss: 1.6757 0.6439 sec/batch
Epoch 17/20  Iteration 2990/3560 Training loss: 1.6756 0.6715 sec/batch
Epoch 17/20  Iteration 2991/3560 Training loss: 1.6755 0.6653 sec/batch
Epoch 17/20  Iteration 2992/3560 Training loss: 1.6757 0.6598 sec/batch
Epoch 17/20  Iteration 2993/3560 Training loss: 1.6756 0.6479 sec/batch
Epoch 17/20  Iteration 2994/3560 Training loss: 1.6757 0.6322 sec/batch
Epoch 17/20  Iteration 2995/3560 Training loss: 1.6757 0.6356 sec/batch
Epoch 17/20  Iteration 2996/3560 Training loss: 1.6759 0.6326 sec/batch
Epoch 17/20  Iteration 2997/3560 Training loss: 1.6760 0.6398 sec/batch
Epoch 17/20  Iteration 2998/3560 Training loss: 1.6759 0.6349 sec/batch
Epoch 17/20  Iteration 2999/3560 Training loss: 1.6756 0.6501 sec/batch
Epoch 17/20  Iteration 3000/3560 Training loss: 1.6757 0.6291 sec/batch
Epoch 17/20  Iteration 3001/3560 Training loss: 1.6758 0.6310 sec/batch
Epoch 17/20  Iteration 3002/3560 Training loss: 1.6758 0.6335 sec/batch
Epoch 17/20  Iteration 3003/3560 Training loss: 1.6758 0.6364 sec/batch
Epoch 17/20  Iteration 3004/3560 Training loss: 1.6759 0.6372 sec/batch
Epoch 17/20  Iteration 3005/3560 Training loss: 1.6759 0.6361 sec/batch
Epoch 17/20  Iteration 3006/3560 Training loss: 1.6760 0.6343 sec/batch
Epoch 17/20  Iteration 3007/3560 Training loss: 1.6758 0.6563 sec/batch
Epoch 17/20  Iteration 3008/3560 Training loss: 1.6759 0.6530 sec/batch
Epoch 17/20  Iteration 3009/3560 Training loss: 1.6761 0.6470 sec/batch
Epoch 17/20  Iteration 3010/3560 Training loss: 1.6760 0.6385 sec/batch
Epoch 17/20  Iteration 3011/3560 Training loss: 1.6761 0.6367 sec/batch
Epoch 17/20  Iteration 3012/3560 Training loss: 1.6761 0.6352 sec/batch
Epoch 17/20  Iteration 3013/3560 Training loss: 1.6761 0.6238 sec/batch
Epoch 17/20  Iteration 3014/3560 Training loss: 1.6760 0.6442 sec/batch
Epoch 17/20  Iteration 3015/3560 Training loss: 1.6762 0.6499 sec/batch
Epoch 17/20  Iteration 3016/3560 Training loss: 1.6766 0.6843 sec/batch
Epoch 17/20  Iteration 3017/3560 Training loss: 1.6766 0.6474 sec/batch
Epoch 17/20  Iteration 3018/3560 Training loss: 1.6765 0.6777 sec/batch
Epoch 17/20  Iteration 3019/3560 Training loss: 1.6763 0.6559 sec/batch
Epoch 17/20  Iteration 3020/3560 Training loss: 1.6761 0.6674 sec/batch
Epoch 17/20  Iteration 3021/3560 Training loss: 1.6763 0.6620 sec/batch
Epoch 17/20  Iteration 3022/3560 Training loss: 1.6762 0.6632 sec/batch
Epoch 17/20  Iteration 3023/3560 Training loss: 1.6763 0.6652 sec/batch
Epoch 17/20  Iteration 3024/3560 Training loss: 1.6763 0.6423 sec/batch
Epoch 17/20  Iteration 3025/3560 Training loss: 1.6762 0.6386 sec/batch
Epoch 17/20  Iteration 3026/3560 Training loss: 1.6762 0.6387 sec/batch
Epoch 18/20  Iteration 3027/3560 Training loss: 1.7477 0.6323 sec/batch
Epoch 18/20  Iteration 3028/3560 Training loss: 1.7074 0.6056 sec/batch
Epoch 18/20  Iteration 3029/3560 Training loss: 1.6962 0.6393 sec/batch
Epoch 18/20  Iteration 3030/3560 Training loss: 1.6925 0.6570 sec/batch
Epoch 18/20  Iteration 3031/3560 Training loss: 1.6865 0.6355 sec/batch
Epoch 18/20  Iteration 3032/3560 Training loss: 1.6764 0.6360 sec/batch
Epoch 18/20  Iteration 3033/3560 Training loss: 1.6757 0.6817 sec/batch
Epoch 18/20  Iteration 3034/3560 Training loss: 1.6744 0.6645 sec/batch
Epoch 18/20  Iteration 3035/3560 Training loss: 1.6761 0.6270 sec/batch
Epoch 18/20  Iteration 3036/3560 Training loss: 1.6754 0.6351 sec/batch
Epoch 18/20  Iteration 3037/3560 Training loss: 1.6730 0.6386 sec/batch
Epoch 18/20  Iteration 3038/3560 Training loss: 1.6731 0.6502 sec/batch
Epoch 18/20  Iteration 3039/3560 Training loss: 1.6725 0.6384 sec/batch
Epoch 18/20  Iteration 3040/3560 Training loss: 1.6752 0.6356 sec/batch
Epoch 18/20  Iteration 3041/3560 Training loss: 1.6746 0.6319 sec/batch
Epoch 18/20  Iteration 3042/3560 Training loss: 1.6725 0.6349 sec/batch
Epoch 18/20  Iteration 3043/3560 Training loss: 1.6732 0.6404 sec/batch
Epoch 18/20  Iteration 3044/3560 Training loss: 1.6755 0.6418 sec/batch
Epoch 18/20  Iteration 3045/3560 Training loss: 1.6758 0.6323 sec/batch
Epoch 18/20  Iteration 3046/3560 Training loss: 1.6765 0.6619 sec/batch
Epoch 18/20  Iteration 3047/3560 Training loss: 1.6761 0.6375 sec/batch
Epoch 18/20  Iteration 3048/3560 Training loss: 1.6763 0.6477 sec/batch
Epoch 18/20  Iteration 3049/3560 Training loss: 1.6756 0.6339 sec/batch
Epoch 18/20  Iteration 3050/3560 Training loss: 1.6751 0.6391 sec/batch
Epoch 18/20  Iteration 3051/3560 Training loss: 1.6746 0.6243 sec/batch
Epoch 18/20  Iteration 3052/3560 Training loss: 1.6731 0.6369 sec/batch
Epoch 18/20  Iteration 3053/3560 Training loss: 1.6721 0.6402 sec/batch
Epoch 18/20  Iteration 3054/3560 Training loss: 1.6729 0.6752 sec/batch
Epoch 18/20  Iteration 3055/3560 Training loss: 1.6736 0.6402 sec/batch
Epoch 18/20  Iteration 3056/3560 Training loss: 1.6741 0.6716 sec/batch
Epoch 18/20  Iteration 3057/3560 Training loss: 1.6739 0.6460 sec/batch
Epoch 18/20  Iteration 3058/3560 Training loss: 1.6732 0.6481 sec/batch
Epoch 18/20  Iteration 3059/3560 Training loss: 1.6741 0.6424 sec/batch
Epoch 18/20  Iteration 3060/3560 Training loss: 1.6744 0.6552 sec/batch
Epoch 18/20  Iteration 3061/3560 Training loss: 1.6742 0.6559 sec/batch
Epoch 18/20  Iteration 3062/3560 Training loss: 1.6743 0.6681 sec/batch
Epoch 18/20  Iteration 3063/3560 Training loss: 1.6737 0.6612 sec/batch
Epoch 18/20  Iteration 3064/3560 Training loss: 1.6728 0.6365 sec/batch
Epoch 18/20  Iteration 3065/3560 Training loss: 1.6713 0.6366 sec/batch
Epoch 18/20  Iteration 3066/3560 Training loss: 1.6709 0.6389 sec/batch
Epoch 18/20  Iteration 3067/3560 Training loss: 1.6705 0.6386 sec/batch
Epoch 18/20  Iteration 3068/3560 Training loss: 1.6710 0.6335 sec/batch
Epoch 18/20  Iteration 3069/3560 Training loss: 1.6705 0.6584 sec/batch
Epoch 18/20  Iteration 3070/3560 Training loss: 1.6698 0.6385 sec/batch
Epoch 18/20  Iteration 3071/3560 Training loss: 1.6699 0.6277 sec/batch
Epoch 18/20  Iteration 3072/3560 Training loss: 1.6691 0.6400 sec/batch
Epoch 18/20  Iteration 3073/3560 Training loss: 1.6689 0.6586 sec/batch
Epoch 18/20  Iteration 3074/3560 Training loss: 1.6685 0.6340 sec/batch
Epoch 18/20  Iteration 3075/3560 Training loss: 1.6684 0.6804 sec/batch
Epoch 18/20  Iteration 3076/3560 Training loss: 1.6690 0.6794 sec/batch
Epoch 18/20  Iteration 3077/3560 Training loss: 1.6685 0.6637 sec/batch
Epoch 18/20  Iteration 3078/3560 Training loss: 1.6693 0.6520 sec/batch
Epoch 18/20  Iteration 3079/3560 Training loss: 1.6693 0.6597 sec/batch
Epoch 18/20  Iteration 3080/3560 Training loss: 1.6693 0.6291 sec/batch
Epoch 18/20  Iteration 3081/3560 Training loss: 1.6691 0.6610 sec/batch
Epoch 18/20  Iteration 3082/3560 Training loss: 1.6692 0.6296 sec/batch
Epoch 18/20  Iteration 3083/3560 Training loss: 1.6697 0.6428 sec/batch
Epoch 18/20  Iteration 3084/3560 Training loss: 1.6695 0.6420 sec/batch
Epoch 18/20  Iteration 3085/3560 Training loss: 1.6692 0.6605 sec/batch
Epoch 18/20  Iteration 3086/3560 Training loss: 1.6696 0.6323 sec/batch
Epoch 18/20  Iteration 3087/3560 Training loss: 1.6696 0.6463 sec/batch
Epoch 18/20  Iteration 3088/3560 Training loss: 1.6706 0.6307 sec/batch
Epoch 18/20  Iteration 3089/3560 Training loss: 1.6709 0.6444 sec/batch
Epoch 18/20  Iteration 3090/3560 Training loss: 1.6713 0.6438 sec/batch
Epoch 18/20  Iteration 3091/3560 Training loss: 1.6713 0.6622 sec/batch
Epoch 18/20  Iteration 3092/3560 Training loss: 1.6716 0.6552 sec/batch
Epoch 18/20  Iteration 3093/3560 Training loss: 1.6719 0.6794 sec/batch
Epoch 18/20  Iteration 3094/3560 Training loss: 1.6715 0.6802 sec/batch
Epoch 18/20  Iteration 3095/3560 Training loss: 1.6713 0.6383 sec/batch
Epoch 18/20  Iteration 3096/3560 Training loss: 1.6712 0.6395 sec/batch
Epoch 18/20  Iteration 3097/3560 Training loss: 1.6718 0.6366 sec/batch
Epoch 18/20  Iteration 3098/3560 Training loss: 1.6718 0.6355 sec/batch
Epoch 18/20  Iteration 3099/3560 Training loss: 1.6722 0.6623 sec/batch
Epoch 18/20  Iteration 3100/3560 Training loss: 1.6718 0.6840 sec/batch
Epoch 18/20  Iteration 3101/3560 Training loss: 1.6718 0.6628 sec/batch
Epoch 18/20  Iteration 3102/3560 Training loss: 1.6719 0.6406 sec/batch
Epoch 18/20  Iteration 3103/3560 Training loss: 1.6718 0.6311 sec/batch
Epoch 18/20  Iteration 3104/3560 Training loss: 1.6718 0.6453 sec/batch
Epoch 18/20  Iteration 3105/3560 Training loss: 1.6713 0.6388 sec/batch
Epoch 18/20  Iteration 3106/3560 Training loss: 1.6712 0.6334 sec/batch
Epoch 18/20  Iteration 3107/3560 Training loss: 1.6707 0.6321 sec/batch
Epoch 18/20  Iteration 3108/3560 Training loss: 1.6707 0.6574 sec/batch
Epoch 18/20  Iteration 3109/3560 Training loss: 1.6702 0.6421 sec/batch
Epoch 18/20  Iteration 3110/3560 Training loss: 1.6702 0.6330 sec/batch
Epoch 18/20  Iteration 3111/3560 Training loss: 1.6697 0.6318 sec/batch
Epoch 18/20  Iteration 3112/3560 Training loss: 1.6695 0.6317 sec/batch
Epoch 18/20  Iteration 3113/3560 Training loss: 1.6693 0.6388 sec/batch
Epoch 18/20  Iteration 3114/3560 Training loss: 1.6689 0.6233 sec/batch
Epoch 18/20  Iteration 3115/3560 Training loss: 1.6685 0.6676 sec/batch
Epoch 18/20  Iteration 3116/3560 Training loss: 1.6685 0.7321 sec/batch
Epoch 18/20  Iteration 3117/3560 Training loss: 1.6682 0.6940 sec/batch
Epoch 18/20  Iteration 3118/3560 Training loss: 1.6681 0.6911 sec/batch
Epoch 18/20  Iteration 3119/3560 Training loss: 1.6678 0.6512 sec/batch
Epoch 18/20  Iteration 3120/3560 Training loss: 1.6674 0.6438 sec/batch
Epoch 18/20  Iteration 3121/3560 Training loss: 1.6671 0.6500 sec/batch
Epoch 18/20  Iteration 3122/3560 Training loss: 1.6671 0.6447 sec/batch
Epoch 18/20  Iteration 3123/3560 Training loss: 1.6671 0.6331 sec/batch
Epoch 18/20  Iteration 3124/3560 Training loss: 1.6667 0.6887 sec/batch
Epoch 18/20  Iteration 3125/3560 Training loss: 1.6664 0.6324 sec/batch
Epoch 18/20  Iteration 3126/3560 Training loss: 1.6660 0.6446 sec/batch
Epoch 18/20  Iteration 3127/3560 Training loss: 1.6660 0.6514 sec/batch
Epoch 18/20  Iteration 3128/3560 Training loss: 1.6660 0.6543 sec/batch
Epoch 18/20  Iteration 3129/3560 Training loss: 1.6659 0.6353 sec/batch
Epoch 18/20  Iteration 3130/3560 Training loss: 1.6657 0.6594 sec/batch
Epoch 18/20  Iteration 3131/3560 Training loss: 1.6655 0.6533 sec/batch
Epoch 18/20  Iteration 3132/3560 Training loss: 1.6655 0.6371 sec/batch
Epoch 18/20  Iteration 3133/3560 Training loss: 1.6655 0.6261 sec/batch
Epoch 18/20  Iteration 3134/3560 Training loss: 1.6655 0.6494 sec/batch
Epoch 18/20  Iteration 3135/3560 Training loss: 1.6655 0.6398 sec/batch
Epoch 18/20  Iteration 3136/3560 Training loss: 1.6655 0.6395 sec/batch
Epoch 18/20  Iteration 3137/3560 Training loss: 1.6654 0.6413 sec/batch
Epoch 18/20  Iteration 3138/3560 Training loss: 1.6652 0.6213 sec/batch
Epoch 18/20  Iteration 3139/3560 Training loss: 1.6651 0.6473 sec/batch
Epoch 18/20  Iteration 3140/3560 Training loss: 1.6649 0.6310 sec/batch
Epoch 18/20  Iteration 3141/3560 Training loss: 1.6647 0.6243 sec/batch
Epoch 18/20  Iteration 3142/3560 Training loss: 1.6645 0.6608 sec/batch
Epoch 18/20  Iteration 3143/3560 Training loss: 1.6644 0.6214 sec/batch
Epoch 18/20  Iteration 3144/3560 Training loss: 1.6643 0.6348 sec/batch
Epoch 18/20  Iteration 3145/3560 Training loss: 1.6643 0.6517 sec/batch
Epoch 18/20  Iteration 3146/3560 Training loss: 1.6642 0.6661 sec/batch
Epoch 18/20  Iteration 3147/3560 Training loss: 1.6641 0.6646 sec/batch
Epoch 18/20  Iteration 3148/3560 Training loss: 1.6638 0.6458 sec/batch
Epoch 18/20  Iteration 3149/3560 Training loss: 1.6635 0.6392 sec/batch
Epoch 18/20  Iteration 3150/3560 Training loss: 1.6635 0.6322 sec/batch
Epoch 18/20  Iteration 3151/3560 Training loss: 1.6635 0.6291 sec/batch
Epoch 18/20  Iteration 3152/3560 Training loss: 1.6631 0.6390 sec/batch
Epoch 18/20  Iteration 3153/3560 Training loss: 1.6632 0.6460 sec/batch
Epoch 18/20  Iteration 3154/3560 Training loss: 1.6633 0.6593 sec/batch
Epoch 18/20  Iteration 3155/3560 Training loss: 1.6632 0.6728 sec/batch
Epoch 18/20  Iteration 3156/3560 Training loss: 1.6630 0.6497 sec/batch
Epoch 18/20  Iteration 3157/3560 Training loss: 1.6626 0.6405 sec/batch
Epoch 18/20  Iteration 3158/3560 Training loss: 1.6625 0.6550 sec/batch
Epoch 18/20  Iteration 3159/3560 Training loss: 1.6625 0.6669 sec/batch
Epoch 18/20  Iteration 3160/3560 Training loss: 1.6626 0.6520 sec/batch
Epoch 18/20  Iteration 3161/3560 Training loss: 1.6626 0.6223 sec/batch
Epoch 18/20  Iteration 3162/3560 Training loss: 1.6626 0.6347 sec/batch
Epoch 18/20  Iteration 3163/3560 Training loss: 1.6627 0.6445 sec/batch
Epoch 18/20  Iteration 3164/3560 Training loss: 1.6627 0.6792 sec/batch
Epoch 18/20  Iteration 3165/3560 Training loss: 1.6629 0.6640 sec/batch
Epoch 18/20  Iteration 3166/3560 Training loss: 1.6629 0.6553 sec/batch
Epoch 18/20  Iteration 3167/3560 Training loss: 1.6632 0.6368 sec/batch
Epoch 18/20  Iteration 3168/3560 Training loss: 1.6632 0.6196 sec/batch
Epoch 18/20  Iteration 3169/3560 Training loss: 1.6631 0.6271 sec/batch
Epoch 18/20  Iteration 3170/3560 Training loss: 1.6632 0.6753 sec/batch
Epoch 18/20  Iteration 3171/3560 Training loss: 1.6631 0.6125 sec/batch
Epoch 18/20  Iteration 3172/3560 Training loss: 1.6632 0.6388 sec/batch
Epoch 18/20  Iteration 3173/3560 Training loss: 1.6633 0.6325 sec/batch
Epoch 18/20  Iteration 3174/3560 Training loss: 1.6635 0.6188 sec/batch
Epoch 18/20  Iteration 3175/3560 Training loss: 1.6636 0.6310 sec/batch
Epoch 18/20  Iteration 3176/3560 Training loss: 1.6634 0.6314 sec/batch
Epoch 18/20  Iteration 3177/3560 Training loss: 1.6631 0.6278 sec/batch
Epoch 18/20  Iteration 3178/3560 Training loss: 1.6631 0.6470 sec/batch
Epoch 18/20  Iteration 3179/3560 Training loss: 1.6632 0.6318 sec/batch
Epoch 18/20  Iteration 3180/3560 Training loss: 1.6631 0.6344 sec/batch
Epoch 18/20  Iteration 3181/3560 Training loss: 1.6632 0.6201 sec/batch
Epoch 18/20  Iteration 3182/3560 Training loss: 1.6632 0.6371 sec/batch
Epoch 18/20  Iteration 3183/3560 Training loss: 1.6633 0.6350 sec/batch
Epoch 18/20  Iteration 3184/3560 Training loss: 1.6633 0.6504 sec/batch
Epoch 18/20  Iteration 3185/3560 Training loss: 1.6631 0.6316 sec/batch
Epoch 18/20  Iteration 3186/3560 Training loss: 1.6632 0.6581 sec/batch
Epoch 18/20  Iteration 3187/3560 Training loss: 1.6635 0.6312 sec/batch
Epoch 18/20  Iteration 3188/3560 Training loss: 1.6635 0.6307 sec/batch
Epoch 18/20  Iteration 3189/3560 Training loss: 1.6635 0.6246 sec/batch
Epoch 18/20  Iteration 3190/3560 Training loss: 1.6635 0.6280 sec/batch
Epoch 18/20  Iteration 3191/3560 Training loss: 1.6635 0.6259 sec/batch
Epoch 18/20  Iteration 3192/3560 Training loss: 1.6634 0.6475 sec/batch
Epoch 18/20  Iteration 3193/3560 Training loss: 1.6635 0.7412 sec/batch
Epoch 18/20  Iteration 3194/3560 Training loss: 1.6640 0.6428 sec/batch
Epoch 18/20  Iteration 3195/3560 Training loss: 1.6640 0.6497 sec/batch
Epoch 18/20  Iteration 3196/3560 Training loss: 1.6640 0.6412 sec/batch
Epoch 18/20  Iteration 3197/3560 Training loss: 1.6638 0.6291 sec/batch
Epoch 18/20  Iteration 3198/3560 Training loss: 1.6637 0.6183 sec/batch
Epoch 18/20  Iteration 3199/3560 Training loss: 1.6639 0.6343 sec/batch
Epoch 18/20  Iteration 3200/3560 Training loss: 1.6638 0.8148 sec/batch
Epoch 18/20  Iteration 3201/3560 Training loss: 1.6639 0.8361 sec/batch
Epoch 18/20  Iteration 3202/3560 Training loss: 1.6638 0.6541 sec/batch
Epoch 18/20  Iteration 3203/3560 Training loss: 1.6637 0.6009 sec/batch
Epoch 18/20  Iteration 3204/3560 Training loss: 1.6638 0.6094 sec/batch
Epoch 19/20  Iteration 3205/3560 Training loss: 1.7330 0.6281 sec/batch
Epoch 19/20  Iteration 3206/3560 Training loss: 1.6925 0.6244 sec/batch
Epoch 19/20  Iteration 3207/3560 Training loss: 1.6835 0.6380 sec/batch
Epoch 19/20  Iteration 3208/3560 Training loss: 1.6787 0.6371 sec/batch
Epoch 19/20  Iteration 3209/3560 Training loss: 1.6738 0.6430 sec/batch
Epoch 19/20  Iteration 3210/3560 Training loss: 1.6641 0.6555 sec/batch
Epoch 19/20  Iteration 3211/3560 Training loss: 1.6643 0.6338 sec/batch
Epoch 19/20  Iteration 3212/3560 Training loss: 1.6630 0.6360 sec/batch
Epoch 19/20  Iteration 3213/3560 Training loss: 1.6639 0.6305 sec/batch
Epoch 19/20  Iteration 3214/3560 Training loss: 1.6635 0.6274 sec/batch
Epoch 19/20  Iteration 3215/3560 Training loss: 1.6605 0.6250 sec/batch
Epoch 19/20  Iteration 3216/3560 Training loss: 1.6588 0.6321 sec/batch
Epoch 19/20  Iteration 3217/3560 Training loss: 1.6572 0.6481 sec/batch
Epoch 19/20  Iteration 3218/3560 Training loss: 1.6597 0.6404 sec/batch
Epoch 19/20  Iteration 3219/3560 Training loss: 1.6583 0.6619 sec/batch
Epoch 19/20  Iteration 3220/3560 Training loss: 1.6568 0.6415 sec/batch
Epoch 19/20  Iteration 3221/3560 Training loss: 1.6572 0.6463 sec/batch
Epoch 19/20  Iteration 3222/3560 Training loss: 1.6594 0.6349 sec/batch
Epoch 19/20  Iteration 3223/3560 Training loss: 1.6591 0.6266 sec/batch
Epoch 19/20  Iteration 3224/3560 Training loss: 1.6596 0.6305 sec/batch
Epoch 19/20  Iteration 3225/3560 Training loss: 1.6596 0.6490 sec/batch
Epoch 19/20  Iteration 3226/3560 Training loss: 1.6598 0.6255 sec/batch
Epoch 19/20  Iteration 3227/3560 Training loss: 1.6588 0.6246 sec/batch
Epoch 19/20  Iteration 3228/3560 Training loss: 1.6591 0.6260 sec/batch
Epoch 19/20  Iteration 3229/3560 Training loss: 1.6587 0.6431 sec/batch
Epoch 19/20  Iteration 3230/3560 Training loss: 1.6573 0.6403 sec/batch
Epoch 19/20  Iteration 3231/3560 Training loss: 1.6564 0.6350 sec/batch
Epoch 19/20  Iteration 3232/3560 Training loss: 1.6564 0.6294 sec/batch
Epoch 19/20  Iteration 3233/3560 Training loss: 1.6573 0.6488 sec/batch
Epoch 19/20  Iteration 3234/3560 Training loss: 1.6579 0.6277 sec/batch
Epoch 19/20  Iteration 3235/3560 Training loss: 1.6576 0.6268 sec/batch
Epoch 19/20  Iteration 3236/3560 Training loss: 1.6570 0.6403 sec/batch
Epoch 19/20  Iteration 3237/3560 Training loss: 1.6577 0.6360 sec/batch
Epoch 19/20  Iteration 3238/3560 Training loss: 1.6583 0.6413 sec/batch
Epoch 19/20  Iteration 3239/3560 Training loss: 1.6586 0.6328 sec/batch
Epoch 19/20  Iteration 3240/3560 Training loss: 1.6586 0.7100 sec/batch
Epoch 19/20  Iteration 3241/3560 Training loss: 1.6580 0.6744 sec/batch
Epoch 19/20  Iteration 3242/3560 Training loss: 1.6572 0.6566 sec/batch
Epoch 19/20  Iteration 3243/3560 Training loss: 1.6559 0.6345 sec/batch
Epoch 19/20  Iteration 3244/3560 Training loss: 1.6555 0.6047 sec/batch
Epoch 19/20  Iteration 3245/3560 Training loss: 1.6553 0.6483 sec/batch
Epoch 19/20  Iteration 3246/3560 Training loss: 1.6561 0.6276 sec/batch
Epoch 19/20  Iteration 3247/3560 Training loss: 1.6558 0.6293 sec/batch
Epoch 19/20  Iteration 3248/3560 Training loss: 1.6550 0.6360 sec/batch
Epoch 19/20  Iteration 3249/3560 Training loss: 1.6549 0.6497 sec/batch
Epoch 19/20  Iteration 3250/3560 Training loss: 1.6539 0.6337 sec/batch
Epoch 19/20  Iteration 3251/3560 Training loss: 1.6537 0.6142 sec/batch
Epoch 19/20  Iteration 3252/3560 Training loss: 1.6533 0.6417 sec/batch
Epoch 19/20  Iteration 3253/3560 Training loss: 1.6533 0.6150 sec/batch
Epoch 19/20  Iteration 3254/3560 Training loss: 1.6541 0.6272 sec/batch
Epoch 19/20  Iteration 3255/3560 Training loss: 1.6536 0.6438 sec/batch
Epoch 19/20  Iteration 3256/3560 Training loss: 1.6545 0.6819 sec/batch
Epoch 19/20  Iteration 3257/3560 Training loss: 1.6543 0.6976 sec/batch
Epoch 19/20  Iteration 3258/3560 Training loss: 1.6545 0.6499 sec/batch
Epoch 19/20  Iteration 3259/3560 Training loss: 1.6542 0.6398 sec/batch
Epoch 19/20  Iteration 3260/3560 Training loss: 1.6546 0.7151 sec/batch
Epoch 19/20  Iteration 3261/3560 Training loss: 1.6551 0.7425 sec/batch
Epoch 19/20  Iteration 3262/3560 Training loss: 1.6546 0.8453 sec/batch
Epoch 19/20  Iteration 3263/3560 Training loss: 1.6543 0.7607 sec/batch
Epoch 19/20  Iteration 3264/3560 Training loss: 1.6548 0.6993 sec/batch
Epoch 19/20  Iteration 3265/3560 Training loss: 1.6546 0.7097 sec/batch
Epoch 19/20  Iteration 3266/3560 Training loss: 1.6555 0.7325 sec/batch
Epoch 19/20  Iteration 3267/3560 Training loss: 1.6559 0.6700 sec/batch
Epoch 19/20  Iteration 3268/3560 Training loss: 1.6562 0.6628 sec/batch
Epoch 19/20  Iteration 3269/3560 Training loss: 1.6561 0.6317 sec/batch
Epoch 19/20  Iteration 3270/3560 Training loss: 1.6563 0.6490 sec/batch
Epoch 19/20  Iteration 3271/3560 Training loss: 1.6566 0.6637 sec/batch
Epoch 19/20  Iteration 3272/3560 Training loss: 1.6560 0.6279 sec/batch
Epoch 19/20  Iteration 3273/3560 Training loss: 1.6561 0.6362 sec/batch
Epoch 19/20  Iteration 3274/3560 Training loss: 1.6559 0.6332 sec/batch
Epoch 19/20  Iteration 3275/3560 Training loss: 1.6564 0.6226 sec/batch
Epoch 19/20  Iteration 3276/3560 Training loss: 1.6565 0.6238 sec/batch
Epoch 19/20  Iteration 3277/3560 Training loss: 1.6568 0.6066 sec/batch
Epoch 19/20  Iteration 3278/3560 Training loss: 1.6565 0.6194 sec/batch
Epoch 19/20  Iteration 3279/3560 Training loss: 1.6565 0.6389 sec/batch
Epoch 19/20  Iteration 3280/3560 Training loss: 1.6566 0.6223 sec/batch
Epoch 19/20  Iteration 3281/3560 Training loss: 1.6566 0.6362 sec/batch
Epoch 19/20  Iteration 3282/3560 Training loss: 1.6564 0.6280 sec/batch
Epoch 19/20  Iteration 3283/3560 Training loss: 1.6559 0.6294 sec/batch
Epoch 19/20  Iteration 3284/3560 Training loss: 1.6558 0.6466 sec/batch
Epoch 19/20  Iteration 3285/3560 Training loss: 1.6553 0.6227 sec/batch
Epoch 19/20  Iteration 3286/3560 Training loss: 1.6553 0.6225 sec/batch
Epoch 19/20  Iteration 3287/3560 Training loss: 1.6548 0.6410 sec/batch
Epoch 19/20  Iteration 3288/3560 Training loss: 1.6547 0.6261 sec/batch
Epoch 19/20  Iteration 3289/3560 Training loss: 1.6544 0.6377 sec/batch
Epoch 19/20  Iteration 3290/3560 Training loss: 1.6542 0.6238 sec/batch
Epoch 19/20  Iteration 3291/3560 Training loss: 1.6540 0.6386 sec/batch
Epoch 19/20  Iteration 3292/3560 Training loss: 1.6537 0.6311 sec/batch
Epoch 19/20  Iteration 3293/3560 Training loss: 1.6533 0.6229 sec/batch
Epoch 19/20  Iteration 3294/3560 Training loss: 1.6535 0.6241 sec/batch
Epoch 19/20  Iteration 3295/3560 Training loss: 1.6533 0.6449 sec/batch
Epoch 19/20  Iteration 3296/3560 Training loss: 1.6531 0.6280 sec/batch
Epoch 19/20  Iteration 3297/3560 Training loss: 1.6527 0.6328 sec/batch
Epoch 19/20  Iteration 3298/3560 Training loss: 1.6524 0.6310 sec/batch
Epoch 19/20  Iteration 3299/3560 Training loss: 1.6521 0.6276 sec/batch
Epoch 19/20  Iteration 3300/3560 Training loss: 1.6521 0.6306 sec/batch
Epoch 19/20  Iteration 3301/3560 Training loss: 1.6520 0.6198 sec/batch
Epoch 19/20  Iteration 3302/3560 Training loss: 1.6516 0.6415 sec/batch
Epoch 19/20  Iteration 3303/3560 Training loss: 1.6513 0.6638 sec/batch
Epoch 19/20  Iteration 3304/3560 Training loss: 1.6509 0.6415 sec/batch
Epoch 19/20  Iteration 3305/3560 Training loss: 1.6509 0.6440 sec/batch
Epoch 19/20  Iteration 3306/3560 Training loss: 1.6507 0.6333 sec/batch
Epoch 19/20  Iteration 3307/3560 Training loss: 1.6505 0.6559 sec/batch
Epoch 19/20  Iteration 3308/3560 Training loss: 1.6503 0.6383 sec/batch
Epoch 19/20  Iteration 3309/3560 Training loss: 1.6501 0.6172 sec/batch
Epoch 19/20  Iteration 3310/3560 Training loss: 1.6501 0.6303 sec/batch
Epoch 19/20  Iteration 3311/3560 Training loss: 1.6500 0.6410 sec/batch
Epoch 19/20  Iteration 3312/3560 Training loss: 1.6499 0.6339 sec/batch
Epoch 19/20  Iteration 3313/3560 Training loss: 1.6499 0.6332 sec/batch
Epoch 19/20  Iteration 3314/3560 Training loss: 1.6500 0.6229 sec/batch
Epoch 19/20  Iteration 3315/3560 Training loss: 1.6500 0.6319 sec/batch
Epoch 19/20  Iteration 3316/3560 Training loss: 1.6498 0.6365 sec/batch
Epoch 19/20  Iteration 3317/3560 Training loss: 1.6497 0.6230 sec/batch
Epoch 19/20  Iteration 3318/3560 Training loss: 1.6495 0.6216 sec/batch
Epoch 19/20  Iteration 3319/3560 Training loss: 1.6493 0.6561 sec/batch
Epoch 19/20  Iteration 3320/3560 Training loss: 1.6490 0.6218 sec/batch
Epoch 19/20  Iteration 3321/3560 Training loss: 1.6490 0.6453 sec/batch
Epoch 19/20  Iteration 3322/3560 Training loss: 1.6489 0.6364 sec/batch
Epoch 19/20  Iteration 3323/3560 Training loss: 1.6488 0.6494 sec/batch
Epoch 19/20  Iteration 3324/3560 Training loss: 1.6488 0.6316 sec/batch
Epoch 19/20  Iteration 3325/3560 Training loss: 1.6488 0.6444 sec/batch
Epoch 19/20  Iteration 3326/3560 Training loss: 1.6485 0.6425 sec/batch
Epoch 19/20  Iteration 3327/3560 Training loss: 1.6483 0.6630 sec/batch
Epoch 19/20  Iteration 3328/3560 Training loss: 1.6484 0.6419 sec/batch
Epoch 19/20  Iteration 3329/3560 Training loss: 1.6484 0.6238 sec/batch
Epoch 19/20  Iteration 3330/3560 Training loss: 1.6480 0.6297 sec/batch
Epoch 19/20  Iteration 3331/3560 Training loss: 1.6482 0.6342 sec/batch
Epoch 19/20  Iteration 3332/3560 Training loss: 1.6482 0.6298 sec/batch
Epoch 19/20  Iteration 3333/3560 Training loss: 1.6481 0.6235 sec/batch
Epoch 19/20  Iteration 3334/3560 Training loss: 1.6480 0.6253 sec/batch
Epoch 19/20  Iteration 3335/3560 Training loss: 1.6476 0.6545 sec/batch
Epoch 19/20  Iteration 3336/3560 Training loss: 1.6474 0.6242 sec/batch
Epoch 19/20  Iteration 3337/3560 Training loss: 1.6474 0.6248 sec/batch
Epoch 19/20  Iteration 3338/3560 Training loss: 1.6474 0.6301 sec/batch
Epoch 19/20  Iteration 3339/3560 Training loss: 1.6474 0.6353 sec/batch
Epoch 19/20  Iteration 3340/3560 Training loss: 1.6475 0.6357 sec/batch
Epoch 19/20  Iteration 3341/3560 Training loss: 1.6476 0.6280 sec/batch
Epoch 19/20  Iteration 3342/3560 Training loss: 1.6477 0.6315 sec/batch
Epoch 19/20  Iteration 3343/3560 Training loss: 1.6479 0.6503 sec/batch
Epoch 19/20  Iteration 3344/3560 Training loss: 1.6478 0.6410 sec/batch
Epoch 19/20  Iteration 3345/3560 Training loss: 1.6482 0.6281 sec/batch
Epoch 19/20  Iteration 3346/3560 Training loss: 1.6482 0.6310 sec/batch
Epoch 19/20  Iteration 3347/3560 Training loss: 1.6481 0.6203 sec/batch
Epoch 19/20  Iteration 3348/3560 Training loss: 1.6483 0.6234 sec/batch
Epoch 19/20  Iteration 3349/3560 Training loss: 1.6482 0.6243 sec/batch
Epoch 19/20  Iteration 3350/3560 Training loss: 1.6483 0.6263 sec/batch
Epoch 19/20  Iteration 3351/3560 Training loss: 1.6484 0.6480 sec/batch
Epoch 19/20  Iteration 3352/3560 Training loss: 1.6486 0.6168 sec/batch
Epoch 19/20  Iteration 3353/3560 Training loss: 1.6486 0.6332 sec/batch
Epoch 19/20  Iteration 3354/3560 Training loss: 1.6485 0.6460 sec/batch
Epoch 19/20  Iteration 3355/3560 Training loss: 1.6484 0.6293 sec/batch
Epoch 19/20  Iteration 3356/3560 Training loss: 1.6485 0.6394 sec/batch
Epoch 19/20  Iteration 3357/3560 Training loss: 1.6485 0.6564 sec/batch
Epoch 19/20  Iteration 3358/3560 Training loss: 1.6485 0.6797 sec/batch
Epoch 19/20  Iteration 3359/3560 Training loss: 1.6486 0.6816 sec/batch
Epoch 19/20  Iteration 3360/3560 Training loss: 1.6486 0.6855 sec/batch
Epoch 19/20  Iteration 3361/3560 Training loss: 1.6487 0.6563 sec/batch
Epoch 19/20  Iteration 3362/3560 Training loss: 1.6487 0.6651 sec/batch
Epoch 19/20  Iteration 3363/3560 Training loss: 1.6485 0.7102 sec/batch
Epoch 19/20  Iteration 3364/3560 Training loss: 1.6487 0.6385 sec/batch
Epoch 19/20  Iteration 3365/3560 Training loss: 1.6490 0.6834 sec/batch
Epoch 19/20  Iteration 3366/3560 Training loss: 1.6490 0.6763 sec/batch
Epoch 19/20  Iteration 3367/3560 Training loss: 1.6490 0.6315 sec/batch
Epoch 19/20  Iteration 3368/3560 Training loss: 1.6490 0.6473 sec/batch
Epoch 19/20  Iteration 3369/3560 Training loss: 1.6490 0.6649 sec/batch
Epoch 19/20  Iteration 3370/3560 Training loss: 1.6490 0.6380 sec/batch
Epoch 19/20  Iteration 3371/3560 Training loss: 1.6491 0.6545 sec/batch
Epoch 19/20  Iteration 3372/3560 Training loss: 1.6496 0.6755 sec/batch
Epoch 19/20  Iteration 3373/3560 Training loss: 1.6495 0.6334 sec/batch
Epoch 19/20  Iteration 3374/3560 Training loss: 1.6495 0.6546 sec/batch
Epoch 19/20  Iteration 3375/3560 Training loss: 1.6494 0.6100 sec/batch
Epoch 19/20  Iteration 3376/3560 Training loss: 1.6492 0.6393 sec/batch
Epoch 19/20  Iteration 3377/3560 Training loss: 1.6493 0.6344 sec/batch
Epoch 19/20  Iteration 3378/3560 Training loss: 1.6493 0.6394 sec/batch
Epoch 19/20  Iteration 3379/3560 Training loss: 1.6494 0.6417 sec/batch
Epoch 19/20  Iteration 3380/3560 Training loss: 1.6494 0.6343 sec/batch
Epoch 19/20  Iteration 3381/3560 Training loss: 1.6492 0.6461 sec/batch
Epoch 19/20  Iteration 3382/3560 Training loss: 1.6493 0.6589 sec/batch
Epoch 20/20  Iteration 3383/3560 Training loss: 1.7215 0.6318 sec/batch
Epoch 20/20  Iteration 3384/3560 Training loss: 1.6891 0.6345 sec/batch
Epoch 20/20  Iteration 3385/3560 Training loss: 1.6736 0.6397 sec/batch
Epoch 20/20  Iteration 3386/3560 Training loss: 1.6688 0.6446 sec/batch
Epoch 20/20  Iteration 3387/3560 Training loss: 1.6593 0.6155 sec/batch
Epoch 20/20  Iteration 3388/3560 Training loss: 1.6503 0.6308 sec/batch
Epoch 20/20  Iteration 3389/3560 Training loss: 1.6507 0.6351 sec/batch
Epoch 20/20  Iteration 3390/3560 Training loss: 1.6487 0.6587 sec/batch
Epoch 20/20  Iteration 3391/3560 Training loss: 1.6507 0.6406 sec/batch
Epoch 20/20  Iteration 3392/3560 Training loss: 1.6501 0.6273 sec/batch
Epoch 20/20  Iteration 3393/3560 Training loss: 1.6473 0.6363 sec/batch
Epoch 20/20  Iteration 3394/3560 Training loss: 1.6473 0.6346 sec/batch
Epoch 20/20  Iteration 3395/3560 Training loss: 1.6466 0.6385 sec/batch
Epoch 20/20  Iteration 3396/3560 Training loss: 1.6493 0.6243 sec/batch
Epoch 20/20  Iteration 3397/3560 Training loss: 1.6484 0.6497 sec/batch
Epoch 20/20  Iteration 3398/3560 Training loss: 1.6474 0.6368 sec/batch
Epoch 20/20  Iteration 3399/3560 Training loss: 1.6476 0.6337 sec/batch
Epoch 20/20  Iteration 3400/3560 Training loss: 1.6499 0.6427 sec/batch
Epoch 20/20  Iteration 3401/3560 Training loss: 1.6500 0.6333 sec/batch
Epoch 20/20  Iteration 3402/3560 Training loss: 1.6507 0.6257 sec/batch
Epoch 20/20  Iteration 3403/3560 Training loss: 1.6500 0.6372 sec/batch
Epoch 20/20  Iteration 3404/3560 Training loss: 1.6506 0.6488 sec/batch
Epoch 20/20  Iteration 3405/3560 Training loss: 1.6502 0.6475 sec/batch
Epoch 20/20  Iteration 3406/3560 Training loss: 1.6500 0.6352 sec/batch
Epoch 20/20  Iteration 3407/3560 Training loss: 1.6495 0.6318 sec/batch
Epoch 20/20  Iteration 3408/3560 Training loss: 1.6482 0.6367 sec/batch
Epoch 20/20  Iteration 3409/3560 Training loss: 1.6470 0.6521 sec/batch
Epoch 20/20  Iteration 3410/3560 Training loss: 1.6475 0.6361 sec/batch
Epoch 20/20  Iteration 3411/3560 Training loss: 1.6481 0.6297 sec/batch
Epoch 20/20  Iteration 3412/3560 Training loss: 1.6484 0.6345 sec/batch
Epoch 20/20  Iteration 3413/3560 Training loss: 1.6480 0.6548 sec/batch
Epoch 20/20  Iteration 3414/3560 Training loss: 1.6471 0.6314 sec/batch
Epoch 20/20  Iteration 3415/3560 Training loss: 1.6477 0.6384 sec/batch
Epoch 20/20  Iteration 3416/3560 Training loss: 1.6486 0.6426 sec/batch
Epoch 20/20  Iteration 3417/3560 Training loss: 1.6485 0.6438 sec/batch
Epoch 20/20  Iteration 3418/3560 Training loss: 1.6486 0.6605 sec/batch
Epoch 20/20  Iteration 3419/3560 Training loss: 1.6483 0.6416 sec/batch
Epoch 20/20  Iteration 3420/3560 Training loss: 1.6474 0.6451 sec/batch
Epoch 20/20  Iteration 3421/3560 Training loss: 1.6465 0.6452 sec/batch
Epoch 20/20  Iteration 3422/3560 Training loss: 1.6457 0.6347 sec/batch
Epoch 20/20  Iteration 3423/3560 Training loss: 1.6453 0.6540 sec/batch
Epoch 20/20  Iteration 3424/3560 Training loss: 1.6459 0.6343 sec/batch
Epoch 20/20  Iteration 3425/3560 Training loss: 1.6453 0.6391 sec/batch
Epoch 20/20  Iteration 3426/3560 Training loss: 1.6447 0.6368 sec/batch
Epoch 20/20  Iteration 3427/3560 Training loss: 1.6449 0.6320 sec/batch
Epoch 20/20  Iteration 3428/3560 Training loss: 1.6440 0.6428 sec/batch
Epoch 20/20  Iteration 3429/3560 Training loss: 1.6438 0.6498 sec/batch
Epoch 20/20  Iteration 3430/3560 Training loss: 1.6433 0.6223 sec/batch
Epoch 20/20  Iteration 3431/3560 Training loss: 1.6432 0.6301 sec/batch
Epoch 20/20  Iteration 3432/3560 Training loss: 1.6437 0.6397 sec/batch
Epoch 20/20  Iteration 3433/3560 Training loss: 1.6432 0.6298 sec/batch
Epoch 20/20  Iteration 3434/3560 Training loss: 1.6441 0.6486 sec/batch
Epoch 20/20  Iteration 3435/3560 Training loss: 1.6441 0.6386 sec/batch
Epoch 20/20  Iteration 3436/3560 Training loss: 1.6444 0.6457 sec/batch
Epoch 20/20  Iteration 3437/3560 Training loss: 1.6442 0.6591 sec/batch
Epoch 20/20  Iteration 3438/3560 Training loss: 1.6443 0.6286 sec/batch
Epoch 20/20  Iteration 3439/3560 Training loss: 1.6449 0.6348 sec/batch
Epoch 20/20  Iteration 3440/3560 Training loss: 1.6446 0.6391 sec/batch
Epoch 20/20  Iteration 3441/3560 Training loss: 1.6442 0.6418 sec/batch
Epoch 20/20  Iteration 3442/3560 Training loss: 1.6447 0.6401 sec/batch
Epoch 20/20  Iteration 3443/3560 Training loss: 1.6445 0.6277 sec/batch
Epoch 20/20  Iteration 3444/3560 Training loss: 1.6454 0.6634 sec/batch
Epoch 20/20  Iteration 3445/3560 Training loss: 1.6457 0.7216 sec/batch
Epoch 20/20  Iteration 3446/3560 Training loss: 1.6460 0.6419 sec/batch
Epoch 20/20  Iteration 3447/3560 Training loss: 1.6460 0.6626 sec/batch
Epoch 20/20  Iteration 3448/3560 Training loss: 1.6463 0.6506 sec/batch
Epoch 20/20  Iteration 3449/3560 Training loss: 1.6466 0.6329 sec/batch
Epoch 20/20  Iteration 3450/3560 Training loss: 1.6460 0.6179 sec/batch
Epoch 20/20  Iteration 3451/3560 Training loss: 1.6458 0.6221 sec/batch
Epoch 20/20  Iteration 3452/3560 Training loss: 1.6457 0.6550 sec/batch
Epoch 20/20  Iteration 3453/3560 Training loss: 1.6463 0.6373 sec/batch
Epoch 20/20  Iteration 3454/3560 Training loss: 1.6465 0.6363 sec/batch
Epoch 20/20  Iteration 3455/3560 Training loss: 1.6470 0.6416 sec/batch
Epoch 20/20  Iteration 3456/3560 Training loss: 1.6466 0.6419 sec/batch
Epoch 20/20  Iteration 3457/3560 Training loss: 1.6465 0.6390 sec/batch
Epoch 20/20  Iteration 3458/3560 Training loss: 1.6465 0.6304 sec/batch
Epoch 20/20  Iteration 3459/3560 Training loss: 1.6466 0.6392 sec/batch
Epoch 20/20  Iteration 3460/3560 Training loss: 1.6466 0.6475 sec/batch
Epoch 20/20  Iteration 3461/3560 Training loss: 1.6460 0.6370 sec/batch
Epoch 20/20  Iteration 3462/3560 Training loss: 1.6459 0.6329 sec/batch
Epoch 20/20  Iteration 3463/3560 Training loss: 1.6455 0.6483 sec/batch
Epoch 20/20  Iteration 3464/3560 Training loss: 1.6454 0.6345 sec/batch
Epoch 20/20  Iteration 3465/3560 Training loss: 1.6450 0.6506 sec/batch
Epoch 20/20  Iteration 3466/3560 Training loss: 1.6448 0.6372 sec/batch
Epoch 20/20  Iteration 3467/3560 Training loss: 1.6442 0.6275 sec/batch
Epoch 20/20  Iteration 3468/3560 Training loss: 1.6439 0.6499 sec/batch
Epoch 20/20  Iteration 3469/3560 Training loss: 1.6436 0.6419 sec/batch
Epoch 20/20  Iteration 3470/3560 Training loss: 1.6434 0.6562 sec/batch
Epoch 20/20  Iteration 3471/3560 Training loss: 1.6429 0.6438 sec/batch
Epoch 20/20  Iteration 3472/3560 Training loss: 1.6430 0.6391 sec/batch
Epoch 20/20  Iteration 3473/3560 Training loss: 1.6428 0.6356 sec/batch
Epoch 20/20  Iteration 3474/3560 Training loss: 1.6426 0.6525 sec/batch
Epoch 20/20  Iteration 3475/3560 Training loss: 1.6422 0.6353 sec/batch
Epoch 20/20  Iteration 3476/3560 Training loss: 1.6418 0.6551 sec/batch
Epoch 20/20  Iteration 3477/3560 Training loss: 1.6415 0.6320 sec/batch
Epoch 20/20  Iteration 3478/3560 Training loss: 1.6416 0.6410 sec/batch
Epoch 20/20  Iteration 3479/3560 Training loss: 1.6416 0.6377 sec/batch
Epoch 20/20  Iteration 3480/3560 Training loss: 1.6413 0.6312 sec/batch
Epoch 20/20  Iteration 3481/3560 Training loss: 1.6409 0.6331 sec/batch
Epoch 20/20  Iteration 3482/3560 Training loss: 1.6405 0.6418 sec/batch
Epoch 20/20  Iteration 3483/3560 Training loss: 1.6405 0.6212 sec/batch
Epoch 20/20  Iteration 3484/3560 Training loss: 1.6404 0.6682 sec/batch
Epoch 20/20  Iteration 3485/3560 Training loss: 1.6401 0.6256 sec/batch
Epoch 20/20  Iteration 3486/3560 Training loss: 1.6399 0.6425 sec/batch
Epoch 20/20  Iteration 3487/3560 Training loss: 1.6397 0.6382 sec/batch
Epoch 20/20  Iteration 3488/3560 Training loss: 1.6395 0.6390 sec/batch
Epoch 20/20  Iteration 3489/3560 Training loss: 1.6394 0.6306 sec/batch
Epoch 20/20  Iteration 3490/3560 Training loss: 1.6393 0.6353 sec/batch
Epoch 20/20  Iteration 3491/3560 Training loss: 1.6394 0.6351 sec/batch
Epoch 20/20  Iteration 3492/3560 Training loss: 1.6394 0.6748 sec/batch
Epoch 20/20  Iteration 3493/3560 Training loss: 1.6393 0.6317 sec/batch
Epoch 20/20  Iteration 3494/3560 Training loss: 1.6391 0.6336 sec/batch
Epoch 20/20  Iteration 3495/3560 Training loss: 1.6391 0.6297 sec/batch
Epoch 20/20  Iteration 3496/3560 Training loss: 1.6389 0.6410 sec/batch
Epoch 20/20  Iteration 3497/3560 Training loss: 1.6386 0.6262 sec/batch
Epoch 20/20  Iteration 3498/3560 Training loss: 1.6383 0.6536 sec/batch
Epoch 20/20  Iteration 3499/3560 Training loss: 1.6383 0.6737 sec/batch
Epoch 20/20  Iteration 3500/3560 Training loss: 1.6382 0.6984 sec/batch
Epoch 20/20  Iteration 3501/3560 Training loss: 1.6382 0.6585 sec/batch
Epoch 20/20  Iteration 3502/3560 Training loss: 1.6381 0.6766 sec/batch
Epoch 20/20  Iteration 3503/3560 Training loss: 1.6381 0.6430 sec/batch
Epoch 20/20  Iteration 3504/3560 Training loss: 1.6377 0.6416 sec/batch
Epoch 20/20  Iteration 3505/3560 Training loss: 1.6375 0.6714 sec/batch
Epoch 20/20  Iteration 3506/3560 Training loss: 1.6376 0.6259 sec/batch
Epoch 20/20  Iteration 3507/3560 Training loss: 1.6377 0.6673 sec/batch
Epoch 20/20  Iteration 3508/3560 Training loss: 1.6373 0.6534 sec/batch
Epoch 20/20  Iteration 3509/3560 Training loss: 1.6375 0.6216 sec/batch
Epoch 20/20  Iteration 3510/3560 Training loss: 1.6376 0.6750 sec/batch
Epoch 20/20  Iteration 3511/3560 Training loss: 1.6375 0.6567 sec/batch
Epoch 20/20  Iteration 3512/3560 Training loss: 1.6374 0.6401 sec/batch
Epoch 20/20  Iteration 3513/3560 Training loss: 1.6371 0.6367 sec/batch
Epoch 20/20  Iteration 3514/3560 Training loss: 1.6370 0.6304 sec/batch
Epoch 20/20  Iteration 3515/3560 Training loss: 1.6370 0.6526 sec/batch
Epoch 20/20  Iteration 3516/3560 Training loss: 1.6369 0.6500 sec/batch
Epoch 20/20  Iteration 3517/3560 Training loss: 1.6369 0.6268 sec/batch
Epoch 20/20  Iteration 3518/3560 Training loss: 1.6370 0.6339 sec/batch
Epoch 20/20  Iteration 3519/3560 Training loss: 1.6371 0.6316 sec/batch
Epoch 20/20  Iteration 3520/3560 Training loss: 1.6371 0.6513 sec/batch
Epoch 20/20  Iteration 3521/3560 Training loss: 1.6372 0.6502 sec/batch
Epoch 20/20  Iteration 3522/3560 Training loss: 1.6372 0.6384 sec/batch
Epoch 20/20  Iteration 3523/3560 Training loss: 1.6375 0.6491 sec/batch
Epoch 20/20  Iteration 3524/3560 Training loss: 1.6375 0.6521 sec/batch
Epoch 20/20  Iteration 3525/3560 Training loss: 1.6374 0.6405 sec/batch
Epoch 20/20  Iteration 3526/3560 Training loss: 1.6375 0.6577 sec/batch
Epoch 20/20  Iteration 3527/3560 Training loss: 1.6374 0.6407 sec/batch
Epoch 20/20  Iteration 3528/3560 Training loss: 1.6375 0.6454 sec/batch
Epoch 20/20  Iteration 3529/3560 Training loss: 1.6375 0.6232 sec/batch
Epoch 20/20  Iteration 3530/3560 Training loss: 1.6377 0.6337 sec/batch
Epoch 20/20  Iteration 3531/3560 Training loss: 1.6378 0.6524 sec/batch
Epoch 20/20  Iteration 3532/3560 Training loss: 1.6377 0.6355 sec/batch
Epoch 20/20  Iteration 3533/3560 Training loss: 1.6374 0.6354 sec/batch
Epoch 20/20  Iteration 3534/3560 Training loss: 1.6374 0.6346 sec/batch
Epoch 20/20  Iteration 3535/3560 Training loss: 1.6374 0.6249 sec/batch
Epoch 20/20  Iteration 3536/3560 Training loss: 1.6373 0.6272 sec/batch
Epoch 20/20  Iteration 3537/3560 Training loss: 1.6373 0.6404 sec/batch
Epoch 20/20  Iteration 3538/3560 Training loss: 1.6374 0.6527 sec/batch
Epoch 20/20  Iteration 3539/3560 Training loss: 1.6374 0.6546 sec/batch
Epoch 20/20  Iteration 3540/3560 Training loss: 1.6374 0.6596 sec/batch
Epoch 20/20  Iteration 3541/3560 Training loss: 1.6372 0.6323 sec/batch
Epoch 20/20  Iteration 3542/3560 Training loss: 1.6374 0.6333 sec/batch
Epoch 20/20  Iteration 3543/3560 Training loss: 1.6376 0.6283 sec/batch
Epoch 20/20  Iteration 3544/3560 Training loss: 1.6375 0.6328 sec/batch
Epoch 20/20  Iteration 3545/3560 Training loss: 1.6375 0.6368 sec/batch
Epoch 20/20  Iteration 3546/3560 Training loss: 1.6376 0.6481 sec/batch
Epoch 20/20  Iteration 3547/3560 Training loss: 1.6376 0.6379 sec/batch
Epoch 20/20  Iteration 3548/3560 Training loss: 1.6376 0.6382 sec/batch
Epoch 20/20  Iteration 3549/3560 Training loss: 1.6377 0.6359 sec/batch
Epoch 20/20  Iteration 3550/3560 Training loss: 1.6381 0.6295 sec/batch
Epoch 20/20  Iteration 3551/3560 Training loss: 1.6381 0.6209 sec/batch
Epoch 20/20  Iteration 3552/3560 Training loss: 1.6380 0.6340 sec/batch
Epoch 20/20  Iteration 3553/3560 Training loss: 1.6379 0.6411 sec/batch
Epoch 20/20  Iteration 3554/3560 Training loss: 1.6378 0.7034 sec/batch
Epoch 20/20  Iteration 3555/3560 Training loss: 1.6379 0.6697 sec/batch
Epoch 20/20  Iteration 3556/3560 Training loss: 1.6380 0.6371 sec/batch
Epoch 20/20  Iteration 3557/3560 Training loss: 1.6381 0.6504 sec/batch
Epoch 20/20  Iteration 3558/3560 Training loss: 1.6380 0.6318 sec/batch
Epoch 20/20  Iteration 3559/3560 Training loss: 1.6379 0.6323 sec/batch
Epoch 20/20  Iteration 3560/3560 Training loss: 1.6380 0.6348 sec/batch
Epoch 1/20  Iteration 1/3560 Training loss: 4.4193 0.6963 sec/batch
Epoch 1/20  Iteration 2/3560 Training loss: 4.4123 0.6250 sec/batch
Epoch 1/20  Iteration 3/3560 Training loss: 4.4037 0.6259 sec/batch
Epoch 1/20  Iteration 4/3560 Training loss: 4.3921 0.6271 sec/batch
Epoch 1/20  Iteration 5/3560 Training loss: 4.3740 0.6262 sec/batch
Epoch 1/20  Iteration 6/3560 Training loss: 4.3441 0.6188 sec/batch
Epoch 1/20  Iteration 7/3560 Training loss: 4.2940 0.6437 sec/batch
Epoch 1/20  Iteration 8/3560 Training loss: 4.2340 0.6377 sec/batch
Epoch 1/20  Iteration 9/3560 Training loss: 4.1776 0.6314 sec/batch
Epoch 1/20  Iteration 10/3560 Training loss: 4.1294 0.6281 sec/batch
Epoch 1/20  Iteration 11/3560 Training loss: 4.0838 0.6837 sec/batch
Epoch 1/20  Iteration 12/3560 Training loss: 4.0417 0.6511 sec/batch
Epoch 1/20  Iteration 13/3560 Training loss: 4.0007 0.7686 sec/batch
Epoch 1/20  Iteration 14/3560 Training loss: 3.9657 0.6557 sec/batch
Epoch 1/20  Iteration 15/3560 Training loss: 3.9319 0.7392 sec/batch
Epoch 1/20  Iteration 16/3560 Training loss: 3.9021 0.6271 sec/batch
Epoch 1/20  Iteration 17/3560 Training loss: 3.8733 0.6577 sec/batch
Epoch 1/20  Iteration 18/3560 Training loss: 3.8488 0.7158 sec/batch
Epoch 1/20  Iteration 19/3560 Training loss: 3.8243 0.7206 sec/batch
Epoch 1/20  Iteration 20/3560 Training loss: 3.8005 0.6289 sec/batch
Epoch 1/20  Iteration 21/3560 Training loss: 3.7794 0.6581 sec/batch
Epoch 1/20  Iteration 22/3560 Training loss: 3.7596 0.6594 sec/batch
Epoch 1/20  Iteration 23/3560 Training loss: 3.7413 0.7289 sec/batch
Epoch 1/20  Iteration 24/3560 Training loss: 3.7239 0.6275 sec/batch
Epoch 1/20  Iteration 25/3560 Training loss: 3.7072 0.6411 sec/batch
Epoch 1/20  Iteration 26/3560 Training loss: 3.6920 0.7001 sec/batch
Epoch 1/20  Iteration 27/3560 Training loss: 3.6779 0.6416 sec/batch
Epoch 1/20  Iteration 28/3560 Training loss: 3.6638 0.7395 sec/batch
Epoch 1/20  Iteration 29/3560 Training loss: 3.6504 0.6178 sec/batch
Epoch 1/20  Iteration 30/3560 Training loss: 3.6380 0.6692 sec/batch
Epoch 1/20  Iteration 31/3560 Training loss: 3.6270 0.7000 sec/batch
Epoch 1/20  Iteration 32/3560 Training loss: 3.6158 0.6554 sec/batch
Epoch 1/20  Iteration 33/3560 Training loss: 3.6048 0.6179 sec/batch
Epoch 1/20  Iteration 34/3560 Training loss: 3.5955 0.6320 sec/batch
Epoch 1/20  Iteration 35/3560 Training loss: 3.5856 0.6290 sec/batch
Epoch 1/20  Iteration 36/3560 Training loss: 3.5769 0.6228 sec/batch
Epoch 1/20  Iteration 37/3560 Training loss: 3.5675 0.6140 sec/batch
Epoch 1/20  Iteration 38/3560 Training loss: 3.5588 0.6312 sec/batch
Epoch 1/20  Iteration 39/3560 Training loss: 3.5503 0.6184 sec/batch
Epoch 1/20  Iteration 40/3560 Training loss: 3.5420 0.6220 sec/batch
Epoch 1/20  Iteration 41/3560 Training loss: 3.5342 0.6156 sec/batch
Epoch 1/20  Iteration 42/3560 Training loss: 3.5271 0.6314 sec/batch
Epoch 1/20  Iteration 43/3560 Training loss: 3.5200 0.6204 sec/batch
Epoch 1/20  Iteration 44/3560 Training loss: 3.5130 0.6018 sec/batch
Epoch 1/20  Iteration 45/3560 Training loss: 3.5063 0.6170 sec/batch
Epoch 1/20  Iteration 46/3560 Training loss: 3.5001 0.6352 sec/batch
Epoch 1/20  Iteration 47/3560 Training loss: 3.4943 0.6212 sec/batch
Epoch 1/20  Iteration 48/3560 Training loss: 3.4887 0.6146 sec/batch
Epoch 1/20  Iteration 49/3560 Training loss: 3.4833 0.6196 sec/batch
Epoch 1/20  Iteration 50/3560 Training loss: 3.4781 0.6190 sec/batch
Epoch 1/20  Iteration 51/3560 Training loss: 3.4729 0.6167 sec/batch
Epoch 1/20  Iteration 52/3560 Training loss: 3.4678 0.6305 sec/batch
Epoch 1/20  Iteration 53/3560 Training loss: 3.4631 0.6215 sec/batch
Epoch 1/20  Iteration 54/3560 Training loss: 3.4581 0.6344 sec/batch
Epoch 1/20  Iteration 55/3560 Training loss: 3.4537 0.6311 sec/batch
Epoch 1/20  Iteration 56/3560 Training loss: 3.4489 0.6268 sec/batch
Epoch 1/20  Iteration 57/3560 Training loss: 3.4445 0.6191 sec/batch
Epoch 1/20  Iteration 58/3560 Training loss: 3.4403 0.6274 sec/batch
Epoch 1/20  Iteration 59/3560 Training loss: 3.4359 0.6234 sec/batch
Epoch 1/20  Iteration 60/3560 Training loss: 3.4320 0.6236 sec/batch
Epoch 1/20  Iteration 61/3560 Training loss: 3.4281 0.6338 sec/batch
Epoch 1/20  Iteration 62/3560 Training loss: 3.4247 0.6305 sec/batch
Epoch 1/20  Iteration 63/3560 Training loss: 3.4214 0.6235 sec/batch
Epoch 1/20  Iteration 64/3560 Training loss: 3.4176 0.5984 sec/batch
Epoch 1/20  Iteration 65/3560 Training loss: 3.4139 0.6181 sec/batch
Epoch 1/20  Iteration 66/3560 Training loss: 3.4107 0.6150 sec/batch
Epoch 1/20  Iteration 67/3560 Training loss: 3.4076 0.6630 sec/batch
Epoch 1/20  Iteration 68/3560 Training loss: 3.4038 0.6206 sec/batch
Epoch 1/20  Iteration 69/3560 Training loss: 3.4004 0.6829 sec/batch
Epoch 1/20  Iteration 70/3560 Training loss: 3.3975 0.6219 sec/batch
Epoch 1/20  Iteration 71/3560 Training loss: 3.3945 0.6597 sec/batch
Epoch 1/20  Iteration 72/3560 Training loss: 3.3919 0.6334 sec/batch
Epoch 1/20  Iteration 73/3560 Training loss: 3.3890 0.6210 sec/batch
Epoch 1/20  Iteration 74/3560 Training loss: 3.3863 0.6105 sec/batch
Epoch 1/20  Iteration 75/3560 Training loss: 3.3837 0.6153 sec/batch
Epoch 1/20  Iteration 76/3560 Training loss: 3.3812 0.6114 sec/batch
Epoch 1/20  Iteration 77/3560 Training loss: 3.3787 0.6084 sec/batch
Epoch 1/20  Iteration 78/3560 Training loss: 3.3762 0.6380 sec/batch
Epoch 1/20  Iteration 79/3560 Training loss: 3.3737 0.6217 sec/batch
Epoch 1/20  Iteration 80/3560 Training loss: 3.3711 0.6327 sec/batch
Epoch 1/20  Iteration 81/3560 Training loss: 3.3686 0.6203 sec/batch
Epoch 1/20  Iteration 82/3560 Training loss: 3.3663 0.6170 sec/batch
Epoch 1/20  Iteration 83/3560 Training loss: 3.3642 0.6173 sec/batch
Epoch 1/20  Iteration 84/3560 Training loss: 3.3619 0.6229 sec/batch
Epoch 1/20  Iteration 85/3560 Training loss: 3.3595 0.6242 sec/batch
Epoch 1/20  Iteration 86/3560 Training loss: 3.3573 0.6280 sec/batch
Epoch 1/20  Iteration 87/3560 Training loss: 3.3551 0.6156 sec/batch
Epoch 1/20  Iteration 88/3560 Training loss: 3.3529 0.6177 sec/batch
Epoch 1/20  Iteration 89/3560 Training loss: 3.3511 0.6376 sec/batch
Epoch 1/20  Iteration 90/3560 Training loss: 3.3492 0.6042 sec/batch
Epoch 1/20  Iteration 91/3560 Training loss: 3.3473 0.6220 sec/batch
Epoch 1/20  Iteration 92/3560 Training loss: 3.3454 0.6218 sec/batch
Epoch 1/20  Iteration 93/3560 Training loss: 3.3435 0.6316 sec/batch
Epoch 1/20  Iteration 94/3560 Training loss: 3.3417 0.6340 sec/batch
Epoch 1/20  Iteration 95/3560 Training loss: 3.3399 0.6459 sec/batch
Epoch 1/20  Iteration 96/3560 Training loss: 3.3381 0.6199 sec/batch
Epoch 1/20  Iteration 97/3560 Training loss: 3.3365 0.6264 sec/batch
Epoch 1/20  Iteration 98/3560 Training loss: 3.3347 0.6254 sec/batch
Epoch 1/20  Iteration 99/3560 Training loss: 3.3330 0.6257 sec/batch
Epoch 1/20  Iteration 100/3560 Training loss: 3.3313 0.6181 sec/batch
Epoch 1/20  Iteration 101/3560 Training loss: 3.3297 0.7808 sec/batch
Epoch 1/20  Iteration 102/3560 Training loss: 3.3281 0.7535 sec/batch
Epoch 1/20  Iteration 103/3560 Training loss: 3.3266 0.6271 sec/batch
Epoch 1/20  Iteration 104/3560 Training loss: 3.3250 0.6263 sec/batch
Epoch 1/20  Iteration 105/3560 Training loss: 3.3235 0.6061 sec/batch
Epoch 1/20  Iteration 106/3560 Training loss: 3.3220 0.6391 sec/batch
Epoch 1/20  Iteration 107/3560 Training loss: 3.3204 0.6197 sec/batch
Epoch 1/20  Iteration 108/3560 Training loss: 3.3187 0.6187 sec/batch
Epoch 1/20  Iteration 109/3560 Training loss: 3.3173 0.6232 sec/batch
Epoch 1/20  Iteration 110/3560 Training loss: 3.3156 0.6336 sec/batch
Epoch 1/20  Iteration 111/3560 Training loss: 3.3141 0.6230 sec/batch
Epoch 1/20  Iteration 112/3560 Training loss: 3.3127 0.6202 sec/batch
Epoch 1/20  Iteration 113/3560 Training loss: 3.3113 0.6217 sec/batch
Epoch 1/20  Iteration 114/3560 Training loss: 3.3097 0.6213 sec/batch
Epoch 1/20  Iteration 115/3560 Training loss: 3.3082 0.6190 sec/batch
Epoch 1/20  Iteration 116/3560 Training loss: 3.3069 0.6372 sec/batch
Epoch 1/20  Iteration 117/3560 Training loss: 3.3054 0.6304 sec/batch
Epoch 1/20  Iteration 118/3560 Training loss: 3.3042 0.6423 sec/batch
Epoch 1/20  Iteration 119/3560 Training loss: 3.3031 0.6228 sec/batch
Epoch 1/20  Iteration 120/3560 Training loss: 3.3017 0.6183 sec/batch
Epoch 1/20  Iteration 121/3560 Training loss: 3.3007 0.6235 sec/batch
Epoch 1/20  Iteration 122/3560 Training loss: 3.2995 0.6250 sec/batch
Epoch 1/20  Iteration 123/3560 Training loss: 3.2984 0.6649 sec/batch
Epoch 1/20  Iteration 124/3560 Training loss: 3.2973 0.7143 sec/batch
Epoch 1/20  Iteration 125/3560 Training loss: 3.2960 0.7437 sec/batch
Epoch 1/20  Iteration 126/3560 Training loss: 3.2947 0.7034 sec/batch
Epoch 1/20  Iteration 127/3560 Training loss: 3.2936 0.6201 sec/batch
Epoch 1/20  Iteration 128/3560 Training loss: 3.2925 0.6206 sec/batch
Epoch 1/20  Iteration 129/3560 Training loss: 3.2913 0.6300 sec/batch
Epoch 1/20  Iteration 130/3560 Training loss: 3.2902 0.6136 sec/batch
Epoch 1/20  Iteration 131/3560 Training loss: 3.2892 0.6226 sec/batch
Epoch 1/20  Iteration 132/3560 Training loss: 3.2881 0.6150 sec/batch
Epoch 1/20  Iteration 133/3560 Training loss: 3.2871 0.6439 sec/batch
Epoch 1/20  Iteration 134/3560 Training loss: 3.2860 0.6253 sec/batch
Epoch 1/20  Iteration 135/3560 Training loss: 3.2847 0.6194 sec/batch
Epoch 1/20  Iteration 136/3560 Training loss: 3.2834 0.6315 sec/batch
Epoch 1/20  Iteration 137/3560 Training loss: 3.2823 0.6218 sec/batch
Epoch 1/20  Iteration 138/3560 Training loss: 3.2811 0.6210 sec/batch
Epoch 1/20  Iteration 139/3560 Training loss: 3.2802 0.6305 sec/batch
Epoch 1/20  Iteration 140/3560 Training loss: 3.2791 0.6343 sec/batch
Epoch 1/20  Iteration 141/3560 Training loss: 3.2781 0.6098 sec/batch
Epoch 1/20  Iteration 142/3560 Training loss: 3.2769 0.6221 sec/batch
Epoch 1/20  Iteration 143/3560 Training loss: 3.2758 0.6284 sec/batch
Epoch 1/20  Iteration 144/3560 Training loss: 3.2748 0.6132 sec/batch
Epoch 1/20  Iteration 145/3560 Training loss: 3.2738 0.6241 sec/batch
Epoch 1/20  Iteration 146/3560 Training loss: 3.2728 0.6222 sec/batch
Epoch 1/20  Iteration 147/3560 Training loss: 3.2719 0.6221 sec/batch
Epoch 1/20  Iteration 148/3560 Training loss: 3.2711 0.6271 sec/batch
Epoch 1/20  Iteration 149/3560 Training loss: 3.2701 0.6270 sec/batch
Epoch 1/20  Iteration 150/3560 Training loss: 3.2692 0.6553 sec/batch
Epoch 1/20  Iteration 151/3560 Training loss: 3.2683 0.6877 sec/batch
Epoch 1/20  Iteration 152/3560 Training loss: 3.2676 0.6625 sec/batch
Epoch 1/20  Iteration 153/3560 Training loss: 3.2667 0.7208 sec/batch
Epoch 1/20  Iteration 154/3560 Training loss: 3.2657 0.6619 sec/batch
Epoch 1/20  Iteration 155/3560 Training loss: 3.2647 0.6815 sec/batch
Epoch 1/20  Iteration 156/3560 Training loss: 3.2637 0.6704 sec/batch
Epoch 1/20  Iteration 157/3560 Training loss: 3.2627 0.6466 sec/batch
Epoch 1/20  Iteration 158/3560 Training loss: 3.2617 0.6318 sec/batch
Epoch 1/20  Iteration 159/3560 Training loss: 3.2607 0.6288 sec/batch
Epoch 1/20  Iteration 160/3560 Training loss: 3.2596 0.6171 sec/batch
Epoch 1/20  Iteration 161/3560 Training loss: 3.2587 0.6334 sec/batch
Epoch 1/20  Iteration 162/3560 Training loss: 3.2576 0.6368 sec/batch
Epoch 1/20  Iteration 163/3560 Training loss: 3.2565 0.6616 sec/batch
Epoch 1/20  Iteration 164/3560 Training loss: 3.2556 0.6517 sec/batch
Epoch 1/20  Iteration 165/3560 Training loss: 3.2546 0.6769 sec/batch
Epoch 1/20  Iteration 166/3560 Training loss: 3.2537 0.6482 sec/batch
Epoch 1/20  Iteration 167/3560 Training loss: 3.2527 0.6517 sec/batch
Epoch 1/20  Iteration 168/3560 Training loss: 3.2518 0.6862 sec/batch
Epoch 1/20  Iteration 169/3560 Training loss: 3.2508 0.6399 sec/batch
Epoch 1/20  Iteration 170/3560 Training loss: 3.2498 0.6474 sec/batch
Epoch 1/20  Iteration 171/3560 Training loss: 3.2489 0.6490 sec/batch
Epoch 1/20  Iteration 172/3560 Training loss: 3.2481 0.6725 sec/batch
Epoch 1/20  Iteration 173/3560 Training loss: 3.2473 0.6509 sec/batch
Epoch 1/20  Iteration 174/3560 Training loss: 3.2466 0.6939 sec/batch
Epoch 1/20  Iteration 175/3560 Training loss: 3.2457 0.6453 sec/batch
Epoch 1/20  Iteration 176/3560 Training loss: 3.2448 0.6305 sec/batch
Epoch 1/20  Iteration 177/3560 Training loss: 3.2438 0.6294 sec/batch
Epoch 1/20  Iteration 178/3560 Training loss: 3.2427 0.6452 sec/batch
Epoch 2/20  Iteration 179/3560 Training loss: 3.1055 0.6574 sec/batch
Epoch 2/20  Iteration 180/3560 Training loss: 3.0706 0.7065 sec/batch
Epoch 2/20  Iteration 181/3560 Training loss: 3.0585 0.7538 sec/batch
Epoch 2/20  Iteration 182/3560 Training loss: 3.0537 0.7655 sec/batch
Epoch 2/20  Iteration 183/3560 Training loss: 3.0509 0.7012 sec/batch
Epoch 2/20  Iteration 184/3560 Training loss: 3.0492 0.6329 sec/batch
Epoch 2/20  Iteration 185/3560 Training loss: 3.0477 0.7188 sec/batch
Epoch 2/20  Iteration 186/3560 Training loss: 3.0466 0.7143 sec/batch
Epoch 2/20  Iteration 187/3560 Training loss: 3.0436 0.6946 sec/batch
Epoch 2/20  Iteration 188/3560 Training loss: 3.0413 0.7753 sec/batch
Epoch 2/20  Iteration 189/3560 Training loss: 3.0369 0.6296 sec/batch
Epoch 2/20  Iteration 190/3560 Training loss: 3.0344 0.7221 sec/batch
Epoch 2/20  Iteration 191/3560 Training loss: 3.0316 0.6660 sec/batch
Epoch 2/20  Iteration 192/3560 Training loss: 3.0301 0.6465 sec/batch
Epoch 2/20  Iteration 193/3560 Training loss: 3.0279 0.6544 sec/batch
Epoch 2/20  Iteration 194/3560 Training loss: 3.0256 0.6994 sec/batch
Epoch 2/20  Iteration 195/3560 Training loss: 3.0225 0.7796 sec/batch
Epoch 2/20  Iteration 196/3560 Training loss: 3.0219 0.9429 sec/batch
Epoch 2/20  Iteration 197/3560 Training loss: 3.0195 0.8831 sec/batch
Epoch 2/20  Iteration 198/3560 Training loss: 3.0152 0.7736 sec/batch
Epoch 2/20  Iteration 199/3560 Training loss: 3.0125 0.7296 sec/batch
Epoch 2/20  Iteration 200/3560 Training loss: 3.0102 0.6699 sec/batch
Epoch 2/20  Iteration 201/3560 Training loss: 3.0074 0.6469 sec/batch
Epoch 2/20  Iteration 202/3560 Training loss: 3.0043 0.6488 sec/batch
Epoch 2/20  Iteration 203/3560 Training loss: 3.0009 0.7276 sec/batch
Epoch 2/20  Iteration 204/3560 Training loss: 2.9986 0.6427 sec/batch
Epoch 2/20  Iteration 205/3560 Training loss: 2.9962 0.6393 sec/batch
Epoch 2/20  Iteration 206/3560 Training loss: 2.9929 0.6360 sec/batch
Epoch 2/20  Iteration 207/3560 Training loss: 2.9901 0.6339 sec/batch
Epoch 2/20  Iteration 208/3560 Training loss: 2.9876 0.6330 sec/batch
Epoch 2/20  Iteration 209/3560 Training loss: 2.9853 0.7273 sec/batch
Epoch 2/20  Iteration 210/3560 Training loss: 2.9823 0.6250 sec/batch
Epoch 2/20  Iteration 211/3560 Training loss: 2.9790 0.6224 sec/batch
Epoch 2/20  Iteration 212/3560 Training loss: 2.9762 0.6588 sec/batch
Epoch 2/20  Iteration 213/3560 Training loss: 2.9726 0.6581 sec/batch
Epoch 2/20  Iteration 214/3560 Training loss: 2.9698 0.6295 sec/batch
Epoch 2/20  Iteration 215/3560 Training loss: 2.9663 0.6519 sec/batch
Epoch 2/20  Iteration 216/3560 Training loss: 2.9628 0.6626 sec/batch
Epoch 2/20  Iteration 217/3560 Training loss: 2.9594 0.6352 sec/batch
Epoch 2/20  Iteration 218/3560 Training loss: 2.9561 0.6334 sec/batch
Epoch 2/20  Iteration 219/3560 Training loss: 2.9528 0.6262 sec/batch
Epoch 2/20  Iteration 220/3560 Training loss: 2.9494 0.6260 sec/batch
Epoch 2/20  Iteration 221/3560 Training loss: 2.9460 0.6496 sec/batch
Epoch 2/20  Iteration 222/3560 Training loss: 2.9428 0.6300 sec/batch
Epoch 2/20  Iteration 223/3560 Training loss: 2.9394 0.6247 sec/batch
Epoch 2/20  Iteration 224/3560 Training loss: 2.9359 0.6359 sec/batch
Epoch 2/20  Iteration 225/3560 Training loss: 2.9330 0.6252 sec/batch
Epoch 2/20  Iteration 226/3560 Training loss: 2.9302 0.6267 sec/batch
Epoch 2/20  Iteration 227/3560 Training loss: 2.9272 0.6274 sec/batch
Epoch 2/20  Iteration 228/3560 Training loss: 2.9245 0.6229 sec/batch
Epoch 2/20  Iteration 229/3560 Training loss: 2.9215 0.6302 sec/batch
Epoch 2/20  Iteration 230/3560 Training loss: 2.9185 0.6408 sec/batch
Epoch 2/20  Iteration 231/3560 Training loss: 2.9157 0.6272 sec/batch
Epoch 2/20  Iteration 232/3560 Training loss: 2.9125 0.6484 sec/batch
Epoch 2/20  Iteration 233/3560 Training loss: 2.9094 0.6230 sec/batch
Epoch 2/20  Iteration 234/3560 Training loss: 2.9065 0.6225 sec/batch
Epoch 2/20  Iteration 235/3560 Training loss: 2.9036 0.6331 sec/batch
Epoch 2/20  Iteration 236/3560 Training loss: 2.9004 0.6328 sec/batch
Epoch 2/20  Iteration 237/3560 Training loss: 2.8973 0.6291 sec/batch
Epoch 2/20  Iteration 238/3560 Training loss: 2.8945 0.6474 sec/batch
Epoch 2/20  Iteration 239/3560 Training loss: 2.8915 0.6519 sec/batch
Epoch 2/20  Iteration 240/3560 Training loss: 2.8890 0.6398 sec/batch
Epoch 2/20  Iteration 241/3560 Training loss: 2.8866 0.6592 sec/batch
Epoch 2/20  Iteration 242/3560 Training loss: 2.8837 0.6808 sec/batch
Epoch 2/20  Iteration 243/3560 Training loss: 2.8807 0.6524 sec/batch
Epoch 2/20  Iteration 244/3560 Training loss: 2.8783 0.6803 sec/batch
Epoch 2/20  Iteration 245/3560 Training loss: 2.8756 0.7795 sec/batch
Epoch 2/20  Iteration 246/3560 Training loss: 2.8726 0.7150 sec/batch
Epoch 2/20  Iteration 247/3560 Training loss: 2.8697 0.6632 sec/batch
Epoch 2/20  Iteration 248/3560 Training loss: 2.8672 0.7440 sec/batch
Epoch 2/20  Iteration 249/3560 Training loss: 2.8646 0.6746 sec/batch
Epoch 2/20  Iteration 250/3560 Training loss: 2.8623 0.6586 sec/batch
Epoch 2/20  Iteration 251/3560 Training loss: 2.8598 0.7591 sec/batch
Epoch 2/20  Iteration 252/3560 Training loss: 2.8573 0.6563 sec/batch
Epoch 2/20  Iteration 253/3560 Training loss: 2.8548 0.6596 sec/batch
Epoch 2/20  Iteration 254/3560 Training loss: 2.8528 0.6536 sec/batch
Epoch 2/20  Iteration 255/3560 Training loss: 2.8503 0.6616 sec/batch
Epoch 2/20  Iteration 256/3560 Training loss: 2.8480 0.6432 sec/batch
Epoch 2/20  Iteration 257/3560 Training loss: 2.8457 0.6478 sec/batch
Epoch 2/20  Iteration 258/3560 Training loss: 2.8432 0.6388 sec/batch
Epoch 2/20  Iteration 259/3560 Training loss: 2.8408 0.6365 sec/batch
Epoch 2/20  Iteration 260/3560 Training loss: 2.8387 0.6494 sec/batch
Epoch 2/20  Iteration 261/3560 Training loss: 2.8365 0.6430 sec/batch
Epoch 2/20  Iteration 262/3560 Training loss: 2.8341 0.6513 sec/batch
Epoch 2/20  Iteration 263/3560 Training loss: 2.8315 0.6934 sec/batch
Epoch 2/20  Iteration 264/3560 Training loss: 2.8291 0.7014 sec/batch
Epoch 2/20  Iteration 265/3560 Training loss: 2.8269 1.0630 sec/batch
Epoch 2/20  Iteration 266/3560 Training loss: 2.8248 0.7973 sec/batch
Epoch 2/20  Iteration 267/3560 Training loss: 2.8226 0.9728 sec/batch
Epoch 2/20  Iteration 268/3560 Training loss: 2.8206 1.1310 sec/batch
Epoch 2/20  Iteration 269/3560 Training loss: 2.8185 1.0127 sec/batch
Epoch 2/20  Iteration 270/3560 Training loss: 2.8164 0.8163 sec/batch
Epoch 2/20  Iteration 271/3560 Training loss: 2.8143 0.6620 sec/batch
Epoch 2/20  Iteration 272/3560 Training loss: 2.8122 0.6891 sec/batch
Epoch 2/20  Iteration 273/3560 Training loss: 2.8098 0.6978 sec/batch
Epoch 2/20  Iteration 274/3560 Training loss: 2.8077 0.6838 sec/batch
Epoch 2/20  Iteration 275/3560 Training loss: 2.8058 0.6787 sec/batch
Epoch 2/20  Iteration 276/3560 Training loss: 2.8038 0.7089 sec/batch
Epoch 2/20  Iteration 277/3560 Training loss: 2.8019 0.7078 sec/batch
Epoch 2/20  Iteration 278/3560 Training loss: 2.8000 0.6987 sec/batch
Epoch 2/20  Iteration 279/3560 Training loss: 2.7981 0.6698 sec/batch
Epoch 2/20  Iteration 280/3560 Training loss: 2.7962 0.6691 sec/batch
Epoch 2/20  Iteration 281/3560 Training loss: 2.7941 0.6880 sec/batch
Epoch 2/20  Iteration 282/3560 Training loss: 2.7921 0.6995 sec/batch
Epoch 2/20  Iteration 283/3560 Training loss: 2.7902 0.6566 sec/batch
Epoch 2/20  Iteration 284/3560 Training loss: 2.7884 0.6919 sec/batch
Epoch 2/20  Iteration 285/3560 Training loss: 2.7864 0.6580 sec/batch
Epoch 2/20  Iteration 286/3560 Training loss: 2.7847 0.6655 sec/batch
Epoch 2/20  Iteration 287/3560 Training loss: 2.7830 0.8069 sec/batch
Epoch 2/20  Iteration 288/3560 Training loss: 2.7809 0.6807 sec/batch
Epoch 2/20  Iteration 289/3560 Training loss: 2.7792 0.6961 sec/batch
Epoch 2/20  Iteration 290/3560 Training loss: 2.7776 0.6700 sec/batch
Epoch 2/20  Iteration 291/3560 Training loss: 2.7758 0.6363 sec/batch
Epoch 2/20  Iteration 292/3560 Training loss: 2.7740 0.6609 sec/batch
Epoch 2/20  Iteration 293/3560 Training loss: 2.7722 0.6835 sec/batch
Epoch 2/20  Iteration 294/3560 Training loss: 2.7703 0.6898 sec/batch
Epoch 2/20  Iteration 295/3560 Training loss: 2.7685 0.6921 sec/batch
Epoch 2/20  Iteration 296/3560 Training loss: 2.7669 0.6884 sec/batch
Epoch 2/20  Iteration 297/3560 Training loss: 2.7656 0.7096 sec/batch
Epoch 2/20  Iteration 298/3560 Training loss: 2.7640 0.6616 sec/batch
Epoch 2/20  Iteration 299/3560 Training loss: 2.7626 0.6384 sec/batch
Epoch 2/20  Iteration 300/3560 Training loss: 2.7611 0.6335 sec/batch
Epoch 2/20  Iteration 301/3560 Training loss: 2.7596 0.6405 sec/batch
Epoch 2/20  Iteration 302/3560 Training loss: 2.7582 0.6480 sec/batch
Epoch 2/20  Iteration 303/3560 Training loss: 2.7566 0.6487 sec/batch
Epoch 2/20  Iteration 304/3560 Training loss: 2.7549 0.6646 sec/batch
Epoch 2/20  Iteration 305/3560 Training loss: 2.7535 0.6923 sec/batch
Epoch 2/20  Iteration 306/3560 Training loss: 2.7522 0.6710 sec/batch
Epoch 2/20  Iteration 307/3560 Training loss: 2.7507 0.7240 sec/batch
Epoch 2/20  Iteration 308/3560 Training loss: 2.7492 0.6791 sec/batch
Epoch 2/20  Iteration 309/3560 Training loss: 2.7478 0.6580 sec/batch
Epoch 2/20  Iteration 310/3560 Training loss: 2.7463 0.6688 sec/batch
Epoch 2/20  Iteration 311/3560 Training loss: 2.7450 0.6726 sec/batch
Epoch 2/20  Iteration 312/3560 Training loss: 2.7437 0.6291 sec/batch
Epoch 2/20  Iteration 313/3560 Training loss: 2.7422 0.8158 sec/batch
Epoch 2/20  Iteration 314/3560 Training loss: 2.7407 0.7292 sec/batch
Epoch 2/20  Iteration 315/3560 Training loss: 2.7392 0.7178 sec/batch
Epoch 2/20  Iteration 316/3560 Training loss: 2.7378 0.7463 sec/batch
Epoch 2/20  Iteration 317/3560 Training loss: 2.7365 0.6588 sec/batch
Epoch 2/20  Iteration 318/3560 Training loss: 2.7352 0.6435 sec/batch
Epoch 2/20  Iteration 319/3560 Training loss: 2.7340 0.6780 sec/batch
Epoch 2/20  Iteration 320/3560 Training loss: 2.7326 0.6521 sec/batch
Epoch 2/20  Iteration 321/3560 Training loss: 2.7313 0.6523 sec/batch
Epoch 2/20  Iteration 322/3560 Training loss: 2.7299 0.7119 sec/batch
Epoch 2/20  Iteration 323/3560 Training loss: 2.7287 0.6490 sec/batch
Epoch 2/20  Iteration 324/3560 Training loss: 2.7275 0.6431 sec/batch
Epoch 2/20  Iteration 325/3560 Training loss: 2.7263 0.6729 sec/batch
Epoch 2/20  Iteration 326/3560 Training loss: 2.7252 0.6514 sec/batch
Epoch 2/20  Iteration 327/3560 Training loss: 2.7239 0.6668 sec/batch
Epoch 2/20  Iteration 328/3560 Training loss: 2.7226 0.6407 sec/batch
Epoch 2/20  Iteration 329/3560 Training loss: 2.7216 0.6506 sec/batch
Epoch 2/20  Iteration 330/3560 Training loss: 2.7206 0.6409 sec/batch
Epoch 2/20  Iteration 331/3560 Training loss: 2.7194 0.6472 sec/batch
Epoch 2/20  Iteration 332/3560 Training loss: 2.7182 0.6432 sec/batch
Epoch 2/20  Iteration 333/3560 Training loss: 2.7169 0.6370 sec/batch
Epoch 2/20  Iteration 334/3560 Training loss: 2.7157 0.6558 sec/batch
Epoch 2/20  Iteration 335/3560 Training loss: 2.7144 0.6570 sec/batch
Epoch 2/20  Iteration 336/3560 Training loss: 2.7132 0.6393 sec/batch
Epoch 2/20  Iteration 337/3560 Training loss: 2.7118 0.6297 sec/batch
Epoch 2/20  Iteration 338/3560 Training loss: 2.7107 0.7506 sec/batch
Epoch 2/20  Iteration 339/3560 Training loss: 2.7096 0.6323 sec/batch
Epoch 2/20  Iteration 340/3560 Training loss: 2.7082 0.6402 sec/batch
Epoch 2/20  Iteration 341/3560 Training loss: 2.7069 0.6314 sec/batch
Epoch 2/20  Iteration 342/3560 Training loss: 2.7058 0.6462 sec/batch
Epoch 2/20  Iteration 343/3560 Training loss: 2.7046 0.6393 sec/batch
Epoch 2/20  Iteration 344/3560 Training loss: 2.7035 0.6521 sec/batch
Epoch 2/20  Iteration 345/3560 Training loss: 2.7024 0.6851 sec/batch
Epoch 2/20  Iteration 346/3560 Training loss: 2.7012 0.6626 sec/batch
Epoch 2/20  Iteration 347/3560 Training loss: 2.7002 0.6404 sec/batch
Epoch 2/20  Iteration 348/3560 Training loss: 2.6990 0.6365 sec/batch
Epoch 2/20  Iteration 349/3560 Training loss: 2.6979 0.6400 sec/batch
Epoch 2/20  Iteration 350/3560 Training loss: 2.6969 0.6640 sec/batch
Epoch 2/20  Iteration 351/3560 Training loss: 2.6961 0.6360 sec/batch
Epoch 2/20  Iteration 352/3560 Training loss: 2.6952 0.6390 sec/batch
Epoch 2/20  Iteration 353/3560 Training loss: 2.6943 0.6369 sec/batch
Epoch 2/20  Iteration 354/3560 Training loss: 2.6932 0.6396 sec/batch
Epoch 2/20  Iteration 355/3560 Training loss: 2.6921 0.6554 sec/batch
Epoch 2/20  Iteration 356/3560 Training loss: 2.6909 0.6292 sec/batch
Epoch 3/20  Iteration 357/3560 Training loss: 2.5676 0.6111 sec/batch
Epoch 3/20  Iteration 358/3560 Training loss: 2.5147 0.6415 sec/batch
Epoch 3/20  Iteration 359/3560 Training loss: 2.5017 0.6328 sec/batch
Epoch 3/20  Iteration 360/3560 Training loss: 2.4963 0.6437 sec/batch
Epoch 3/20  Iteration 361/3560 Training loss: 2.4945 0.6607 sec/batch
Epoch 3/20  Iteration 362/3560 Training loss: 2.4921 0.6441 sec/batch
Epoch 3/20  Iteration 363/3560 Training loss: 2.4926 0.6333 sec/batch
Epoch 3/20  Iteration 364/3560 Training loss: 2.4943 0.6384 sec/batch
Epoch 3/20  Iteration 365/3560 Training loss: 2.4948 0.6416 sec/batch
Epoch 3/20  Iteration 366/3560 Training loss: 2.4935 0.6250 sec/batch
Epoch 3/20  Iteration 367/3560 Training loss: 2.4919 0.6403 sec/batch
Epoch 3/20  Iteration 368/3560 Training loss: 2.4918 0.6295 sec/batch
Epoch 3/20  Iteration 369/3560 Training loss: 2.4913 0.6357 sec/batch
Epoch 3/20  Iteration 370/3560 Training loss: 2.4928 0.6285 sec/batch
Epoch 3/20  Iteration 371/3560 Training loss: 2.4923 0.6405 sec/batch
Epoch 3/20  Iteration 372/3560 Training loss: 2.4921 0.6362 sec/batch
Epoch 3/20  Iteration 373/3560 Training loss: 2.4917 0.6372 sec/batch
Epoch 3/20  Iteration 374/3560 Training loss: 2.4931 0.7043 sec/batch
Epoch 3/20  Iteration 375/3560 Training loss: 2.4931 0.6512 sec/batch
Epoch 3/20  Iteration 376/3560 Training loss: 2.4909 0.6293 sec/batch
Epoch 3/20  Iteration 377/3560 Training loss: 2.4901 0.6370 sec/batch
Epoch 3/20  Iteration 378/3560 Training loss: 2.4910 0.6385 sec/batch
Epoch 3/20  Iteration 379/3560 Training loss: 2.4904 0.6311 sec/batch
Epoch 3/20  Iteration 380/3560 Training loss: 2.4892 0.6285 sec/batch
Epoch 3/20  Iteration 381/3560 Training loss: 2.4882 0.6436 sec/batch
Epoch 3/20  Iteration 382/3560 Training loss: 2.4878 0.6333 sec/batch
Epoch 3/20  Iteration 383/3560 Training loss: 2.4871 0.6395 sec/batch
Epoch 3/20  Iteration 384/3560 Training loss: 2.4864 0.6405 sec/batch
Epoch 3/20  Iteration 385/3560 Training loss: 2.4862 0.6356 sec/batch
Epoch 3/20  Iteration 386/3560 Training loss: 2.4854 0.6329 sec/batch
Epoch 3/20  Iteration 387/3560 Training loss: 2.4857 0.6980 sec/batch
Epoch 3/20  Iteration 388/3560 Training loss: 2.4848 0.8088 sec/batch
Epoch 3/20  Iteration 389/3560 Training loss: 2.4839 0.6876 sec/batch
Epoch 3/20  Iteration 390/3560 Training loss: 2.4835 0.6409 sec/batch
Epoch 3/20  Iteration 391/3560 Training loss: 2.4824 0.6265 sec/batch
Epoch 3/20  Iteration 392/3560 Training loss: 2.4823 0.6316 sec/batch
Epoch 3/20  Iteration 393/3560 Training loss: 2.4817 0.6380 sec/batch
Epoch 3/20  Iteration 394/3560 Training loss: 2.4801 0.6377 sec/batch
Epoch 3/20  Iteration 395/3560 Training loss: 2.4791 0.6315 sec/batch
Epoch 3/20  Iteration 396/3560 Training loss: 2.4781 0.6311 sec/batch
Epoch 3/20  Iteration 397/3560 Training loss: 2.4772 0.6365 sec/batch
Epoch 3/20  Iteration 398/3560 Training loss: 2.4762 0.6342 sec/batch
Epoch 3/20  Iteration 399/3560 Training loss: 2.4751 0.6217 sec/batch
Epoch 3/20  Iteration 400/3560 Training loss: 2.4744 0.6365 sec/batch
Epoch 3/20  Iteration 401/3560 Training loss: 2.4734 0.6313 sec/batch
Epoch 3/20  Iteration 402/3560 Training loss: 2.4721 0.6379 sec/batch
Epoch 3/20  Iteration 403/3560 Training loss: 2.4717 0.6440 sec/batch
Epoch 3/20  Iteration 404/3560 Training loss: 2.4712 0.6348 sec/batch
Epoch 3/20  Iteration 405/3560 Training loss: 2.4707 0.6395 sec/batch
Epoch 3/20  Iteration 406/3560 Training loss: 2.4705 0.6284 sec/batch
Epoch 3/20  Iteration 407/3560 Training loss: 2.4699 0.6425 sec/batch
Epoch 3/20  Iteration 408/3560 Training loss: 2.4693 0.6630 sec/batch
Epoch 3/20  Iteration 409/3560 Training loss: 2.4687 0.6283 sec/batch
Epoch 3/20  Iteration 410/3560 Training loss: 2.4680 0.6296 sec/batch
Epoch 3/20  Iteration 411/3560 Training loss: 2.4671 0.6415 sec/batch
Epoch 3/20  Iteration 412/3560 Training loss: 2.4667 0.6226 sec/batch
Epoch 3/20  Iteration 413/3560 Training loss: 2.4659 0.6387 sec/batch
Epoch 3/20  Iteration 414/3560 Training loss: 2.4653 0.6283 sec/batch
Epoch 3/20  Iteration 415/3560 Training loss: 2.4646 0.6401 sec/batch
Epoch 3/20  Iteration 416/3560 Training loss: 2.4644 0.6270 sec/batch
Epoch 3/20  Iteration 417/3560 Training loss: 2.4638 0.6303 sec/batch
Epoch 3/20  Iteration 418/3560 Training loss: 2.4636 0.6327 sec/batch
Epoch 3/20  Iteration 419/3560 Training loss: 2.4634 0.6298 sec/batch
Epoch 3/20  Iteration 420/3560 Training loss: 2.4627 0.6365 sec/batch
Epoch 3/20  Iteration 421/3560 Training loss: 2.4620 0.6411 sec/batch
Epoch 3/20  Iteration 422/3560 Training loss: 2.4618 0.6383 sec/batch
Epoch 3/20  Iteration 423/3560 Training loss: 2.4613 0.6395 sec/batch
Epoch 3/20  Iteration 424/3560 Training loss: 2.4603 0.6296 sec/batch
Epoch 3/20  Iteration 425/3560 Training loss: 2.4594 0.6405 sec/batch
Epoch 3/20  Iteration 426/3560 Training loss: 2.4590 0.6486 sec/batch
Epoch 3/20  Iteration 427/3560 Training loss: 2.4586 0.6353 sec/batch
Epoch 3/20  Iteration 428/3560 Training loss: 2.4584 0.6268 sec/batch
Epoch 3/20  Iteration 429/3560 Training loss: 2.4580 0.7252 sec/batch
Epoch 3/20  Iteration 430/3560 Training loss: 2.4573 0.6425 sec/batch
Epoch 3/20  Iteration 431/3560 Training loss: 2.4567 0.6641 sec/batch
Epoch 3/20  Iteration 432/3560 Training loss: 2.4567 0.6536 sec/batch
Epoch 3/20  Iteration 433/3560 Training loss: 2.4562 0.6485 sec/batch
Epoch 3/20  Iteration 434/3560 Training loss: 2.4560 0.6406 sec/batch
Epoch 3/20  Iteration 435/3560 Training loss: 2.4555 0.6440 sec/batch
Epoch 3/20  Iteration 436/3560 Training loss: 2.4549 0.6446 sec/batch
Epoch 3/20  Iteration 437/3560 Training loss: 2.4543 0.6441 sec/batch
Epoch 3/20  Iteration 438/3560 Training loss: 2.4540 0.6233 sec/batch
Epoch 3/20  Iteration 439/3560 Training loss: 2.4535 0.6332 sec/batch
Epoch 3/20  Iteration 440/3560 Training loss: 2.4528 0.6112 sec/batch
Epoch 3/20  Iteration 441/3560 Training loss: 2.4519 0.6412 sec/batch
Epoch 3/20  Iteration 442/3560 Training loss: 2.4513 0.6310 sec/batch
Epoch 3/20  Iteration 443/3560 Training loss: 2.4508 0.6334 sec/batch
Epoch 3/20  Iteration 444/3560 Training loss: 2.4503 0.6329 sec/batch
Epoch 3/20  Iteration 445/3560 Training loss: 2.4498 0.6376 sec/batch
Epoch 3/20  Iteration 446/3560 Training loss: 2.4495 0.6348 sec/batch
Epoch 3/20  Iteration 447/3560 Training loss: 2.4490 0.6369 sec/batch
Epoch 3/20  Iteration 448/3560 Training loss: 2.4486 0.6365 sec/batch
Epoch 3/20  Iteration 449/3560 Training loss: 2.4481 0.6439 sec/batch
Epoch 3/20  Iteration 450/3560 Training loss: 2.4473 0.6388 sec/batch
Epoch 3/20  Iteration 451/3560 Training loss: 2.4467 0.6381 sec/batch
Epoch 3/20  Iteration 452/3560 Training loss: 2.4461 0.6333 sec/batch
Epoch 3/20  Iteration 453/3560 Training loss: 2.4457 0.6388 sec/batch
Epoch 3/20  Iteration 454/3560 Training loss: 2.4452 0.6532 sec/batch
Epoch 3/20  Iteration 455/3560 Training loss: 2.4446 0.6379 sec/batch
Epoch 3/20  Iteration 456/3560 Training loss: 2.4441 0.6464 sec/batch
Epoch 3/20  Iteration 457/3560 Training loss: 2.4439 0.6368 sec/batch
Epoch 3/20  Iteration 458/3560 Training loss: 2.4435 0.6384 sec/batch
Epoch 3/20  Iteration 459/3560 Training loss: 2.4428 0.6342 sec/batch
Epoch 3/20  Iteration 460/3560 Training loss: 2.4423 0.6424 sec/batch
Epoch 3/20  Iteration 461/3560 Training loss: 2.4417 0.6387 sec/batch
Epoch 3/20  Iteration 462/3560 Training loss: 2.4412 0.6333 sec/batch
Epoch 3/20  Iteration 463/3560 Training loss: 2.4407 0.6375 sec/batch
Epoch 3/20  Iteration 464/3560 Training loss: 2.4404 0.6343 sec/batch
Epoch 3/20  Iteration 465/3560 Training loss: 2.4401 0.6336 sec/batch
Epoch 3/20  Iteration 466/3560 Training loss: 2.4394 0.6270 sec/batch
Epoch 3/20  Iteration 467/3560 Training loss: 2.4390 0.6306 sec/batch
Epoch 3/20  Iteration 468/3560 Training loss: 2.4387 0.6365 sec/batch
Epoch 3/20  Iteration 469/3560 Training loss: 2.4382 0.6310 sec/batch
Epoch 3/20  Iteration 470/3560 Training loss: 2.4377 0.6311 sec/batch
Epoch 3/20  Iteration 471/3560 Training loss: 2.4372 0.6160 sec/batch
Epoch 3/20  Iteration 472/3560 Training loss: 2.4364 0.6389 sec/batch
Epoch 3/20  Iteration 473/3560 Training loss: 2.4360 0.6327 sec/batch
Epoch 3/20  Iteration 474/3560 Training loss: 2.4356 0.6337 sec/batch
Epoch 3/20  Iteration 475/3560 Training loss: 2.4353 0.6334 sec/batch
Epoch 3/20  Iteration 476/3560 Training loss: 2.4349 0.6372 sec/batch
Epoch 3/20  Iteration 477/3560 Training loss: 2.4346 0.6263 sec/batch
Epoch 3/20  Iteration 478/3560 Training loss: 2.4341 0.6338 sec/batch
Epoch 3/20  Iteration 479/3560 Training loss: 2.4336 0.6348 sec/batch
Epoch 3/20  Iteration 480/3560 Training loss: 2.4333 0.6336 sec/batch
Epoch 3/20  Iteration 481/3560 Training loss: 2.4329 0.6339 sec/batch
Epoch 3/20  Iteration 482/3560 Training loss: 2.4323 0.6345 sec/batch
Epoch 3/20  Iteration 483/3560 Training loss: 2.4320 0.6308 sec/batch
Epoch 3/20  Iteration 484/3560 Training loss: 2.4316 0.6269 sec/batch
Epoch 3/20  Iteration 485/3560 Training loss: 2.4312 0.6338 sec/batch
Epoch 3/20  Iteration 486/3560 Training loss: 2.4309 0.6486 sec/batch
Epoch 3/20  Iteration 487/3560 Training loss: 2.4305 0.6228 sec/batch
Epoch 3/20  Iteration 488/3560 Training loss: 2.4300 0.6349 sec/batch
Epoch 3/20  Iteration 489/3560 Training loss: 2.4296 0.6329 sec/batch
Epoch 3/20  Iteration 490/3560 Training loss: 2.4294 0.6395 sec/batch
Epoch 3/20  Iteration 491/3560 Training loss: 2.4288 0.6271 sec/batch
Epoch 3/20  Iteration 492/3560 Training loss: 2.4284 0.6408 sec/batch
Epoch 3/20  Iteration 493/3560 Training loss: 2.4280 0.6417 sec/batch
Epoch 3/20  Iteration 494/3560 Training loss: 2.4277 0.6360 sec/batch
Epoch 3/20  Iteration 495/3560 Training loss: 2.4275 0.6295 sec/batch
Epoch 3/20  Iteration 496/3560 Training loss: 2.4271 0.6369 sec/batch
Epoch 3/20  Iteration 497/3560 Training loss: 2.4268 0.6315 sec/batch
Epoch 3/20  Iteration 498/3560 Training loss: 2.4264 0.6371 sec/batch
Epoch 3/20  Iteration 499/3560 Training loss: 2.4261 0.6313 sec/batch
Epoch 3/20  Iteration 500/3560 Training loss: 2.4257 0.6387 sec/batch
Epoch 3/20  Iteration 501/3560 Training loss: 2.4253 0.6384 sec/batch
Epoch 3/20  Iteration 502/3560 Training loss: 2.4251 0.6342 sec/batch
Epoch 3/20  Iteration 503/3560 Training loss: 2.4248 0.6393 sec/batch
Epoch 3/20  Iteration 504/3560 Training loss: 2.4246 0.6403 sec/batch
Epoch 3/20  Iteration 505/3560 Training loss: 2.4242 0.6241 sec/batch
Epoch 3/20  Iteration 506/3560 Training loss: 2.4238 0.6331 sec/batch
Epoch 3/20  Iteration 507/3560 Training loss: 2.4236 0.6354 sec/batch
Epoch 3/20  Iteration 508/3560 Training loss: 2.4235 0.6271 sec/batch
Epoch 3/20  Iteration 509/3560 Training loss: 2.4232 0.6344 sec/batch
Epoch 3/20  Iteration 510/3560 Training loss: 2.4229 0.6386 sec/batch
Epoch 3/20  Iteration 511/3560 Training loss: 2.4224 0.6433 sec/batch
Epoch 3/20  Iteration 512/3560 Training loss: 2.4220 0.6346 sec/batch
Epoch 3/20  Iteration 513/3560 Training loss: 2.4216 0.6327 sec/batch
Epoch 3/20  Iteration 514/3560 Training loss: 2.4212 0.6458 sec/batch
Epoch 3/20  Iteration 515/3560 Training loss: 2.4207 0.6411 sec/batch
Epoch 3/20  Iteration 516/3560 Training loss: 2.4204 0.6357 sec/batch
Epoch 3/20  Iteration 517/3560 Training loss: 2.4200 0.6305 sec/batch
Epoch 3/20  Iteration 518/3560 Training loss: 2.4195 0.6363 sec/batch
Epoch 3/20  Iteration 519/3560 Training loss: 2.4191 0.6342 sec/batch
Epoch 3/20  Iteration 520/3560 Training loss: 2.4186 0.6385 sec/batch
Epoch 3/20  Iteration 521/3560 Training loss: 2.4183 0.6428 sec/batch
Epoch 3/20  Iteration 522/3560 Training loss: 2.4179 0.6300 sec/batch
Epoch 3/20  Iteration 523/3560 Training loss: 2.4176 0.6415 sec/batch
Epoch 3/20  Iteration 524/3560 Training loss: 2.4173 0.6304 sec/batch
Epoch 3/20  Iteration 525/3560 Training loss: 2.4170 0.6365 sec/batch
Epoch 3/20  Iteration 526/3560 Training loss: 2.4165 0.6361 sec/batch
Epoch 3/20  Iteration 527/3560 Training loss: 2.4161 0.6391 sec/batch
Epoch 3/20  Iteration 528/3560 Training loss: 2.4156 0.6324 sec/batch
Epoch 3/20  Iteration 529/3560 Training loss: 2.4154 0.6373 sec/batch
Epoch 3/20  Iteration 530/3560 Training loss: 2.4152 0.6408 sec/batch
Epoch 3/20  Iteration 531/3560 Training loss: 2.4149 0.6374 sec/batch
Epoch 3/20  Iteration 532/3560 Training loss: 2.4145 0.6359 sec/batch
Epoch 3/20  Iteration 533/3560 Training loss: 2.4141 0.6398 sec/batch
Epoch 3/20  Iteration 534/3560 Training loss: 2.4137 0.6310 sec/batch
Epoch 4/20  Iteration 535/3560 Training loss: 2.4181 0.6410 sec/batch
Epoch 4/20  Iteration 536/3560 Training loss: 2.3669 0.6372 sec/batch
Epoch 4/20  Iteration 537/3560 Training loss: 2.3509 0.6311 sec/batch
Epoch 4/20  Iteration 538/3560 Training loss: 2.3458 0.6388 sec/batch
Epoch 4/20  Iteration 539/3560 Training loss: 2.3422 0.6402 sec/batch
Epoch 4/20  Iteration 540/3560 Training loss: 2.3392 0.6257 sec/batch
Epoch 4/20  Iteration 541/3560 Training loss: 2.3396 0.6398 sec/batch
Epoch 4/20  Iteration 542/3560 Training loss: 2.3417 0.6307 sec/batch
Epoch 4/20  Iteration 543/3560 Training loss: 2.3433 0.6397 sec/batch
Epoch 4/20  Iteration 544/3560 Training loss: 2.3417 0.6358 sec/batch
Epoch 4/20  Iteration 545/3560 Training loss: 2.3398 0.6386 sec/batch
Epoch 4/20  Iteration 546/3560 Training loss: 2.3387 0.6307 sec/batch
Epoch 4/20  Iteration 547/3560 Training loss: 2.3380 0.6351 sec/batch
Epoch 4/20  Iteration 548/3560 Training loss: 2.3402 0.6545 sec/batch
Epoch 4/20  Iteration 549/3560 Training loss: 2.3402 0.6495 sec/batch
Epoch 4/20  Iteration 550/3560 Training loss: 2.3399 0.6393 sec/batch
Epoch 4/20  Iteration 551/3560 Training loss: 2.3395 0.6179 sec/batch
Epoch 4/20  Iteration 552/3560 Training loss: 2.3413 0.6267 sec/batch
Epoch 4/20  Iteration 553/3560 Training loss: 2.3415 0.6269 sec/batch
Epoch 4/20  Iteration 554/3560 Training loss: 2.3402 0.6367 sec/batch
Epoch 4/20  Iteration 555/3560 Training loss: 2.3389 0.6515 sec/batch
Epoch 4/20  Iteration 556/3560 Training loss: 2.3395 0.7843 sec/batch
Epoch 4/20  Iteration 557/3560 Training loss: 2.3393 0.7946 sec/batch
Epoch 4/20  Iteration 558/3560 Training loss: 2.3384 0.7429 sec/batch
Epoch 4/20  Iteration 559/3560 Training loss: 2.3375 0.6598 sec/batch
Epoch 4/20  Iteration 560/3560 Training loss: 2.3369 0.6423 sec/batch
Epoch 4/20  Iteration 561/3560 Training loss: 2.3364 0.6347 sec/batch
Epoch 4/20  Iteration 562/3560 Training loss: 2.3360 0.6258 sec/batch
Epoch 4/20  Iteration 563/3560 Training loss: 2.3360 0.6399 sec/batch
Epoch 4/20  Iteration 564/3560 Training loss: 2.3358 0.6530 sec/batch
Epoch 4/20  Iteration 565/3560 Training loss: 2.3360 0.6621 sec/batch
Epoch 4/20  Iteration 566/3560 Training loss: 2.3356 0.6451 sec/batch
Epoch 4/20  Iteration 567/3560 Training loss: 2.3349 0.6401 sec/batch
Epoch 4/20  Iteration 568/3560 Training loss: 2.3349 0.6371 sec/batch
Epoch 4/20  Iteration 569/3560 Training loss: 2.3341 0.6362 sec/batch
Epoch 4/20  Iteration 570/3560 Training loss: 2.3339 0.6190 sec/batch
Epoch 4/20  Iteration 571/3560 Training loss: 2.3332 0.6348 sec/batch
Epoch 4/20  Iteration 572/3560 Training loss: 2.3321 0.6419 sec/batch
Epoch 4/20  Iteration 573/3560 Training loss: 2.3313 0.6566 sec/batch
Epoch 4/20  Iteration 574/3560 Training loss: 2.3301 0.6434 sec/batch
Epoch 4/20  Iteration 575/3560 Training loss: 2.3292 0.6370 sec/batch
Epoch 4/20  Iteration 576/3560 Training loss: 2.3288 0.6378 sec/batch
Epoch 4/20  Iteration 577/3560 Training loss: 2.3280 0.6328 sec/batch
Epoch 4/20  Iteration 578/3560 Training loss: 2.3273 0.6455 sec/batch
Epoch 4/20  Iteration 579/3560 Training loss: 2.3266 0.6300 sec/batch
Epoch 4/20  Iteration 580/3560 Training loss: 2.3253 0.6364 sec/batch
Epoch 4/20  Iteration 581/3560 Training loss: 2.3252 0.6444 sec/batch
Epoch 4/20  Iteration 582/3560 Training loss: 2.3246 0.6465 sec/batch
Epoch 4/20  Iteration 583/3560 Training loss: 2.3242 0.6429 sec/batch
Epoch 4/20  Iteration 584/3560 Training loss: 2.3242 0.6333 sec/batch
Epoch 4/20  Iteration 585/3560 Training loss: 2.3234 0.6330 sec/batch
Epoch 4/20  Iteration 586/3560 Training loss: 2.3233 0.6561 sec/batch
Epoch 4/20  Iteration 587/3560 Training loss: 2.3229 0.6463 sec/batch
Epoch 4/20  Iteration 588/3560 Training loss: 2.3222 0.6434 sec/batch
Epoch 4/20  Iteration 589/3560 Training loss: 2.3219 0.6952 sec/batch
Epoch 4/20  Iteration 590/3560 Training loss: 2.3217 0.6658 sec/batch
Epoch 4/20  Iteration 591/3560 Training loss: 2.3215 0.6523 sec/batch
Epoch 4/20  Iteration 592/3560 Training loss: 2.3210 0.6395 sec/batch
Epoch 4/20  Iteration 593/3560 Training loss: 2.3205 0.6340 sec/batch
Epoch 4/20  Iteration 594/3560 Training loss: 2.3202 0.6556 sec/batch
Epoch 4/20  Iteration 595/3560 Training loss: 2.3197 0.6373 sec/batch
Epoch 4/20  Iteration 596/3560 Training loss: 2.3197 0.6372 sec/batch
Epoch 4/20  Iteration 597/3560 Training loss: 2.3197 0.6251 sec/batch
Epoch 4/20  Iteration 598/3560 Training loss: 2.3193 0.6406 sec/batch
Epoch 4/20  Iteration 599/3560 Training loss: 2.3187 0.6393 sec/batch
Epoch 4/20  Iteration 600/3560 Training loss: 2.3187 0.6406 sec/batch
Epoch 4/20  Iteration 601/3560 Training loss: 2.3186 0.6360 sec/batch
Epoch 4/20  Iteration 602/3560 Training loss: 2.3177 0.6417 sec/batch
Epoch 4/20  Iteration 603/3560 Training loss: 2.3171 0.6356 sec/batch
Epoch 4/20  Iteration 604/3560 Training loss: 2.3168 0.6340 sec/batch
Epoch 4/20  Iteration 605/3560 Training loss: 2.3167 0.6504 sec/batch
Epoch 4/20  Iteration 606/3560 Training loss: 2.3165 0.6292 sec/batch
Epoch 4/20  Iteration 607/3560 Training loss: 2.3163 0.6606 sec/batch
Epoch 4/20  Iteration 608/3560 Training loss: 2.3158 0.6429 sec/batch
Epoch 4/20  Iteration 609/3560 Training loss: 2.3156 0.6362 sec/batch
Epoch 4/20  Iteration 610/3560 Training loss: 2.3157 0.6363 sec/batch
Epoch 4/20  Iteration 611/3560 Training loss: 2.3153 0.6379 sec/batch
Epoch 4/20  Iteration 612/3560 Training loss: 2.3151 0.6456 sec/batch
Epoch 4/20  Iteration 613/3560 Training loss: 2.3146 0.6448 sec/batch
Epoch 4/20  Iteration 614/3560 Training loss: 2.3142 0.6392 sec/batch
Epoch 4/20  Iteration 615/3560 Training loss: 2.3137 0.7189 sec/batch
Epoch 4/20  Iteration 616/3560 Training loss: 2.3135 0.6346 sec/batch
Epoch 4/20  Iteration 617/3560 Training loss: 2.3129 0.6776 sec/batch
Epoch 4/20  Iteration 618/3560 Training loss: 2.3125 0.6624 sec/batch
Epoch 4/20  Iteration 619/3560 Training loss: 2.3117 0.6410 sec/batch
Epoch 4/20  Iteration 620/3560 Training loss: 2.3112 0.6443 sec/batch
Epoch 4/20  Iteration 621/3560 Training loss: 2.3109 0.6344 sec/batch
Epoch 4/20  Iteration 622/3560 Training loss: 2.3103 0.6474 sec/batch
Epoch 4/20  Iteration 623/3560 Training loss: 2.3099 0.6313 sec/batch
Epoch 4/20  Iteration 624/3560 Training loss: 2.3097 0.6433 sec/batch
Epoch 4/20  Iteration 625/3560 Training loss: 2.3093 0.6497 sec/batch
Epoch 4/20  Iteration 626/3560 Training loss: 2.3090 0.6420 sec/batch
Epoch 4/20  Iteration 627/3560 Training loss: 2.3085 0.6381 sec/batch
Epoch 4/20  Iteration 628/3560 Training loss: 2.3079 0.6991 sec/batch
Epoch 4/20  Iteration 629/3560 Training loss: 2.3074 0.6489 sec/batch
Epoch 4/20  Iteration 630/3560 Training loss: 2.3069 0.6453 sec/batch
Epoch 4/20  Iteration 631/3560 Training loss: 2.3066 0.6292 sec/batch
Epoch 4/20  Iteration 632/3560 Training loss: 2.3063 0.6327 sec/batch
Epoch 4/20  Iteration 633/3560 Training loss: 2.3059 0.6365 sec/batch
Epoch 4/20  Iteration 634/3560 Training loss: 2.3054 0.6461 sec/batch
Epoch 4/20  Iteration 635/3560 Training loss: 2.3052 0.6306 sec/batch
Epoch 4/20  Iteration 636/3560 Training loss: 2.3049 0.6379 sec/batch
Epoch 4/20  Iteration 637/3560 Training loss: 2.3044 0.6348 sec/batch
Epoch 4/20  Iteration 638/3560 Training loss: 2.3040 0.6428 sec/batch
Epoch 4/20  Iteration 639/3560 Training loss: 2.3036 0.6334 sec/batch
Epoch 4/20  Iteration 640/3560 Training loss: 2.3032 0.6516 sec/batch
Epoch 4/20  Iteration 641/3560 Training loss: 2.3028 0.6318 sec/batch
Epoch 4/20  Iteration 642/3560 Training loss: 2.3026 0.6298 sec/batch
Epoch 4/20  Iteration 643/3560 Training loss: 2.3025 0.6189 sec/batch
Epoch 4/20  Iteration 644/3560 Training loss: 2.3020 0.6351 sec/batch
Epoch 4/20  Iteration 645/3560 Training loss: 2.3018 0.6387 sec/batch
Epoch 4/20  Iteration 646/3560 Training loss: 2.3016 0.6309 sec/batch
Epoch 4/20  Iteration 647/3560 Training loss: 2.3013 0.6371 sec/batch
Epoch 4/20  Iteration 648/3560 Training loss: 2.3008 0.6371 sec/batch
Epoch 4/20  Iteration 649/3560 Training loss: 2.3004 0.6377 sec/batch
Epoch 4/20  Iteration 650/3560 Training loss: 2.2998 0.6353 sec/batch
Epoch 4/20  Iteration 651/3560 Training loss: 2.2996 0.6339 sec/batch
Epoch 4/20  Iteration 652/3560 Training loss: 2.2994 0.6357 sec/batch
Epoch 4/20  Iteration 653/3560 Training loss: 2.2993 0.6360 sec/batch
Epoch 4/20  Iteration 654/3560 Training loss: 2.2989 0.6261 sec/batch
Epoch 4/20  Iteration 655/3560 Training loss: 2.2988 0.6341 sec/batch
Epoch 4/20  Iteration 656/3560 Training loss: 2.2984 0.6415 sec/batch
Epoch 4/20  Iteration 657/3560 Training loss: 2.2981 0.6365 sec/batch
Epoch 4/20  Iteration 658/3560 Training loss: 2.2979 0.6341 sec/batch
Epoch 4/20  Iteration 659/3560 Training loss: 2.2976 0.5970 sec/batch
Epoch 4/20  Iteration 660/3560 Training loss: 2.2971 0.6369 sec/batch
Epoch 4/20  Iteration 661/3560 Training loss: 2.2970 0.6271 sec/batch
Epoch 4/20  Iteration 662/3560 Training loss: 2.2969 0.6366 sec/batch
Epoch 4/20  Iteration 663/3560 Training loss: 2.2966 0.6355 sec/batch
Epoch 4/20  Iteration 664/3560 Training loss: 2.2963 0.6355 sec/batch
Epoch 4/20  Iteration 665/3560 Training loss: 2.2962 0.6331 sec/batch
Epoch 4/20  Iteration 666/3560 Training loss: 2.2957 0.6363 sec/batch
Epoch 4/20  Iteration 667/3560 Training loss: 2.2955 0.6393 sec/batch
Epoch 4/20  Iteration 668/3560 Training loss: 2.2953 0.6532 sec/batch
Epoch 4/20  Iteration 669/3560 Training loss: 2.2950 0.6346 sec/batch
Epoch 4/20  Iteration 670/3560 Training loss: 2.2947 0.6252 sec/batch
Epoch 4/20  Iteration 671/3560 Training loss: 2.2945 0.6335 sec/batch
Epoch 4/20  Iteration 672/3560 Training loss: 2.2942 0.6290 sec/batch
Epoch 4/20  Iteration 673/3560 Training loss: 2.2942 0.6386 sec/batch
Epoch 4/20  Iteration 674/3560 Training loss: 2.2938 0.6382 sec/batch
Epoch 4/20  Iteration 675/3560 Training loss: 2.2937 0.6428 sec/batch
Epoch 4/20  Iteration 676/3560 Training loss: 2.2934 0.6328 sec/batch
Epoch 4/20  Iteration 677/3560 Training loss: 2.2930 0.6401 sec/batch
Epoch 4/20  Iteration 678/3560 Training loss: 2.2928 0.6375 sec/batch
Epoch 4/20  Iteration 679/3560 Training loss: 2.2925 0.6331 sec/batch
Epoch 4/20  Iteration 680/3560 Training loss: 2.2924 0.6391 sec/batch
Epoch 4/20  Iteration 681/3560 Training loss: 2.2922 0.6355 sec/batch
Epoch 4/20  Iteration 682/3560 Training loss: 2.2921 0.6341 sec/batch
Epoch 4/20  Iteration 683/3560 Training loss: 2.2918 0.6338 sec/batch
Epoch 4/20  Iteration 684/3560 Training loss: 2.2914 0.6285 sec/batch
Epoch 4/20  Iteration 685/3560 Training loss: 2.2913 0.6319 sec/batch
Epoch 4/20  Iteration 686/3560 Training loss: 2.2913 0.6370 sec/batch
Epoch 4/20  Iteration 687/3560 Training loss: 2.2910 0.6576 sec/batch
Epoch 4/20  Iteration 688/3560 Training loss: 2.2909 0.6394 sec/batch
Epoch 4/20  Iteration 689/3560 Training loss: 2.2906 0.6360 sec/batch
Epoch 4/20  Iteration 690/3560 Training loss: 2.2903 0.6339 sec/batch
Epoch 4/20  Iteration 691/3560 Training loss: 2.2900 0.6286 sec/batch
Epoch 4/20  Iteration 692/3560 Training loss: 2.2897 0.6326 sec/batch
Epoch 4/20  Iteration 693/3560 Training loss: 2.2893 0.6336 sec/batch
Epoch 4/20  Iteration 694/3560 Training loss: 2.2892 0.6344 sec/batch
Epoch 4/20  Iteration 695/3560 Training loss: 2.2890 0.6462 sec/batch
Epoch 4/20  Iteration 696/3560 Training loss: 2.2886 0.6387 sec/batch
Epoch 4/20  Iteration 697/3560 Training loss: 2.2884 0.6410 sec/batch
Epoch 4/20  Iteration 698/3560 Training loss: 2.2882 0.6351 sec/batch
Epoch 4/20  Iteration 699/3560 Training loss: 2.2880 0.6546 sec/batch
Epoch 4/20  Iteration 700/3560 Training loss: 2.2877 0.6373 sec/batch
Epoch 4/20  Iteration 701/3560 Training loss: 2.2875 0.6384 sec/batch
Epoch 4/20  Iteration 702/3560 Training loss: 2.2874 0.6391 sec/batch
Epoch 4/20  Iteration 703/3560 Training loss: 2.2871 0.6344 sec/batch
Epoch 4/20  Iteration 704/3560 Training loss: 2.2868 0.6310 sec/batch
Epoch 4/20  Iteration 705/3560 Training loss: 2.2865 0.6355 sec/batch
Epoch 4/20  Iteration 706/3560 Training loss: 2.2861 0.6307 sec/batch
Epoch 4/20  Iteration 707/3560 Training loss: 2.2860 0.6287 sec/batch
Epoch 4/20  Iteration 708/3560 Training loss: 2.2858 0.6380 sec/batch
Epoch 4/20  Iteration 709/3560 Training loss: 2.2857 0.6375 sec/batch
Epoch 4/20  Iteration 710/3560 Training loss: 2.2855 0.6325 sec/batch
Epoch 4/20  Iteration 711/3560 Training loss: 2.2851 0.6335 sec/batch
Epoch 4/20  Iteration 712/3560 Training loss: 2.2848 0.6478 sec/batch
Epoch 5/20  Iteration 713/3560 Training loss: 2.3134 0.6403 sec/batch
Epoch 5/20  Iteration 714/3560 Training loss: 2.2596 0.6341 sec/batch
Epoch 5/20  Iteration 715/3560 Training loss: 2.2463 0.6371 sec/batch
Epoch 5/20  Iteration 716/3560 Training loss: 2.2407 0.6425 sec/batch
Epoch 5/20  Iteration 717/3560 Training loss: 2.2373 0.6267 sec/batch
Epoch 5/20  Iteration 718/3560 Training loss: 2.2350 0.6363 sec/batch
Epoch 5/20  Iteration 719/3560 Training loss: 2.2355 0.6378 sec/batch
Epoch 5/20  Iteration 720/3560 Training loss: 2.2359 0.6366 sec/batch
Epoch 5/20  Iteration 721/3560 Training loss: 2.2384 0.6382 sec/batch
Epoch 5/20  Iteration 722/3560 Training loss: 2.2366 0.6352 sec/batch
Epoch 5/20  Iteration 723/3560 Training loss: 2.2344 0.6366 sec/batch
Epoch 5/20  Iteration 724/3560 Training loss: 2.2339 0.6417 sec/batch
Epoch 5/20  Iteration 725/3560 Training loss: 2.2339 0.6486 sec/batch
Epoch 5/20  Iteration 726/3560 Training loss: 2.2358 0.6276 sec/batch
Epoch 5/20  Iteration 727/3560 Training loss: 2.2349 0.6109 sec/batch
Epoch 5/20  Iteration 728/3560 Training loss: 2.2347 0.6215 sec/batch
Epoch 5/20  Iteration 729/3560 Training loss: 2.2342 0.6294 sec/batch
Epoch 5/20  Iteration 730/3560 Training loss: 2.2362 0.6466 sec/batch
Epoch 5/20  Iteration 731/3560 Training loss: 2.2363 0.6445 sec/batch
Epoch 5/20  Iteration 732/3560 Training loss: 2.2354 0.6376 sec/batch
Epoch 5/20  Iteration 733/3560 Training loss: 2.2346 0.6268 sec/batch
Epoch 5/20  Iteration 734/3560 Training loss: 2.2357 0.6563 sec/batch
Epoch 5/20  Iteration 735/3560 Training loss: 2.2351 0.6337 sec/batch
Epoch 5/20  Iteration 736/3560 Training loss: 2.2344 0.6292 sec/batch
Epoch 5/20  Iteration 737/3560 Training loss: 2.2334 0.6307 sec/batch
Epoch 5/20  Iteration 738/3560 Training loss: 2.2326 0.6374 sec/batch
Epoch 5/20  Iteration 739/3560 Training loss: 2.2320 0.6399 sec/batch
Epoch 5/20  Iteration 740/3560 Training loss: 2.2321 0.6406 sec/batch
Epoch 5/20  Iteration 741/3560 Training loss: 2.2321 0.6368 sec/batch
Epoch 5/20  Iteration 742/3560 Training loss: 2.2322 0.6344 sec/batch
Epoch 5/20  Iteration 743/3560 Training loss: 2.2322 0.6418 sec/batch
Epoch 5/20  Iteration 744/3560 Training loss: 2.2316 0.6406 sec/batch
Epoch 5/20  Iteration 745/3560 Training loss: 2.2313 0.6340 sec/batch
Epoch 5/20  Iteration 746/3560 Training loss: 2.2317 0.6336 sec/batch
Epoch 5/20  Iteration 747/3560 Training loss: 2.2317 0.6340 sec/batch
Epoch 5/20  Iteration 748/3560 Training loss: 2.2316 0.6358 sec/batch
Epoch 5/20  Iteration 749/3560 Training loss: 2.2312 0.6416 sec/batch
Epoch 5/20  Iteration 750/3560 Training loss: 2.2302 0.6406 sec/batch
Epoch 5/20  Iteration 751/3560 Training loss: 2.2291 0.6481 sec/batch
Epoch 5/20  Iteration 752/3560 Training loss: 2.2281 0.6459 sec/batch
Epoch 5/20  Iteration 753/3560 Training loss: 2.2275 0.6436 sec/batch
Epoch 5/20  Iteration 754/3560 Training loss: 2.2268 0.6434 sec/batch
Epoch 5/20  Iteration 755/3560 Training loss: 2.2261 0.6551 sec/batch
Epoch 5/20  Iteration 756/3560 Training loss: 2.2252 0.6367 sec/batch
Epoch 5/20  Iteration 757/3560 Training loss: 2.2246 0.6294 sec/batch
Epoch 5/20  Iteration 758/3560 Training loss: 2.2233 0.6378 sec/batch
Epoch 5/20  Iteration 759/3560 Training loss: 2.2232 0.6352 sec/batch
Epoch 5/20  Iteration 760/3560 Training loss: 2.2228 0.6338 sec/batch
Epoch 5/20  Iteration 761/3560 Training loss: 2.2225 0.6387 sec/batch
Epoch 5/20  Iteration 762/3560 Training loss: 2.2227 0.6372 sec/batch
Epoch 5/20  Iteration 763/3560 Training loss: 2.2220 0.6426 sec/batch
Epoch 5/20  Iteration 764/3560 Training loss: 2.2222 0.6345 sec/batch
Epoch 5/20  Iteration 765/3560 Training loss: 2.2218 0.6333 sec/batch
Epoch 5/20  Iteration 766/3560 Training loss: 2.2213 0.6321 sec/batch
Epoch 5/20  Iteration 767/3560 Training loss: 2.2207 0.6411 sec/batch
Epoch 5/20  Iteration 768/3560 Training loss: 2.2204 0.6391 sec/batch
Epoch 5/20  Iteration 769/3560 Training loss: 2.2203 0.6544 sec/batch
Epoch 5/20  Iteration 770/3560 Training loss: 2.2199 0.6432 sec/batch
Epoch 5/20  Iteration 771/3560 Training loss: 2.2196 0.6338 sec/batch
Epoch 5/20  Iteration 772/3560 Training loss: 2.2196 0.6352 sec/batch
Epoch 5/20  Iteration 773/3560 Training loss: 2.2192 0.6407 sec/batch
Epoch 5/20  Iteration 774/3560 Training loss: 2.2194 0.6357 sec/batch
Epoch 5/20  Iteration 775/3560 Training loss: 2.2195 0.6307 sec/batch
Epoch 5/20  Iteration 776/3560 Training loss: 2.2192 0.6361 sec/batch
Epoch 5/20  Iteration 777/3560 Training loss: 2.2188 0.6356 sec/batch
Epoch 5/20  Iteration 778/3560 Training loss: 2.2190 0.6351 sec/batch
Epoch 5/20  Iteration 779/3560 Training loss: 2.2189 0.6290 sec/batch
Epoch 5/20  Iteration 780/3560 Training loss: 2.2182 0.6413 sec/batch
Epoch 5/20  Iteration 781/3560 Training loss: 2.2177 0.6259 sec/batch
Epoch 5/20  Iteration 782/3560 Training loss: 2.2176 0.6365 sec/batch
Epoch 5/20  Iteration 783/3560 Training loss: 2.2175 0.6420 sec/batch
Epoch 5/20  Iteration 784/3560 Training loss: 2.2174 0.6430 sec/batch
Epoch 5/20  Iteration 785/3560 Training loss: 2.2175 0.6273 sec/batch
Epoch 5/20  Iteration 786/3560 Training loss: 2.2171 0.6416 sec/batch
Epoch 5/20  Iteration 787/3560 Training loss: 2.2169 0.6427 sec/batch
Epoch 5/20  Iteration 788/3560 Training loss: 2.2172 0.6322 sec/batch
Epoch 5/20  Iteration 789/3560 Training loss: 2.2169 0.6391 sec/batch
Epoch 5/20  Iteration 790/3560 Training loss: 2.2168 0.6375 sec/batch
Epoch 5/20  Iteration 791/3560 Training loss: 2.2163 0.6403 sec/batch
Epoch 5/20  Iteration 792/3560 Training loss: 2.2162 0.6161 sec/batch
Epoch 5/20  Iteration 793/3560 Training loss: 2.2158 0.6339 sec/batch
Epoch 5/20  Iteration 794/3560 Training loss: 2.2159 0.6969 sec/batch
Epoch 5/20  Iteration 795/3560 Training loss: 2.2152 0.6437 sec/batch
Epoch 5/20  Iteration 796/3560 Training loss: 2.2149 0.6379 sec/batch
Epoch 5/20  Iteration 797/3560 Training loss: 2.2143 0.6358 sec/batch
Epoch 5/20  Iteration 798/3560 Training loss: 2.2139 0.6325 sec/batch
Epoch 5/20  Iteration 799/3560 Training loss: 2.2137 0.6287 sec/batch
Epoch 5/20  Iteration 800/3560 Training loss: 2.2133 0.6340 sec/batch
Epoch 5/20  Iteration 801/3560 Training loss: 2.2128 0.6711 sec/batch
Epoch 5/20  Iteration 802/3560 Training loss: 2.2129 0.7201 sec/batch
Epoch 5/20  Iteration 803/3560 Training loss: 2.2126 0.6839 sec/batch
Epoch 5/20  Iteration 804/3560 Training loss: 2.2125 0.6661 sec/batch
Epoch 5/20  Iteration 805/3560 Training loss: 2.2120 0.6948 sec/batch
Epoch 5/20  Iteration 806/3560 Training loss: 2.2117 0.7284 sec/batch
Epoch 5/20  Iteration 807/3560 Training loss: 2.2111 0.7417 sec/batch
Epoch 5/20  Iteration 808/3560 Training loss: 2.2109 0.6446 sec/batch
Epoch 5/20  Iteration 809/3560 Training loss: 2.2106 0.7278 sec/batch
Epoch 5/20  Iteration 810/3560 Training loss: 2.2103 0.6652 sec/batch
Epoch 5/20  Iteration 811/3560 Training loss: 2.2099 0.6557 sec/batch
Epoch 5/20  Iteration 812/3560 Training loss: 2.2095 0.7008 sec/batch
Epoch 5/20  Iteration 813/3560 Training loss: 2.2095 0.6471 sec/batch
Epoch 5/20  Iteration 814/3560 Training loss: 2.2092 0.6515 sec/batch
Epoch 5/20  Iteration 815/3560 Training loss: 2.2087 0.6630 sec/batch
Epoch 5/20  Iteration 816/3560 Training loss: 2.2084 0.7074 sec/batch
Epoch 5/20  Iteration 817/3560 Training loss: 2.2079 0.6458 sec/batch
Epoch 5/20  Iteration 818/3560 Training loss: 2.2077 0.6424 sec/batch
Epoch 5/20  Iteration 819/3560 Training loss: 2.2074 0.6495 sec/batch
Epoch 5/20  Iteration 820/3560 Training loss: 2.2073 0.6428 sec/batch
Epoch 5/20  Iteration 821/3560 Training loss: 2.2073 0.6587 sec/batch
Epoch 5/20  Iteration 822/3560 Training loss: 2.2069 0.6627 sec/batch
Epoch 5/20  Iteration 823/3560 Training loss: 2.2069 0.6640 sec/batch
Epoch 5/20  Iteration 824/3560 Training loss: 2.2068 0.6463 sec/batch
Epoch 5/20  Iteration 825/3560 Training loss: 2.2066 0.6540 sec/batch
Epoch 5/20  Iteration 826/3560 Training loss: 2.2063 0.6601 sec/batch
Epoch 5/20  Iteration 827/3560 Training loss: 2.2060 0.6520 sec/batch
Epoch 5/20  Iteration 828/3560 Training loss: 2.2055 0.6564 sec/batch
Epoch 5/20  Iteration 829/3560 Training loss: 2.2053 0.6633 sec/batch
Epoch 5/20  Iteration 830/3560 Training loss: 2.2051 0.6645 sec/batch
Epoch 5/20  Iteration 831/3560 Training loss: 2.2050 0.7053 sec/batch
Epoch 5/20  Iteration 832/3560 Training loss: 2.2048 0.8096 sec/batch
Epoch 5/20  Iteration 833/3560 Training loss: 2.2047 0.7987 sec/batch
Epoch 5/20  Iteration 834/3560 Training loss: 2.2045 0.7903 sec/batch
Epoch 5/20  Iteration 835/3560 Training loss: 2.2043 0.8647 sec/batch
Epoch 5/20  Iteration 836/3560 Training loss: 2.2042 0.8357 sec/batch
Epoch 5/20  Iteration 837/3560 Training loss: 2.2040 0.8714 sec/batch
Epoch 5/20  Iteration 838/3560 Training loss: 2.2036 0.8600 sec/batch
Epoch 5/20  Iteration 839/3560 Training loss: 2.2035 0.7082 sec/batch
Epoch 5/20  Iteration 840/3560 Training loss: 2.2034 0.6551 sec/batch
Epoch 5/20  Iteration 841/3560 Training loss: 2.2033 0.6554 sec/batch
Epoch 5/20  Iteration 842/3560 Training loss: 2.2031 0.7015 sec/batch
Epoch 5/20  Iteration 843/3560 Training loss: 2.2029 0.6745 sec/batch
Epoch 5/20  Iteration 844/3560 Training loss: 2.2025 0.6482 sec/batch
Epoch 5/20  Iteration 845/3560 Training loss: 2.2024 0.6438 sec/batch
Epoch 5/20  Iteration 846/3560 Training loss: 2.2022 0.6682 sec/batch
Epoch 5/20  Iteration 847/3560 Training loss: 2.2020 0.6792 sec/batch
Epoch 5/20  Iteration 848/3560 Training loss: 2.2018 0.6620 sec/batch
Epoch 5/20  Iteration 849/3560 Training loss: 2.2016 0.6647 sec/batch
Epoch 5/20  Iteration 850/3560 Training loss: 2.2014 0.6913 sec/batch
Epoch 5/20  Iteration 851/3560 Training loss: 2.2015 0.6608 sec/batch
Epoch 5/20  Iteration 852/3560 Training loss: 2.2012 0.6487 sec/batch
Epoch 5/20  Iteration 853/3560 Training loss: 2.2011 0.6371 sec/batch
Epoch 5/20  Iteration 854/3560 Training loss: 2.2009 0.6253 sec/batch
Epoch 5/20  Iteration 855/3560 Training loss: 2.2007 0.6938 sec/batch
Epoch 5/20  Iteration 856/3560 Training loss: 2.2004 0.6299 sec/batch
Epoch 5/20  Iteration 857/3560 Training loss: 2.2002 0.6290 sec/batch
Epoch 5/20  Iteration 858/3560 Training loss: 2.2001 0.6269 sec/batch
Epoch 5/20  Iteration 859/3560 Training loss: 2.2000 0.6310 sec/batch
Epoch 5/20  Iteration 860/3560 Training loss: 2.1999 0.6313 sec/batch
Epoch 5/20  Iteration 861/3560 Training loss: 2.1997 0.6243 sec/batch
Epoch 5/20  Iteration 862/3560 Training loss: 2.1994 0.6290 sec/batch
Epoch 5/20  Iteration 863/3560 Training loss: 2.1993 0.6193 sec/batch
Epoch 5/20  Iteration 864/3560 Training loss: 2.1994 0.6208 sec/batch
Epoch 5/20  Iteration 865/3560 Training loss: 2.1992 0.6296 sec/batch
Epoch 5/20  Iteration 866/3560 Training loss: 2.1992 0.6330 sec/batch
Epoch 5/20  Iteration 867/3560 Training loss: 2.1990 0.6477 sec/batch
Epoch 5/20  Iteration 868/3560 Training loss: 2.1987 0.6818 sec/batch
Epoch 5/20  Iteration 869/3560 Training loss: 2.1985 0.7337 sec/batch
Epoch 5/20  Iteration 870/3560 Training loss: 2.1983 0.6740 sec/batch
Epoch 5/20  Iteration 871/3560 Training loss: 2.1980 0.6910 sec/batch
Epoch 5/20  Iteration 872/3560 Training loss: 2.1980 0.7709 sec/batch
Epoch 5/20  Iteration 873/3560 Training loss: 2.1979 0.6656 sec/batch
Epoch 5/20  Iteration 874/3560 Training loss: 2.1976 0.7679 sec/batch
Epoch 5/20  Iteration 875/3560 Training loss: 2.1975 0.8056 sec/batch
Epoch 5/20  Iteration 876/3560 Training loss: 2.1974 0.8208 sec/batch
Epoch 5/20  Iteration 877/3560 Training loss: 2.1972 0.8516 sec/batch
Epoch 5/20  Iteration 878/3560 Training loss: 2.1971 0.7558 sec/batch
Epoch 5/20  Iteration 879/3560 Training loss: 2.1969 0.8213 sec/batch
Epoch 5/20  Iteration 880/3560 Training loss: 2.1970 0.7799 sec/batch
Epoch 5/20  Iteration 881/3560 Training loss: 2.1968 0.8504 sec/batch
Epoch 5/20  Iteration 882/3560 Training loss: 2.1966 0.7978 sec/batch
Epoch 5/20  Iteration 883/3560 Training loss: 2.1964 0.6799 sec/batch
Epoch 5/20  Iteration 884/3560 Training loss: 2.1962 0.6867 sec/batch
Epoch 5/20  Iteration 885/3560 Training loss: 2.1961 0.6845 sec/batch
Epoch 5/20  Iteration 886/3560 Training loss: 2.1961 0.6852 sec/batch
Epoch 5/20  Iteration 887/3560 Training loss: 2.1960 0.6597 sec/batch
Epoch 5/20  Iteration 888/3560 Training loss: 2.1959 0.6907 sec/batch
Epoch 5/20  Iteration 889/3560 Training loss: 2.1956 0.6791 sec/batch
Epoch 5/20  Iteration 890/3560 Training loss: 2.1955 0.7560 sec/batch
Epoch 6/20  Iteration 891/3560 Training loss: 2.2330 0.7007 sec/batch
Epoch 6/20  Iteration 892/3560 Training loss: 2.1784 0.7252 sec/batch
Epoch 6/20  Iteration 893/3560 Training loss: 2.1652 0.6942 sec/batch
Epoch 6/20  Iteration 894/3560 Training loss: 2.1607 0.7157 sec/batch
Epoch 6/20  Iteration 895/3560 Training loss: 2.1596 0.6942 sec/batch
Epoch 6/20  Iteration 896/3560 Training loss: 2.1552 0.6798 sec/batch
Epoch 6/20  Iteration 897/3560 Training loss: 2.1570 0.6803 sec/batch
Epoch 6/20  Iteration 898/3560 Training loss: 2.1565 0.6839 sec/batch
Epoch 6/20  Iteration 899/3560 Training loss: 2.1597 0.7345 sec/batch
Epoch 6/20  Iteration 900/3560 Training loss: 2.1593 0.8475 sec/batch
Epoch 6/20  Iteration 901/3560 Training loss: 2.1592 0.7851 sec/batch
Epoch 6/20  Iteration 902/3560 Training loss: 2.1578 0.7609 sec/batch
Epoch 6/20  Iteration 903/3560 Training loss: 2.1582 0.7731 sec/batch
Epoch 6/20  Iteration 904/3560 Training loss: 2.1611 0.6490 sec/batch
Epoch 6/20  Iteration 905/3560 Training loss: 2.1609 0.6828 sec/batch
Epoch 6/20  Iteration 906/3560 Training loss: 2.1610 0.6557 sec/batch
Epoch 6/20  Iteration 907/3560 Training loss: 2.1607 0.6367 sec/batch
Epoch 6/20  Iteration 908/3560 Training loss: 2.1633 0.6406 sec/batch
Epoch 6/20  Iteration 909/3560 Training loss: 2.1635 0.6638 sec/batch
Epoch 6/20  Iteration 910/3560 Training loss: 2.1629 0.6376 sec/batch
Epoch 6/20  Iteration 911/3560 Training loss: 2.1625 0.6662 sec/batch
Epoch 6/20  Iteration 912/3560 Training loss: 2.1635 0.6400 sec/batch
Epoch 6/20  Iteration 913/3560 Training loss: 2.1635 0.6417 sec/batch
Epoch 6/20  Iteration 914/3560 Training loss: 2.1626 0.6345 sec/batch
Epoch 6/20  Iteration 915/3560 Training loss: 2.1620 0.6339 sec/batch
Epoch 6/20  Iteration 916/3560 Training loss: 2.1613 0.6474 sec/batch
Epoch 6/20  Iteration 917/3560 Training loss: 2.1606 0.6548 sec/batch
Epoch 6/20  Iteration 918/3560 Training loss: 2.1606 0.8210 sec/batch
Epoch 6/20  Iteration 919/3560 Training loss: 2.1610 0.8200 sec/batch
Epoch 6/20  Iteration 920/3560 Training loss: 2.1611 0.7814 sec/batch
Epoch 6/20  Iteration 921/3560 Training loss: 2.1614 0.6437 sec/batch
Epoch 6/20  Iteration 922/3560 Training loss: 2.1611 0.6271 sec/batch
Epoch 6/20  Iteration 923/3560 Training loss: 2.1606 0.6214 sec/batch
Epoch 6/20  Iteration 924/3560 Training loss: 2.1613 0.6514 sec/batch
Epoch 6/20  Iteration 925/3560 Training loss: 2.1610 0.6798 sec/batch
Epoch 6/20  Iteration 926/3560 Training loss: 2.1610 0.7351 sec/batch
Epoch 6/20  Iteration 927/3560 Training loss: 2.1608 0.7748 sec/batch
Epoch 6/20  Iteration 928/3560 Training loss: 2.1597 0.6870 sec/batch
Epoch 6/20  Iteration 929/3560 Training loss: 2.1585 0.6818 sec/batch
Epoch 6/20  Iteration 930/3560 Training loss: 2.1577 0.6437 sec/batch
Epoch 6/20  Iteration 931/3560 Training loss: 2.1570 0.6383 sec/batch
Epoch 6/20  Iteration 932/3560 Training loss: 2.1565 0.6239 sec/batch
Epoch 6/20  Iteration 933/3560 Training loss: 2.1559 0.6500 sec/batch
Epoch 6/20  Iteration 934/3560 Training loss: 2.1550 0.6365 sec/batch
Epoch 6/20  Iteration 935/3560 Training loss: 2.1546 0.6365 sec/batch
Epoch 6/20  Iteration 936/3560 Training loss: 2.1535 0.6373 sec/batch
Epoch 6/20  Iteration 937/3560 Training loss: 2.1535 0.6591 sec/batch
Epoch 6/20  Iteration 938/3560 Training loss: 2.1529 0.6344 sec/batch
Epoch 6/20  Iteration 939/3560 Training loss: 2.1526 0.6495 sec/batch
Epoch 6/20  Iteration 940/3560 Training loss: 2.1531 0.6334 sec/batch
Epoch 6/20  Iteration 941/3560 Training loss: 2.1524 0.6406 sec/batch
Epoch 6/20  Iteration 942/3560 Training loss: 2.1526 0.6403 sec/batch
Epoch 6/20  Iteration 943/3560 Training loss: 2.1521 0.6368 sec/batch
Epoch 6/20  Iteration 944/3560 Training loss: 2.1518 0.6595 sec/batch
Epoch 6/20  Iteration 945/3560 Training loss: 2.1513 0.6350 sec/batch
Epoch 6/20  Iteration 946/3560 Training loss: 2.1513 0.6052 sec/batch
Epoch 6/20  Iteration 947/3560 Training loss: 2.1510 0.6324 sec/batch
Epoch 6/20  Iteration 948/3560 Training loss: 2.1507 0.6300 sec/batch
Epoch 6/20  Iteration 949/3560 Training loss: 2.1503 0.6309 sec/batch
Epoch 6/20  Iteration 950/3560 Training loss: 2.1503 0.6328 sec/batch
Epoch 6/20  Iteration 951/3560 Training loss: 2.1501 0.6416 sec/batch
Epoch 6/20  Iteration 952/3560 Training loss: 2.1504 0.6340 sec/batch
Epoch 6/20  Iteration 953/3560 Training loss: 2.1506 0.6433 sec/batch
Epoch 6/20  Iteration 954/3560 Training loss: 2.1504 0.6397 sec/batch
Epoch 6/20  Iteration 955/3560 Training loss: 2.1501 0.6415 sec/batch
Epoch 6/20  Iteration 956/3560 Training loss: 2.1501 0.6709 sec/batch
Epoch 6/20  Iteration 957/3560 Training loss: 2.1503 0.6412 sec/batch
Epoch 6/20  Iteration 958/3560 Training loss: 2.1497 0.6418 sec/batch
Epoch 6/20  Iteration 959/3560 Training loss: 2.1493 0.6295 sec/batch
Epoch 6/20  Iteration 960/3560 Training loss: 2.1492 0.6425 sec/batch
Epoch 6/20  Iteration 961/3560 Training loss: 2.1493 0.6361 sec/batch
Epoch 6/20  Iteration 962/3560 Training loss: 2.1495 0.6275 sec/batch
Epoch 6/20  Iteration 963/3560 Training loss: 2.1496 0.6212 sec/batch
Epoch 6/20  Iteration 964/3560 Training loss: 2.1493 0.6416 sec/batch
Epoch 6/20  Iteration 965/3560 Training loss: 2.1492 0.6611 sec/batch
Epoch 6/20  Iteration 966/3560 Training loss: 2.1495 0.6445 sec/batch
Epoch 6/20  Iteration 967/3560 Training loss: 2.1493 0.6300 sec/batch
Epoch 6/20  Iteration 968/3560 Training loss: 2.1492 0.6123 sec/batch
Epoch 6/20  Iteration 969/3560 Training loss: 2.1487 0.6378 sec/batch
Epoch 6/20  Iteration 970/3560 Training loss: 2.1483 0.6343 sec/batch
Epoch 6/20  Iteration 971/3560 Training loss: 2.1479 0.6296 sec/batch
Epoch 6/20  Iteration 972/3560 Training loss: 2.1479 0.6341 sec/batch
Epoch 6/20  Iteration 973/3560 Training loss: 2.1473 0.6243 sec/batch
Epoch 6/20  Iteration 974/3560 Training loss: 2.1470 0.6412 sec/batch
Epoch 6/20  Iteration 975/3560 Training loss: 2.1463 0.6383 sec/batch
Epoch 6/20  Iteration 976/3560 Training loss: 2.1460 0.6344 sec/batch
Epoch 6/20  Iteration 977/3560 Training loss: 2.1459 0.7189 sec/batch
Epoch 6/20  Iteration 978/3560 Training loss: 2.1456 0.6591 sec/batch
Epoch 6/20  Iteration 979/3560 Training loss: 2.1450 0.6553 sec/batch
Epoch 6/20  Iteration 980/3560 Training loss: 2.1451 0.6414 sec/batch
Epoch 6/20  Iteration 981/3560 Training loss: 2.1447 0.6735 sec/batch
Epoch 6/20  Iteration 982/3560 Training loss: 2.1445 0.6260 sec/batch
Epoch 6/20  Iteration 983/3560 Training loss: 2.1440 0.6378 sec/batch
Epoch 6/20  Iteration 984/3560 Training loss: 2.1437 0.6379 sec/batch
Epoch 6/20  Iteration 985/3560 Training loss: 2.1432 0.6260 sec/batch
Epoch 6/20  Iteration 986/3560 Training loss: 2.1429 0.6164 sec/batch
Epoch 6/20  Iteration 987/3560 Training loss: 2.1427 0.6403 sec/batch
Epoch 6/20  Iteration 988/3560 Training loss: 2.1424 0.6369 sec/batch
Epoch 6/20  Iteration 989/3560 Training loss: 2.1419 0.6442 sec/batch
Epoch 6/20  Iteration 990/3560 Training loss: 2.1416 0.6351 sec/batch
Epoch 6/20  Iteration 991/3560 Training loss: 2.1415 0.6385 sec/batch
Epoch 6/20  Iteration 992/3560 Training loss: 2.1414 0.6412 sec/batch
Epoch 6/20  Iteration 993/3560 Training loss: 2.1411 0.6361 sec/batch
Epoch 6/20  Iteration 994/3560 Training loss: 2.1408 0.6296 sec/batch
Epoch 6/20  Iteration 995/3560 Training loss: 2.1405 0.6350 sec/batch
Epoch 6/20  Iteration 996/3560 Training loss: 2.1403 0.6383 sec/batch
Epoch 6/20  Iteration 997/3560 Training loss: 2.1401 0.6256 sec/batch
Epoch 6/20  Iteration 998/3560 Training loss: 2.1400 0.6398 sec/batch
Epoch 6/20  Iteration 999/3560 Training loss: 2.1400 0.6312 sec/batch
Epoch 6/20  Iteration 1000/3560 Training loss: 2.1398 0.6333 sec/batch
Epoch 6/20  Iteration 1001/3560 Training loss: 2.1396 0.6392 sec/batch
Epoch 6/20  Iteration 1002/3560 Training loss: 2.1395 0.6418 sec/batch
Epoch 6/20  Iteration 1003/3560 Training loss: 2.1392 0.7234 sec/batch
Epoch 6/20  Iteration 1004/3560 Training loss: 2.1390 0.6343 sec/batch
Epoch 6/20  Iteration 1005/3560 Training loss: 2.1387 0.6342 sec/batch
Epoch 6/20  Iteration 1006/3560 Training loss: 2.1382 0.6390 sec/batch
Epoch 6/20  Iteration 1007/3560 Training loss: 2.1381 0.6363 sec/batch
Epoch 6/20  Iteration 1008/3560 Training loss: 2.1379 0.6349 sec/batch
Epoch 6/20  Iteration 1009/3560 Training loss: 2.1378 0.6482 sec/batch
Epoch 6/20  Iteration 1010/3560 Training loss: 2.1377 0.7969 sec/batch
Epoch 6/20  Iteration 1011/3560 Training loss: 2.1377 0.7820 sec/batch
Epoch 6/20  Iteration 1012/3560 Training loss: 2.1374 0.7811 sec/batch
Epoch 6/20  Iteration 1013/3560 Training loss: 2.1372 0.7073 sec/batch
Epoch 6/20  Iteration 1014/3560 Training loss: 2.1372 0.6426 sec/batch
Epoch 6/20  Iteration 1015/3560 Training loss: 2.1370 0.6371 sec/batch
Epoch 6/20  Iteration 1016/3560 Training loss: 2.1366 0.6477 sec/batch
Epoch 6/20  Iteration 1017/3560 Training loss: 2.1365 0.6606 sec/batch
Epoch 6/20  Iteration 1018/3560 Training loss: 2.1365 0.6414 sec/batch
Epoch 6/20  Iteration 1019/3560 Training loss: 2.1363 0.6381 sec/batch
Epoch 6/20  Iteration 1020/3560 Training loss: 2.1362 0.6363 sec/batch
Epoch 6/20  Iteration 1021/3560 Training loss: 2.1360 0.6242 sec/batch
Epoch 6/20  Iteration 1022/3560 Training loss: 2.1357 0.6388 sec/batch
Epoch 6/20  Iteration 1023/3560 Training loss: 2.1355 0.6368 sec/batch
Epoch 6/20  Iteration 1024/3560 Training loss: 2.1355 0.6293 sec/batch
Epoch 6/20  Iteration 1025/3560 Training loss: 2.1353 0.6225 sec/batch
Epoch 6/20  Iteration 1026/3560 Training loss: 2.1352 0.6345 sec/batch
Epoch 6/20  Iteration 1027/3560 Training loss: 2.1350 0.6320 sec/batch
Epoch 6/20  Iteration 1028/3560 Training loss: 2.1349 0.6417 sec/batch
Epoch 6/20  Iteration 1029/3560 Training loss: 2.1349 0.6365 sec/batch
Epoch 6/20  Iteration 1030/3560 Training loss: 2.1346 0.6323 sec/batch
Epoch 6/20  Iteration 1031/3560 Training loss: 2.1346 0.6310 sec/batch
Epoch 6/20  Iteration 1032/3560 Training loss: 2.1345 0.6377 sec/batch
Epoch 6/20  Iteration 1033/3560 Training loss: 2.1343 0.6321 sec/batch
Epoch 6/20  Iteration 1034/3560 Training loss: 2.1341 0.6379 sec/batch
Epoch 6/20  Iteration 1035/3560 Training loss: 2.1339 0.6489 sec/batch
Epoch 6/20  Iteration 1036/3560 Training loss: 2.1338 0.6379 sec/batch
Epoch 6/20  Iteration 1037/3560 Training loss: 2.1337 0.6321 sec/batch
Epoch 6/20  Iteration 1038/3560 Training loss: 2.1337 0.6248 sec/batch
Epoch 6/20  Iteration 1039/3560 Training loss: 2.1336 0.6389 sec/batch
Epoch 6/20  Iteration 1040/3560 Training loss: 2.1334 0.6274 sec/batch
Epoch 6/20  Iteration 1041/3560 Training loss: 2.1332 0.6414 sec/batch
Epoch 6/20  Iteration 1042/3560 Training loss: 2.1334 0.6339 sec/batch
Epoch 6/20  Iteration 1043/3560 Training loss: 2.1333 0.6337 sec/batch
Epoch 6/20  Iteration 1044/3560 Training loss: 2.1331 0.6366 sec/batch
Epoch 6/20  Iteration 1045/3560 Training loss: 2.1330 0.6555 sec/batch
Epoch 6/20  Iteration 1046/3560 Training loss: 2.1329 0.6414 sec/batch
Epoch 6/20  Iteration 1047/3560 Training loss: 2.1328 0.6323 sec/batch
Epoch 6/20  Iteration 1048/3560 Training loss: 2.1326 0.6462 sec/batch
Epoch 6/20  Iteration 1049/3560 Training loss: 2.1322 0.6765 sec/batch
Epoch 6/20  Iteration 1050/3560 Training loss: 2.1322 0.8118 sec/batch
Epoch 6/20  Iteration 1051/3560 Training loss: 2.1321 0.6492 sec/batch
Epoch 6/20  Iteration 1052/3560 Training loss: 2.1320 0.8211 sec/batch
Epoch 6/20  Iteration 1053/3560 Training loss: 2.1318 0.6374 sec/batch
Epoch 6/20  Iteration 1054/3560 Training loss: 2.1316 0.6307 sec/batch
Epoch 6/20  Iteration 1055/3560 Training loss: 2.1316 0.6375 sec/batch
Epoch 6/20  Iteration 1056/3560 Training loss: 2.1315 0.6364 sec/batch
Epoch 6/20  Iteration 1057/3560 Training loss: 2.1314 0.6413 sec/batch
Epoch 6/20  Iteration 1058/3560 Training loss: 2.1314 0.6410 sec/batch
Epoch 6/20  Iteration 1059/3560 Training loss: 2.1312 0.6363 sec/batch
Epoch 6/20  Iteration 1060/3560 Training loss: 2.1310 0.6318 sec/batch
Epoch 6/20  Iteration 1061/3560 Training loss: 2.1308 0.6425 sec/batch
Epoch 6/20  Iteration 1062/3560 Training loss: 2.1306 0.6319 sec/batch
Epoch 6/20  Iteration 1063/3560 Training loss: 2.1306 0.6474 sec/batch
Epoch 6/20  Iteration 1064/3560 Training loss: 2.1306 0.6479 sec/batch
Epoch 6/20  Iteration 1065/3560 Training loss: 2.1306 0.6373 sec/batch
Epoch 6/20  Iteration 1066/3560 Training loss: 2.1305 0.6291 sec/batch
Epoch 6/20  Iteration 1067/3560 Training loss: 2.1303 0.6400 sec/batch
Epoch 6/20  Iteration 1068/3560 Training loss: 2.1301 0.6286 sec/batch
Epoch 7/20  Iteration 1069/3560 Training loss: 2.1819 0.6108 sec/batch
Epoch 7/20  Iteration 1070/3560 Training loss: 2.1307 0.6428 sec/batch
Epoch 7/20  Iteration 1071/3560 Training loss: 2.1181 0.6241 sec/batch
Epoch 7/20  Iteration 1072/3560 Training loss: 2.1100 0.6400 sec/batch
Epoch 7/20  Iteration 1073/3560 Training loss: 2.1075 0.6277 sec/batch
Epoch 7/20  Iteration 1074/3560 Training loss: 2.1031 0.6321 sec/batch
Epoch 7/20  Iteration 1075/3560 Training loss: 2.1015 0.6297 sec/batch
Epoch 7/20  Iteration 1076/3560 Training loss: 2.1014 0.6422 sec/batch
Epoch 7/20  Iteration 1077/3560 Training loss: 2.1045 0.6291 sec/batch
Epoch 7/20  Iteration 1078/3560 Training loss: 2.1033 0.6296 sec/batch
Epoch 7/20  Iteration 1079/3560 Training loss: 2.1010 0.6371 sec/batch
Epoch 7/20  Iteration 1080/3560 Training loss: 2.0995 0.6313 sec/batch
Epoch 7/20  Iteration 1081/3560 Training loss: 2.0997 0.6695 sec/batch
Epoch 7/20  Iteration 1082/3560 Training loss: 2.1028 0.6364 sec/batch
Epoch 7/20  Iteration 1083/3560 Training loss: 2.1024 0.6369 sec/batch
Epoch 7/20  Iteration 1084/3560 Training loss: 2.1014 0.6357 sec/batch
Epoch 7/20  Iteration 1085/3560 Training loss: 2.1012 0.6378 sec/batch
Epoch 7/20  Iteration 1086/3560 Training loss: 2.1033 0.6388 sec/batch
Epoch 7/20  Iteration 1087/3560 Training loss: 2.1030 0.6424 sec/batch
Epoch 7/20  Iteration 1088/3560 Training loss: 2.1022 0.6468 sec/batch
Epoch 7/20  Iteration 1089/3560 Training loss: 2.1015 0.6289 sec/batch
Epoch 7/20  Iteration 1090/3560 Training loss: 2.1024 0.6289 sec/batch
Epoch 7/20  Iteration 1091/3560 Training loss: 2.1015 0.6443 sec/batch
Epoch 7/20  Iteration 1092/3560 Training loss: 2.1007 0.6352 sec/batch
Epoch 7/20  Iteration 1093/3560 Training loss: 2.0999 0.6359 sec/batch
Epoch 7/20  Iteration 1094/3560 Training loss: 2.0990 0.6467 sec/batch
Epoch 7/20  Iteration 1095/3560 Training loss: 2.0988 0.7009 sec/batch
Epoch 7/20  Iteration 1096/3560 Training loss: 2.0988 0.6390 sec/batch
Epoch 7/20  Iteration 1097/3560 Training loss: 2.1000 0.6315 sec/batch
Epoch 7/20  Iteration 1098/3560 Training loss: 2.1000 0.6250 sec/batch
Epoch 7/20  Iteration 1099/3560 Training loss: 2.1001 0.6332 sec/batch
Epoch 7/20  Iteration 1100/3560 Training loss: 2.0998 0.6335 sec/batch
Epoch 7/20  Iteration 1101/3560 Training loss: 2.0997 0.6403 sec/batch
Epoch 7/20  Iteration 1102/3560 Training loss: 2.1003 0.6293 sec/batch
Epoch 7/20  Iteration 1103/3560 Training loss: 2.0998 0.6403 sec/batch
Epoch 7/20  Iteration 1104/3560 Training loss: 2.0996 0.6433 sec/batch
Epoch 7/20  Iteration 1105/3560 Training loss: 2.0991 0.6328 sec/batch
Epoch 7/20  Iteration 1106/3560 Training loss: 2.0979 0.6307 sec/batch
Epoch 7/20  Iteration 1107/3560 Training loss: 2.0967 0.6322 sec/batch
Epoch 7/20  Iteration 1108/3560 Training loss: 2.0959 0.6251 sec/batch
Epoch 7/20  Iteration 1109/3560 Training loss: 2.0955 0.6334 sec/batch
Epoch 7/20  Iteration 1110/3560 Training loss: 2.0949 0.6263 sec/batch
Epoch 7/20  Iteration 1111/3560 Training loss: 2.0944 0.6334 sec/batch
Epoch 7/20  Iteration 1112/3560 Training loss: 2.0936 0.6273 sec/batch
Epoch 7/20  Iteration 1113/3560 Training loss: 2.0935 0.6341 sec/batch
Epoch 7/20  Iteration 1114/3560 Training loss: 2.0926 0.6212 sec/batch
Epoch 7/20  Iteration 1115/3560 Training loss: 2.0925 0.6336 sec/batch
Epoch 7/20  Iteration 1116/3560 Training loss: 2.0923 0.6331 sec/batch
Epoch 7/20  Iteration 1117/3560 Training loss: 2.0921 0.6407 sec/batch
Epoch 7/20  Iteration 1118/3560 Training loss: 2.0927 0.6380 sec/batch
Epoch 7/20  Iteration 1119/3560 Training loss: 2.0923 0.6303 sec/batch
Epoch 7/20  Iteration 1120/3560 Training loss: 2.0930 0.6340 sec/batch
Epoch 7/20  Iteration 1121/3560 Training loss: 2.0927 0.6348 sec/batch
Epoch 7/20  Iteration 1122/3560 Training loss: 2.0924 0.6381 sec/batch
Epoch 7/20  Iteration 1123/3560 Training loss: 2.0920 0.6315 sec/batch
Epoch 7/20  Iteration 1124/3560 Training loss: 2.0920 0.6289 sec/batch
Epoch 7/20  Iteration 1125/3560 Training loss: 2.0919 0.6379 sec/batch
Epoch 7/20  Iteration 1126/3560 Training loss: 2.0915 0.6460 sec/batch
Epoch 7/20  Iteration 1127/3560 Training loss: 2.0911 0.6286 sec/batch
Epoch 7/20  Iteration 1128/3560 Training loss: 2.0912 0.6343 sec/batch
Epoch 7/20  Iteration 1129/3560 Training loss: 2.0907 0.6311 sec/batch
Epoch 7/20  Iteration 1130/3560 Training loss: 2.0911 0.6360 sec/batch
Epoch 7/20  Iteration 1131/3560 Training loss: 2.0915 0.6351 sec/batch
Epoch 7/20  Iteration 1132/3560 Training loss: 2.0916 0.6416 sec/batch
Epoch 7/20  Iteration 1133/3560 Training loss: 2.0913 0.6366 sec/batch
Epoch 7/20  Iteration 1134/3560 Training loss: 2.0913 0.6397 sec/batch
Epoch 7/20  Iteration 1135/3560 Training loss: 2.0914 0.6309 sec/batch
Epoch 7/20  Iteration 1136/3560 Training loss: 2.0907 0.6359 sec/batch
Epoch 7/20  Iteration 1137/3560 Training loss: 2.0903 0.6399 sec/batch
Epoch 7/20  Iteration 1138/3560 Training loss: 2.0903 0.6165 sec/batch
Epoch 7/20  Iteration 1139/3560 Training loss: 2.0904 0.6339 sec/batch
Epoch 7/20  Iteration 1140/3560 Training loss: 2.0906 0.6829 sec/batch
Epoch 7/20  Iteration 1141/3560 Training loss: 2.0909 0.6813 sec/batch
Epoch 7/20  Iteration 1142/3560 Training loss: 2.0905 0.7767 sec/batch
Epoch 7/20  Iteration 1143/3560 Training loss: 2.0905 0.6395 sec/batch
Epoch 7/20  Iteration 1144/3560 Training loss: 2.0907 0.6302 sec/batch
Epoch 7/20  Iteration 1145/3560 Training loss: 2.0906 0.6329 sec/batch
Epoch 7/20  Iteration 1146/3560 Training loss: 2.0904 0.6394 sec/batch
Epoch 7/20  Iteration 1147/3560 Training loss: 2.0899 0.6283 sec/batch
Epoch 7/20  Iteration 1148/3560 Training loss: 2.0899 0.6439 sec/batch
Epoch 7/20  Iteration 1149/3560 Training loss: 2.0894 0.6370 sec/batch
Epoch 7/20  Iteration 1150/3560 Training loss: 2.0894 0.6317 sec/batch
Epoch 7/20  Iteration 1151/3560 Training loss: 2.0888 0.6342 sec/batch
Epoch 7/20  Iteration 1152/3560 Training loss: 2.0885 0.6342 sec/batch
Epoch 7/20  Iteration 1153/3560 Training loss: 2.0879 0.6403 sec/batch
Epoch 7/20  Iteration 1154/3560 Training loss: 2.0876 0.6346 sec/batch
Epoch 7/20  Iteration 1155/3560 Training loss: 2.0876 0.6382 sec/batch
Epoch 7/20  Iteration 1156/3560 Training loss: 2.0873 0.6379 sec/batch
Epoch 7/20  Iteration 1157/3560 Training loss: 2.0867 0.6445 sec/batch
Epoch 7/20  Iteration 1158/3560 Training loss: 2.0867 0.6319 sec/batch
Epoch 7/20  Iteration 1159/3560 Training loss: 2.0863 0.6393 sec/batch
Epoch 7/20  Iteration 1160/3560 Training loss: 2.0861 0.6382 sec/batch
Epoch 7/20  Iteration 1161/3560 Training loss: 2.0856 0.6357 sec/batch
Epoch 7/20  Iteration 1162/3560 Training loss: 2.0852 0.7454 sec/batch
Epoch 7/20  Iteration 1163/3560 Training loss: 2.0848 0.6329 sec/batch
Epoch 7/20  Iteration 1164/3560 Training loss: 2.0847 0.6616 sec/batch
Epoch 7/20  Iteration 1165/3560 Training loss: 2.0846 0.6299 sec/batch
Epoch 7/20  Iteration 1166/3560 Training loss: 2.0844 0.6291 sec/batch
Epoch 7/20  Iteration 1167/3560 Training loss: 2.0841 0.6286 sec/batch
Epoch 7/20  Iteration 1168/3560 Training loss: 2.0837 0.6327 sec/batch
Epoch 7/20  Iteration 1169/3560 Training loss: 2.0837 0.6337 sec/batch
Epoch 7/20  Iteration 1170/3560 Training loss: 2.0836 0.6333 sec/batch
Epoch 7/20  Iteration 1171/3560 Training loss: 2.0832 0.6357 sec/batch
Epoch 7/20  Iteration 1172/3560 Training loss: 2.0829 0.6378 sec/batch
Epoch 7/20  Iteration 1173/3560 Training loss: 2.0826 0.6367 sec/batch
Epoch 7/20  Iteration 1174/3560 Training loss: 2.0823 0.6463 sec/batch
Epoch 7/20  Iteration 1175/3560 Training loss: 2.0821 0.6330 sec/batch
Epoch 7/20  Iteration 1176/3560 Training loss: 2.0820 0.6303 sec/batch
Epoch 7/20  Iteration 1177/3560 Training loss: 2.0820 0.6409 sec/batch
Epoch 7/20  Iteration 1178/3560 Training loss: 2.0817 0.6309 sec/batch
Epoch 7/20  Iteration 1179/3560 Training loss: 2.0816 0.6321 sec/batch
Epoch 7/20  Iteration 1180/3560 Training loss: 2.0815 0.6445 sec/batch
Epoch 7/20  Iteration 1181/3560 Training loss: 2.0813 0.6277 sec/batch
Epoch 7/20  Iteration 1182/3560 Training loss: 2.0812 0.6357 sec/batch
Epoch 7/20  Iteration 1183/3560 Training loss: 2.0810 0.6323 sec/batch
Epoch 7/20  Iteration 1184/3560 Training loss: 2.0806 0.6310 sec/batch
Epoch 7/20  Iteration 1185/3560 Training loss: 2.0805 0.6356 sec/batch
Epoch 7/20  Iteration 1186/3560 Training loss: 2.0804 0.6332 sec/batch
Epoch 7/20  Iteration 1187/3560 Training loss: 2.0804 0.6428 sec/batch
Epoch 7/20  Iteration 1188/3560 Training loss: 2.0802 0.6927 sec/batch
Epoch 7/20  Iteration 1189/3560 Training loss: 2.0802 0.6319 sec/batch
Epoch 7/20  Iteration 1190/3560 Training loss: 2.0800 0.6102 sec/batch
Epoch 7/20  Iteration 1191/3560 Training loss: 2.0798 0.6220 sec/batch
Epoch 7/20  Iteration 1192/3560 Training loss: 2.0798 0.6420 sec/batch
Epoch 7/20  Iteration 1193/3560 Training loss: 2.0798 0.6367 sec/batch
Epoch 7/20  Iteration 1194/3560 Training loss: 2.0793 0.6393 sec/batch
Epoch 7/20  Iteration 1195/3560 Training loss: 2.0793 0.6297 sec/batch
Epoch 7/20  Iteration 1196/3560 Training loss: 2.0793 0.6356 sec/batch
Epoch 7/20  Iteration 1197/3560 Training loss: 2.0791 0.6459 sec/batch
Epoch 7/20  Iteration 1198/3560 Training loss: 2.0791 0.6403 sec/batch
Epoch 7/20  Iteration 1199/3560 Training loss: 2.0788 0.6455 sec/batch
Epoch 7/20  Iteration 1200/3560 Training loss: 2.0786 0.6330 sec/batch
Epoch 7/20  Iteration 1201/3560 Training loss: 2.0784 0.6368 sec/batch
Epoch 7/20  Iteration 1202/3560 Training loss: 2.0783 0.6380 sec/batch
Epoch 7/20  Iteration 1203/3560 Training loss: 2.0782 0.6366 sec/batch
Epoch 7/20  Iteration 1204/3560 Training loss: 2.0781 0.6380 sec/batch
Epoch 7/20  Iteration 1205/3560 Training loss: 2.0780 0.6332 sec/batch
Epoch 7/20  Iteration 1206/3560 Training loss: 2.0780 0.6345 sec/batch
Epoch 7/20  Iteration 1207/3560 Training loss: 2.0781 0.6326 sec/batch
Epoch 7/20  Iteration 1208/3560 Training loss: 2.0779 0.6369 sec/batch
Epoch 7/20  Iteration 1209/3560 Training loss: 2.0780 0.6314 sec/batch
Epoch 7/20  Iteration 1210/3560 Training loss: 2.0779 0.6335 sec/batch
Epoch 7/20  Iteration 1211/3560 Training loss: 2.0777 0.6373 sec/batch
Epoch 7/20  Iteration 1212/3560 Training loss: 2.0776 0.6357 sec/batch
Epoch 7/20  Iteration 1213/3560 Training loss: 2.0775 0.6086 sec/batch
Epoch 7/20  Iteration 1214/3560 Training loss: 2.0775 0.6297 sec/batch
Epoch 7/20  Iteration 1215/3560 Training loss: 2.0774 0.6372 sec/batch
Epoch 7/20  Iteration 1216/3560 Training loss: 2.0775 0.6182 sec/batch
Epoch 7/20  Iteration 1217/3560 Training loss: 2.0774 0.6347 sec/batch
Epoch 7/20  Iteration 1218/3560 Training loss: 2.0772 0.6332 sec/batch
Epoch 7/20  Iteration 1219/3560 Training loss: 2.0771 0.6102 sec/batch
Epoch 7/20  Iteration 1220/3560 Training loss: 2.0772 0.6319 sec/batch
Epoch 7/20  Iteration 1221/3560 Training loss: 2.0771 0.6359 sec/batch
Epoch 7/20  Iteration 1222/3560 Training loss: 2.0771 0.6379 sec/batch
Epoch 7/20  Iteration 1223/3560 Training loss: 2.0770 0.6322 sec/batch
Epoch 7/20  Iteration 1224/3560 Training loss: 2.0768 0.6327 sec/batch
Epoch 7/20  Iteration 1225/3560 Training loss: 2.0767 0.6129 sec/batch
Epoch 7/20  Iteration 1226/3560 Training loss: 2.0766 0.6451 sec/batch
Epoch 7/20  Iteration 1227/3560 Training loss: 2.0764 0.6363 sec/batch
Epoch 7/20  Iteration 1228/3560 Training loss: 2.0765 0.6348 sec/batch
Epoch 7/20  Iteration 1229/3560 Training loss: 2.0764 0.6340 sec/batch
Epoch 7/20  Iteration 1230/3560 Training loss: 2.0762 0.6337 sec/batch
Epoch 7/20  Iteration 1231/3560 Training loss: 2.0762 0.6369 sec/batch
Epoch 7/20  Iteration 1232/3560 Training loss: 2.0761 0.6351 sec/batch
Epoch 7/20  Iteration 1233/3560 Training loss: 2.0761 0.6371 sec/batch
Epoch 7/20  Iteration 1234/3560 Training loss: 2.0759 0.6541 sec/batch
Epoch 7/20  Iteration 1235/3560 Training loss: 2.0759 0.6360 sec/batch
Epoch 7/20  Iteration 1236/3560 Training loss: 2.0760 0.6396 sec/batch
Epoch 7/20  Iteration 1237/3560 Training loss: 2.0759 0.6362 sec/batch
Epoch 7/20  Iteration 1238/3560 Training loss: 2.0757 0.6329 sec/batch
Epoch 7/20  Iteration 1239/3560 Training loss: 2.0755 0.6283 sec/batch
Epoch 7/20  Iteration 1240/3560 Training loss: 2.0753 0.6391 sec/batch
Epoch 7/20  Iteration 1241/3560 Training loss: 2.0753 0.6351 sec/batch
Epoch 7/20  Iteration 1242/3560 Training loss: 2.0753 0.6429 sec/batch
Epoch 7/20  Iteration 1243/3560 Training loss: 2.0753 0.6332 sec/batch
Epoch 7/20  Iteration 1244/3560 Training loss: 2.0752 0.6327 sec/batch
Epoch 7/20  Iteration 1245/3560 Training loss: 2.0750 0.6391 sec/batch
Epoch 7/20  Iteration 1246/3560 Training loss: 2.0749 0.6247 sec/batch
Epoch 8/20  Iteration 1247/3560 Training loss: 2.1244 0.6357 sec/batch
Epoch 8/20  Iteration 1248/3560 Training loss: 2.0770 0.6312 sec/batch
Epoch 8/20  Iteration 1249/3560 Training loss: 2.0638 0.6362 sec/batch
Epoch 8/20  Iteration 1250/3560 Training loss: 2.0575 0.6394 sec/batch
Epoch 8/20  Iteration 1251/3560 Training loss: 2.0560 0.6328 sec/batch
Epoch 8/20  Iteration 1252/3560 Training loss: 2.0501 0.6272 sec/batch
Epoch 8/20  Iteration 1253/3560 Training loss: 2.0501 0.6421 sec/batch
Epoch 8/20  Iteration 1254/3560 Training loss: 2.0514 0.6319 sec/batch
Epoch 8/20  Iteration 1255/3560 Training loss: 2.0541 0.6299 sec/batch
Epoch 8/20  Iteration 1256/3560 Training loss: 2.0538 0.6364 sec/batch
Epoch 8/20  Iteration 1257/3560 Training loss: 2.0501 0.6379 sec/batch
Epoch 8/20  Iteration 1258/3560 Training loss: 2.0490 0.6375 sec/batch
Epoch 8/20  Iteration 1259/3560 Training loss: 2.0495 0.6376 sec/batch
Epoch 8/20  Iteration 1260/3560 Training loss: 2.0517 0.6348 sec/batch
Epoch 8/20  Iteration 1261/3560 Training loss: 2.0510 0.6382 sec/batch
Epoch 8/20  Iteration 1262/3560 Training loss: 2.0501 0.6344 sec/batch
Epoch 8/20  Iteration 1263/3560 Training loss: 2.0496 0.6306 sec/batch
Epoch 8/20  Iteration 1264/3560 Training loss: 2.0522 0.6335 sec/batch
Epoch 8/20  Iteration 1265/3560 Training loss: 2.0519 0.6291 sec/batch
Epoch 8/20  Iteration 1266/3560 Training loss: 2.0513 0.6401 sec/batch
Epoch 8/20  Iteration 1267/3560 Training loss: 2.0501 0.6322 sec/batch
Epoch 8/20  Iteration 1268/3560 Training loss: 2.0512 0.6375 sec/batch
Epoch 8/20  Iteration 1269/3560 Training loss: 2.0507 0.6338 sec/batch
Epoch 8/20  Iteration 1270/3560 Training loss: 2.0502 0.6356 sec/batch
Epoch 8/20  Iteration 1271/3560 Training loss: 2.0494 0.6201 sec/batch
Epoch 8/20  Iteration 1272/3560 Training loss: 2.0492 0.6333 sec/batch
Epoch 8/20  Iteration 1273/3560 Training loss: 2.0489 0.6309 sec/batch
Epoch 8/20  Iteration 1274/3560 Training loss: 2.0494 0.6331 sec/batch
Epoch 8/20  Iteration 1275/3560 Training loss: 2.0504 0.6335 sec/batch
Epoch 8/20  Iteration 1276/3560 Training loss: 2.0509 0.6354 sec/batch
Epoch 8/20  Iteration 1277/3560 Training loss: 2.0509 0.6306 sec/batch
Epoch 8/20  Iteration 1278/3560 Training loss: 2.0505 0.6291 sec/batch
Epoch 8/20  Iteration 1279/3560 Training loss: 2.0503 0.6340 sec/batch
Epoch 8/20  Iteration 1280/3560 Training loss: 2.0512 0.6358 sec/batch
Epoch 8/20  Iteration 1281/3560 Training loss: 2.0510 0.6444 sec/batch
Epoch 8/20  Iteration 1282/3560 Training loss: 2.0507 0.6375 sec/batch
Epoch 8/20  Iteration 1283/3560 Training loss: 2.0502 0.6403 sec/batch
Epoch 8/20  Iteration 1284/3560 Training loss: 2.0493 0.6368 sec/batch
Epoch 8/20  Iteration 1285/3560 Training loss: 2.0482 0.6335 sec/batch
Epoch 8/20  Iteration 1286/3560 Training loss: 2.0470 0.6373 sec/batch
Epoch 8/20  Iteration 1287/3560 Training loss: 2.0465 0.6375 sec/batch
Epoch 8/20  Iteration 1288/3560 Training loss: 2.0461 0.6218 sec/batch
Epoch 8/20  Iteration 1289/3560 Training loss: 2.0457 0.6268 sec/batch
Epoch 8/20  Iteration 1290/3560 Training loss: 2.0449 0.6258 sec/batch
Epoch 8/20  Iteration 1291/3560 Training loss: 2.0453 0.6421 sec/batch
Epoch 8/20  Iteration 1292/3560 Training loss: 2.0441 0.6287 sec/batch
Epoch 8/20  Iteration 1293/3560 Training loss: 2.0443 0.6346 sec/batch
Epoch 8/20  Iteration 1294/3560 Training loss: 2.0438 0.6385 sec/batch
Epoch 8/20  Iteration 1295/3560 Training loss: 2.0437 0.6289 sec/batch
Epoch 8/20  Iteration 1296/3560 Training loss: 2.0443 0.6328 sec/batch
Epoch 8/20  Iteration 1297/3560 Training loss: 2.0437 0.6367 sec/batch
Epoch 8/20  Iteration 1298/3560 Training loss: 2.0442 0.6378 sec/batch
Epoch 8/20  Iteration 1299/3560 Training loss: 2.0440 0.6394 sec/batch
Epoch 8/20  Iteration 1300/3560 Training loss: 2.0437 0.6302 sec/batch
Epoch 8/20  Iteration 1301/3560 Training loss: 2.0433 0.6163 sec/batch
Epoch 8/20  Iteration 1302/3560 Training loss: 2.0433 0.6232 sec/batch
Epoch 8/20  Iteration 1303/3560 Training loss: 2.0433 0.6336 sec/batch
Epoch 8/20  Iteration 1304/3560 Training loss: 2.0430 0.6305 sec/batch
Epoch 8/20  Iteration 1305/3560 Training loss: 2.0427 0.6385 sec/batch
Epoch 8/20  Iteration 1306/3560 Training loss: 2.0429 0.6292 sec/batch
Epoch 8/20  Iteration 1307/3560 Training loss: 2.0426 0.6381 sec/batch
Epoch 8/20  Iteration 1308/3560 Training loss: 2.0431 0.6280 sec/batch
Epoch 8/20  Iteration 1309/3560 Training loss: 2.0434 0.6249 sec/batch
Epoch 8/20  Iteration 1310/3560 Training loss: 2.0434 0.6394 sec/batch
Epoch 8/20  Iteration 1311/3560 Training loss: 2.0431 0.6397 sec/batch
Epoch 8/20  Iteration 1312/3560 Training loss: 2.0431 0.6340 sec/batch
Epoch 8/20  Iteration 1313/3560 Training loss: 2.0432 0.6367 sec/batch
Epoch 8/20  Iteration 1314/3560 Training loss: 2.0425 0.6387 sec/batch
Epoch 8/20  Iteration 1315/3560 Training loss: 2.0422 0.6413 sec/batch
Epoch 8/20  Iteration 1316/3560 Training loss: 2.0422 0.6441 sec/batch
Epoch 8/20  Iteration 1317/3560 Training loss: 2.0426 0.6310 sec/batch
Epoch 8/20  Iteration 1318/3560 Training loss: 2.0427 0.6383 sec/batch
Epoch 8/20  Iteration 1319/3560 Training loss: 2.0430 0.6391 sec/batch
Epoch 8/20  Iteration 1320/3560 Training loss: 2.0427 0.6493 sec/batch
Epoch 8/20  Iteration 1321/3560 Training loss: 2.0427 0.6402 sec/batch
Epoch 8/20  Iteration 1322/3560 Training loss: 2.0430 0.6326 sec/batch
Epoch 8/20  Iteration 1323/3560 Training loss: 2.0428 0.6372 sec/batch
Epoch 8/20  Iteration 1324/3560 Training loss: 2.0429 0.6409 sec/batch
Epoch 8/20  Iteration 1325/3560 Training loss: 2.0423 0.6362 sec/batch
Epoch 8/20  Iteration 1326/3560 Training loss: 2.0420 0.6354 sec/batch
Epoch 8/20  Iteration 1327/3560 Training loss: 2.0415 0.6357 sec/batch
Epoch 8/20  Iteration 1328/3560 Training loss: 2.0417 0.6430 sec/batch
Epoch 8/20  Iteration 1329/3560 Training loss: 2.0411 0.6235 sec/batch
Epoch 8/20  Iteration 1330/3560 Training loss: 2.0410 0.6336 sec/batch
Epoch 8/20  Iteration 1331/3560 Training loss: 2.0403 0.6422 sec/batch
Epoch 8/20  Iteration 1332/3560 Training loss: 2.0400 0.6423 sec/batch
Epoch 8/20  Iteration 1333/3560 Training loss: 2.0399 0.6325 sec/batch
Epoch 8/20  Iteration 1334/3560 Training loss: 2.0397 0.6324 sec/batch
Epoch 8/20  Iteration 1335/3560 Training loss: 2.0393 0.6343 sec/batch
Epoch 8/20  Iteration 1336/3560 Training loss: 2.0393 0.6326 sec/batch
Epoch 8/20  Iteration 1337/3560 Training loss: 2.0391 0.6501 sec/batch
Epoch 8/20  Iteration 1338/3560 Training loss: 2.0390 0.6369 sec/batch
Epoch 8/20  Iteration 1339/3560 Training loss: 2.0386 0.6375 sec/batch
Epoch 8/20  Iteration 1340/3560 Training loss: 2.0382 0.6049 sec/batch
Epoch 8/20  Iteration 1341/3560 Training loss: 2.0377 0.6533 sec/batch
Epoch 8/20  Iteration 1342/3560 Training loss: 2.0376 0.6441 sec/batch
Epoch 8/20  Iteration 1343/3560 Training loss: 2.0375 0.6375 sec/batch
Epoch 8/20  Iteration 1344/3560 Training loss: 2.0373 0.6422 sec/batch
Epoch 8/20  Iteration 1345/3560 Training loss: 2.0369 0.6371 sec/batch
Epoch 8/20  Iteration 1346/3560 Training loss: 2.0365 0.6291 sec/batch
Epoch 8/20  Iteration 1347/3560 Training loss: 2.0364 0.6369 sec/batch
Epoch 8/20  Iteration 1348/3560 Training loss: 2.0364 0.6352 sec/batch
Epoch 8/20  Iteration 1349/3560 Training loss: 2.0360 0.7056 sec/batch
Epoch 8/20  Iteration 1350/3560 Training loss: 2.0357 0.6688 sec/batch
Epoch 8/20  Iteration 1351/3560 Training loss: 2.0354 0.6651 sec/batch
Epoch 8/20  Iteration 1352/3560 Training loss: 2.0353 0.6416 sec/batch
Epoch 8/20  Iteration 1353/3560 Training loss: 2.0350 0.6506 sec/batch
Epoch 8/20  Iteration 1354/3560 Training loss: 2.0348 0.6475 sec/batch
Epoch 8/20  Iteration 1355/3560 Training loss: 2.0348 0.6287 sec/batch
Epoch 8/20  Iteration 1356/3560 Training loss: 2.0347 0.6328 sec/batch
Epoch 8/20  Iteration 1357/3560 Training loss: 2.0346 0.6384 sec/batch
Epoch 8/20  Iteration 1358/3560 Training loss: 2.0344 0.6331 sec/batch
Epoch 8/20  Iteration 1359/3560 Training loss: 2.0344 0.6392 sec/batch
Epoch 8/20  Iteration 1360/3560 Training loss: 2.0342 0.6357 sec/batch
Epoch 8/20  Iteration 1361/3560 Training loss: 2.0339 0.6490 sec/batch
Epoch 8/20  Iteration 1362/3560 Training loss: 2.0335 0.6396 sec/batch
Epoch 8/20  Iteration 1363/3560 Training loss: 2.0334 0.6477 sec/batch
Epoch 8/20  Iteration 1364/3560 Training loss: 2.0333 0.6363 sec/batch
Epoch 8/20  Iteration 1365/3560 Training loss: 2.0333 0.6296 sec/batch
Epoch 8/20  Iteration 1366/3560 Training loss: 2.0332 0.6362 sec/batch
Epoch 8/20  Iteration 1367/3560 Training loss: 2.0332 0.6378 sec/batch
Epoch 8/20  Iteration 1368/3560 Training loss: 2.0330 0.6332 sec/batch
Epoch 8/20  Iteration 1369/3560 Training loss: 2.0329 0.6292 sec/batch
Epoch 8/20  Iteration 1370/3560 Training loss: 2.0330 0.6440 sec/batch
Epoch 8/20  Iteration 1371/3560 Training loss: 2.0329 0.6414 sec/batch
Epoch 8/20  Iteration 1372/3560 Training loss: 2.0325 0.6368 sec/batch
Epoch 8/20  Iteration 1373/3560 Training loss: 2.0326 0.6311 sec/batch
Epoch 8/20  Iteration 1374/3560 Training loss: 2.0326 0.6567 sec/batch
Epoch 8/20  Iteration 1375/3560 Training loss: 2.0325 0.6408 sec/batch
Epoch 8/20  Iteration 1376/3560 Training loss: 2.0324 0.6312 sec/batch
Epoch 8/20  Iteration 1377/3560 Training loss: 2.0323 0.6336 sec/batch
Epoch 8/20  Iteration 1378/3560 Training loss: 2.0320 0.6321 sec/batch
Epoch 8/20  Iteration 1379/3560 Training loss: 2.0320 0.6325 sec/batch
Epoch 8/20  Iteration 1380/3560 Training loss: 2.0320 0.6326 sec/batch
Epoch 8/20  Iteration 1381/3560 Training loss: 2.0319 0.6371 sec/batch
Epoch 8/20  Iteration 1382/3560 Training loss: 2.0318 0.6341 sec/batch
Epoch 8/20  Iteration 1383/3560 Training loss: 2.0317 0.6185 sec/batch
Epoch 8/20  Iteration 1384/3560 Training loss: 2.0317 0.6337 sec/batch
Epoch 8/20  Iteration 1385/3560 Training loss: 2.0318 0.6255 sec/batch
Epoch 8/20  Iteration 1386/3560 Training loss: 2.0316 0.6401 sec/batch
Epoch 8/20  Iteration 1387/3560 Training loss: 2.0317 0.6366 sec/batch
Epoch 8/20  Iteration 1388/3560 Training loss: 2.0316 0.6333 sec/batch
Epoch 8/20  Iteration 1389/3560 Training loss: 2.0315 0.6402 sec/batch
Epoch 8/20  Iteration 1390/3560 Training loss: 2.0315 0.6363 sec/batch
Epoch 8/20  Iteration 1391/3560 Training loss: 2.0314 0.6342 sec/batch
Epoch 8/20  Iteration 1392/3560 Training loss: 2.0313 0.6317 sec/batch
Epoch 8/20  Iteration 1393/3560 Training loss: 2.0312 0.6359 sec/batch
Epoch 8/20  Iteration 1394/3560 Training loss: 2.0314 0.6373 sec/batch
Epoch 8/20  Iteration 1395/3560 Training loss: 2.0313 0.6410 sec/batch
Epoch 8/20  Iteration 1396/3560 Training loss: 2.0312 0.6285 sec/batch
Epoch 8/20  Iteration 1397/3560 Training loss: 2.0310 0.6397 sec/batch
Epoch 8/20  Iteration 1398/3560 Training loss: 2.0312 0.6368 sec/batch
Epoch 8/20  Iteration 1399/3560 Training loss: 2.0311 0.6441 sec/batch
Epoch 8/20  Iteration 1400/3560 Training loss: 2.0311 0.6292 sec/batch
Epoch 8/20  Iteration 1401/3560 Training loss: 2.0309 0.6303 sec/batch
Epoch 8/20  Iteration 1402/3560 Training loss: 2.0308 0.6382 sec/batch
Epoch 8/20  Iteration 1403/3560 Training loss: 2.0308 0.6210 sec/batch
Epoch 8/20  Iteration 1404/3560 Training loss: 2.0308 0.6414 sec/batch
Epoch 8/20  Iteration 1405/3560 Training loss: 2.0305 0.6327 sec/batch
Epoch 8/20  Iteration 1406/3560 Training loss: 2.0306 0.6435 sec/batch
Epoch 8/20  Iteration 1407/3560 Training loss: 2.0306 0.6356 sec/batch
Epoch 8/20  Iteration 1408/3560 Training loss: 2.0305 0.6394 sec/batch
Epoch 8/20  Iteration 1409/3560 Training loss: 2.0304 0.6378 sec/batch
Epoch 8/20  Iteration 1410/3560 Training loss: 2.0304 0.6366 sec/batch
Epoch 8/20  Iteration 1411/3560 Training loss: 2.0303 0.6318 sec/batch
Epoch 8/20  Iteration 1412/3560 Training loss: 2.0303 0.6400 sec/batch
Epoch 8/20  Iteration 1413/3560 Training loss: 2.0302 0.6249 sec/batch
Epoch 8/20  Iteration 1414/3560 Training loss: 2.0304 0.6242 sec/batch
Epoch 8/20  Iteration 1415/3560 Training loss: 2.0302 0.6301 sec/batch
Epoch 8/20  Iteration 1416/3560 Training loss: 2.0301 0.6345 sec/batch
Epoch 8/20  Iteration 1417/3560 Training loss: 2.0299 0.6408 sec/batch
Epoch 8/20  Iteration 1418/3560 Training loss: 2.0298 0.6312 sec/batch
Epoch 8/20  Iteration 1419/3560 Training loss: 2.0298 0.6339 sec/batch
Epoch 8/20  Iteration 1420/3560 Training loss: 2.0299 0.6388 sec/batch
Epoch 8/20  Iteration 1421/3560 Training loss: 2.0298 0.6495 sec/batch
Epoch 8/20  Iteration 1422/3560 Training loss: 2.0297 0.6349 sec/batch
Epoch 8/20  Iteration 1423/3560 Training loss: 2.0296 0.6371 sec/batch
Epoch 8/20  Iteration 1424/3560 Training loss: 2.0295 0.6360 sec/batch
Epoch 9/20  Iteration 1425/3560 Training loss: 2.0823 0.6374 sec/batch
Epoch 9/20  Iteration 1426/3560 Training loss: 2.0376 0.6290 sec/batch
Epoch 9/20  Iteration 1427/3560 Training loss: 2.0220 0.6395 sec/batch
Epoch 9/20  Iteration 1428/3560 Training loss: 2.0162 0.6308 sec/batch
Epoch 9/20  Iteration 1429/3560 Training loss: 2.0122 0.6350 sec/batch
Epoch 9/20  Iteration 1430/3560 Training loss: 2.0075 0.6412 sec/batch
Epoch 9/20  Iteration 1431/3560 Training loss: 2.0087 0.6367 sec/batch
Epoch 9/20  Iteration 1432/3560 Training loss: 2.0093 0.6298 sec/batch
Epoch 9/20  Iteration 1433/3560 Training loss: 2.0123 0.6382 sec/batch
Epoch 9/20  Iteration 1434/3560 Training loss: 2.0122 0.6444 sec/batch
Epoch 9/20  Iteration 1435/3560 Training loss: 2.0101 0.6324 sec/batch
Epoch 9/20  Iteration 1436/3560 Training loss: 2.0086 0.6341 sec/batch
Epoch 9/20  Iteration 1437/3560 Training loss: 2.0083 0.6396 sec/batch
Epoch 9/20  Iteration 1438/3560 Training loss: 2.0107 0.6239 sec/batch
Epoch 9/20  Iteration 1439/3560 Training loss: 2.0100 0.6393 sec/batch
Epoch 9/20  Iteration 1440/3560 Training loss: 2.0092 0.6373 sec/batch
Epoch 9/20  Iteration 1441/3560 Training loss: 2.0089 0.6335 sec/batch
Epoch 9/20  Iteration 1442/3560 Training loss: 2.0111 0.6387 sec/batch
Epoch 9/20  Iteration 1443/3560 Training loss: 2.0106 0.6296 sec/batch
Epoch 9/20  Iteration 1444/3560 Training loss: 2.0103 0.6360 sec/batch
Epoch 9/20  Iteration 1445/3560 Training loss: 2.0097 0.6356 sec/batch
Epoch 9/20  Iteration 1446/3560 Training loss: 2.0109 0.6337 sec/batch
Epoch 9/20  Iteration 1447/3560 Training loss: 2.0101 0.6389 sec/batch
Epoch 9/20  Iteration 1448/3560 Training loss: 2.0100 0.6314 sec/batch
Epoch 9/20  Iteration 1449/3560 Training loss: 2.0102 0.6337 sec/batch
Epoch 9/20  Iteration 1450/3560 Training loss: 2.0097 0.6374 sec/batch
Epoch 9/20  Iteration 1451/3560 Training loss: 2.0093 0.6350 sec/batch
Epoch 9/20  Iteration 1452/3560 Training loss: 2.0096 0.6258 sec/batch
Epoch 9/20  Iteration 1453/3560 Training loss: 2.0107 0.6366 sec/batch
Epoch 9/20  Iteration 1454/3560 Training loss: 2.0112 0.6406 sec/batch
Epoch 9/20  Iteration 1455/3560 Training loss: 2.0112 0.6454 sec/batch
Epoch 9/20  Iteration 1456/3560 Training loss: 2.0103 0.6449 sec/batch
Epoch 9/20  Iteration 1457/3560 Training loss: 2.0100 0.6370 sec/batch
Epoch 9/20  Iteration 1458/3560 Training loss: 2.0108 0.6288 sec/batch
Epoch 9/20  Iteration 1459/3560 Training loss: 2.0104 0.6381 sec/batch
Epoch 9/20  Iteration 1460/3560 Training loss: 2.0101 0.6319 sec/batch
Epoch 9/20  Iteration 1461/3560 Training loss: 2.0098 0.6337 sec/batch
Epoch 9/20  Iteration 1462/3560 Training loss: 2.0084 0.6347 sec/batch
Epoch 9/20  Iteration 1463/3560 Training loss: 2.0075 0.6305 sec/batch
Epoch 9/20  Iteration 1464/3560 Training loss: 2.0068 0.6378 sec/batch
Epoch 9/20  Iteration 1465/3560 Training loss: 2.0065 0.6229 sec/batch
Epoch 9/20  Iteration 1466/3560 Training loss: 2.0064 0.6342 sec/batch
Epoch 9/20  Iteration 1467/3560 Training loss: 2.0058 0.6350 sec/batch
Epoch 9/20  Iteration 1468/3560 Training loss: 2.0050 0.6483 sec/batch
Epoch 9/20  Iteration 1469/3560 Training loss: 2.0052 0.6388 sec/batch
Epoch 9/20  Iteration 1470/3560 Training loss: 2.0041 0.6417 sec/batch
Epoch 9/20  Iteration 1471/3560 Training loss: 2.0042 0.6379 sec/batch
Epoch 9/20  Iteration 1472/3560 Training loss: 2.0039 0.6442 sec/batch
Epoch 9/20  Iteration 1473/3560 Training loss: 2.0037 0.6282 sec/batch
Epoch 9/20  Iteration 1474/3560 Training loss: 2.0041 0.6375 sec/batch
Epoch 9/20  Iteration 1475/3560 Training loss: 2.0036 0.6605 sec/batch
Epoch 9/20  Iteration 1476/3560 Training loss: 2.0043 0.8424 sec/batch
Epoch 9/20  Iteration 1477/3560 Training loss: 2.0038 0.6445 sec/batch
Epoch 9/20  Iteration 1478/3560 Training loss: 2.0037 0.6447 sec/batch
Epoch 9/20  Iteration 1479/3560 Training loss: 2.0032 0.6305 sec/batch
Epoch 9/20  Iteration 1480/3560 Training loss: 2.0032 0.6276 sec/batch
Epoch 9/20  Iteration 1481/3560 Training loss: 2.0034 0.6161 sec/batch
Epoch 9/20  Iteration 1482/3560 Training loss: 2.0030 0.6437 sec/batch
Epoch 9/20  Iteration 1483/3560 Training loss: 2.0027 0.6316 sec/batch
Epoch 9/20  Iteration 1484/3560 Training loss: 2.0030 0.6367 sec/batch
Epoch 9/20  Iteration 1485/3560 Training loss: 2.0027 0.6356 sec/batch
Epoch 9/20  Iteration 1486/3560 Training loss: 2.0031 0.6376 sec/batch
Epoch 9/20  Iteration 1487/3560 Training loss: 2.0034 0.6314 sec/batch
Epoch 9/20  Iteration 1488/3560 Training loss: 2.0036 0.6302 sec/batch
Epoch 9/20  Iteration 1489/3560 Training loss: 2.0032 0.6386 sec/batch
Epoch 9/20  Iteration 1490/3560 Training loss: 2.0035 0.6348 sec/batch
Epoch 9/20  Iteration 1491/3560 Training loss: 2.0036 0.6239 sec/batch
Epoch 9/20  Iteration 1492/3560 Training loss: 2.0031 0.6326 sec/batch
Epoch 9/20  Iteration 1493/3560 Training loss: 2.0029 0.6296 sec/batch
Epoch 9/20  Iteration 1494/3560 Training loss: 2.0030 0.6350 sec/batch
Epoch 9/20  Iteration 1495/3560 Training loss: 2.0033 0.6248 sec/batch
Epoch 9/20  Iteration 1496/3560 Training loss: 2.0035 0.6382 sec/batch
Epoch 9/20  Iteration 1497/3560 Training loss: 2.0039 0.6378 sec/batch
Epoch 9/20  Iteration 1498/3560 Training loss: 2.0035 0.6362 sec/batch
Epoch 9/20  Iteration 1499/3560 Training loss: 2.0034 0.6350 sec/batch
Epoch 9/20  Iteration 1500/3560 Training loss: 2.0037 0.6473 sec/batch
Epoch 9/20  Iteration 1501/3560 Training loss: 2.0035 0.6354 sec/batch
Epoch 9/20  Iteration 1502/3560 Training loss: 2.0036 0.6354 sec/batch
Epoch 9/20  Iteration 1503/3560 Training loss: 2.0030 0.6347 sec/batch
Epoch 9/20  Iteration 1504/3560 Training loss: 2.0028 0.6338 sec/batch
Epoch 9/20  Iteration 1505/3560 Training loss: 2.0023 0.6406 sec/batch
Epoch 9/20  Iteration 1506/3560 Training loss: 2.0024 0.6318 sec/batch
Epoch 9/20  Iteration 1507/3560 Training loss: 2.0019 0.6248 sec/batch
Epoch 9/20  Iteration 1508/3560 Training loss: 2.0017 0.6303 sec/batch
Epoch 9/20  Iteration 1509/3560 Training loss: 2.0012 0.6445 sec/batch
Epoch 9/20  Iteration 1510/3560 Training loss: 2.0008 0.6370 sec/batch
Epoch 9/20  Iteration 1511/3560 Training loss: 2.0007 0.6294 sec/batch
Epoch 9/20  Iteration 1512/3560 Training loss: 2.0003 0.6350 sec/batch
Epoch 9/20  Iteration 1513/3560 Training loss: 1.9998 0.6373 sec/batch
Epoch 9/20  Iteration 1514/3560 Training loss: 1.9998 0.6367 sec/batch
Epoch 9/20  Iteration 1515/3560 Training loss: 1.9994 0.6549 sec/batch
Epoch 9/20  Iteration 1516/3560 Training loss: 1.9993 0.6321 sec/batch
Epoch 9/20  Iteration 1517/3560 Training loss: 1.9989 0.6335 sec/batch
Epoch 9/20  Iteration 1518/3560 Training loss: 1.9986 0.6304 sec/batch
Epoch 9/20  Iteration 1519/3560 Training loss: 1.9982 0.6364 sec/batch
Epoch 9/20  Iteration 1520/3560 Training loss: 1.9981 0.6282 sec/batch
Epoch 9/20  Iteration 1521/3560 Training loss: 1.9979 0.6345 sec/batch
Epoch 9/20  Iteration 1522/3560 Training loss: 1.9975 0.6394 sec/batch
Epoch 9/20  Iteration 1523/3560 Training loss: 1.9971 0.6399 sec/batch
Epoch 9/20  Iteration 1524/3560 Training loss: 1.9967 0.6307 sec/batch
Epoch 9/20  Iteration 1525/3560 Training loss: 1.9966 0.6379 sec/batch
Epoch 9/20  Iteration 1526/3560 Training loss: 1.9965 0.6329 sec/batch
Epoch 9/20  Iteration 1527/3560 Training loss: 1.9962 0.6342 sec/batch
Epoch 9/20  Iteration 1528/3560 Training loss: 1.9959 0.6507 sec/batch
Epoch 9/20  Iteration 1529/3560 Training loss: 1.9958 0.6406 sec/batch
Epoch 9/20  Iteration 1530/3560 Training loss: 1.9956 0.6461 sec/batch
Epoch 9/20  Iteration 1531/3560 Training loss: 1.9955 0.6388 sec/batch
Epoch 9/20  Iteration 1532/3560 Training loss: 1.9954 0.6246 sec/batch
Epoch 9/20  Iteration 1533/3560 Training loss: 1.9954 0.6333 sec/batch
Epoch 9/20  Iteration 1534/3560 Training loss: 1.9953 0.6415 sec/batch
Epoch 9/20  Iteration 1535/3560 Training loss: 1.9951 0.6350 sec/batch
Epoch 9/20  Iteration 1536/3560 Training loss: 1.9951 0.7082 sec/batch
Epoch 9/20  Iteration 1537/3560 Training loss: 1.9949 0.6641 sec/batch
Epoch 9/20  Iteration 1538/3560 Training loss: 1.9948 0.6628 sec/batch
Epoch 9/20  Iteration 1539/3560 Training loss: 1.9945 0.6430 sec/batch
Epoch 9/20  Iteration 1540/3560 Training loss: 1.9940 0.6595 sec/batch
Epoch 9/20  Iteration 1541/3560 Training loss: 1.9940 0.6346 sec/batch
Epoch 9/20  Iteration 1542/3560 Training loss: 1.9939 0.6369 sec/batch
Epoch 9/20  Iteration 1543/3560 Training loss: 1.9939 0.6354 sec/batch
Epoch 9/20  Iteration 1544/3560 Training loss: 1.9938 0.6368 sec/batch
Epoch 9/20  Iteration 1545/3560 Training loss: 1.9938 0.6367 sec/batch
Epoch 9/20  Iteration 1546/3560 Training loss: 1.9935 0.6329 sec/batch
Epoch 9/20  Iteration 1547/3560 Training loss: 1.9932 0.6436 sec/batch
Epoch 9/20  Iteration 1548/3560 Training loss: 1.9934 0.6444 sec/batch
Epoch 9/20  Iteration 1549/3560 Training loss: 1.9934 0.6310 sec/batch
Epoch 9/20  Iteration 1550/3560 Training loss: 1.9930 0.6280 sec/batch
Epoch 9/20  Iteration 1551/3560 Training loss: 1.9931 0.6309 sec/batch
Epoch 9/20  Iteration 1552/3560 Training loss: 1.9930 0.6211 sec/batch
Epoch 9/20  Iteration 1553/3560 Training loss: 1.9929 0.6399 sec/batch
Epoch 9/20  Iteration 1554/3560 Training loss: 1.9928 0.6372 sec/batch
Epoch 9/20  Iteration 1555/3560 Training loss: 1.9925 0.6377 sec/batch
Epoch 9/20  Iteration 1556/3560 Training loss: 1.9922 0.6294 sec/batch
Epoch 9/20  Iteration 1557/3560 Training loss: 1.9922 0.6351 sec/batch
Epoch 9/20  Iteration 1558/3560 Training loss: 1.9922 0.6362 sec/batch
Epoch 9/20  Iteration 1559/3560 Training loss: 1.9921 0.6297 sec/batch
Epoch 9/20  Iteration 1560/3560 Training loss: 1.9921 0.6248 sec/batch
Epoch 9/20  Iteration 1561/3560 Training loss: 1.9921 0.6347 sec/batch
Epoch 9/20  Iteration 1562/3560 Training loss: 1.9921 0.6428 sec/batch
Epoch 9/20  Iteration 1563/3560 Training loss: 1.9921 0.6380 sec/batch
Epoch 9/20  Iteration 1564/3560 Training loss: 1.9919 0.6399 sec/batch
Epoch 9/20  Iteration 1565/3560 Training loss: 1.9920 0.6408 sec/batch
Epoch 9/20  Iteration 1566/3560 Training loss: 1.9919 0.6385 sec/batch
Epoch 9/20  Iteration 1567/3560 Training loss: 1.9918 0.6347 sec/batch
Epoch 9/20  Iteration 1568/3560 Training loss: 1.9918 0.6400 sec/batch
Epoch 9/20  Iteration 1569/3560 Training loss: 1.9916 0.6437 sec/batch
Epoch 9/20  Iteration 1570/3560 Training loss: 1.9917 0.6425 sec/batch
Epoch 9/20  Iteration 1571/3560 Training loss: 1.9917 0.6361 sec/batch
Epoch 9/20  Iteration 1572/3560 Training loss: 1.9918 0.6301 sec/batch
Epoch 9/20  Iteration 1573/3560 Training loss: 1.9916 0.6318 sec/batch
Epoch 9/20  Iteration 1574/3560 Training loss: 1.9914 0.6349 sec/batch
Epoch 9/20  Iteration 1575/3560 Training loss: 1.9912 0.6102 sec/batch
Epoch 9/20  Iteration 1576/3560 Training loss: 1.9913 0.6373 sec/batch
Epoch 9/20  Iteration 1577/3560 Training loss: 1.9913 0.6337 sec/batch
Epoch 9/20  Iteration 1578/3560 Training loss: 1.9913 0.6458 sec/batch
Epoch 9/20  Iteration 1579/3560 Training loss: 1.9912 0.6379 sec/batch
Epoch 9/20  Iteration 1580/3560 Training loss: 1.9910 0.6302 sec/batch
Epoch 9/20  Iteration 1581/3560 Training loss: 1.9910 0.6417 sec/batch
Epoch 9/20  Iteration 1582/3560 Training loss: 1.9909 0.6350 sec/batch
Epoch 9/20  Iteration 1583/3560 Training loss: 1.9907 0.6333 sec/batch
Epoch 9/20  Iteration 1584/3560 Training loss: 1.9908 0.6343 sec/batch
Epoch 9/20  Iteration 1585/3560 Training loss: 1.9908 0.6494 sec/batch
Epoch 9/20  Iteration 1586/3560 Training loss: 1.9907 0.6393 sec/batch
Epoch 9/20  Iteration 1587/3560 Training loss: 1.9906 0.6361 sec/batch
Epoch 9/20  Iteration 1588/3560 Training loss: 1.9906 0.6429 sec/batch
Epoch 9/20  Iteration 1589/3560 Training loss: 1.9905 0.6344 sec/batch
Epoch 9/20  Iteration 1590/3560 Training loss: 1.9904 0.6381 sec/batch
Epoch 9/20  Iteration 1591/3560 Training loss: 1.9903 0.6207 sec/batch
Epoch 9/20  Iteration 1592/3560 Training loss: 1.9905 0.6418 sec/batch
Epoch 9/20  Iteration 1593/3560 Training loss: 1.9904 0.6415 sec/batch
Epoch 9/20  Iteration 1594/3560 Training loss: 1.9903 0.6373 sec/batch
Epoch 9/20  Iteration 1595/3560 Training loss: 1.9901 0.6388 sec/batch
Epoch 9/20  Iteration 1596/3560 Training loss: 1.9899 0.6400 sec/batch
Epoch 9/20  Iteration 1597/3560 Training loss: 1.9899 0.6148 sec/batch
Epoch 9/20  Iteration 1598/3560 Training loss: 1.9899 0.6157 sec/batch
Epoch 9/20  Iteration 1599/3560 Training loss: 1.9899 0.6351 sec/batch
Epoch 9/20  Iteration 1600/3560 Training loss: 1.9898 0.6355 sec/batch
Epoch 9/20  Iteration 1601/3560 Training loss: 1.9897 0.6409 sec/batch
Epoch 9/20  Iteration 1602/3560 Training loss: 1.9896 0.6353 sec/batch
Epoch 10/20  Iteration 1603/3560 Training loss: 2.0464 0.6393 sec/batch
Epoch 10/20  Iteration 1604/3560 Training loss: 2.0045 0.6297 sec/batch
Epoch 10/20  Iteration 1605/3560 Training loss: 1.9909 0.6312 sec/batch
Epoch 10/20  Iteration 1606/3560 Training loss: 1.9871 0.6334 sec/batch
Epoch 10/20  Iteration 1607/3560 Training loss: 1.9847 0.6388 sec/batch
Epoch 10/20  Iteration 1608/3560 Training loss: 1.9760 0.6519 sec/batch
Epoch 10/20  Iteration 1609/3560 Training loss: 1.9763 0.6329 sec/batch
Epoch 10/20  Iteration 1610/3560 Training loss: 1.9747 0.6362 sec/batch
Epoch 10/20  Iteration 1611/3560 Training loss: 1.9772 0.6072 sec/batch
Epoch 10/20  Iteration 1612/3560 Training loss: 1.9769 0.6348 sec/batch
Epoch 10/20  Iteration 1613/3560 Training loss: 1.9741 0.6280 sec/batch
Epoch 10/20  Iteration 1614/3560 Training loss: 1.9733 0.6347 sec/batch
Epoch 10/20  Iteration 1615/3560 Training loss: 1.9735 0.6363 sec/batch
Epoch 10/20  Iteration 1616/3560 Training loss: 1.9760 0.6373 sec/batch
Epoch 10/20  Iteration 1617/3560 Training loss: 1.9752 0.6234 sec/batch
Epoch 10/20  Iteration 1618/3560 Training loss: 1.9742 0.6383 sec/batch
Epoch 10/20  Iteration 1619/3560 Training loss: 1.9739 0.6381 sec/batch
Epoch 10/20  Iteration 1620/3560 Training loss: 1.9763 0.6374 sec/batch
Epoch 10/20  Iteration 1621/3560 Training loss: 1.9763 0.6402 sec/batch
Epoch 10/20  Iteration 1622/3560 Training loss: 1.9763 0.6328 sec/batch
Epoch 10/20  Iteration 1623/3560 Training loss: 1.9760 0.6437 sec/batch
Epoch 10/20  Iteration 1624/3560 Training loss: 1.9767 0.6157 sec/batch
Epoch 10/20  Iteration 1625/3560 Training loss: 1.9758 0.6380 sec/batch
Epoch 10/20  Iteration 1626/3560 Training loss: 1.9755 0.6294 sec/batch
Epoch 10/20  Iteration 1627/3560 Training loss: 1.9748 0.6374 sec/batch
Epoch 10/20  Iteration 1628/3560 Training loss: 1.9743 0.6300 sec/batch
Epoch 10/20  Iteration 1629/3560 Training loss: 1.9735 0.6373 sec/batch
Epoch 10/20  Iteration 1630/3560 Training loss: 1.9738 0.6416 sec/batch
Epoch 10/20  Iteration 1631/3560 Training loss: 1.9749 0.6314 sec/batch
Epoch 10/20  Iteration 1632/3560 Training loss: 1.9752 0.6431 sec/batch
Epoch 10/20  Iteration 1633/3560 Training loss: 1.9754 0.6381 sec/batch
Epoch 10/20  Iteration 1634/3560 Training loss: 1.9749 0.6396 sec/batch
Epoch 10/20  Iteration 1635/3560 Training loss: 1.9748 0.6348 sec/batch
Epoch 10/20  Iteration 1636/3560 Training loss: 1.9757 0.6377 sec/batch
Epoch 10/20  Iteration 1637/3560 Training loss: 1.9752 0.6358 sec/batch
Epoch 10/20  Iteration 1638/3560 Training loss: 1.9750 0.6434 sec/batch
Epoch 10/20  Iteration 1639/3560 Training loss: 1.9744 0.6347 sec/batch
Epoch 10/20  Iteration 1640/3560 Training loss: 1.9733 0.6376 sec/batch
Epoch 10/20  Iteration 1641/3560 Training loss: 1.9723 0.6346 sec/batch
Epoch 10/20  Iteration 1642/3560 Training loss: 1.9714 0.6387 sec/batch
Epoch 10/20  Iteration 1643/3560 Training loss: 1.9710 0.6378 sec/batch
Epoch 10/20  Iteration 1644/3560 Training loss: 1.9710 0.6762 sec/batch
Epoch 10/20  Iteration 1645/3560 Training loss: 1.9703 0.7570 sec/batch
Epoch 10/20  Iteration 1646/3560 Training loss: 1.9696 0.7491 sec/batch
Epoch 10/20  Iteration 1647/3560 Training loss: 1.9697 0.7691 sec/batch
Epoch 10/20  Iteration 1648/3560 Training loss: 1.9685 0.7728 sec/batch
Epoch 10/20  Iteration 1649/3560 Training loss: 1.9686 0.7621 sec/batch
Epoch 10/20  Iteration 1650/3560 Training loss: 1.9683 0.7600 sec/batch
Epoch 10/20  Iteration 1651/3560 Training loss: 1.9680 0.7637 sec/batch
Epoch 10/20  Iteration 1652/3560 Training loss: 1.9685 0.7715 sec/batch
Epoch 10/20  Iteration 1653/3560 Training loss: 1.9679 0.7817 sec/batch
Epoch 10/20  Iteration 1654/3560 Training loss: 1.9686 0.7705 sec/batch
Epoch 10/20  Iteration 1655/3560 Training loss: 1.9682 0.7649 sec/batch
Epoch 10/20  Iteration 1656/3560 Training loss: 1.9680 0.7621 sec/batch
Epoch 10/20  Iteration 1657/3560 Training loss: 1.9673 0.7542 sec/batch
Epoch 10/20  Iteration 1658/3560 Training loss: 1.9674 0.7626 sec/batch
Epoch 10/20  Iteration 1659/3560 Training loss: 1.9676 0.7605 sec/batch
Epoch 10/20  Iteration 1660/3560 Training loss: 1.9672 0.7718 sec/batch
Epoch 10/20  Iteration 1661/3560 Training loss: 1.9667 0.7750 sec/batch
Epoch 10/20  Iteration 1662/3560 Training loss: 1.9669 0.7953 sec/batch
Epoch 10/20  Iteration 1663/3560 Training loss: 1.9667 0.7771 sec/batch
Epoch 10/20  Iteration 1664/3560 Training loss: 1.9673 0.7770 sec/batch
Epoch 10/20  Iteration 1665/3560 Training loss: 1.9676 0.7741 sec/batch
Epoch 10/20  Iteration 1666/3560 Training loss: 1.9677 0.7553 sec/batch
Epoch 10/20  Iteration 1667/3560 Training loss: 1.9676 0.7517 sec/batch
Epoch 10/20  Iteration 1668/3560 Training loss: 1.9680 0.7511 sec/batch
Epoch 10/20  Iteration 1669/3560 Training loss: 1.9680 0.7606 sec/batch
Epoch 10/20  Iteration 1670/3560 Training loss: 1.9675 0.7550 sec/batch
Epoch 10/20  Iteration 1671/3560 Training loss: 1.9671 0.7608 sec/batch
Epoch 10/20  Iteration 1672/3560 Training loss: 1.9670 0.7715 sec/batch
Epoch 10/20  Iteration 1673/3560 Training loss: 1.9674 0.7564 sec/batch
Epoch 10/20  Iteration 1674/3560 Training loss: 1.9675 0.7569 sec/batch
Epoch 10/20  Iteration 1675/3560 Training loss: 1.9677 0.7632 sec/batch
Epoch 10/20  Iteration 1676/3560 Training loss: 1.9674 0.7658 sec/batch
Epoch 10/20  Iteration 1677/3560 Training loss: 1.9672 0.7614 sec/batch
Epoch 10/20  Iteration 1678/3560 Training loss: 1.9675 0.7607 sec/batch
Epoch 10/20  Iteration 1679/3560 Training loss: 1.9674 0.7587 sec/batch
Epoch 10/20  Iteration 1680/3560 Training loss: 1.9675 0.7610 sec/batch
Epoch 10/20  Iteration 1681/3560 Training loss: 1.9669 0.7616 sec/batch
Epoch 10/20  Iteration 1682/3560 Training loss: 1.9668 0.7538 sec/batch
Epoch 10/20  Iteration 1683/3560 Training loss: 1.9664 0.7697 sec/batch
Epoch 10/20  Iteration 1684/3560 Training loss: 1.9665 0.7591 sec/batch
Epoch 10/20  Iteration 1685/3560 Training loss: 1.9659 0.7632 sec/batch
Epoch 10/20  Iteration 1686/3560 Training loss: 1.9657 0.7739 sec/batch
Epoch 10/20  Iteration 1687/3560 Training loss: 1.9652 0.7669 sec/batch
Epoch 10/20  Iteration 1688/3560 Training loss: 1.9648 0.7601 sec/batch
Epoch 10/20  Iteration 1689/3560 Training loss: 1.9646 0.7662 sec/batch
Epoch 10/20  Iteration 1690/3560 Training loss: 1.9643 0.7547 sec/batch
Epoch 10/20  Iteration 1691/3560 Training loss: 1.9638 0.7637 sec/batch
Epoch 10/20  Iteration 1692/3560 Training loss: 1.9638 0.7905 sec/batch
Epoch 10/20  Iteration 1693/3560 Training loss: 1.9635 0.7635 sec/batch
Epoch 10/20  Iteration 1694/3560 Training loss: 1.9633 0.7496 sec/batch
Epoch 10/20  Iteration 1695/3560 Training loss: 1.9629 0.7528 sec/batch
Epoch 10/20  Iteration 1696/3560 Training loss: 1.9624 0.7597 sec/batch
Epoch 10/20  Iteration 1697/3560 Training loss: 1.9621 0.7617 sec/batch
Epoch 10/20  Iteration 1698/3560 Training loss: 1.9620 0.7632 sec/batch
Epoch 10/20  Iteration 1699/3560 Training loss: 1.9619 0.7669 sec/batch
Epoch 10/20  Iteration 1700/3560 Training loss: 1.9616 0.7681 sec/batch
Epoch 10/20  Iteration 1701/3560 Training loss: 1.9612 0.7494 sec/batch
Epoch 10/20  Iteration 1702/3560 Training loss: 1.9608 0.7630 sec/batch
Epoch 10/20  Iteration 1703/3560 Training loss: 1.9608 0.7591 sec/batch
Epoch 10/20  Iteration 1704/3560 Training loss: 1.9607 0.7522 sec/batch
Epoch 10/20  Iteration 1705/3560 Training loss: 1.9604 0.7611 sec/batch
Epoch 10/20  Iteration 1706/3560 Training loss: 1.9603 0.7618 sec/batch
Epoch 10/20  Iteration 1707/3560 Training loss: 1.9601 0.7512 sec/batch
Epoch 10/20  Iteration 1708/3560 Training loss: 1.9600 0.7522 sec/batch
Epoch 10/20  Iteration 1709/3560 Training loss: 1.9599 0.7698 sec/batch
Epoch 10/20  Iteration 1710/3560 Training loss: 1.9598 0.8841 sec/batch
Epoch 10/20  Iteration 1711/3560 Training loss: 1.9598 0.7829 sec/batch
Epoch 10/20  Iteration 1712/3560 Training loss: 1.9597 0.8329 sec/batch
Epoch 10/20  Iteration 1713/3560 Training loss: 1.9596 0.7972 sec/batch
Epoch 10/20  Iteration 1714/3560 Training loss: 1.9595 0.7622 sec/batch
Epoch 10/20  Iteration 1715/3560 Training loss: 1.9593 0.7573 sec/batch
Epoch 10/20  Iteration 1716/3560 Training loss: 1.9593 0.7586 sec/batch
Epoch 10/20  Iteration 1717/3560 Training loss: 1.9591 0.7565 sec/batch
Epoch 10/20  Iteration 1718/3560 Training loss: 1.9587 0.7652 sec/batch
Epoch 10/20  Iteration 1719/3560 Training loss: 1.9587 0.7620 sec/batch
Epoch 10/20  Iteration 1720/3560 Training loss: 1.9586 0.7724 sec/batch
Epoch 10/20  Iteration 1721/3560 Training loss: 1.9586 0.7697 sec/batch
Epoch 10/20  Iteration 1722/3560 Training loss: 1.9584 0.7614 sec/batch
Epoch 10/20  Iteration 1723/3560 Training loss: 1.9583 0.7603 sec/batch
Epoch 10/20  Iteration 1724/3560 Training loss: 1.9581 0.7599 sec/batch
Epoch 10/20  Iteration 1725/3560 Training loss: 1.9579 0.7613 sec/batch
Epoch 10/20  Iteration 1726/3560 Training loss: 1.9581 0.7491 sec/batch
Epoch 10/20  Iteration 1727/3560 Training loss: 1.9580 0.7581 sec/batch
Epoch 10/20  Iteration 1728/3560 Training loss: 1.9576 0.7532 sec/batch
Epoch 10/20  Iteration 1729/3560 Training loss: 1.9577 0.7688 sec/batch
Epoch 10/20  Iteration 1730/3560 Training loss: 1.9577 0.7661 sec/batch
Epoch 10/20  Iteration 1731/3560 Training loss: 1.9576 0.7766 sec/batch
Epoch 10/20  Iteration 1732/3560 Training loss: 1.9575 0.7628 sec/batch
Epoch 10/20  Iteration 1733/3560 Training loss: 1.9572 0.7595 sec/batch
Epoch 10/20  Iteration 1734/3560 Training loss: 1.9570 0.7572 sec/batch
Epoch 10/20  Iteration 1735/3560 Training loss: 1.9570 0.7621 sec/batch
Epoch 10/20  Iteration 1736/3560 Training loss: 1.9570 0.7600 sec/batch
Epoch 10/20  Iteration 1737/3560 Training loss: 1.9569 0.7658 sec/batch
Epoch 10/20  Iteration 1738/3560 Training loss: 1.9569 0.7627 sec/batch
Epoch 10/20  Iteration 1739/3560 Training loss: 1.9569 0.7661 sec/batch
Epoch 10/20  Iteration 1740/3560 Training loss: 1.9569 0.7654 sec/batch
Epoch 10/20  Iteration 1741/3560 Training loss: 1.9570 0.7610 sec/batch
Epoch 10/20  Iteration 1742/3560 Training loss: 1.9568 0.7732 sec/batch
Epoch 10/20  Iteration 1743/3560 Training loss: 1.9569 0.7668 sec/batch
Epoch 10/20  Iteration 1744/3560 Training loss: 1.9567 0.7648 sec/batch
Epoch 10/20  Iteration 1745/3560 Training loss: 1.9567 0.7670 sec/batch
Epoch 10/20  Iteration 1746/3560 Training loss: 1.9566 0.7520 sec/batch
Epoch 10/20  Iteration 1747/3560 Training loss: 1.9565 0.7564 sec/batch
Epoch 10/20  Iteration 1748/3560 Training loss: 1.9566 0.7440 sec/batch
Epoch 10/20  Iteration 1749/3560 Training loss: 1.9566 0.7628 sec/batch
Epoch 10/20  Iteration 1750/3560 Training loss: 1.9567 0.7621 sec/batch
Epoch 10/20  Iteration 1751/3560 Training loss: 1.9567 0.7628 sec/batch
Epoch 10/20  Iteration 1752/3560 Training loss: 1.9565 0.7516 sec/batch
Epoch 10/20  Iteration 1753/3560 Training loss: 1.9563 0.7544 sec/batch
Epoch 10/20  Iteration 1754/3560 Training loss: 1.9565 0.7675 sec/batch
Epoch 10/20  Iteration 1755/3560 Training loss: 1.9564 0.7592 sec/batch
Epoch 10/20  Iteration 1756/3560 Training loss: 1.9565 0.7488 sec/batch
Epoch 10/20  Iteration 1757/3560 Training loss: 1.9564 0.7563 sec/batch
Epoch 10/20  Iteration 1758/3560 Training loss: 1.9563 0.7608 sec/batch
Epoch 10/20  Iteration 1759/3560 Training loss: 1.9563 0.7563 sec/batch
Epoch 10/20  Iteration 1760/3560 Training loss: 1.9562 0.7530 sec/batch
Epoch 10/20  Iteration 1761/3560 Training loss: 1.9560 0.7514 sec/batch
Epoch 10/20  Iteration 1762/3560 Training loss: 1.9561 0.7550 sec/batch
Epoch 10/20  Iteration 1763/3560 Training loss: 1.9562 0.7632 sec/batch
Epoch 10/20  Iteration 1764/3560 Training loss: 1.9562 0.7528 sec/batch
Epoch 10/20  Iteration 1765/3560 Training loss: 1.9562 0.7575 sec/batch
Epoch 10/20  Iteration 1766/3560 Training loss: 1.9562 0.7501 sec/batch
Epoch 10/20  Iteration 1767/3560 Training loss: 1.9562 0.7638 sec/batch
Epoch 10/20  Iteration 1768/3560 Training loss: 1.9561 0.7574 sec/batch
Epoch 10/20  Iteration 1769/3560 Training loss: 1.9560 0.7587 sec/batch
Epoch 10/20  Iteration 1770/3560 Training loss: 1.9562 0.7767 sec/batch
Epoch 10/20  Iteration 1771/3560 Training loss: 1.9561 0.7644 sec/batch
Epoch 10/20  Iteration 1772/3560 Training loss: 1.9560 0.7597 sec/batch
Epoch 10/20  Iteration 1773/3560 Training loss: 1.9559 0.7606 sec/batch
Epoch 10/20  Iteration 1774/3560 Training loss: 1.9557 0.7527 sec/batch
Epoch 10/20  Iteration 1775/3560 Training loss: 1.9558 0.7526 sec/batch
Epoch 10/20  Iteration 1776/3560 Training loss: 1.9558 0.7584 sec/batch
Epoch 10/20  Iteration 1777/3560 Training loss: 1.9559 0.7573 sec/batch
Epoch 10/20  Iteration 1778/3560 Training loss: 1.9558 0.7544 sec/batch
Epoch 10/20  Iteration 1779/3560 Training loss: 1.9556 0.7689 sec/batch
Epoch 10/20  Iteration 1780/3560 Training loss: 1.9555 0.7581 sec/batch
Epoch 11/20  Iteration 1781/3560 Training loss: 2.0141 0.7869 sec/batch
Epoch 11/20  Iteration 1782/3560 Training loss: 1.9749 0.7517 sec/batch
Epoch 11/20  Iteration 1783/3560 Training loss: 1.9621 0.7897 sec/batch
Epoch 11/20  Iteration 1784/3560 Training loss: 1.9559 0.7556 sec/batch
Epoch 11/20  Iteration 1785/3560 Training loss: 1.9553 0.7526 sec/batch
Epoch 11/20  Iteration 1786/3560 Training loss: 1.9456 0.7550 sec/batch
Epoch 11/20  Iteration 1787/3560 Training loss: 1.9460 0.7529 sec/batch
Epoch 11/20  Iteration 1788/3560 Training loss: 1.9446 0.7535 sec/batch
Epoch 11/20  Iteration 1789/3560 Training loss: 1.9479 0.7640 sec/batch
Epoch 11/20  Iteration 1790/3560 Training loss: 1.9473 0.7513 sec/batch
Epoch 11/20  Iteration 1791/3560 Training loss: 1.9445 0.7522 sec/batch
Epoch 11/20  Iteration 1792/3560 Training loss: 1.9430 0.7531 sec/batch
Epoch 11/20  Iteration 1793/3560 Training loss: 1.9426 0.7625 sec/batch
Epoch 11/20  Iteration 1794/3560 Training loss: 1.9447 0.7620 sec/batch
Epoch 11/20  Iteration 1795/3560 Training loss: 1.9444 0.7590 sec/batch
Epoch 11/20  Iteration 1796/3560 Training loss: 1.9433 0.7533 sec/batch
Epoch 11/20  Iteration 1797/3560 Training loss: 1.9425 0.7631 sec/batch
Epoch 11/20  Iteration 1798/3560 Training loss: 1.9447 0.7575 sec/batch
Epoch 11/20  Iteration 1799/3560 Training loss: 1.9444 0.7626 sec/batch
Epoch 11/20  Iteration 1800/3560 Training loss: 1.9445 0.7514 sec/batch
Epoch 11/20  Iteration 1801/3560 Training loss: 1.9432 0.7655 sec/batch
Epoch 11/20  Iteration 1802/3560 Training loss: 1.9442 0.7705 sec/batch
Epoch 11/20  Iteration 1803/3560 Training loss: 1.9435 0.7700 sec/batch
Epoch 11/20  Iteration 1804/3560 Training loss: 1.9431 0.7566 sec/batch
Epoch 11/20  Iteration 1805/3560 Training loss: 1.9425 0.7644 sec/batch
Epoch 11/20  Iteration 1806/3560 Training loss: 1.9415 0.7577 sec/batch
Epoch 11/20  Iteration 1807/3560 Training loss: 1.9407 0.7485 sec/batch
Epoch 11/20  Iteration 1808/3560 Training loss: 1.9408 0.7526 sec/batch
Epoch 11/20  Iteration 1809/3560 Training loss: 1.9421 0.7873 sec/batch
Epoch 11/20  Iteration 1810/3560 Training loss: 1.9423 0.7627 sec/batch
Epoch 11/20  Iteration 1811/3560 Training loss: 1.9421 0.7556 sec/batch
Epoch 11/20  Iteration 1812/3560 Training loss: 1.9415 0.7443 sec/batch
Epoch 11/20  Iteration 1813/3560 Training loss: 1.9416 0.7507 sec/batch
Epoch 11/20  Iteration 1814/3560 Training loss: 1.9425 0.7591 sec/batch
Epoch 11/20  Iteration 1815/3560 Training loss: 1.9423 0.8037 sec/batch
Epoch 11/20  Iteration 1816/3560 Training loss: 1.9419 0.9082 sec/batch
Epoch 11/20  Iteration 1817/3560 Training loss: 1.9417 0.7993 sec/batch
Epoch 11/20  Iteration 1818/3560 Training loss: 1.9404 0.7575 sec/batch
Epoch 11/20  Iteration 1819/3560 Training loss: 1.9395 0.7628 sec/batch
Epoch 11/20  Iteration 1820/3560 Training loss: 1.9389 0.8220 sec/batch
Epoch 11/20  Iteration 1821/3560 Training loss: 1.9385 0.8517 sec/batch
Epoch 11/20  Iteration 1822/3560 Training loss: 1.9386 0.7682 sec/batch
Epoch 11/20  Iteration 1823/3560 Training loss: 1.9382 0.7633 sec/batch
Epoch 11/20  Iteration 1824/3560 Training loss: 1.9374 0.7771 sec/batch
Epoch 11/20  Iteration 1825/3560 Training loss: 1.9376 0.7723 sec/batch
Epoch 11/20  Iteration 1826/3560 Training loss: 1.9364 0.7594 sec/batch
Epoch 11/20  Iteration 1827/3560 Training loss: 1.9364 0.7733 sec/batch
Epoch 11/20  Iteration 1828/3560 Training loss: 1.9361 0.7638 sec/batch
Epoch 11/20  Iteration 1829/3560 Training loss: 1.9359 0.7842 sec/batch
Epoch 11/20  Iteration 1830/3560 Training loss: 1.9365 0.7710 sec/batch
Epoch 11/20  Iteration 1831/3560 Training loss: 1.9360 0.7766 sec/batch
Epoch 11/20  Iteration 1832/3560 Training loss: 1.9368 0.7705 sec/batch
Epoch 11/20  Iteration 1833/3560 Training loss: 1.9365 0.7600 sec/batch
Epoch 11/20  Iteration 1834/3560 Training loss: 1.9363 0.7770 sec/batch
Epoch 11/20  Iteration 1835/3560 Training loss: 1.9360 0.7722 sec/batch
Epoch 11/20  Iteration 1836/3560 Training loss: 1.9360 0.7687 sec/batch
Epoch 11/20  Iteration 1837/3560 Training loss: 1.9360 0.7710 sec/batch
Epoch 11/20  Iteration 1838/3560 Training loss: 1.9358 0.7728 sec/batch
Epoch 11/20  Iteration 1839/3560 Training loss: 1.9352 0.7711 sec/batch
Epoch 11/20  Iteration 1840/3560 Training loss: 1.9355 0.7727 sec/batch
Epoch 11/20  Iteration 1841/3560 Training loss: 1.9354 0.7649 sec/batch
Epoch 11/20  Iteration 1842/3560 Training loss: 1.9359 0.7634 sec/batch
Epoch 11/20  Iteration 1843/3560 Training loss: 1.9363 0.7652 sec/batch
Epoch 11/20  Iteration 1844/3560 Training loss: 1.9364 0.8440 sec/batch
Epoch 11/20  Iteration 1845/3560 Training loss: 1.9363 0.8239 sec/batch
Epoch 11/20  Iteration 1846/3560 Training loss: 1.9365 0.7764 sec/batch
Epoch 11/20  Iteration 1847/3560 Training loss: 1.9366 0.7989 sec/batch
Epoch 11/20  Iteration 1848/3560 Training loss: 1.9360 0.7629 sec/batch
Epoch 11/20  Iteration 1849/3560 Training loss: 1.9357 0.7634 sec/batch
Epoch 11/20  Iteration 1850/3560 Training loss: 1.9357 0.7640 sec/batch
Epoch 11/20  Iteration 1851/3560 Training loss: 1.9359 0.7673 sec/batch
Epoch 11/20  Iteration 1852/3560 Training loss: 1.9359 0.7738 sec/batch
Epoch 11/20  Iteration 1853/3560 Training loss: 1.9363 0.7813 sec/batch
Epoch 11/20  Iteration 1854/3560 Training loss: 1.9360 0.7887 sec/batch
Epoch 11/20  Iteration 1855/3560 Training loss: 1.9360 0.7733 sec/batch
Epoch 11/20  Iteration 1856/3560 Training loss: 1.9363 0.7754 sec/batch
Epoch 11/20  Iteration 1857/3560 Training loss: 1.9363 0.7595 sec/batch
Epoch 11/20  Iteration 1858/3560 Training loss: 1.9362 0.7978 sec/batch
Epoch 11/20  Iteration 1859/3560 Training loss: 1.9357 0.7701 sec/batch
Epoch 11/20  Iteration 1860/3560 Training loss: 1.9355 0.8042 sec/batch
Epoch 11/20  Iteration 1861/3560 Training loss: 1.9350 0.7685 sec/batch
Epoch 11/20  Iteration 1862/3560 Training loss: 1.9350 0.8493 sec/batch
Epoch 11/20  Iteration 1863/3560 Training loss: 1.9345 0.7776 sec/batch
Epoch 11/20  Iteration 1864/3560 Training loss: 1.9343 0.6408 sec/batch
Epoch 11/20  Iteration 1865/3560 Training loss: 1.9338 0.7287 sec/batch
Epoch 11/20  Iteration 1866/3560 Training loss: 1.9335 0.6527 sec/batch
Epoch 11/20  Iteration 1867/3560 Training loss: 1.9335 0.6658 sec/batch
Epoch 11/20  Iteration 1868/3560 Training loss: 1.9331 0.6547 sec/batch
Epoch 11/20  Iteration 1869/3560 Training loss: 1.9326 0.6426 sec/batch
Epoch 11/20  Iteration 1870/3560 Training loss: 1.9326 0.6254 sec/batch
Epoch 11/20  Iteration 1871/3560 Training loss: 1.9323 0.6332 sec/batch
Epoch 11/20  Iteration 1872/3560 Training loss: 1.9321 0.6356 sec/batch
Epoch 11/20  Iteration 1873/3560 Training loss: 1.9317 0.6413 sec/batch
Epoch 11/20  Iteration 1874/3560 Training loss: 1.9314 0.6329 sec/batch
Epoch 11/20  Iteration 1875/3560 Training loss: 1.9310 0.6512 sec/batch
Epoch 11/20  Iteration 1876/3560 Training loss: 1.9310 0.6474 sec/batch
Epoch 11/20  Iteration 1877/3560 Training loss: 1.9309 0.6328 sec/batch
Epoch 11/20  Iteration 1878/3560 Training loss: 1.9306 0.6402 sec/batch
Epoch 11/20  Iteration 1879/3560 Training loss: 1.9301 0.6352 sec/batch
Epoch 11/20  Iteration 1880/3560 Training loss: 1.9297 0.6399 sec/batch
Epoch 11/20  Iteration 1881/3560 Training loss: 1.9296 0.6322 sec/batch
Epoch 11/20  Iteration 1882/3560 Training loss: 1.9295 0.6321 sec/batch
Epoch 11/20  Iteration 1883/3560 Training loss: 1.9292 0.6292 sec/batch
Epoch 11/20  Iteration 1884/3560 Training loss: 1.9290 0.6352 sec/batch
Epoch 11/20  Iteration 1885/3560 Training loss: 1.9287 0.6387 sec/batch
Epoch 11/20  Iteration 1886/3560 Training loss: 1.9286 0.6352 sec/batch
Epoch 11/20  Iteration 1887/3560 Training loss: 1.9285 0.6364 sec/batch
Epoch 11/20  Iteration 1888/3560 Training loss: 1.9285 0.6363 sec/batch
Epoch 11/20  Iteration 1889/3560 Training loss: 1.9285 0.6327 sec/batch
Epoch 11/20  Iteration 1890/3560 Training loss: 1.9284 0.6533 sec/batch
Epoch 11/20  Iteration 1891/3560 Training loss: 1.9284 0.6366 sec/batch
Epoch 11/20  Iteration 1892/3560 Training loss: 1.9282 0.6264 sec/batch
Epoch 11/20  Iteration 1893/3560 Training loss: 1.9281 0.6387 sec/batch
Epoch 11/20  Iteration 1894/3560 Training loss: 1.9280 0.6456 sec/batch
Epoch 11/20  Iteration 1895/3560 Training loss: 1.9278 0.6321 sec/batch
Epoch 11/20  Iteration 1896/3560 Training loss: 1.9274 0.6282 sec/batch
Epoch 11/20  Iteration 1897/3560 Training loss: 1.9274 0.6503 sec/batch
Epoch 11/20  Iteration 1898/3560 Training loss: 1.9274 0.8331 sec/batch
Epoch 11/20  Iteration 1899/3560 Training loss: 1.9275 0.7808 sec/batch
Epoch 11/20  Iteration 1900/3560 Training loss: 1.9274 0.6410 sec/batch
Epoch 11/20  Iteration 1901/3560 Training loss: 1.9274 0.6342 sec/batch
Epoch 11/20  Iteration 1902/3560 Training loss: 1.9271 0.6339 sec/batch
Epoch 11/20  Iteration 1903/3560 Training loss: 1.9269 0.6383 sec/batch
Epoch 11/20  Iteration 1904/3560 Training loss: 1.9269 0.6329 sec/batch
Epoch 11/20  Iteration 1905/3560 Training loss: 1.9268 0.6390 sec/batch
Epoch 11/20  Iteration 1906/3560 Training loss: 1.9264 0.6345 sec/batch
Epoch 11/20  Iteration 1907/3560 Training loss: 1.9266 0.6304 sec/batch
Epoch 11/20  Iteration 1908/3560 Training loss: 1.9266 0.6361 sec/batch
Epoch 11/20  Iteration 1909/3560 Training loss: 1.9265 0.6316 sec/batch
Epoch 11/20  Iteration 1910/3560 Training loss: 1.9264 0.6345 sec/batch
Epoch 11/20  Iteration 1911/3560 Training loss: 1.9261 0.6280 sec/batch
Epoch 11/20  Iteration 1912/3560 Training loss: 1.9258 0.6330 sec/batch
Epoch 11/20  Iteration 1913/3560 Training loss: 1.9258 0.6319 sec/batch
Epoch 11/20  Iteration 1914/3560 Training loss: 1.9258 0.6237 sec/batch
Epoch 11/20  Iteration 1915/3560 Training loss: 1.9257 0.6387 sec/batch
Epoch 11/20  Iteration 1916/3560 Training loss: 1.9257 0.6340 sec/batch
Epoch 11/20  Iteration 1917/3560 Training loss: 1.9257 0.6265 sec/batch
Epoch 11/20  Iteration 1918/3560 Training loss: 1.9257 0.6361 sec/batch
Epoch 11/20  Iteration 1919/3560 Training loss: 1.9258 0.6354 sec/batch
Epoch 11/20  Iteration 1920/3560 Training loss: 1.9257 0.6329 sec/batch
Epoch 11/20  Iteration 1921/3560 Training loss: 1.9259 0.6321 sec/batch
Epoch 11/20  Iteration 1922/3560 Training loss: 1.9258 0.6328 sec/batch
Epoch 11/20  Iteration 1923/3560 Training loss: 1.9257 0.6440 sec/batch
Epoch 11/20  Iteration 1924/3560 Training loss: 1.9257 0.6456 sec/batch
Epoch 11/20  Iteration 1925/3560 Training loss: 1.9256 0.7731 sec/batch
Epoch 11/20  Iteration 1926/3560 Training loss: 1.9256 0.6848 sec/batch
Epoch 11/20  Iteration 1927/3560 Training loss: 1.9255 0.7077 sec/batch
Epoch 11/20  Iteration 1928/3560 Training loss: 1.9257 0.6253 sec/batch
Epoch 11/20  Iteration 1929/3560 Training loss: 1.9256 0.6357 sec/batch
Epoch 11/20  Iteration 1930/3560 Training loss: 1.9255 0.6339 sec/batch
Epoch 11/20  Iteration 1931/3560 Training loss: 1.9253 0.6356 sec/batch
Epoch 11/20  Iteration 1932/3560 Training loss: 1.9255 0.6418 sec/batch
Epoch 11/20  Iteration 1933/3560 Training loss: 1.9254 0.6336 sec/batch
Epoch 11/20  Iteration 1934/3560 Training loss: 1.9255 0.6350 sec/batch
Epoch 11/20  Iteration 1935/3560 Training loss: 1.9254 0.6332 sec/batch
Epoch 11/20  Iteration 1936/3560 Training loss: 1.9253 0.6500 sec/batch
Epoch 11/20  Iteration 1937/3560 Training loss: 1.9253 0.6328 sec/batch
Epoch 11/20  Iteration 1938/3560 Training loss: 1.9252 0.6336 sec/batch
Epoch 11/20  Iteration 1939/3560 Training loss: 1.9250 0.6342 sec/batch
Epoch 11/20  Iteration 1940/3560 Training loss: 1.9252 0.6345 sec/batch
Epoch 11/20  Iteration 1941/3560 Training loss: 1.9253 0.6316 sec/batch
Epoch 11/20  Iteration 1942/3560 Training loss: 1.9252 0.6324 sec/batch
Epoch 11/20  Iteration 1943/3560 Training loss: 1.9252 0.6268 sec/batch
Epoch 11/20  Iteration 1944/3560 Training loss: 1.9251 0.6374 sec/batch
Epoch 11/20  Iteration 1945/3560 Training loss: 1.9251 0.6172 sec/batch
Epoch 11/20  Iteration 1946/3560 Training loss: 1.9251 0.6387 sec/batch
Epoch 11/20  Iteration 1947/3560 Training loss: 1.9250 0.6365 sec/batch
Epoch 11/20  Iteration 1948/3560 Training loss: 1.9253 0.6286 sec/batch
Epoch 11/20  Iteration 1949/3560 Training loss: 1.9253 0.6471 sec/batch
Epoch 11/20  Iteration 1950/3560 Training loss: 1.9252 0.6377 sec/batch
Epoch 11/20  Iteration 1951/3560 Training loss: 1.9250 0.6432 sec/batch
Epoch 11/20  Iteration 1952/3560 Training loss: 1.9248 0.6342 sec/batch
Epoch 11/20  Iteration 1953/3560 Training loss: 1.9249 0.6393 sec/batch
Epoch 11/20  Iteration 1954/3560 Training loss: 1.9249 0.6449 sec/batch
Epoch 11/20  Iteration 1955/3560 Training loss: 1.9250 0.6314 sec/batch
Epoch 11/20  Iteration 1956/3560 Training loss: 1.9249 0.6399 sec/batch
Epoch 11/20  Iteration 1957/3560 Training loss: 1.9247 0.6207 sec/batch
Epoch 11/20  Iteration 1958/3560 Training loss: 1.9247 0.6339 sec/batch
Epoch 12/20  Iteration 1959/3560 Training loss: 1.9979 0.6372 sec/batch
Epoch 12/20  Iteration 1960/3560 Training loss: 1.9471 0.6340 sec/batch
Epoch 12/20  Iteration 1961/3560 Training loss: 1.9319 0.6491 sec/batch
Epoch 12/20  Iteration 1962/3560 Training loss: 1.9263 0.6431 sec/batch
Epoch 12/20  Iteration 1963/3560 Training loss: 1.9250 0.6371 sec/batch
Epoch 12/20  Iteration 1964/3560 Training loss: 1.9165 0.6368 sec/batch
Epoch 12/20  Iteration 1965/3560 Training loss: 1.9160 0.6449 sec/batch
Epoch 12/20  Iteration 1966/3560 Training loss: 1.9153 0.6343 sec/batch
Epoch 12/20  Iteration 1967/3560 Training loss: 1.9180 0.6331 sec/batch
Epoch 12/20  Iteration 1968/3560 Training loss: 1.9173 0.6381 sec/batch
Epoch 12/20  Iteration 1969/3560 Training loss: 1.9144 0.6372 sec/batch
Epoch 12/20  Iteration 1970/3560 Training loss: 1.9130 0.6318 sec/batch
Epoch 12/20  Iteration 1971/3560 Training loss: 1.9126 0.6453 sec/batch
Epoch 12/20  Iteration 1972/3560 Training loss: 1.9155 0.6406 sec/batch
Epoch 12/20  Iteration 1973/3560 Training loss: 1.9148 0.6396 sec/batch
Epoch 12/20  Iteration 1974/3560 Training loss: 1.9131 0.6378 sec/batch
Epoch 12/20  Iteration 1975/3560 Training loss: 1.9128 0.6449 sec/batch
Epoch 12/20  Iteration 1976/3560 Training loss: 1.9144 0.6404 sec/batch
Epoch 12/20  Iteration 1977/3560 Training loss: 1.9142 0.6319 sec/batch
Epoch 12/20  Iteration 1978/3560 Training loss: 1.9141 0.6151 sec/batch
Epoch 12/20  Iteration 1979/3560 Training loss: 1.9133 0.6424 sec/batch
Epoch 12/20  Iteration 1980/3560 Training loss: 1.9144 0.6361 sec/batch
Epoch 12/20  Iteration 1981/3560 Training loss: 1.9137 0.6369 sec/batch
Epoch 12/20  Iteration 1982/3560 Training loss: 1.9130 0.6456 sec/batch
Epoch 12/20  Iteration 1983/3560 Training loss: 1.9132 0.6487 sec/batch
Epoch 12/20  Iteration 1984/3560 Training loss: 1.9125 0.6386 sec/batch
Epoch 12/20  Iteration 1985/3560 Training loss: 1.9119 0.6420 sec/batch
Epoch 12/20  Iteration 1986/3560 Training loss: 1.9120 0.6378 sec/batch
Epoch 12/20  Iteration 1987/3560 Training loss: 1.9133 0.6221 sec/batch
Epoch 12/20  Iteration 1988/3560 Training loss: 1.9136 0.6349 sec/batch
Epoch 12/20  Iteration 1989/3560 Training loss: 1.9135 0.6279 sec/batch
Epoch 12/20  Iteration 1990/3560 Training loss: 1.9127 0.6436 sec/batch
Epoch 12/20  Iteration 1991/3560 Training loss: 1.9128 0.6327 sec/batch
Epoch 12/20  Iteration 1992/3560 Training loss: 1.9137 0.6203 sec/batch
Epoch 12/20  Iteration 1993/3560 Training loss: 1.9133 0.6364 sec/batch
Epoch 12/20  Iteration 1994/3560 Training loss: 1.9130 0.6323 sec/batch
Epoch 12/20  Iteration 1995/3560 Training loss: 1.9127 0.6297 sec/batch
Epoch 12/20  Iteration 1996/3560 Training loss: 1.9114 0.6331 sec/batch
Epoch 12/20  Iteration 1997/3560 Training loss: 1.9103 0.6399 sec/batch
Epoch 12/20  Iteration 1998/3560 Training loss: 1.9095 0.6388 sec/batch
Epoch 12/20  Iteration 1999/3560 Training loss: 1.9087 0.6456 sec/batch
Epoch 12/20  Iteration 2000/3560 Training loss: 1.9088 0.6508 sec/batch
Epoch 12/20  Iteration 2001/3560 Training loss: 1.9083 0.6370 sec/batch
Epoch 12/20  Iteration 2002/3560 Training loss: 1.9077 0.6363 sec/batch
Epoch 12/20  Iteration 2003/3560 Training loss: 1.9080 0.6373 sec/batch
Epoch 12/20  Iteration 2004/3560 Training loss: 1.9069 0.6328 sec/batch
Epoch 12/20  Iteration 2005/3560 Training loss: 1.9070 0.6365 sec/batch
Epoch 12/20  Iteration 2006/3560 Training loss: 1.9065 0.6401 sec/batch
Epoch 12/20  Iteration 2007/3560 Training loss: 1.9062 0.6436 sec/batch
Epoch 12/20  Iteration 2008/3560 Training loss: 1.9069 0.6344 sec/batch
Epoch 12/20  Iteration 2009/3560 Training loss: 1.9064 0.6384 sec/batch
Epoch 12/20  Iteration 2010/3560 Training loss: 1.9075 0.6405 sec/batch
Epoch 12/20  Iteration 2011/3560 Training loss: 1.9072 0.6303 sec/batch
Epoch 12/20  Iteration 2012/3560 Training loss: 1.9070 0.6332 sec/batch
Epoch 12/20  Iteration 2013/3560 Training loss: 1.9066 0.6310 sec/batch
Epoch 12/20  Iteration 2014/3560 Training loss: 1.9068 0.6337 sec/batch
Epoch 12/20  Iteration 2015/3560 Training loss: 1.9070 0.6368 sec/batch
Epoch 12/20  Iteration 2016/3560 Training loss: 1.9068 0.6362 sec/batch
Epoch 12/20  Iteration 2017/3560 Training loss: 1.9063 0.6189 sec/batch
Epoch 12/20  Iteration 2018/3560 Training loss: 1.9067 0.6263 sec/batch
Epoch 12/20  Iteration 2019/3560 Training loss: 1.9064 0.6357 sec/batch
Epoch 12/20  Iteration 2020/3560 Training loss: 1.9071 0.6365 sec/batch
Epoch 12/20  Iteration 2021/3560 Training loss: 1.9075 0.6383 sec/batch
Epoch 12/20  Iteration 2022/3560 Training loss: 1.9076 0.6408 sec/batch
Epoch 12/20  Iteration 2023/3560 Training loss: 1.9074 0.6345 sec/batch
Epoch 12/20  Iteration 2024/3560 Training loss: 1.9076 0.6389 sec/batch
Epoch 12/20  Iteration 2025/3560 Training loss: 1.9078 0.6375 sec/batch
Epoch 12/20  Iteration 2026/3560 Training loss: 1.9074 0.6330 sec/batch
Epoch 12/20  Iteration 2027/3560 Training loss: 1.9072 0.6306 sec/batch
Epoch 12/20  Iteration 2028/3560 Training loss: 1.9072 0.6374 sec/batch
Epoch 12/20  Iteration 2029/3560 Training loss: 1.9076 0.6404 sec/batch
Epoch 12/20  Iteration 2030/3560 Training loss: 1.9077 0.6495 sec/batch
Epoch 12/20  Iteration 2031/3560 Training loss: 1.9080 0.6430 sec/batch
Epoch 12/20  Iteration 2032/3560 Training loss: 1.9078 0.6343 sec/batch
Epoch 12/20  Iteration 2033/3560 Training loss: 1.9076 0.6329 sec/batch
Epoch 12/20  Iteration 2034/3560 Training loss: 1.9078 0.6399 sec/batch
Epoch 12/20  Iteration 2035/3560 Training loss: 1.9077 0.6371 sec/batch
Epoch 12/20  Iteration 2036/3560 Training loss: 1.9079 0.6421 sec/batch
Epoch 12/20  Iteration 2037/3560 Training loss: 1.9074 0.6397 sec/batch
Epoch 12/20  Iteration 2038/3560 Training loss: 1.9073 0.6343 sec/batch
Epoch 12/20  Iteration 2039/3560 Training loss: 1.9069 0.6372 sec/batch
Epoch 12/20  Iteration 2040/3560 Training loss: 1.9069 0.6134 sec/batch
Epoch 12/20  Iteration 2041/3560 Training loss: 1.9063 0.6327 sec/batch
Epoch 12/20  Iteration 2042/3560 Training loss: 1.9062 0.6294 sec/batch
Epoch 12/20  Iteration 2043/3560 Training loss: 1.9057 0.7149 sec/batch
Epoch 12/20  Iteration 2044/3560 Training loss: 1.9053 0.7862 sec/batch
Epoch 12/20  Iteration 2045/3560 Training loss: 1.9052 0.7833 sec/batch
Epoch 12/20  Iteration 2046/3560 Training loss: 1.9048 0.7755 sec/batch
Epoch 12/20  Iteration 2047/3560 Training loss: 1.9043 0.7770 sec/batch
Epoch 12/20  Iteration 2048/3560 Training loss: 1.9044 0.8114 sec/batch
Epoch 12/20  Iteration 2049/3560 Training loss: 1.9040 0.9263 sec/batch
Epoch 12/20  Iteration 2050/3560 Training loss: 1.9038 0.8775 sec/batch
Epoch 12/20  Iteration 2051/3560 Training loss: 1.9034 0.7643 sec/batch
Epoch 12/20  Iteration 2052/3560 Training loss: 1.9031 0.8194 sec/batch
Epoch 12/20  Iteration 2053/3560 Training loss: 1.9028 0.7527 sec/batch
Epoch 12/20  Iteration 2054/3560 Training loss: 1.9028 0.7576 sec/batch
Epoch 12/20  Iteration 2055/3560 Training loss: 1.9026 0.7591 sec/batch
Epoch 12/20  Iteration 2056/3560 Training loss: 1.9023 0.6783 sec/batch
Epoch 12/20  Iteration 2057/3560 Training loss: 1.9020 0.6709 sec/batch
Epoch 12/20  Iteration 2058/3560 Training loss: 1.9017 0.6800 sec/batch
Epoch 12/20  Iteration 2059/3560 Training loss: 1.9017 0.6803 sec/batch
Epoch 12/20  Iteration 2060/3560 Training loss: 1.9016 0.6810 sec/batch
Epoch 12/20  Iteration 2061/3560 Training loss: 1.9013 0.6726 sec/batch
Epoch 12/20  Iteration 2062/3560 Training loss: 1.9010 0.6717 sec/batch
Epoch 12/20  Iteration 2063/3560 Training loss: 1.9008 0.6810 sec/batch
Epoch 12/20  Iteration 2064/3560 Training loss: 1.9007 0.6760 sec/batch
Epoch 12/20  Iteration 2065/3560 Training loss: 1.9006 0.6786 sec/batch
Epoch 12/20  Iteration 2066/3560 Training loss: 1.9006 0.6829 sec/batch
Epoch 12/20  Iteration 2067/3560 Training loss: 1.9006 0.6842 sec/batch
Epoch 12/20  Iteration 2068/3560 Training loss: 1.9006 0.6739 sec/batch
Epoch 12/20  Iteration 2069/3560 Training loss: 1.9005 0.6711 sec/batch
Epoch 12/20  Iteration 2070/3560 Training loss: 1.9003 0.6815 sec/batch
Epoch 12/20  Iteration 2071/3560 Training loss: 1.9001 0.6819 sec/batch
Epoch 12/20  Iteration 2072/3560 Training loss: 1.9001 0.6874 sec/batch
Epoch 12/20  Iteration 2073/3560 Training loss: 1.8998 0.6787 sec/batch
Epoch 12/20  Iteration 2074/3560 Training loss: 1.8994 0.6806 sec/batch
Epoch 12/20  Iteration 2075/3560 Training loss: 1.8994 0.6787 sec/batch
Epoch 12/20  Iteration 2076/3560 Training loss: 1.8994 0.6812 sec/batch
Epoch 12/20  Iteration 2077/3560 Training loss: 1.8993 0.6778 sec/batch
Epoch 12/20  Iteration 2078/3560 Training loss: 1.8992 0.6699 sec/batch
Epoch 12/20  Iteration 2079/3560 Training loss: 1.8993 0.6703 sec/batch
Epoch 12/20  Iteration 2080/3560 Training loss: 1.8990 0.6689 sec/batch
Epoch 12/20  Iteration 2081/3560 Training loss: 1.8988 0.6815 sec/batch
Epoch 12/20  Iteration 2082/3560 Training loss: 1.8989 0.6622 sec/batch
Epoch 12/20  Iteration 2083/3560 Training loss: 1.8988 0.6606 sec/batch
Epoch 12/20  Iteration 2084/3560 Training loss: 1.8984 0.6784 sec/batch
Epoch 12/20  Iteration 2085/3560 Training loss: 1.8984 0.6825 sec/batch
Epoch 12/20  Iteration 2086/3560 Training loss: 1.8985 0.6718 sec/batch
Epoch 12/20  Iteration 2087/3560 Training loss: 1.8985 0.6604 sec/batch
Epoch 12/20  Iteration 2088/3560 Training loss: 1.8984 0.6756 sec/batch
Epoch 12/20  Iteration 2089/3560 Training loss: 1.8981 0.6799 sec/batch
Epoch 12/20  Iteration 2090/3560 Training loss: 1.8978 0.6744 sec/batch
Epoch 12/20  Iteration 2091/3560 Training loss: 1.8979 0.6737 sec/batch
Epoch 12/20  Iteration 2092/3560 Training loss: 1.8979 0.6795 sec/batch
Epoch 12/20  Iteration 2093/3560 Training loss: 1.8978 0.6762 sec/batch
Epoch 12/20  Iteration 2094/3560 Training loss: 1.8978 0.6835 sec/batch
Epoch 12/20  Iteration 2095/3560 Training loss: 1.8978 0.6703 sec/batch
Epoch 12/20  Iteration 2096/3560 Training loss: 1.8978 0.6875 sec/batch
Epoch 12/20  Iteration 2097/3560 Training loss: 1.8979 0.6766 sec/batch
Epoch 12/20  Iteration 2098/3560 Training loss: 1.8978 0.6640 sec/batch
Epoch 12/20  Iteration 2099/3560 Training loss: 1.8980 0.6771 sec/batch
Epoch 12/20  Iteration 2100/3560 Training loss: 1.8978 0.6889 sec/batch
Epoch 12/20  Iteration 2101/3560 Training loss: 1.8978 0.6808 sec/batch
Epoch 12/20  Iteration 2102/3560 Training loss: 1.8978 0.6778 sec/batch
Epoch 12/20  Iteration 2103/3560 Training loss: 1.8976 0.6803 sec/batch
Epoch 12/20  Iteration 2104/3560 Training loss: 1.8977 0.6758 sec/batch
Epoch 12/20  Iteration 2105/3560 Training loss: 1.8976 0.6724 sec/batch
Epoch 12/20  Iteration 2106/3560 Training loss: 1.8978 0.6865 sec/batch
Epoch 12/20  Iteration 2107/3560 Training loss: 1.8978 0.6794 sec/batch
Epoch 12/20  Iteration 2108/3560 Training loss: 1.8976 0.6827 sec/batch
Epoch 12/20  Iteration 2109/3560 Training loss: 1.8973 0.6767 sec/batch
Epoch 12/20  Iteration 2110/3560 Training loss: 1.8974 0.6770 sec/batch
Epoch 12/20  Iteration 2111/3560 Training loss: 1.8974 0.6729 sec/batch
Epoch 12/20  Iteration 2112/3560 Training loss: 1.8974 0.6824 sec/batch
Epoch 12/20  Iteration 2113/3560 Training loss: 1.8973 0.6749 sec/batch
Epoch 12/20  Iteration 2114/3560 Training loss: 1.8972 0.6749 sec/batch
Epoch 12/20  Iteration 2115/3560 Training loss: 1.8973 0.6846 sec/batch
Epoch 12/20  Iteration 2116/3560 Training loss: 1.8972 0.6938 sec/batch
Epoch 12/20  Iteration 2117/3560 Training loss: 1.8970 0.6802 sec/batch
Epoch 12/20  Iteration 2118/3560 Training loss: 1.8971 0.6796 sec/batch
Epoch 12/20  Iteration 2119/3560 Training loss: 1.8972 0.6680 sec/batch
Epoch 12/20  Iteration 2120/3560 Training loss: 1.8971 0.6839 sec/batch
Epoch 12/20  Iteration 2121/3560 Training loss: 1.8972 0.6744 sec/batch
Epoch 12/20  Iteration 2122/3560 Training loss: 1.8973 0.6807 sec/batch
Epoch 12/20  Iteration 2123/3560 Training loss: 1.8973 0.6732 sec/batch
Epoch 12/20  Iteration 2124/3560 Training loss: 1.8972 0.6613 sec/batch
Epoch 12/20  Iteration 2125/3560 Training loss: 1.8972 0.6767 sec/batch
Epoch 12/20  Iteration 2126/3560 Training loss: 1.8975 0.6746 sec/batch
Epoch 12/20  Iteration 2127/3560 Training loss: 1.8974 0.6816 sec/batch
Epoch 12/20  Iteration 2128/3560 Training loss: 1.8973 0.6834 sec/batch
Epoch 12/20  Iteration 2129/3560 Training loss: 1.8971 0.6836 sec/batch
Epoch 12/20  Iteration 2130/3560 Training loss: 1.8970 0.6741 sec/batch
Epoch 12/20  Iteration 2131/3560 Training loss: 1.8970 0.6769 sec/batch
Epoch 12/20  Iteration 2132/3560 Training loss: 1.8970 0.6806 sec/batch
Epoch 12/20  Iteration 2133/3560 Training loss: 1.8971 0.6787 sec/batch
Epoch 12/20  Iteration 2134/3560 Training loss: 1.8970 0.6808 sec/batch
Epoch 12/20  Iteration 2135/3560 Training loss: 1.8968 0.6710 sec/batch
Epoch 12/20  Iteration 2136/3560 Training loss: 1.8969 0.6796 sec/batch
Epoch 13/20  Iteration 2137/3560 Training loss: 1.9575 0.6771 sec/batch
Epoch 13/20  Iteration 2138/3560 Training loss: 1.9188 0.6783 sec/batch
Epoch 13/20  Iteration 2139/3560 Training loss: 1.9029 0.6803 sec/batch
Epoch 13/20  Iteration 2140/3560 Training loss: 1.8991 0.6787 sec/batch
Epoch 13/20  Iteration 2141/3560 Training loss: 1.8978 0.6751 sec/batch
Epoch 13/20  Iteration 2142/3560 Training loss: 1.8873 0.6812 sec/batch
Epoch 13/20  Iteration 2143/3560 Training loss: 1.8903 0.6717 sec/batch
Epoch 13/20  Iteration 2144/3560 Training loss: 1.8892 0.6859 sec/batch
Epoch 13/20  Iteration 2145/3560 Training loss: 1.8914 0.6740 sec/batch
Epoch 13/20  Iteration 2146/3560 Training loss: 1.8900 0.6850 sec/batch
Epoch 13/20  Iteration 2147/3560 Training loss: 1.8878 0.6609 sec/batch
Epoch 13/20  Iteration 2148/3560 Training loss: 1.8861 0.6784 sec/batch
Epoch 13/20  Iteration 2149/3560 Training loss: 1.8864 0.6775 sec/batch
Epoch 13/20  Iteration 2150/3560 Training loss: 1.8891 0.6689 sec/batch
Epoch 13/20  Iteration 2151/3560 Training loss: 1.8888 0.6801 sec/batch
Epoch 13/20  Iteration 2152/3560 Training loss: 1.8871 0.6788 sec/batch
Epoch 13/20  Iteration 2153/3560 Training loss: 1.8868 0.6795 sec/batch
Epoch 13/20  Iteration 2154/3560 Training loss: 1.8889 0.6717 sec/batch
Epoch 13/20  Iteration 2155/3560 Training loss: 1.8888 0.6736 sec/batch
Epoch 13/20  Iteration 2156/3560 Training loss: 1.8887 0.6785 sec/batch
Epoch 13/20  Iteration 2157/3560 Training loss: 1.8876 0.6821 sec/batch
Epoch 13/20  Iteration 2158/3560 Training loss: 1.8889 0.6791 sec/batch
Epoch 13/20  Iteration 2159/3560 Training loss: 1.8880 0.6821 sec/batch
Epoch 13/20  Iteration 2160/3560 Training loss: 1.8877 0.6982 sec/batch
Epoch 13/20  Iteration 2161/3560 Training loss: 1.8873 0.6841 sec/batch
Epoch 13/20  Iteration 2162/3560 Training loss: 1.8862 0.6724 sec/batch
Epoch 13/20  Iteration 2163/3560 Training loss: 1.8858 0.6775 sec/batch
Epoch 13/20  Iteration 2164/3560 Training loss: 1.8863 0.6884 sec/batch
Epoch 13/20  Iteration 2165/3560 Training loss: 1.8875 0.6821 sec/batch
Epoch 13/20  Iteration 2166/3560 Training loss: 1.8875 0.6637 sec/batch
Epoch 13/20  Iteration 2167/3560 Training loss: 1.8876 0.6741 sec/batch
Epoch 13/20  Iteration 2168/3560 Training loss: 1.8869 0.6753 sec/batch
Epoch 13/20  Iteration 2169/3560 Training loss: 1.8868 0.6786 sec/batch
Epoch 13/20  Iteration 2170/3560 Training loss: 1.8875 0.6717 sec/batch
Epoch 13/20  Iteration 2171/3560 Training loss: 1.8874 0.6856 sec/batch
Epoch 13/20  Iteration 2172/3560 Training loss: 1.8873 0.6670 sec/batch
Epoch 13/20  Iteration 2173/3560 Training loss: 1.8869 0.6736 sec/batch
Epoch 13/20  Iteration 2174/3560 Training loss: 1.8859 0.6914 sec/batch
Epoch 13/20  Iteration 2175/3560 Training loss: 1.8846 0.6700 sec/batch
Epoch 13/20  Iteration 2176/3560 Training loss: 1.8839 0.6719 sec/batch
Epoch 13/20  Iteration 2177/3560 Training loss: 1.8832 0.6707 sec/batch
Epoch 13/20  Iteration 2178/3560 Training loss: 1.8834 0.6749 sec/batch
Epoch 13/20  Iteration 2179/3560 Training loss: 1.8833 0.6782 sec/batch
Epoch 13/20  Iteration 2180/3560 Training loss: 1.8822 0.6838 sec/batch
Epoch 13/20  Iteration 2181/3560 Training loss: 1.8824 0.6888 sec/batch
Epoch 13/20  Iteration 2182/3560 Training loss: 1.8813 0.6755 sec/batch
Epoch 13/20  Iteration 2183/3560 Training loss: 1.8809 0.6828 sec/batch
Epoch 13/20  Iteration 2184/3560 Training loss: 1.8805 0.6792 sec/batch
Epoch 13/20  Iteration 2185/3560 Training loss: 1.8805 0.6795 sec/batch
Epoch 13/20  Iteration 2186/3560 Training loss: 1.8811 0.6800 sec/batch
Epoch 13/20  Iteration 2187/3560 Training loss: 1.8805 0.6813 sec/batch
Epoch 13/20  Iteration 2188/3560 Training loss: 1.8812 0.6806 sec/batch
Epoch 13/20  Iteration 2189/3560 Training loss: 1.8809 0.6780 sec/batch
Epoch 13/20  Iteration 2190/3560 Training loss: 1.8809 0.6768 sec/batch
Epoch 13/20  Iteration 2191/3560 Training loss: 1.8804 0.6794 sec/batch
Epoch 13/20  Iteration 2192/3560 Training loss: 1.8804 0.6812 sec/batch
Epoch 13/20  Iteration 2193/3560 Training loss: 1.8807 0.6827 sec/batch
Epoch 13/20  Iteration 2194/3560 Training loss: 1.8803 0.6721 sec/batch
Epoch 13/20  Iteration 2195/3560 Training loss: 1.8798 0.6820 sec/batch
Epoch 13/20  Iteration 2196/3560 Training loss: 1.8803 0.6838 sec/batch
Epoch 13/20  Iteration 2197/3560 Training loss: 1.8801 0.6751 sec/batch
Epoch 13/20  Iteration 2198/3560 Training loss: 1.8809 0.6753 sec/batch
Epoch 13/20  Iteration 2199/3560 Training loss: 1.8814 0.6799 sec/batch
Epoch 13/20  Iteration 2200/3560 Training loss: 1.8815 0.6794 sec/batch
Epoch 13/20  Iteration 2201/3560 Training loss: 1.8815 0.6893 sec/batch
Epoch 13/20  Iteration 2202/3560 Training loss: 1.8818 0.6862 sec/batch
Epoch 13/20  Iteration 2203/3560 Training loss: 1.8818 0.6747 sec/batch
Epoch 13/20  Iteration 2204/3560 Training loss: 1.8813 0.7016 sec/batch
Epoch 13/20  Iteration 2205/3560 Training loss: 1.8811 0.6864 sec/batch
Epoch 13/20  Iteration 2206/3560 Training loss: 1.8810 0.6780 sec/batch
Epoch 13/20  Iteration 2207/3560 Training loss: 1.8813 0.6932 sec/batch
Epoch 13/20  Iteration 2208/3560 Training loss: 1.8815 0.6765 sec/batch
Epoch 13/20  Iteration 2209/3560 Training loss: 1.8819 0.6732 sec/batch
Epoch 13/20  Iteration 2210/3560 Training loss: 1.8817 0.6739 sec/batch
Epoch 13/20  Iteration 2211/3560 Training loss: 1.8818 0.6660 sec/batch
Epoch 13/20  Iteration 2212/3560 Training loss: 1.8820 0.6871 sec/batch
Epoch 13/20  Iteration 2213/3560 Training loss: 1.8820 0.6712 sec/batch
Epoch 13/20  Iteration 2214/3560 Training loss: 1.8821 0.6803 sec/batch
Epoch 13/20  Iteration 2215/3560 Training loss: 1.8816 0.6813 sec/batch
Epoch 13/20  Iteration 2216/3560 Training loss: 1.8816 0.6845 sec/batch
Epoch 13/20  Iteration 2217/3560 Training loss: 1.8812 0.7304 sec/batch
Epoch 13/20  Iteration 2218/3560 Training loss: 1.8813 0.8953 sec/batch
Epoch 13/20  Iteration 2219/3560 Training loss: 1.8807 0.8607 sec/batch
Epoch 13/20  Iteration 2220/3560 Training loss: 1.8806 0.8188 sec/batch
Epoch 13/20  Iteration 2221/3560 Training loss: 1.8802 0.7991 sec/batch
Epoch 13/20  Iteration 2222/3560 Training loss: 1.8798 0.7734 sec/batch
Epoch 13/20  Iteration 2223/3560 Training loss: 1.8798 0.8763 sec/batch
Epoch 13/20  Iteration 2224/3560 Training loss: 1.8795 0.8216 sec/batch
Epoch 13/20  Iteration 2225/3560 Training loss: 1.8790 0.7233 sec/batch
Epoch 13/20  Iteration 2226/3560 Training loss: 1.8791 0.7032 sec/batch
Epoch 13/20  Iteration 2227/3560 Training loss: 1.8788 0.6640 sec/batch
Epoch 13/20  Iteration 2228/3560 Training loss: 1.8787 0.6835 sec/batch
Epoch 13/20  Iteration 2229/3560 Training loss: 1.8783 0.6763 sec/batch
Epoch 13/20  Iteration 2230/3560 Training loss: 1.8780 0.6687 sec/batch
Epoch 13/20  Iteration 2231/3560 Training loss: 1.8776 0.6778 sec/batch
Epoch 13/20  Iteration 2232/3560 Training loss: 1.8777 0.6670 sec/batch
Epoch 13/20  Iteration 2233/3560 Training loss: 1.8775 0.6731 sec/batch
Epoch 13/20  Iteration 2234/3560 Training loss: 1.8773 0.6813 sec/batch
Epoch 13/20  Iteration 2235/3560 Training loss: 1.8769 0.6778 sec/batch
Epoch 13/20  Iteration 2236/3560 Training loss: 1.8765 0.6699 sec/batch
Epoch 13/20  Iteration 2237/3560 Training loss: 1.8765 0.6753 sec/batch
Epoch 13/20  Iteration 2238/3560 Training loss: 1.8764 0.6831 sec/batch
Epoch 13/20  Iteration 2239/3560 Training loss: 1.8761 0.6783 sec/batch
Epoch 13/20  Iteration 2240/3560 Training loss: 1.8759 0.6756 sec/batch
Epoch 13/20  Iteration 2241/3560 Training loss: 1.8757 0.6816 sec/batch
Epoch 13/20  Iteration 2242/3560 Training loss: 1.8756 0.6825 sec/batch
Epoch 13/20  Iteration 2243/3560 Training loss: 1.8755 0.6800 sec/batch
Epoch 13/20  Iteration 2244/3560 Training loss: 1.8754 0.6773 sec/batch
Epoch 13/20  Iteration 2245/3560 Training loss: 1.8755 0.6949 sec/batch
Epoch 13/20  Iteration 2246/3560 Training loss: 1.8754 0.6974 sec/batch
Epoch 13/20  Iteration 2247/3560 Training loss: 1.8753 0.6772 sec/batch
Epoch 13/20  Iteration 2248/3560 Training loss: 1.8753 0.6734 sec/batch
Epoch 13/20  Iteration 2249/3560 Training loss: 1.8751 0.6775 sec/batch
Epoch 13/20  Iteration 2250/3560 Training loss: 1.8751 0.6733 sec/batch
Epoch 13/20  Iteration 2251/3560 Training loss: 1.8748 0.6776 sec/batch
Epoch 13/20  Iteration 2252/3560 Training loss: 1.8744 0.6705 sec/batch
Epoch 13/20  Iteration 2253/3560 Training loss: 1.8744 0.6758 sec/batch
Epoch 13/20  Iteration 2254/3560 Training loss: 1.8743 0.6785 sec/batch
Epoch 13/20  Iteration 2255/3560 Training loss: 1.8743 0.6750 sec/batch
Epoch 13/20  Iteration 2256/3560 Training loss: 1.8744 0.6726 sec/batch
Epoch 13/20  Iteration 2257/3560 Training loss: 1.8744 0.6736 sec/batch
Epoch 13/20  Iteration 2258/3560 Training loss: 1.8741 0.6744 sec/batch
Epoch 13/20  Iteration 2259/3560 Training loss: 1.8740 0.6847 sec/batch
Epoch 13/20  Iteration 2260/3560 Training loss: 1.8741 0.6848 sec/batch
Epoch 13/20  Iteration 2261/3560 Training loss: 1.8740 0.6806 sec/batch
Epoch 13/20  Iteration 2262/3560 Training loss: 1.8737 0.6851 sec/batch
Epoch 13/20  Iteration 2263/3560 Training loss: 1.8738 0.6736 sec/batch
Epoch 13/20  Iteration 2264/3560 Training loss: 1.8739 0.6804 sec/batch
Epoch 13/20  Iteration 2265/3560 Training loss: 1.8738 0.6863 sec/batch
Epoch 13/20  Iteration 2266/3560 Training loss: 1.8738 0.6803 sec/batch
Epoch 13/20  Iteration 2267/3560 Training loss: 1.8736 0.6747 sec/batch
Epoch 13/20  Iteration 2268/3560 Training loss: 1.8734 0.6783 sec/batch
Epoch 13/20  Iteration 2269/3560 Training loss: 1.8734 0.6805 sec/batch
Epoch 13/20  Iteration 2270/3560 Training loss: 1.8734 0.6860 sec/batch
Epoch 13/20  Iteration 2271/3560 Training loss: 1.8733 0.6789 sec/batch
Epoch 13/20  Iteration 2272/3560 Training loss: 1.8733 0.6806 sec/batch
Epoch 13/20  Iteration 2273/3560 Training loss: 1.8733 0.6829 sec/batch
Epoch 13/20  Iteration 2274/3560 Training loss: 1.8733 0.6793 sec/batch
Epoch 13/20  Iteration 2275/3560 Training loss: 1.8734 0.6892 sec/batch
Epoch 13/20  Iteration 2276/3560 Training loss: 1.8733 0.6842 sec/batch
Epoch 13/20  Iteration 2277/3560 Training loss: 1.8735 0.6754 sec/batch
Epoch 13/20  Iteration 2278/3560 Training loss: 1.8734 0.6805 sec/batch
Epoch 13/20  Iteration 2279/3560 Training loss: 1.8734 0.6749 sec/batch
Epoch 13/20  Iteration 2280/3560 Training loss: 1.8734 0.6710 sec/batch
Epoch 13/20  Iteration 2281/3560 Training loss: 1.8733 0.6890 sec/batch
Epoch 13/20  Iteration 2282/3560 Training loss: 1.8734 0.6830 sec/batch
Epoch 13/20  Iteration 2283/3560 Training loss: 1.8734 0.6729 sec/batch
Epoch 13/20  Iteration 2284/3560 Training loss: 1.8736 0.6819 sec/batch
Epoch 13/20  Iteration 2285/3560 Training loss: 1.8736 0.6688 sec/batch
Epoch 13/20  Iteration 2286/3560 Training loss: 1.8734 0.6788 sec/batch
Epoch 13/20  Iteration 2287/3560 Training loss: 1.8732 0.6720 sec/batch
Epoch 13/20  Iteration 2288/3560 Training loss: 1.8734 0.6534 sec/batch
Epoch 13/20  Iteration 2289/3560 Training loss: 1.8733 0.6768 sec/batch
Epoch 13/20  Iteration 2290/3560 Training loss: 1.8733 0.6924 sec/batch
Epoch 13/20  Iteration 2291/3560 Training loss: 1.8733 0.6785 sec/batch
Epoch 13/20  Iteration 2292/3560 Training loss: 1.8733 0.6761 sec/batch
Epoch 13/20  Iteration 2293/3560 Training loss: 1.8732 0.6808 sec/batch
Epoch 13/20  Iteration 2294/3560 Training loss: 1.8732 0.6773 sec/batch
Epoch 13/20  Iteration 2295/3560 Training loss: 1.8730 0.6834 sec/batch
Epoch 13/20  Iteration 2296/3560 Training loss: 1.8731 0.6763 sec/batch
Epoch 13/20  Iteration 2297/3560 Training loss: 1.8732 0.6675 sec/batch
Epoch 13/20  Iteration 2298/3560 Training loss: 1.8732 0.6718 sec/batch
Epoch 13/20  Iteration 2299/3560 Training loss: 1.8733 0.6740 sec/batch
Epoch 13/20  Iteration 2300/3560 Training loss: 1.8732 0.6754 sec/batch
Epoch 13/20  Iteration 2301/3560 Training loss: 1.8732 0.6716 sec/batch
Epoch 13/20  Iteration 2302/3560 Training loss: 1.8732 0.6941 sec/batch
Epoch 13/20  Iteration 2303/3560 Training loss: 1.8732 0.6782 sec/batch
Epoch 13/20  Iteration 2304/3560 Training loss: 1.8734 0.6775 sec/batch
Epoch 13/20  Iteration 2305/3560 Training loss: 1.8733 0.6818 sec/batch
Epoch 13/20  Iteration 2306/3560 Training loss: 1.8732 0.6851 sec/batch
Epoch 13/20  Iteration 2307/3560 Training loss: 1.8731 0.6723 sec/batch
Epoch 13/20  Iteration 2308/3560 Training loss: 1.8730 0.6729 sec/batch
Epoch 13/20  Iteration 2309/3560 Training loss: 1.8731 0.6770 sec/batch
Epoch 13/20  Iteration 2310/3560 Training loss: 1.8731 0.6792 sec/batch
Epoch 13/20  Iteration 2311/3560 Training loss: 1.8732 0.6878 sec/batch
Epoch 13/20  Iteration 2312/3560 Training loss: 1.8731 0.6853 sec/batch
Epoch 13/20  Iteration 2313/3560 Training loss: 1.8729 0.6790 sec/batch
Epoch 13/20  Iteration 2314/3560 Training loss: 1.8729 0.6865 sec/batch
Epoch 14/20  Iteration 2315/3560 Training loss: 1.9334 0.6688 sec/batch
Epoch 14/20  Iteration 2316/3560 Training loss: 1.8937 0.6703 sec/batch
Epoch 14/20  Iteration 2317/3560 Training loss: 1.8835 0.6754 sec/batch
Epoch 14/20  Iteration 2318/3560 Training loss: 1.8782 0.6772 sec/batch
Epoch 14/20  Iteration 2319/3560 Training loss: 1.8754 0.6802 sec/batch
Epoch 14/20  Iteration 2320/3560 Training loss: 1.8655 0.6777 sec/batch
Epoch 14/20  Iteration 2321/3560 Training loss: 1.8665 0.6794 sec/batch
Epoch 14/20  Iteration 2322/3560 Training loss: 1.8645 0.6808 sec/batch
Epoch 14/20  Iteration 2323/3560 Training loss: 1.8665 0.6848 sec/batch
Epoch 14/20  Iteration 2324/3560 Training loss: 1.8658 0.6686 sec/batch
Epoch 14/20  Iteration 2325/3560 Training loss: 1.8632 0.6810 sec/batch
Epoch 14/20  Iteration 2326/3560 Training loss: 1.8616 0.6861 sec/batch
Epoch 14/20  Iteration 2327/3560 Training loss: 1.8612 0.6860 sec/batch
Epoch 14/20  Iteration 2328/3560 Training loss: 1.8634 0.6780 sec/batch
Epoch 14/20  Iteration 2329/3560 Training loss: 1.8628 0.6747 sec/batch
Epoch 14/20  Iteration 2330/3560 Training loss: 1.8614 0.6768 sec/batch
Epoch 14/20  Iteration 2331/3560 Training loss: 1.8613 0.6792 sec/batch
Epoch 14/20  Iteration 2332/3560 Training loss: 1.8636 0.6784 sec/batch
Epoch 14/20  Iteration 2333/3560 Training loss: 1.8633 0.6794 sec/batch
Epoch 14/20  Iteration 2334/3560 Training loss: 1.8635 0.6993 sec/batch
Epoch 14/20  Iteration 2335/3560 Training loss: 1.8629 0.6777 sec/batch
Epoch 14/20  Iteration 2336/3560 Training loss: 1.8644 0.6838 sec/batch
Epoch 14/20  Iteration 2337/3560 Training loss: 1.8634 0.6894 sec/batch
Epoch 14/20  Iteration 2338/3560 Training loss: 1.8628 0.6797 sec/batch
Epoch 14/20  Iteration 2339/3560 Training loss: 1.8627 0.6774 sec/batch
Epoch 14/20  Iteration 2340/3560 Training loss: 1.8619 0.6934 sec/batch
Epoch 14/20  Iteration 2341/3560 Training loss: 1.8615 0.8810 sec/batch
Epoch 14/20  Iteration 2342/3560 Training loss: 1.8618 0.8742 sec/batch
Epoch 14/20  Iteration 2343/3560 Training loss: 1.8630 0.6770 sec/batch
Epoch 14/20  Iteration 2344/3560 Training loss: 1.8631 0.6826 sec/batch
Epoch 14/20  Iteration 2345/3560 Training loss: 1.8634 0.6867 sec/batch
Epoch 14/20  Iteration 2346/3560 Training loss: 1.8626 0.6860 sec/batch
Epoch 14/20  Iteration 2347/3560 Training loss: 1.8628 0.6905 sec/batch
Epoch 14/20  Iteration 2348/3560 Training loss: 1.8636 0.6812 sec/batch
Epoch 14/20  Iteration 2349/3560 Training loss: 1.8629 0.6811 sec/batch
Epoch 14/20  Iteration 2350/3560 Training loss: 1.8627 0.6732 sec/batch
Epoch 14/20  Iteration 2351/3560 Training loss: 1.8623 0.6899 sec/batch
Epoch 14/20  Iteration 2352/3560 Training loss: 1.8611 0.6876 sec/batch
Epoch 14/20  Iteration 2353/3560 Training loss: 1.8601 0.6868 sec/batch
Epoch 14/20  Iteration 2354/3560 Training loss: 1.8597 0.6829 sec/batch
Epoch 14/20  Iteration 2355/3560 Training loss: 1.8592 0.6896 sec/batch
Epoch 14/20  Iteration 2356/3560 Training loss: 1.8593 0.6856 sec/batch
Epoch 14/20  Iteration 2357/3560 Training loss: 1.8590 0.6729 sec/batch
Epoch 14/20  Iteration 2358/3560 Training loss: 1.8581 0.6896 sec/batch
Epoch 14/20  Iteration 2359/3560 Training loss: 1.8581 0.6934 sec/batch
Epoch 14/20  Iteration 2360/3560 Training loss: 1.8571 0.6760 sec/batch
Epoch 14/20  Iteration 2361/3560 Training loss: 1.8569 0.6951 sec/batch
Epoch 14/20  Iteration 2362/3560 Training loss: 1.8565 0.6793 sec/batch
Epoch 14/20  Iteration 2363/3560 Training loss: 1.8565 0.6945 sec/batch
Epoch 14/20  Iteration 2364/3560 Training loss: 1.8573 0.6901 sec/batch
Epoch 14/20  Iteration 2365/3560 Training loss: 1.8570 0.6737 sec/batch
Epoch 14/20  Iteration 2366/3560 Training loss: 1.8577 0.6786 sec/batch
Epoch 14/20  Iteration 2367/3560 Training loss: 1.8575 0.6769 sec/batch
Epoch 14/20  Iteration 2368/3560 Training loss: 1.8575 0.6900 sec/batch
Epoch 14/20  Iteration 2369/3560 Training loss: 1.8571 0.7054 sec/batch
Epoch 14/20  Iteration 2370/3560 Training loss: 1.8573 0.6827 sec/batch
Epoch 14/20  Iteration 2371/3560 Training loss: 1.8574 0.6780 sec/batch
Epoch 14/20  Iteration 2372/3560 Training loss: 1.8571 0.6696 sec/batch
Epoch 14/20  Iteration 2373/3560 Training loss: 1.8568 0.6963 sec/batch
Epoch 14/20  Iteration 2374/3560 Training loss: 1.8571 0.6787 sec/batch
Epoch 14/20  Iteration 2375/3560 Training loss: 1.8568 0.6787 sec/batch
Epoch 14/20  Iteration 2376/3560 Training loss: 1.8575 0.7046 sec/batch
Epoch 14/20  Iteration 2377/3560 Training loss: 1.8580 0.7016 sec/batch
Epoch 14/20  Iteration 2378/3560 Training loss: 1.8581 0.6820 sec/batch
Epoch 14/20  Iteration 2379/3560 Training loss: 1.8582 0.6744 sec/batch
Epoch 14/20  Iteration 2380/3560 Training loss: 1.8585 0.6867 sec/batch
Epoch 14/20  Iteration 2381/3560 Training loss: 1.8585 0.6758 sec/batch
Epoch 14/20  Iteration 2382/3560 Training loss: 1.8582 0.6682 sec/batch
Epoch 14/20  Iteration 2383/3560 Training loss: 1.8582 0.6795 sec/batch
Epoch 14/20  Iteration 2384/3560 Training loss: 1.8581 0.6852 sec/batch
Epoch 14/20  Iteration 2385/3560 Training loss: 1.8586 0.6828 sec/batch
Epoch 14/20  Iteration 2386/3560 Training loss: 1.8586 0.6754 sec/batch
Epoch 14/20  Iteration 2387/3560 Training loss: 1.8591 0.6665 sec/batch
Epoch 14/20  Iteration 2388/3560 Training loss: 1.8588 0.6874 sec/batch
Epoch 14/20  Iteration 2389/3560 Training loss: 1.8587 0.6997 sec/batch
Epoch 14/20  Iteration 2390/3560 Training loss: 1.8590 0.6802 sec/batch
Epoch 14/20  Iteration 2391/3560 Training loss: 1.8589 0.6869 sec/batch
Epoch 14/20  Iteration 2392/3560 Training loss: 1.8590 0.6847 sec/batch
Epoch 14/20  Iteration 2393/3560 Training loss: 1.8585 0.6752 sec/batch
Epoch 14/20  Iteration 2394/3560 Training loss: 1.8583 0.6581 sec/batch
Epoch 14/20  Iteration 2395/3560 Training loss: 1.8579 0.6762 sec/batch
Epoch 14/20  Iteration 2396/3560 Training loss: 1.8580 0.7263 sec/batch
Epoch 14/20  Iteration 2397/3560 Training loss: 1.8575 0.7411 sec/batch
Epoch 14/20  Iteration 2398/3560 Training loss: 1.8574 0.7066 sec/batch
Epoch 14/20  Iteration 2399/3560 Training loss: 1.8569 0.6837 sec/batch
Epoch 14/20  Iteration 2400/3560 Training loss: 1.8565 0.7054 sec/batch
Epoch 14/20  Iteration 2401/3560 Training loss: 1.8563 0.6757 sec/batch
Epoch 14/20  Iteration 2402/3560 Training loss: 1.8558 0.6805 sec/batch
Epoch 14/20  Iteration 2403/3560 Training loss: 1.8554 0.6801 sec/batch
Epoch 14/20  Iteration 2404/3560 Training loss: 1.8555 0.6796 sec/batch
Epoch 14/20  Iteration 2405/3560 Training loss: 1.8553 0.6853 sec/batch
Epoch 14/20  Iteration 2406/3560 Training loss: 1.8552 0.6737 sec/batch
Epoch 14/20  Iteration 2407/3560 Training loss: 1.8548 0.6893 sec/batch
Epoch 14/20  Iteration 2408/3560 Training loss: 1.8544 0.6744 sec/batch
Epoch 14/20  Iteration 2409/3560 Training loss: 1.8541 0.6807 sec/batch
Epoch 14/20  Iteration 2410/3560 Training loss: 1.8540 0.6735 sec/batch
Epoch 14/20  Iteration 2411/3560 Training loss: 1.8539 0.6823 sec/batch
Epoch 14/20  Iteration 2412/3560 Training loss: 1.8536 0.6839 sec/batch
Epoch 14/20  Iteration 2413/3560 Training loss: 1.8533 0.6782 sec/batch
Epoch 14/20  Iteration 2414/3560 Training loss: 1.8529 0.6659 sec/batch
Epoch 14/20  Iteration 2415/3560 Training loss: 1.8529 0.6845 sec/batch
Epoch 14/20  Iteration 2416/3560 Training loss: 1.8528 0.6724 sec/batch
Epoch 14/20  Iteration 2417/3560 Training loss: 1.8526 0.6747 sec/batch
Epoch 14/20  Iteration 2418/3560 Training loss: 1.8524 0.6744 sec/batch
Epoch 14/20  Iteration 2419/3560 Training loss: 1.8523 0.6774 sec/batch
Epoch 14/20  Iteration 2420/3560 Training loss: 1.8521 0.6920 sec/batch
Epoch 14/20  Iteration 2421/3560 Training loss: 1.8521 0.6796 sec/batch
Epoch 14/20  Iteration 2422/3560 Training loss: 1.8520 0.6665 sec/batch
Epoch 14/20  Iteration 2423/3560 Training loss: 1.8520 0.6947 sec/batch
Epoch 14/20  Iteration 2424/3560 Training loss: 1.8520 0.6862 sec/batch
Epoch 14/20  Iteration 2425/3560 Training loss: 1.8520 0.6731 sec/batch
Epoch 14/20  Iteration 2426/3560 Training loss: 1.8519 0.6695 sec/batch
Epoch 14/20  Iteration 2427/3560 Training loss: 1.8518 0.6813 sec/batch
Epoch 14/20  Iteration 2428/3560 Training loss: 1.8516 0.6911 sec/batch
Epoch 14/20  Iteration 2429/3560 Training loss: 1.8514 0.6666 sec/batch
Epoch 14/20  Iteration 2430/3560 Training loss: 1.8511 0.6819 sec/batch
Epoch 14/20  Iteration 2431/3560 Training loss: 1.8511 0.6665 sec/batch
Epoch 14/20  Iteration 2432/3560 Training loss: 1.8511 0.6658 sec/batch
Epoch 14/20  Iteration 2433/3560 Training loss: 1.8511 0.6780 sec/batch
Epoch 14/20  Iteration 2434/3560 Training loss: 1.8512 0.6674 sec/batch
Epoch 14/20  Iteration 2435/3560 Training loss: 1.8512 0.6800 sec/batch
Epoch 14/20  Iteration 2436/3560 Training loss: 1.8509 0.6811 sec/batch
Epoch 14/20  Iteration 2437/3560 Training loss: 1.8506 0.6788 sec/batch
Epoch 14/20  Iteration 2438/3560 Training loss: 1.8507 0.6800 sec/batch
Epoch 14/20  Iteration 2439/3560 Training loss: 1.8507 0.6821 sec/batch
Epoch 14/20  Iteration 2440/3560 Training loss: 1.8503 0.6712 sec/batch
Epoch 14/20  Iteration 2441/3560 Training loss: 1.8504 0.6793 sec/batch
Epoch 14/20  Iteration 2442/3560 Training loss: 1.8505 0.6805 sec/batch
Epoch 14/20  Iteration 2443/3560 Training loss: 1.8504 0.6843 sec/batch
Epoch 14/20  Iteration 2444/3560 Training loss: 1.8503 0.6754 sec/batch
Epoch 14/20  Iteration 2445/3560 Training loss: 1.8501 0.6577 sec/batch
Epoch 14/20  Iteration 2446/3560 Training loss: 1.8499 0.6916 sec/batch
Epoch 14/20  Iteration 2447/3560 Training loss: 1.8500 0.6824 sec/batch
Epoch 14/20  Iteration 2448/3560 Training loss: 1.8500 0.6666 sec/batch
Epoch 14/20  Iteration 2449/3560 Training loss: 1.8499 0.6712 sec/batch
Epoch 14/20  Iteration 2450/3560 Training loss: 1.8500 0.6807 sec/batch
Epoch 14/20  Iteration 2451/3560 Training loss: 1.8500 0.6744 sec/batch
Epoch 14/20  Iteration 2452/3560 Training loss: 1.8501 0.6718 sec/batch
Epoch 14/20  Iteration 2453/3560 Training loss: 1.8502 0.6851 sec/batch
Epoch 14/20  Iteration 2454/3560 Training loss: 1.8501 0.6831 sec/batch
Epoch 14/20  Iteration 2455/3560 Training loss: 1.8503 0.6753 sec/batch
Epoch 14/20  Iteration 2456/3560 Training loss: 1.8502 0.6699 sec/batch
Epoch 14/20  Iteration 2457/3560 Training loss: 1.8501 0.6567 sec/batch
Epoch 14/20  Iteration 2458/3560 Training loss: 1.8501 0.6760 sec/batch
Epoch 14/20  Iteration 2459/3560 Training loss: 1.8501 0.6806 sec/batch
Epoch 14/20  Iteration 2460/3560 Training loss: 1.8502 0.6846 sec/batch
Epoch 14/20  Iteration 2461/3560 Training loss: 1.8501 0.6682 sec/batch
Epoch 14/20  Iteration 2462/3560 Training loss: 1.8503 0.6785 sec/batch
Epoch 14/20  Iteration 2463/3560 Training loss: 1.8503 0.6713 sec/batch
Epoch 14/20  Iteration 2464/3560 Training loss: 1.8502 0.6985 sec/batch
Epoch 14/20  Iteration 2465/3560 Training loss: 1.8500 0.6563 sec/batch
Epoch 14/20  Iteration 2466/3560 Training loss: 1.8501 0.6746 sec/batch
Epoch 14/20  Iteration 2467/3560 Training loss: 1.8502 0.6653 sec/batch
Epoch 14/20  Iteration 2468/3560 Training loss: 1.8502 0.6857 sec/batch
Epoch 14/20  Iteration 2469/3560 Training loss: 1.8502 0.6796 sec/batch
Epoch 14/20  Iteration 2470/3560 Training loss: 1.8502 0.6774 sec/batch
Epoch 14/20  Iteration 2471/3560 Training loss: 1.8502 0.6738 sec/batch
Epoch 14/20  Iteration 2472/3560 Training loss: 1.8502 0.6675 sec/batch
Epoch 14/20  Iteration 2473/3560 Training loss: 1.8500 0.6770 sec/batch
Epoch 14/20  Iteration 2474/3560 Training loss: 1.8501 0.6767 sec/batch
Epoch 14/20  Iteration 2475/3560 Training loss: 1.8503 0.6604 sec/batch
Epoch 14/20  Iteration 2476/3560 Training loss: 1.8503 0.6917 sec/batch
Epoch 14/20  Iteration 2477/3560 Training loss: 1.8503 0.6800 sec/batch
Epoch 14/20  Iteration 2478/3560 Training loss: 1.8503 0.6828 sec/batch
Epoch 14/20  Iteration 2479/3560 Training loss: 1.8503 0.6844 sec/batch
Epoch 14/20  Iteration 2480/3560 Training loss: 1.8502 0.6883 sec/batch
Epoch 14/20  Iteration 2481/3560 Training loss: 1.8502 0.6658 sec/batch
Epoch 14/20  Iteration 2482/3560 Training loss: 1.8506 0.6846 sec/batch
Epoch 14/20  Iteration 2483/3560 Training loss: 1.8504 0.6777 sec/batch
Epoch 14/20  Iteration 2484/3560 Training loss: 1.8503 0.6788 sec/batch
Epoch 14/20  Iteration 2485/3560 Training loss: 1.8502 0.6815 sec/batch
Epoch 14/20  Iteration 2486/3560 Training loss: 1.8501 0.6805 sec/batch
Epoch 14/20  Iteration 2487/3560 Training loss: 1.8501 0.7955 sec/batch
Epoch 14/20  Iteration 2488/3560 Training loss: 1.8502 0.8794 sec/batch
Epoch 14/20  Iteration 2489/3560 Training loss: 1.8502 0.9515 sec/batch
Epoch 14/20  Iteration 2490/3560 Training loss: 1.8501 0.8939 sec/batch
Epoch 14/20  Iteration 2491/3560 Training loss: 1.8501 0.7703 sec/batch
Epoch 14/20  Iteration 2492/3560 Training loss: 1.8501 0.6737 sec/batch
Epoch 15/20  Iteration 2493/3560 Training loss: 1.9176 0.6777 sec/batch
Epoch 15/20  Iteration 2494/3560 Training loss: 1.8728 0.6852 sec/batch
Epoch 15/20  Iteration 2495/3560 Training loss: 1.8643 0.6817 sec/batch
Epoch 15/20  Iteration 2496/3560 Training loss: 1.8566 0.6831 sec/batch
Epoch 15/20  Iteration 2497/3560 Training loss: 1.8542 0.6839 sec/batch
Epoch 15/20  Iteration 2498/3560 Training loss: 1.8460 0.6708 sec/batch
Epoch 15/20  Iteration 2499/3560 Training loss: 1.8447 0.6704 sec/batch
Epoch 15/20  Iteration 2500/3560 Training loss: 1.8433 0.6778 sec/batch
Epoch 15/20  Iteration 2501/3560 Training loss: 1.8462 0.6759 sec/batch
Epoch 15/20  Iteration 2502/3560 Training loss: 1.8452 0.6779 sec/batch
Epoch 15/20  Iteration 2503/3560 Training loss: 1.8427 0.6769 sec/batch
Epoch 15/20  Iteration 2504/3560 Training loss: 1.8424 0.6826 sec/batch
Epoch 15/20  Iteration 2505/3560 Training loss: 1.8422 0.6853 sec/batch
Epoch 15/20  Iteration 2506/3560 Training loss: 1.8448 0.6831 sec/batch
Epoch 15/20  Iteration 2507/3560 Training loss: 1.8437 0.6944 sec/batch
Epoch 15/20  Iteration 2508/3560 Training loss: 1.8424 0.6734 sec/batch
Epoch 15/20  Iteration 2509/3560 Training loss: 1.8422 0.6737 sec/batch
Epoch 15/20  Iteration 2510/3560 Training loss: 1.8442 0.6840 sec/batch
Epoch 15/20  Iteration 2511/3560 Training loss: 1.8436 0.6848 sec/batch
Epoch 15/20  Iteration 2512/3560 Training loss: 1.8441 0.6723 sec/batch
Epoch 15/20  Iteration 2513/3560 Training loss: 1.8431 0.6702 sec/batch
Epoch 15/20  Iteration 2514/3560 Training loss: 1.8441 0.6714 sec/batch
Epoch 15/20  Iteration 2515/3560 Training loss: 1.8432 0.6802 sec/batch
Epoch 15/20  Iteration 2516/3560 Training loss: 1.8427 0.6839 sec/batch
Epoch 15/20  Iteration 2517/3560 Training loss: 1.8428 0.6833 sec/batch
Epoch 15/20  Iteration 2518/3560 Training loss: 1.8418 0.6815 sec/batch
Epoch 15/20  Iteration 2519/3560 Training loss: 1.8413 0.6770 sec/batch
Epoch 15/20  Iteration 2520/3560 Training loss: 1.8418 0.6767 sec/batch
Epoch 15/20  Iteration 2521/3560 Training loss: 1.8428 0.6804 sec/batch
Epoch 15/20  Iteration 2522/3560 Training loss: 1.8431 0.6833 sec/batch
Epoch 15/20  Iteration 2523/3560 Training loss: 1.8427 0.6806 sec/batch
Epoch 15/20  Iteration 2524/3560 Training loss: 1.8420 0.6682 sec/batch
Epoch 15/20  Iteration 2525/3560 Training loss: 1.8419 0.6815 sec/batch
Epoch 15/20  Iteration 2526/3560 Training loss: 1.8428 0.6853 sec/batch
Epoch 15/20  Iteration 2527/3560 Training loss: 1.8425 0.6786 sec/batch
Epoch 15/20  Iteration 2528/3560 Training loss: 1.8425 0.6834 sec/batch
Epoch 15/20  Iteration 2529/3560 Training loss: 1.8420 0.6825 sec/batch
Epoch 15/20  Iteration 2530/3560 Training loss: 1.8405 0.6791 sec/batch
Epoch 15/20  Iteration 2531/3560 Training loss: 1.8394 0.6852 sec/batch
Epoch 15/20  Iteration 2532/3560 Training loss: 1.8385 0.6844 sec/batch
Epoch 15/20  Iteration 2533/3560 Training loss: 1.8378 0.6776 sec/batch
Epoch 15/20  Iteration 2534/3560 Training loss: 1.8381 0.6843 sec/batch
Epoch 15/20  Iteration 2535/3560 Training loss: 1.8377 0.6746 sec/batch
Epoch 15/20  Iteration 2536/3560 Training loss: 1.8370 0.6830 sec/batch
Epoch 15/20  Iteration 2537/3560 Training loss: 1.8374 0.6855 sec/batch
Epoch 15/20  Iteration 2538/3560 Training loss: 1.8361 0.6766 sec/batch
Epoch 15/20  Iteration 2539/3560 Training loss: 1.8361 0.6708 sec/batch
Epoch 15/20  Iteration 2540/3560 Training loss: 1.8359 0.6756 sec/batch
Epoch 15/20  Iteration 2541/3560 Training loss: 1.8357 0.6844 sec/batch
Epoch 15/20  Iteration 2542/3560 Training loss: 1.8364 0.6795 sec/batch
Epoch 15/20  Iteration 2543/3560 Training loss: 1.8359 0.6762 sec/batch
Epoch 15/20  Iteration 2544/3560 Training loss: 1.8366 0.6815 sec/batch
Epoch 15/20  Iteration 2545/3560 Training loss: 1.8361 0.6873 sec/batch
Epoch 15/20  Iteration 2546/3560 Training loss: 1.8360 0.6743 sec/batch
Epoch 15/20  Iteration 2547/3560 Training loss: 1.8357 0.6817 sec/batch
Epoch 15/20  Iteration 2548/3560 Training loss: 1.8360 0.6815 sec/batch
Epoch 15/20  Iteration 2549/3560 Training loss: 1.8362 0.6780 sec/batch
Epoch 15/20  Iteration 2550/3560 Training loss: 1.8358 0.6947 sec/batch
Epoch 15/20  Iteration 2551/3560 Training loss: 1.8353 0.6810 sec/batch
Epoch 15/20  Iteration 2552/3560 Training loss: 1.8359 0.6832 sec/batch
Epoch 15/20  Iteration 2553/3560 Training loss: 1.8358 0.6791 sec/batch
Epoch 15/20  Iteration 2554/3560 Training loss: 1.8364 0.6730 sec/batch
Epoch 15/20  Iteration 2555/3560 Training loss: 1.8367 0.6799 sec/batch
Epoch 15/20  Iteration 2556/3560 Training loss: 1.8368 0.6746 sec/batch
Epoch 15/20  Iteration 2557/3560 Training loss: 1.8368 0.6853 sec/batch
Epoch 15/20  Iteration 2558/3560 Training loss: 1.8370 0.6785 sec/batch
Epoch 15/20  Iteration 2559/3560 Training loss: 1.8369 0.6847 sec/batch
Epoch 15/20  Iteration 2560/3560 Training loss: 1.8366 0.6847 sec/batch
Epoch 15/20  Iteration 2561/3560 Training loss: 1.8365 0.6782 sec/batch
Epoch 15/20  Iteration 2562/3560 Training loss: 1.8363 0.6821 sec/batch
Epoch 15/20  Iteration 2563/3560 Training loss: 1.8366 0.6843 sec/batch
Epoch 15/20  Iteration 2564/3560 Training loss: 1.8368 0.6819 sec/batch
Epoch 15/20  Iteration 2565/3560 Training loss: 1.8372 0.6837 sec/batch
Epoch 15/20  Iteration 2566/3560 Training loss: 1.8369 0.6803 sec/batch
Epoch 15/20  Iteration 2567/3560 Training loss: 1.8367 0.6751 sec/batch
Epoch 15/20  Iteration 2568/3560 Training loss: 1.8370 0.6852 sec/batch
Epoch 15/20  Iteration 2569/3560 Training loss: 1.8369 0.6734 sec/batch
Epoch 15/20  Iteration 2570/3560 Training loss: 1.8370 0.6946 sec/batch
Epoch 15/20  Iteration 2571/3560 Training loss: 1.8366 0.7855 sec/batch
Epoch 15/20  Iteration 2572/3560 Training loss: 1.8365 0.6698 sec/batch
Epoch 15/20  Iteration 2573/3560 Training loss: 1.8362 0.7170 sec/batch
Epoch 15/20  Iteration 2574/3560 Training loss: 1.8363 0.7086 sec/batch
Epoch 15/20  Iteration 2575/3560 Training loss: 1.8357 0.6798 sec/batch
Epoch 15/20  Iteration 2576/3560 Training loss: 1.8357 0.6920 sec/batch
Epoch 15/20  Iteration 2577/3560 Training loss: 1.8351 0.6792 sec/batch
Epoch 15/20  Iteration 2578/3560 Training loss: 1.8347 0.6791 sec/batch
Epoch 15/20  Iteration 2579/3560 Training loss: 1.8346 0.6793 sec/batch
Epoch 15/20  Iteration 2580/3560 Training loss: 1.8343 0.6851 sec/batch
Epoch 15/20  Iteration 2581/3560 Training loss: 1.8339 0.6829 sec/batch
Epoch 15/20  Iteration 2582/3560 Training loss: 1.8340 0.6852 sec/batch
Epoch 15/20  Iteration 2583/3560 Training loss: 1.8337 0.6839 sec/batch
Epoch 15/20  Iteration 2584/3560 Training loss: 1.8336 0.6679 sec/batch
Epoch 15/20  Iteration 2585/3560 Training loss: 1.8332 0.6786 sec/batch
Epoch 15/20  Iteration 2586/3560 Training loss: 1.8328 0.6574 sec/batch
Epoch 15/20  Iteration 2587/3560 Training loss: 1.8325 0.6826 sec/batch
Epoch 15/20  Iteration 2588/3560 Training loss: 1.8324 0.6780 sec/batch
Epoch 15/20  Iteration 2589/3560 Training loss: 1.8323 0.6754 sec/batch
Epoch 15/20  Iteration 2590/3560 Training loss: 1.8320 0.6791 sec/batch
Epoch 15/20  Iteration 2591/3560 Training loss: 1.8315 0.6806 sec/batch
Epoch 15/20  Iteration 2592/3560 Training loss: 1.8312 0.6798 sec/batch
Epoch 15/20  Iteration 2593/3560 Training loss: 1.8312 0.6747 sec/batch
Epoch 15/20  Iteration 2594/3560 Training loss: 1.8312 0.6908 sec/batch
Epoch 15/20  Iteration 2595/3560 Training loss: 1.8309 0.6820 sec/batch
Epoch 15/20  Iteration 2596/3560 Training loss: 1.8308 0.6901 sec/batch
Epoch 15/20  Iteration 2597/3560 Training loss: 1.8306 0.6774 sec/batch
Epoch 15/20  Iteration 2598/3560 Training loss: 1.8304 0.6817 sec/batch
Epoch 15/20  Iteration 2599/3560 Training loss: 1.8304 0.6787 sec/batch
Epoch 15/20  Iteration 2600/3560 Training loss: 1.8303 0.6863 sec/batch
Epoch 15/20  Iteration 2601/3560 Training loss: 1.8304 0.6679 sec/batch
Epoch 15/20  Iteration 2602/3560 Training loss: 1.8304 0.6763 sec/batch
Epoch 15/20  Iteration 2603/3560 Training loss: 1.8304 0.6876 sec/batch
Epoch 15/20  Iteration 2604/3560 Training loss: 1.8301 0.6840 sec/batch
Epoch 15/20  Iteration 2605/3560 Training loss: 1.8300 0.6808 sec/batch
Epoch 15/20  Iteration 2606/3560 Training loss: 1.8298 0.6819 sec/batch
Epoch 15/20  Iteration 2607/3560 Training loss: 1.8296 0.6915 sec/batch
Epoch 15/20  Iteration 2608/3560 Training loss: 1.8293 0.6991 sec/batch
Epoch 15/20  Iteration 2609/3560 Training loss: 1.8293 0.6817 sec/batch
Epoch 15/20  Iteration 2610/3560 Training loss: 1.8293 0.6788 sec/batch
Epoch 15/20  Iteration 2611/3560 Training loss: 1.8293 0.6851 sec/batch
Epoch 15/20  Iteration 2612/3560 Training loss: 1.8292 0.6744 sec/batch
Epoch 15/20  Iteration 2613/3560 Training loss: 1.8292 0.6905 sec/batch
Epoch 15/20  Iteration 2614/3560 Training loss: 1.8290 0.6783 sec/batch
Epoch 15/20  Iteration 2615/3560 Training loss: 1.8288 0.6818 sec/batch
Epoch 15/20  Iteration 2616/3560 Training loss: 1.8289 0.6735 sec/batch
Epoch 15/20  Iteration 2617/3560 Training loss: 1.8290 0.6870 sec/batch
Epoch 15/20  Iteration 2618/3560 Training loss: 1.8287 0.6801 sec/batch
Epoch 15/20  Iteration 2619/3560 Training loss: 1.8289 0.6834 sec/batch
Epoch 15/20  Iteration 2620/3560 Training loss: 1.8291 0.6800 sec/batch
Epoch 15/20  Iteration 2621/3560 Training loss: 1.8290 0.6807 sec/batch
Epoch 15/20  Iteration 2622/3560 Training loss: 1.8289 0.6674 sec/batch
Epoch 15/20  Iteration 2623/3560 Training loss: 1.8286 0.6774 sec/batch
Epoch 15/20  Iteration 2624/3560 Training loss: 1.8284 0.6585 sec/batch
Epoch 15/20  Iteration 2625/3560 Training loss: 1.8284 0.6901 sec/batch
Epoch 15/20  Iteration 2626/3560 Training loss: 1.8284 0.6856 sec/batch
Epoch 15/20  Iteration 2627/3560 Training loss: 1.8284 0.6861 sec/batch
Epoch 15/20  Iteration 2628/3560 Training loss: 1.8285 0.6844 sec/batch
Epoch 15/20  Iteration 2629/3560 Training loss: 1.8286 0.6755 sec/batch
Epoch 15/20  Iteration 2630/3560 Training loss: 1.8286 0.6778 sec/batch
Epoch 15/20  Iteration 2631/3560 Training loss: 1.8288 0.6806 sec/batch
Epoch 15/20  Iteration 2632/3560 Training loss: 1.8287 0.6851 sec/batch
Epoch 15/20  Iteration 2633/3560 Training loss: 1.8289 0.6862 sec/batch
Epoch 15/20  Iteration 2634/3560 Training loss: 1.8289 0.6820 sec/batch
Epoch 15/20  Iteration 2635/3560 Training loss: 1.8289 0.6880 sec/batch
Epoch 15/20  Iteration 2636/3560 Training loss: 1.8289 0.6794 sec/batch
Epoch 15/20  Iteration 2637/3560 Training loss: 1.8288 0.6933 sec/batch
Epoch 15/20  Iteration 2638/3560 Training loss: 1.8289 0.6812 sec/batch
Epoch 15/20  Iteration 2639/3560 Training loss: 1.8290 0.6729 sec/batch
Epoch 15/20  Iteration 2640/3560 Training loss: 1.8291 0.6720 sec/batch
Epoch 15/20  Iteration 2641/3560 Training loss: 1.8292 0.6829 sec/batch
Epoch 15/20  Iteration 2642/3560 Training loss: 1.8290 0.6854 sec/batch
Epoch 15/20  Iteration 2643/3560 Training loss: 1.8288 0.6867 sec/batch
Epoch 15/20  Iteration 2644/3560 Training loss: 1.8290 0.6835 sec/batch
Epoch 15/20  Iteration 2645/3560 Training loss: 1.8290 0.6751 sec/batch
Epoch 15/20  Iteration 2646/3560 Training loss: 1.8291 0.6782 sec/batch
Epoch 15/20  Iteration 2647/3560 Training loss: 1.8290 0.6812 sec/batch
Epoch 15/20  Iteration 2648/3560 Training loss: 1.8289 0.6758 sec/batch
Epoch 15/20  Iteration 2649/3560 Training loss: 1.8290 0.6999 sec/batch
Epoch 15/20  Iteration 2650/3560 Training loss: 1.8290 0.7738 sec/batch
Epoch 15/20  Iteration 2651/3560 Training loss: 1.8287 0.6419 sec/batch
Epoch 15/20  Iteration 2652/3560 Training loss: 1.8289 0.6370 sec/batch
Epoch 15/20  Iteration 2653/3560 Training loss: 1.8291 0.6236 sec/batch
Epoch 15/20  Iteration 2654/3560 Training loss: 1.8290 0.6729 sec/batch
Epoch 15/20  Iteration 2655/3560 Training loss: 1.8291 0.6632 sec/batch
Epoch 15/20  Iteration 2656/3560 Training loss: 1.8291 0.6368 sec/batch
Epoch 15/20  Iteration 2657/3560 Training loss: 1.8291 0.8388 sec/batch
Epoch 15/20  Iteration 2658/3560 Training loss: 1.8290 0.8966 sec/batch
Epoch 15/20  Iteration 2659/3560 Training loss: 1.8290 0.8806 sec/batch
Epoch 15/20  Iteration 2660/3560 Training loss: 1.8294 1.0602 sec/batch
Epoch 15/20  Iteration 2661/3560 Training loss: 1.8293 0.9781 sec/batch
Epoch 15/20  Iteration 2662/3560 Training loss: 1.8292 0.7820 sec/batch
Epoch 15/20  Iteration 2663/3560 Training loss: 1.8291 0.7534 sec/batch
Epoch 15/20  Iteration 2664/3560 Training loss: 1.8289 0.6917 sec/batch
Epoch 15/20  Iteration 2665/3560 Training loss: 1.8290 0.7528 sec/batch
Epoch 15/20  Iteration 2666/3560 Training loss: 1.8289 0.7766 sec/batch
Epoch 15/20  Iteration 2667/3560 Training loss: 1.8290 0.6778 sec/batch
Epoch 15/20  Iteration 2668/3560 Training loss: 1.8289 0.6854 sec/batch
Epoch 15/20  Iteration 2669/3560 Training loss: 1.8287 0.6743 sec/batch
Epoch 15/20  Iteration 2670/3560 Training loss: 1.8288 0.6752 sec/batch
Epoch 16/20  Iteration 2671/3560 Training loss: 1.8982 0.6741 sec/batch
Epoch 16/20  Iteration 2672/3560 Training loss: 1.8580 0.6783 sec/batch
Epoch 16/20  Iteration 2673/3560 Training loss: 1.8477 0.6827 sec/batch
Epoch 16/20  Iteration 2674/3560 Training loss: 1.8430 0.6818 sec/batch
Epoch 16/20  Iteration 2675/3560 Training loss: 1.8405 0.6678 sec/batch
Epoch 16/20  Iteration 2676/3560 Training loss: 1.8305 0.6769 sec/batch
Epoch 16/20  Iteration 2677/3560 Training loss: 1.8295 0.6911 sec/batch
Epoch 16/20  Iteration 2678/3560 Training loss: 1.8274 0.6818 sec/batch
Epoch 16/20  Iteration 2679/3560 Training loss: 1.8294 0.7026 sec/batch
Epoch 16/20  Iteration 2680/3560 Training loss: 1.8287 0.6640 sec/batch
Epoch 16/20  Iteration 2681/3560 Training loss: 1.8262 0.6892 sec/batch
Epoch 16/20  Iteration 2682/3560 Training loss: 1.8243 0.6723 sec/batch
Epoch 16/20  Iteration 2683/3560 Training loss: 1.8238 0.6814 sec/batch
Epoch 16/20  Iteration 2684/3560 Training loss: 1.8264 0.6750 sec/batch
Epoch 16/20  Iteration 2685/3560 Training loss: 1.8251 0.6820 sec/batch
Epoch 16/20  Iteration 2686/3560 Training loss: 1.8236 0.6850 sec/batch
Epoch 16/20  Iteration 2687/3560 Training loss: 1.8231 0.6841 sec/batch
Epoch 16/20  Iteration 2688/3560 Training loss: 1.8251 0.6746 sec/batch
Epoch 16/20  Iteration 2689/3560 Training loss: 1.8248 0.6721 sec/batch
Epoch 16/20  Iteration 2690/3560 Training loss: 1.8248 0.6708 sec/batch
Epoch 16/20  Iteration 2691/3560 Training loss: 1.8240 0.6804 sec/batch
Epoch 16/20  Iteration 2692/3560 Training loss: 1.8252 0.6732 sec/batch
Epoch 16/20  Iteration 2693/3560 Training loss: 1.8244 0.6774 sec/batch
Epoch 16/20  Iteration 2694/3560 Training loss: 1.8237 0.6708 sec/batch
Epoch 16/20  Iteration 2695/3560 Training loss: 1.8237 0.6879 sec/batch
Epoch 16/20  Iteration 2696/3560 Training loss: 1.8228 0.6745 sec/batch
Epoch 16/20  Iteration 2697/3560 Training loss: 1.8219 0.6737 sec/batch
Epoch 16/20  Iteration 2698/3560 Training loss: 1.8221 0.6775 sec/batch
Epoch 16/20  Iteration 2699/3560 Training loss: 1.8233 0.6863 sec/batch
Epoch 16/20  Iteration 2700/3560 Training loss: 1.8236 0.6797 sec/batch
Epoch 16/20  Iteration 2701/3560 Training loss: 1.8232 0.6751 sec/batch
Epoch 16/20  Iteration 2702/3560 Training loss: 1.8225 0.6797 sec/batch
Epoch 16/20  Iteration 2703/3560 Training loss: 1.8225 0.6764 sec/batch
Epoch 16/20  Iteration 2704/3560 Training loss: 1.8232 0.6804 sec/batch
Epoch 16/20  Iteration 2705/3560 Training loss: 1.8227 0.6830 sec/batch
Epoch 16/20  Iteration 2706/3560 Training loss: 1.8223 0.6796 sec/batch
Epoch 16/20  Iteration 2707/3560 Training loss: 1.8219 0.6757 sec/batch
Epoch 16/20  Iteration 2708/3560 Training loss: 1.8208 0.6817 sec/batch
Epoch 16/20  Iteration 2709/3560 Training loss: 1.8196 0.6865 sec/batch
Epoch 16/20  Iteration 2710/3560 Training loss: 1.8190 0.6754 sec/batch
Epoch 16/20  Iteration 2711/3560 Training loss: 1.8184 0.6830 sec/batch
Epoch 16/20  Iteration 2712/3560 Training loss: 1.8186 0.6887 sec/batch
Epoch 16/20  Iteration 2713/3560 Training loss: 1.8184 0.6810 sec/batch
Epoch 16/20  Iteration 2714/3560 Training loss: 1.8176 0.6782 sec/batch
Epoch 16/20  Iteration 2715/3560 Training loss: 1.8177 0.6870 sec/batch
Epoch 16/20  Iteration 2716/3560 Training loss: 1.8166 0.6821 sec/batch
Epoch 16/20  Iteration 2717/3560 Training loss: 1.8164 0.6730 sec/batch
Epoch 16/20  Iteration 2718/3560 Training loss: 1.8163 0.6790 sec/batch
Epoch 16/20  Iteration 2719/3560 Training loss: 1.8160 0.6837 sec/batch
Epoch 16/20  Iteration 2720/3560 Training loss: 1.8166 0.6793 sec/batch
Epoch 16/20  Iteration 2721/3560 Training loss: 1.8161 0.6837 sec/batch
Epoch 16/20  Iteration 2722/3560 Training loss: 1.8169 0.6799 sec/batch
Epoch 16/20  Iteration 2723/3560 Training loss: 1.8166 0.7038 sec/batch
Epoch 16/20  Iteration 2724/3560 Training loss: 1.8165 0.6809 sec/batch
Epoch 16/20  Iteration 2725/3560 Training loss: 1.8160 0.6620 sec/batch
Epoch 16/20  Iteration 2726/3560 Training loss: 1.8160 0.6654 sec/batch
Epoch 16/20  Iteration 2727/3560 Training loss: 1.8164 0.6793 sec/batch
Epoch 16/20  Iteration 2728/3560 Training loss: 1.8160 0.6805 sec/batch
Epoch 16/20  Iteration 2729/3560 Training loss: 1.8156 0.6848 sec/batch
Epoch 16/20  Iteration 2730/3560 Training loss: 1.8161 0.6837 sec/batch
Epoch 16/20  Iteration 2731/3560 Training loss: 1.8160 0.6806 sec/batch
Epoch 16/20  Iteration 2732/3560 Training loss: 1.8168 0.7060 sec/batch
Epoch 16/20  Iteration 2733/3560 Training loss: 1.8172 0.6567 sec/batch
Epoch 16/20  Iteration 2734/3560 Training loss: 1.8174 0.6680 sec/batch
Epoch 16/20  Iteration 2735/3560 Training loss: 1.8172 0.6954 sec/batch
Epoch 16/20  Iteration 2736/3560 Training loss: 1.8175 0.6747 sec/batch
Epoch 16/20  Iteration 2737/3560 Training loss: 1.8176 0.6626 sec/batch
Epoch 16/20  Iteration 2738/3560 Training loss: 1.8172 0.6673 sec/batch
Epoch 16/20  Iteration 2739/3560 Training loss: 1.8173 0.6942 sec/batch
Epoch 16/20  Iteration 2740/3560 Training loss: 1.8173 0.6819 sec/batch
Epoch 16/20  Iteration 2741/3560 Training loss: 1.8179 0.6765 sec/batch
Epoch 16/20  Iteration 2742/3560 Training loss: 1.8182 0.6721 sec/batch
Epoch 16/20  Iteration 2743/3560 Training loss: 1.8187 0.8132 sec/batch
Epoch 16/20  Iteration 2744/3560 Training loss: 1.8184 0.6737 sec/batch
Epoch 16/20  Iteration 2745/3560 Training loss: 1.8184 0.7357 sec/batch
Epoch 16/20  Iteration 2746/3560 Training loss: 1.8187 0.6831 sec/batch
Epoch 16/20  Iteration 2747/3560 Training loss: 1.8187 0.6713 sec/batch
Epoch 16/20  Iteration 2748/3560 Training loss: 1.8187 0.6654 sec/batch
Epoch 16/20  Iteration 2749/3560 Training loss: 1.8184 0.6745 sec/batch
Epoch 16/20  Iteration 2750/3560 Training loss: 1.8182 0.6805 sec/batch
Epoch 16/20  Iteration 2751/3560 Training loss: 1.8177 0.6777 sec/batch
Epoch 16/20  Iteration 2752/3560 Training loss: 1.8179 0.6766 sec/batch
Epoch 16/20  Iteration 2753/3560 Training loss: 1.8173 0.6828 sec/batch
Epoch 16/20  Iteration 2754/3560 Training loss: 1.8172 0.6852 sec/batch
Epoch 16/20  Iteration 2755/3560 Training loss: 1.8168 0.6800 sec/batch
Epoch 16/20  Iteration 2756/3560 Training loss: 1.8164 0.6886 sec/batch
Epoch 16/20  Iteration 2757/3560 Training loss: 1.8163 0.6720 sec/batch
Epoch 16/20  Iteration 2758/3560 Training loss: 1.8159 0.6761 sec/batch
Epoch 16/20  Iteration 2759/3560 Training loss: 1.8154 0.6775 sec/batch
Epoch 16/20  Iteration 2760/3560 Training loss: 1.8155 0.6858 sec/batch
Epoch 16/20  Iteration 2761/3560 Training loss: 1.8151 0.6752 sec/batch
Epoch 16/20  Iteration 2762/3560 Training loss: 1.8149 0.6952 sec/batch
Epoch 16/20  Iteration 2763/3560 Training loss: 1.8145 0.6862 sec/batch
Epoch 16/20  Iteration 2764/3560 Training loss: 1.8141 0.6768 sec/batch
Epoch 16/20  Iteration 2765/3560 Training loss: 1.8138 0.6752 sec/batch
Epoch 16/20  Iteration 2766/3560 Training loss: 1.8138 0.7016 sec/batch
Epoch 16/20  Iteration 2767/3560 Training loss: 1.8137 0.6829 sec/batch
Epoch 16/20  Iteration 2768/3560 Training loss: 1.8134 0.6688 sec/batch
Epoch 16/20  Iteration 2769/3560 Training loss: 1.8131 0.6730 sec/batch
Epoch 16/20  Iteration 2770/3560 Training loss: 1.8128 0.6692 sec/batch
Epoch 16/20  Iteration 2771/3560 Training loss: 1.8127 0.6837 sec/batch
Epoch 16/20  Iteration 2772/3560 Training loss: 1.8125 0.6753 sec/batch
Epoch 16/20  Iteration 2773/3560 Training loss: 1.8124 0.7653 sec/batch
Epoch 16/20  Iteration 2774/3560 Training loss: 1.8123 0.8740 sec/batch
Epoch 16/20  Iteration 2775/3560 Training loss: 1.8120 0.6828 sec/batch
Epoch 16/20  Iteration 2776/3560 Training loss: 1.8120 0.6760 sec/batch
Epoch 16/20  Iteration 2777/3560 Training loss: 1.8119 0.6753 sec/batch
Epoch 16/20  Iteration 2778/3560 Training loss: 1.8118 0.6786 sec/batch
Epoch 16/20  Iteration 2779/3560 Training loss: 1.8118 0.6753 sec/batch
Epoch 16/20  Iteration 2780/3560 Training loss: 1.8118 0.6807 sec/batch
Epoch 16/20  Iteration 2781/3560 Training loss: 1.8116 0.6777 sec/batch
Epoch 16/20  Iteration 2782/3560 Training loss: 1.8114 0.6839 sec/batch
Epoch 16/20  Iteration 2783/3560 Training loss: 1.8113 0.6835 sec/batch
Epoch 16/20  Iteration 2784/3560 Training loss: 1.8112 0.6801 sec/batch
Epoch 16/20  Iteration 2785/3560 Training loss: 1.8108 0.6795 sec/batch
Epoch 16/20  Iteration 2786/3560 Training loss: 1.8106 0.6826 sec/batch
Epoch 16/20  Iteration 2787/3560 Training loss: 1.8106 0.6763 sec/batch
Epoch 16/20  Iteration 2788/3560 Training loss: 1.8104 0.6914 sec/batch
Epoch 16/20  Iteration 2789/3560 Training loss: 1.8104 0.6766 sec/batch
Epoch 16/20  Iteration 2790/3560 Training loss: 1.8103 0.6830 sec/batch
Epoch 16/20  Iteration 2791/3560 Training loss: 1.8104 0.6741 sec/batch
Epoch 16/20  Iteration 2792/3560 Training loss: 1.8099 0.6813 sec/batch
Epoch 16/20  Iteration 2793/3560 Training loss: 1.8098 0.6852 sec/batch
Epoch 16/20  Iteration 2794/3560 Training loss: 1.8098 0.6766 sec/batch
Epoch 16/20  Iteration 2795/3560 Training loss: 1.8098 0.6639 sec/batch
Epoch 16/20  Iteration 2796/3560 Training loss: 1.8094 0.6975 sec/batch
Epoch 16/20  Iteration 2797/3560 Training loss: 1.8096 0.6880 sec/batch
Epoch 16/20  Iteration 2798/3560 Training loss: 1.8096 0.6768 sec/batch
Epoch 16/20  Iteration 2799/3560 Training loss: 1.8095 0.6779 sec/batch
Epoch 16/20  Iteration 2800/3560 Training loss: 1.8095 0.6783 sec/batch
Epoch 16/20  Iteration 2801/3560 Training loss: 1.8093 0.6792 sec/batch
Epoch 16/20  Iteration 2802/3560 Training loss: 1.8091 0.6775 sec/batch
Epoch 16/20  Iteration 2803/3560 Training loss: 1.8091 0.6860 sec/batch
Epoch 16/20  Iteration 2804/3560 Training loss: 1.8091 0.6872 sec/batch
Epoch 16/20  Iteration 2805/3560 Training loss: 1.8091 0.6645 sec/batch
Epoch 16/20  Iteration 2806/3560 Training loss: 1.8092 0.6734 sec/batch
Epoch 16/20  Iteration 2807/3560 Training loss: 1.8092 0.6834 sec/batch
Epoch 16/20  Iteration 2808/3560 Training loss: 1.8093 0.6789 sec/batch
Epoch 16/20  Iteration 2809/3560 Training loss: 1.8094 0.6819 sec/batch
Epoch 16/20  Iteration 2810/3560 Training loss: 1.8094 0.7013 sec/batch
Epoch 16/20  Iteration 2811/3560 Training loss: 1.8096 0.6884 sec/batch
Epoch 16/20  Iteration 2812/3560 Training loss: 1.8095 0.6839 sec/batch
Epoch 16/20  Iteration 2813/3560 Training loss: 1.8095 0.6863 sec/batch
Epoch 16/20  Iteration 2814/3560 Training loss: 1.8095 0.6803 sec/batch
Epoch 16/20  Iteration 2815/3560 Training loss: 1.8095 0.6741 sec/batch
Epoch 16/20  Iteration 2816/3560 Training loss: 1.8095 0.6797 sec/batch
Epoch 16/20  Iteration 2817/3560 Training loss: 1.8095 0.6788 sec/batch
Epoch 16/20  Iteration 2818/3560 Training loss: 1.8097 0.6844 sec/batch
Epoch 16/20  Iteration 2819/3560 Training loss: 1.8097 0.6973 sec/batch
Epoch 16/20  Iteration 2820/3560 Training loss: 1.8096 0.6927 sec/batch
Epoch 16/20  Iteration 2821/3560 Training loss: 1.8094 0.6918 sec/batch
Epoch 16/20  Iteration 2822/3560 Training loss: 1.8096 0.6796 sec/batch
Epoch 16/20  Iteration 2823/3560 Training loss: 1.8096 0.6886 sec/batch
Epoch 16/20  Iteration 2824/3560 Training loss: 1.8096 0.6834 sec/batch
Epoch 16/20  Iteration 2825/3560 Training loss: 1.8095 0.6779 sec/batch
Epoch 16/20  Iteration 2826/3560 Training loss: 1.8095 0.6765 sec/batch
Epoch 16/20  Iteration 2827/3560 Training loss: 1.8095 0.6858 sec/batch
Epoch 16/20  Iteration 2828/3560 Training loss: 1.8095 0.6878 sec/batch
Epoch 16/20  Iteration 2829/3560 Training loss: 1.8093 0.6706 sec/batch
Epoch 16/20  Iteration 2830/3560 Training loss: 1.8095 0.6831 sec/batch
Epoch 16/20  Iteration 2831/3560 Training loss: 1.8097 0.6803 sec/batch
Epoch 16/20  Iteration 2832/3560 Training loss: 1.8096 0.6810 sec/batch
Epoch 16/20  Iteration 2833/3560 Training loss: 1.8097 0.6768 sec/batch
Epoch 16/20  Iteration 2834/3560 Training loss: 1.8097 0.6745 sec/batch
Epoch 16/20  Iteration 2835/3560 Training loss: 1.8097 0.6726 sec/batch
Epoch 16/20  Iteration 2836/3560 Training loss: 1.8096 0.6745 sec/batch
Epoch 16/20  Iteration 2837/3560 Training loss: 1.8097 0.6702 sec/batch
Epoch 16/20  Iteration 2838/3560 Training loss: 1.8099 0.6807 sec/batch
Epoch 16/20  Iteration 2839/3560 Training loss: 1.8099 0.6735 sec/batch
Epoch 16/20  Iteration 2840/3560 Training loss: 1.8098 0.6863 sec/batch
Epoch 16/20  Iteration 2841/3560 Training loss: 1.8097 0.6902 sec/batch
Epoch 16/20  Iteration 2842/3560 Training loss: 1.8095 0.6737 sec/batch
Epoch 16/20  Iteration 2843/3560 Training loss: 1.8096 0.6817 sec/batch
Epoch 16/20  Iteration 2844/3560 Training loss: 1.8097 0.6785 sec/batch
Epoch 16/20  Iteration 2845/3560 Training loss: 1.8097 0.6786 sec/batch
Epoch 16/20  Iteration 2846/3560 Training loss: 1.8097 0.6777 sec/batch
Epoch 16/20  Iteration 2847/3560 Training loss: 1.8096 0.6697 sec/batch
Epoch 16/20  Iteration 2848/3560 Training loss: 1.8096 0.6483 sec/batch
Epoch 17/20  Iteration 2849/3560 Training loss: 1.8716 0.6402 sec/batch
Epoch 17/20  Iteration 2850/3560 Training loss: 1.8317 0.6982 sec/batch
Epoch 17/20  Iteration 2851/3560 Training loss: 1.8212 0.6738 sec/batch
Epoch 17/20  Iteration 2852/3560 Training loss: 1.8170 0.6766 sec/batch
Epoch 17/20  Iteration 2853/3560 Training loss: 1.8133 0.6966 sec/batch
Epoch 17/20  Iteration 2854/3560 Training loss: 1.8046 0.6885 sec/batch
Epoch 17/20  Iteration 2855/3560 Training loss: 1.8063 0.6789 sec/batch
Epoch 17/20  Iteration 2856/3560 Training loss: 1.8063 0.6735 sec/batch
Epoch 17/20  Iteration 2857/3560 Training loss: 1.8079 0.6761 sec/batch
Epoch 17/20  Iteration 2858/3560 Training loss: 1.8079 0.6817 sec/batch
Epoch 17/20  Iteration 2859/3560 Training loss: 1.8050 0.6714 sec/batch
Epoch 17/20  Iteration 2860/3560 Training loss: 1.8037 0.6710 sec/batch
Epoch 17/20  Iteration 2861/3560 Training loss: 1.8035 0.6919 sec/batch
Epoch 17/20  Iteration 2862/3560 Training loss: 1.8056 0.6823 sec/batch
Epoch 17/20  Iteration 2863/3560 Training loss: 1.8055 0.6596 sec/batch
Epoch 17/20  Iteration 2864/3560 Training loss: 1.8042 0.6636 sec/batch
Epoch 17/20  Iteration 2865/3560 Training loss: 1.8042 0.6690 sec/batch
Epoch 17/20  Iteration 2866/3560 Training loss: 1.8061 0.6686 sec/batch
Epoch 17/20  Iteration 2867/3560 Training loss: 1.8055 0.6770 sec/batch
Epoch 17/20  Iteration 2868/3560 Training loss: 1.8059 0.6685 sec/batch
Epoch 17/20  Iteration 2869/3560 Training loss: 1.8050 0.6742 sec/batch
Epoch 17/20  Iteration 2870/3560 Training loss: 1.8061 0.6746 sec/batch
Epoch 17/20  Iteration 2871/3560 Training loss: 1.8052 0.6801 sec/batch
Epoch 17/20  Iteration 2872/3560 Training loss: 1.8049 0.6756 sec/batch
Epoch 17/20  Iteration 2873/3560 Training loss: 1.8048 0.6789 sec/batch
Epoch 17/20  Iteration 2874/3560 Training loss: 1.8038 0.6961 sec/batch
Epoch 17/20  Iteration 2875/3560 Training loss: 1.8028 0.6863 sec/batch
Epoch 17/20  Iteration 2876/3560 Training loss: 1.8036 0.6676 sec/batch
Epoch 17/20  Iteration 2877/3560 Training loss: 1.8044 0.6842 sec/batch
Epoch 17/20  Iteration 2878/3560 Training loss: 1.8051 0.6785 sec/batch
Epoch 17/20  Iteration 2879/3560 Training loss: 1.8047 0.6774 sec/batch
Epoch 17/20  Iteration 2880/3560 Training loss: 1.8039 0.6765 sec/batch
Epoch 17/20  Iteration 2881/3560 Training loss: 1.8039 0.6764 sec/batch
Epoch 17/20  Iteration 2882/3560 Training loss: 1.8046 0.6872 sec/batch
Epoch 17/20  Iteration 2883/3560 Training loss: 1.8044 0.6773 sec/batch
Epoch 17/20  Iteration 2884/3560 Training loss: 1.8043 0.6742 sec/batch
Epoch 17/20  Iteration 2885/3560 Training loss: 1.8037 0.6704 sec/batch
Epoch 17/20  Iteration 2886/3560 Training loss: 1.8027 0.6783 sec/batch
Epoch 17/20  Iteration 2887/3560 Training loss: 1.8016 0.6821 sec/batch
Epoch 17/20  Iteration 2888/3560 Training loss: 1.8010 0.6757 sec/batch
Epoch 17/20  Iteration 2889/3560 Training loss: 1.8005 0.6806 sec/batch
Epoch 17/20  Iteration 2890/3560 Training loss: 1.8007 0.7026 sec/batch
Epoch 17/20  Iteration 2891/3560 Training loss: 1.8002 0.6890 sec/batch
Epoch 17/20  Iteration 2892/3560 Training loss: 1.7992 0.6697 sec/batch
Epoch 17/20  Iteration 2893/3560 Training loss: 1.7994 0.6812 sec/batch
Epoch 17/20  Iteration 2894/3560 Training loss: 1.7983 0.6717 sec/batch
Epoch 17/20  Iteration 2895/3560 Training loss: 1.7982 0.6757 sec/batch
Epoch 17/20  Iteration 2896/3560 Training loss: 1.7978 0.6804 sec/batch
Epoch 17/20  Iteration 2897/3560 Training loss: 1.7978 0.6912 sec/batch
Epoch 17/20  Iteration 2898/3560 Training loss: 1.7985 0.6790 sec/batch
Epoch 17/20  Iteration 2899/3560 Training loss: 1.7981 0.6784 sec/batch
Epoch 17/20  Iteration 2900/3560 Training loss: 1.7988 0.6730 sec/batch
Epoch 17/20  Iteration 2901/3560 Training loss: 1.7986 0.6762 sec/batch
Epoch 17/20  Iteration 2902/3560 Training loss: 1.7986 0.6821 sec/batch
Epoch 17/20  Iteration 2903/3560 Training loss: 1.7982 0.6712 sec/batch
Epoch 17/20  Iteration 2904/3560 Training loss: 1.7985 0.6809 sec/batch
Epoch 17/20  Iteration 2905/3560 Training loss: 1.7988 0.6876 sec/batch
Epoch 17/20  Iteration 2906/3560 Training loss: 1.7985 0.6843 sec/batch
Epoch 17/20  Iteration 2907/3560 Training loss: 1.7981 0.6800 sec/batch
Epoch 17/20  Iteration 2908/3560 Training loss: 1.7986 0.6783 sec/batch
Epoch 17/20  Iteration 2909/3560 Training loss: 1.7986 0.6900 sec/batch
Epoch 17/20  Iteration 2910/3560 Training loss: 1.7994 0.6720 sec/batch
Epoch 17/20  Iteration 2911/3560 Training loss: 1.7998 0.6795 sec/batch
Epoch 17/20  Iteration 2912/3560 Training loss: 1.8001 0.6734 sec/batch
Epoch 17/20  Iteration 2913/3560 Training loss: 1.7999 0.6676 sec/batch
Epoch 17/20  Iteration 2914/3560 Training loss: 1.8003 0.6811 sec/batch
Epoch 17/20  Iteration 2915/3560 Training loss: 1.8004 0.6892 sec/batch
Epoch 17/20  Iteration 2916/3560 Training loss: 1.7999 0.6842 sec/batch
Epoch 17/20  Iteration 2917/3560 Training loss: 1.7998 0.7372 sec/batch
Epoch 17/20  Iteration 2918/3560 Training loss: 1.7998 0.7255 sec/batch
Epoch 17/20  Iteration 2919/3560 Training loss: 1.8001 0.7135 sec/batch
Epoch 17/20  Iteration 2920/3560 Training loss: 1.8003 0.6860 sec/batch
Epoch 17/20  Iteration 2921/3560 Training loss: 1.8008 0.6870 sec/batch
Epoch 17/20  Iteration 2922/3560 Training loss: 1.8006 0.6871 sec/batch
Epoch 17/20  Iteration 2923/3560 Training loss: 1.8004 0.6643 sec/batch
Epoch 17/20  Iteration 2924/3560 Training loss: 1.8007 0.6783 sec/batch
Epoch 17/20  Iteration 2925/3560 Training loss: 1.8008 0.6847 sec/batch
Epoch 17/20  Iteration 2926/3560 Training loss: 1.8009 0.6815 sec/batch
Epoch 17/20  Iteration 2927/3560 Training loss: 1.8004 0.6657 sec/batch
Epoch 17/20  Iteration 2928/3560 Training loss: 1.8001 0.6783 sec/batch
Epoch 17/20  Iteration 2929/3560 Training loss: 1.7997 0.6821 sec/batch
Epoch 17/20  Iteration 2930/3560 Training loss: 1.7998 0.6874 sec/batch
Epoch 17/20  Iteration 2931/3560 Training loss: 1.7993 0.6736 sec/batch
Epoch 17/20  Iteration 2932/3560 Training loss: 1.7992 0.6799 sec/batch
Epoch 17/20  Iteration 2933/3560 Training loss: 1.7986 0.6690 sec/batch
Epoch 17/20  Iteration 2934/3560 Training loss: 1.7983 0.6776 sec/batch
Epoch 17/20  Iteration 2935/3560 Training loss: 1.7981 0.6769 sec/batch
Epoch 17/20  Iteration 2936/3560 Training loss: 1.7977 0.6831 sec/batch
Epoch 17/20  Iteration 2937/3560 Training loss: 1.7973 0.6766 sec/batch
Epoch 17/20  Iteration 2938/3560 Training loss: 1.7977 0.6611 sec/batch
Epoch 17/20  Iteration 2939/3560 Training loss: 1.7974 0.6757 sec/batch
Epoch 17/20  Iteration 2940/3560 Training loss: 1.7973 0.6777 sec/batch
Epoch 17/20  Iteration 2941/3560 Training loss: 1.7969 0.6894 sec/batch
Epoch 17/20  Iteration 2942/3560 Training loss: 1.7965 0.6808 sec/batch
Epoch 17/20  Iteration 2943/3560 Training loss: 1.7962 0.6869 sec/batch
Epoch 17/20  Iteration 2944/3560 Training loss: 1.7962 0.6749 sec/batch
Epoch 17/20  Iteration 2945/3560 Training loss: 1.7961 0.6804 sec/batch
Epoch 17/20  Iteration 2946/3560 Training loss: 1.7957 0.6874 sec/batch
Epoch 17/20  Iteration 2947/3560 Training loss: 1.7954 0.6799 sec/batch
Epoch 17/20  Iteration 2948/3560 Training loss: 1.7950 0.6745 sec/batch
Epoch 17/20  Iteration 2949/3560 Training loss: 1.7951 0.6745 sec/batch
Epoch 17/20  Iteration 2950/3560 Training loss: 1.7950 0.6945 sec/batch
Epoch 17/20  Iteration 2951/3560 Training loss: 1.7947 0.6727 sec/batch
Epoch 17/20  Iteration 2952/3560 Training loss: 1.7946 0.6728 sec/batch
Epoch 17/20  Iteration 2953/3560 Training loss: 1.7944 0.6801 sec/batch
Epoch 17/20  Iteration 2954/3560 Training loss: 1.7942 0.8121 sec/batch
Epoch 17/20  Iteration 2955/3560 Training loss: 1.7942 0.6374 sec/batch
Epoch 17/20  Iteration 2956/3560 Training loss: 1.7941 0.6285 sec/batch
Epoch 17/20  Iteration 2957/3560 Training loss: 1.7941 0.6502 sec/batch
Epoch 17/20  Iteration 2958/3560 Training loss: 1.7941 0.6302 sec/batch
Epoch 17/20  Iteration 2959/3560 Training loss: 1.7940 0.6784 sec/batch
Epoch 17/20  Iteration 2960/3560 Training loss: 1.7938 0.7225 sec/batch
Epoch 17/20  Iteration 2961/3560 Training loss: 1.7937 0.6451 sec/batch
Epoch 17/20  Iteration 2962/3560 Training loss: 1.7937 0.6925 sec/batch
Epoch 17/20  Iteration 2963/3560 Training loss: 1.7934 0.6717 sec/batch
Epoch 17/20  Iteration 2964/3560 Training loss: 1.7930 0.6893 sec/batch
Epoch 17/20  Iteration 2965/3560 Training loss: 1.7930 0.6700 sec/batch
Epoch 17/20  Iteration 2966/3560 Training loss: 1.7930 0.6556 sec/batch
Epoch 17/20  Iteration 2967/3560 Training loss: 1.7930 0.6701 sec/batch
Epoch 17/20  Iteration 2968/3560 Training loss: 1.7929 0.6498 sec/batch
Epoch 17/20  Iteration 2969/3560 Training loss: 1.7930 0.6396 sec/batch
Epoch 17/20  Iteration 2970/3560 Training loss: 1.7926 0.6336 sec/batch
Epoch 17/20  Iteration 2971/3560 Training loss: 1.7923 0.6393 sec/batch
Epoch 17/20  Iteration 2972/3560 Training loss: 1.7925 0.6435 sec/batch
Epoch 17/20  Iteration 2973/3560 Training loss: 1.7925 0.6413 sec/batch
Epoch 17/20  Iteration 2974/3560 Training loss: 1.7921 0.6881 sec/batch
Epoch 17/20  Iteration 2975/3560 Training loss: 1.7922 0.7223 sec/batch
Epoch 17/20  Iteration 2976/3560 Training loss: 1.7922 0.6391 sec/batch
Epoch 17/20  Iteration 2977/3560 Training loss: 1.7922 0.7717 sec/batch
Epoch 17/20  Iteration 2978/3560 Training loss: 1.7921 0.8318 sec/batch
Epoch 17/20  Iteration 2979/3560 Training loss: 1.7919 0.6844 sec/batch
Epoch 17/20  Iteration 2980/3560 Training loss: 1.7917 0.7314 sec/batch
Epoch 17/20  Iteration 2981/3560 Training loss: 1.7918 0.7052 sec/batch
Epoch 17/20  Iteration 2982/3560 Training loss: 1.7918 0.6374 sec/batch
Epoch 17/20  Iteration 2983/3560 Training loss: 1.7918 0.6369 sec/batch
Epoch 17/20  Iteration 2984/3560 Training loss: 1.7918 0.6327 sec/batch
Epoch 17/20  Iteration 2985/3560 Training loss: 1.7919 0.6700 sec/batch
Epoch 17/20  Iteration 2986/3560 Training loss: 1.7919 0.6443 sec/batch
Epoch 17/20  Iteration 2987/3560 Training loss: 1.7921 0.6261 sec/batch
Epoch 17/20  Iteration 2988/3560 Training loss: 1.7920 0.6146 sec/batch
Epoch 17/20  Iteration 2989/3560 Training loss: 1.7924 0.6464 sec/batch
Epoch 17/20  Iteration 2990/3560 Training loss: 1.7923 0.6496 sec/batch
Epoch 17/20  Iteration 2991/3560 Training loss: 1.7922 0.8492 sec/batch
Epoch 17/20  Iteration 2992/3560 Training loss: 1.7922 0.8970 sec/batch
Epoch 17/20  Iteration 2993/3560 Training loss: 1.7922 0.8564 sec/batch
Epoch 17/20  Iteration 2994/3560 Training loss: 1.7922 0.6343 sec/batch
Epoch 17/20  Iteration 2995/3560 Training loss: 1.7922 0.6427 sec/batch
Epoch 17/20  Iteration 2996/3560 Training loss: 1.7925 0.6405 sec/batch
Epoch 17/20  Iteration 2997/3560 Training loss: 1.7925 0.6345 sec/batch
Epoch 17/20  Iteration 2998/3560 Training loss: 1.7924 0.6274 sec/batch
Epoch 17/20  Iteration 2999/3560 Training loss: 1.7923 0.6310 sec/batch
Epoch 17/20  Iteration 3000/3560 Training loss: 1.7923 0.6338 sec/batch
Epoch 17/20  Iteration 3001/3560 Training loss: 1.7923 0.6299 sec/batch
Epoch 17/20  Iteration 3002/3560 Training loss: 1.7924 0.6433 sec/batch
Epoch 17/20  Iteration 3003/3560 Training loss: 1.7924 0.6676 sec/batch
Epoch 17/20  Iteration 3004/3560 Training loss: 1.7924 0.8654 sec/batch
Epoch 17/20  Iteration 3005/3560 Training loss: 1.7924 0.6780 sec/batch
Epoch 17/20  Iteration 3006/3560 Training loss: 1.7925 0.7997 sec/batch
Epoch 17/20  Iteration 3007/3560 Training loss: 1.7922 0.7426 sec/batch
Epoch 17/20  Iteration 3008/3560 Training loss: 1.7924 0.6764 sec/batch
Epoch 17/20  Iteration 3009/3560 Training loss: 1.7926 0.7339 sec/batch
Epoch 17/20  Iteration 3010/3560 Training loss: 1.7926 0.6390 sec/batch
Epoch 17/20  Iteration 3011/3560 Training loss: 1.7927 0.6331 sec/batch
Epoch 17/20  Iteration 3012/3560 Training loss: 1.7927 0.6317 sec/batch
Epoch 17/20  Iteration 3013/3560 Training loss: 1.7927 0.6369 sec/batch
Epoch 17/20  Iteration 3014/3560 Training loss: 1.7927 0.6355 sec/batch
Epoch 17/20  Iteration 3015/3560 Training loss: 1.7927 0.6625 sec/batch
Epoch 17/20  Iteration 3016/3560 Training loss: 1.7930 0.6347 sec/batch
Epoch 17/20  Iteration 3017/3560 Training loss: 1.7929 0.6354 sec/batch
Epoch 17/20  Iteration 3018/3560 Training loss: 1.7929 0.6277 sec/batch
Epoch 17/20  Iteration 3019/3560 Training loss: 1.7928 0.6327 sec/batch
Epoch 17/20  Iteration 3020/3560 Training loss: 1.7927 0.6635 sec/batch
Epoch 17/20  Iteration 3021/3560 Training loss: 1.7927 0.6328 sec/batch
Epoch 17/20  Iteration 3022/3560 Training loss: 1.7928 0.6557 sec/batch
Epoch 17/20  Iteration 3023/3560 Training loss: 1.7928 0.6456 sec/batch
Epoch 17/20  Iteration 3024/3560 Training loss: 1.7928 0.6466 sec/batch
Epoch 17/20  Iteration 3025/3560 Training loss: 1.7927 0.6432 sec/batch
Epoch 17/20  Iteration 3026/3560 Training loss: 1.7928 0.6499 sec/batch
Epoch 18/20  Iteration 3027/3560 Training loss: 1.8626 0.6578 sec/batch
Epoch 18/20  Iteration 3028/3560 Training loss: 1.8207 0.6612 sec/batch
Epoch 18/20  Iteration 3029/3560 Training loss: 1.8052 0.7040 sec/batch
Epoch 18/20  Iteration 3030/3560 Training loss: 1.8020 0.6707 sec/batch
Epoch 18/20  Iteration 3031/3560 Training loss: 1.8002 0.6552 sec/batch
Epoch 18/20  Iteration 3032/3560 Training loss: 1.7906 0.7012 sec/batch
Epoch 18/20  Iteration 3033/3560 Training loss: 1.7897 0.6451 sec/batch
Epoch 18/20  Iteration 3034/3560 Training loss: 1.7883 0.6850 sec/batch
Epoch 18/20  Iteration 3035/3560 Training loss: 1.7914 0.6508 sec/batch
Epoch 18/20  Iteration 3036/3560 Training loss: 1.7911 0.6942 sec/batch
Epoch 18/20  Iteration 3037/3560 Training loss: 1.7873 0.6642 sec/batch
Epoch 18/20  Iteration 3038/3560 Training loss: 1.7854 0.6480 sec/batch
Epoch 18/20  Iteration 3039/3560 Training loss: 1.7858 0.6882 sec/batch
Epoch 18/20  Iteration 3040/3560 Training loss: 1.7886 0.6715 sec/batch
Epoch 18/20  Iteration 3041/3560 Training loss: 1.7866 0.6637 sec/batch
Epoch 18/20  Iteration 3042/3560 Training loss: 1.7852 0.7049 sec/batch
Epoch 18/20  Iteration 3043/3560 Training loss: 1.7854 0.6347 sec/batch
Epoch 18/20  Iteration 3044/3560 Training loss: 1.7873 0.6613 sec/batch
Epoch 18/20  Iteration 3045/3560 Training loss: 1.7869 0.6463 sec/batch
Epoch 18/20  Iteration 3046/3560 Training loss: 1.7874 0.6255 sec/batch
Epoch 18/20  Iteration 3047/3560 Training loss: 1.7867 0.6265 sec/batch
Epoch 18/20  Iteration 3048/3560 Training loss: 1.7874 0.6226 sec/batch
Epoch 18/20  Iteration 3049/3560 Training loss: 1.7868 0.6288 sec/batch
Epoch 18/20  Iteration 3050/3560 Training loss: 1.7865 0.6340 sec/batch
Epoch 18/20  Iteration 3051/3560 Training loss: 1.7864 0.6364 sec/batch
Epoch 18/20  Iteration 3052/3560 Training loss: 1.7858 0.6320 sec/batch
Epoch 18/20  Iteration 3053/3560 Training loss: 1.7846 0.6140 sec/batch
Epoch 18/20  Iteration 3054/3560 Training loss: 1.7851 0.6303 sec/batch
Epoch 18/20  Iteration 3055/3560 Training loss: 1.7860 0.6286 sec/batch
Epoch 18/20  Iteration 3056/3560 Training loss: 1.7859 0.6333 sec/batch
Epoch 18/20  Iteration 3057/3560 Training loss: 1.7860 0.6315 sec/batch
Epoch 18/20  Iteration 3058/3560 Training loss: 1.7852 0.6307 sec/batch
Epoch 18/20  Iteration 3059/3560 Training loss: 1.7856 0.6341 sec/batch
Epoch 18/20  Iteration 3060/3560 Training loss: 1.7861 0.6371 sec/batch
Epoch 18/20  Iteration 3061/3560 Training loss: 1.7855 0.6316 sec/batch
Epoch 18/20  Iteration 3062/3560 Training loss: 1.7852 0.6359 sec/batch
Epoch 18/20  Iteration 3063/3560 Training loss: 1.7847 0.6258 sec/batch
Epoch 18/20  Iteration 3064/3560 Training loss: 1.7838 0.6350 sec/batch
Epoch 18/20  Iteration 3065/3560 Training loss: 1.7827 0.6521 sec/batch
Epoch 18/20  Iteration 3066/3560 Training loss: 1.7819 0.6315 sec/batch
Epoch 18/20  Iteration 3067/3560 Training loss: 1.7814 0.6262 sec/batch
Epoch 18/20  Iteration 3068/3560 Training loss: 1.7818 0.6343 sec/batch
Epoch 18/20  Iteration 3069/3560 Training loss: 1.7815 0.6358 sec/batch
Epoch 18/20  Iteration 3070/3560 Training loss: 1.7808 0.6291 sec/batch
Epoch 18/20  Iteration 3071/3560 Training loss: 1.7809 0.6257 sec/batch
Epoch 18/20  Iteration 3072/3560 Training loss: 1.7797 0.6335 sec/batch
Epoch 18/20  Iteration 3073/3560 Training loss: 1.7798 0.6339 sec/batch
Epoch 18/20  Iteration 3074/3560 Training loss: 1.7793 0.6211 sec/batch
Epoch 18/20  Iteration 3075/3560 Training loss: 1.7792 0.6401 sec/batch
Epoch 18/20  Iteration 3076/3560 Training loss: 1.7800 0.6302 sec/batch
Epoch 18/20  Iteration 3077/3560 Training loss: 1.7793 0.6349 sec/batch
Epoch 18/20  Iteration 3078/3560 Training loss: 1.7801 0.6269 sec/batch
Epoch 18/20  Iteration 3079/3560 Training loss: 1.7799 0.6350 sec/batch
Epoch 18/20  Iteration 3080/3560 Training loss: 1.7800 0.6313 sec/batch
Epoch 18/20  Iteration 3081/3560 Training loss: 1.7796 0.6367 sec/batch
Epoch 18/20  Iteration 3082/3560 Training loss: 1.7798 0.6331 sec/batch
Epoch 18/20  Iteration 3083/3560 Training loss: 1.7801 0.6306 sec/batch
Epoch 18/20  Iteration 3084/3560 Training loss: 1.7800 0.6281 sec/batch
Epoch 18/20  Iteration 3085/3560 Training loss: 1.7794 0.6392 sec/batch
Epoch 18/20  Iteration 3086/3560 Training loss: 1.7799 0.6320 sec/batch
Epoch 18/20  Iteration 3087/3560 Training loss: 1.7798 0.6283 sec/batch
Epoch 18/20  Iteration 3088/3560 Training loss: 1.7808 0.6316 sec/batch
Epoch 18/20  Iteration 3089/3560 Training loss: 1.7813 0.6343 sec/batch
Epoch 18/20  Iteration 3090/3560 Training loss: 1.7815 0.6316 sec/batch
Epoch 18/20  Iteration 3091/3560 Training loss: 1.7816 0.6268 sec/batch
Epoch 18/20  Iteration 3092/3560 Training loss: 1.7817 0.6296 sec/batch
Epoch 18/20  Iteration 3093/3560 Training loss: 1.7819 0.6331 sec/batch
Epoch 18/20  Iteration 3094/3560 Training loss: 1.7814 0.6269 sec/batch
Epoch 18/20  Iteration 3095/3560 Training loss: 1.7813 0.6327 sec/batch
Epoch 18/20  Iteration 3096/3560 Training loss: 1.7813 0.6234 sec/batch
Epoch 18/20  Iteration 3097/3560 Training loss: 1.7819 0.7382 sec/batch
Epoch 18/20  Iteration 3098/3560 Training loss: 1.7820 0.6367 sec/batch
Epoch 18/20  Iteration 3099/3560 Training loss: 1.7824 0.6669 sec/batch
Epoch 18/20  Iteration 3100/3560 Training loss: 1.7823 0.6569 sec/batch
Epoch 18/20  Iteration 3101/3560 Training loss: 1.7821 0.6377 sec/batch
Epoch 18/20  Iteration 3102/3560 Training loss: 1.7825 0.6097 sec/batch
Epoch 18/20  Iteration 3103/3560 Training loss: 1.7825 0.6311 sec/batch
Epoch 18/20  Iteration 3104/3560 Training loss: 1.7826 0.6324 sec/batch
Epoch 18/20  Iteration 3105/3560 Training loss: 1.7822 0.6322 sec/batch
Epoch 18/20  Iteration 3106/3560 Training loss: 1.7820 0.6280 sec/batch
Epoch 18/20  Iteration 3107/3560 Training loss: 1.7816 0.6275 sec/batch
Epoch 18/20  Iteration 3108/3560 Training loss: 1.7818 0.6348 sec/batch
Epoch 18/20  Iteration 3109/3560 Training loss: 1.7814 0.6272 sec/batch
Epoch 18/20  Iteration 3110/3560 Training loss: 1.7814 0.6374 sec/batch
Epoch 18/20  Iteration 3111/3560 Training loss: 1.7809 0.6112 sec/batch
Epoch 18/20  Iteration 3112/3560 Training loss: 1.7807 0.6331 sec/batch
Epoch 18/20  Iteration 3113/3560 Training loss: 1.7806 0.6405 sec/batch
Epoch 18/20  Iteration 3114/3560 Training loss: 1.7803 0.6188 sec/batch
Epoch 18/20  Iteration 3115/3560 Training loss: 1.7799 0.6387 sec/batch
Epoch 18/20  Iteration 3116/3560 Training loss: 1.7800 0.6299 sec/batch
Epoch 18/20  Iteration 3117/3560 Training loss: 1.7798 0.6312 sec/batch
Epoch 18/20  Iteration 3118/3560 Training loss: 1.7797 0.6331 sec/batch
Epoch 18/20  Iteration 3119/3560 Training loss: 1.7793 0.6250 sec/batch
Epoch 18/20  Iteration 3120/3560 Training loss: 1.7789 0.6366 sec/batch
Epoch 18/20  Iteration 3121/3560 Training loss: 1.7787 0.6387 sec/batch
Epoch 18/20  Iteration 3122/3560 Training loss: 1.7788 0.6505 sec/batch
Epoch 18/20  Iteration 3123/3560 Training loss: 1.7787 0.6344 sec/batch
Epoch 18/20  Iteration 3124/3560 Training loss: 1.7784 0.6416 sec/batch
Epoch 18/20  Iteration 3125/3560 Training loss: 1.7780 0.6334 sec/batch
Epoch 18/20  Iteration 3126/3560 Training loss: 1.7776 0.6376 sec/batch
Epoch 18/20  Iteration 3127/3560 Training loss: 1.7776 0.6337 sec/batch
Epoch 18/20  Iteration 3128/3560 Training loss: 1.7776 0.6292 sec/batch
Epoch 18/20  Iteration 3129/3560 Training loss: 1.7774 0.6295 sec/batch
Epoch 18/20  Iteration 3130/3560 Training loss: 1.7772 0.6258 sec/batch
Epoch 18/20  Iteration 3131/3560 Training loss: 1.7769 0.6407 sec/batch
Epoch 18/20  Iteration 3132/3560 Training loss: 1.7768 0.6281 sec/batch
Epoch 18/20  Iteration 3133/3560 Training loss: 1.7768 0.6195 sec/batch
Epoch 18/20  Iteration 3134/3560 Training loss: 1.7768 0.6328 sec/batch
Epoch 18/20  Iteration 3135/3560 Training loss: 1.7768 0.6320 sec/batch
Epoch 18/20  Iteration 3136/3560 Training loss: 1.7768 0.6344 sec/batch
Epoch 18/20  Iteration 3137/3560 Training loss: 1.7767 0.6355 sec/batch
Epoch 18/20  Iteration 3138/3560 Training loss: 1.7765 0.6315 sec/batch
Epoch 18/20  Iteration 3139/3560 Training loss: 1.7763 0.6328 sec/batch
Epoch 18/20  Iteration 3140/3560 Training loss: 1.7762 0.6294 sec/batch
Epoch 18/20  Iteration 3141/3560 Training loss: 1.7761 0.6328 sec/batch
Epoch 18/20  Iteration 3142/3560 Training loss: 1.7757 0.6304 sec/batch
Epoch 18/20  Iteration 3143/3560 Training loss: 1.7756 0.6299 sec/batch
Epoch 18/20  Iteration 3144/3560 Training loss: 1.7757 0.6346 sec/batch
Epoch 18/20  Iteration 3145/3560 Training loss: 1.7756 0.6094 sec/batch
Epoch 18/20  Iteration 3146/3560 Training loss: 1.7755 0.6296 sec/batch
Epoch 18/20  Iteration 3147/3560 Training loss: 1.7756 0.6386 sec/batch
Epoch 18/20  Iteration 3148/3560 Training loss: 1.7753 0.6271 sec/batch
Epoch 18/20  Iteration 3149/3560 Training loss: 1.7751 0.6312 sec/batch
Epoch 18/20  Iteration 3150/3560 Training loss: 1.7752 0.6299 sec/batch
Epoch 18/20  Iteration 3151/3560 Training loss: 1.7751 0.6334 sec/batch
Epoch 18/20  Iteration 3152/3560 Training loss: 1.7748 0.6366 sec/batch
Epoch 18/20  Iteration 3153/3560 Training loss: 1.7750 0.6293 sec/batch
Epoch 18/20  Iteration 3154/3560 Training loss: 1.7750 0.6327 sec/batch
Epoch 18/20  Iteration 3155/3560 Training loss: 1.7749 0.6251 sec/batch
Epoch 18/20  Iteration 3156/3560 Training loss: 1.7748 0.6339 sec/batch
Epoch 18/20  Iteration 3157/3560 Training loss: 1.7745 0.6285 sec/batch
Epoch 18/20  Iteration 3158/3560 Training loss: 1.7743 0.6325 sec/batch
Epoch 18/20  Iteration 3159/3560 Training loss: 1.7743 0.6374 sec/batch
Epoch 18/20  Iteration 3160/3560 Training loss: 1.7743 0.6347 sec/batch
Epoch 18/20  Iteration 3161/3560 Training loss: 1.7743 0.6352 sec/batch
Epoch 18/20  Iteration 3162/3560 Training loss: 1.7745 0.6242 sec/batch
Epoch 18/20  Iteration 3163/3560 Training loss: 1.7745 0.6190 sec/batch
Epoch 18/20  Iteration 3164/3560 Training loss: 1.7746 0.6266 sec/batch
Epoch 18/20  Iteration 3165/3560 Training loss: 1.7747 0.6264 sec/batch
Epoch 18/20  Iteration 3166/3560 Training loss: 1.7746 0.6306 sec/batch
Epoch 18/20  Iteration 3167/3560 Training loss: 1.7749 0.6340 sec/batch
Epoch 18/20  Iteration 3168/3560 Training loss: 1.7749 0.6381 sec/batch
Epoch 18/20  Iteration 3169/3560 Training loss: 1.7748 0.6557 sec/batch
Epoch 18/20  Iteration 3170/3560 Training loss: 1.7749 0.7743 sec/batch
Epoch 18/20  Iteration 3171/3560 Training loss: 1.7748 0.6322 sec/batch
Epoch 18/20  Iteration 3172/3560 Training loss: 1.7749 0.6334 sec/batch
Epoch 18/20  Iteration 3173/3560 Training loss: 1.7750 0.6373 sec/batch
Epoch 18/20  Iteration 3174/3560 Training loss: 1.7753 0.6335 sec/batch
Epoch 18/20  Iteration 3175/3560 Training loss: 1.7753 0.6361 sec/batch
Epoch 18/20  Iteration 3176/3560 Training loss: 1.7751 0.6326 sec/batch
Epoch 18/20  Iteration 3177/3560 Training loss: 1.7749 0.6311 sec/batch
Epoch 18/20  Iteration 3178/3560 Training loss: 1.7750 0.6333 sec/batch
Epoch 18/20  Iteration 3179/3560 Training loss: 1.7750 0.6412 sec/batch
Epoch 18/20  Iteration 3180/3560 Training loss: 1.7752 0.6319 sec/batch
Epoch 18/20  Iteration 3181/3560 Training loss: 1.7752 0.6334 sec/batch
Epoch 18/20  Iteration 3182/3560 Training loss: 1.7752 0.6497 sec/batch
Epoch 18/20  Iteration 3183/3560 Training loss: 1.7753 0.6111 sec/batch
Epoch 18/20  Iteration 3184/3560 Training loss: 1.7754 0.6426 sec/batch
Epoch 18/20  Iteration 3185/3560 Training loss: 1.7752 0.6308 sec/batch
Epoch 18/20  Iteration 3186/3560 Training loss: 1.7754 0.6398 sec/batch
Epoch 18/20  Iteration 3187/3560 Training loss: 1.7756 0.6367 sec/batch
Epoch 18/20  Iteration 3188/3560 Training loss: 1.7756 0.6281 sec/batch
Epoch 18/20  Iteration 3189/3560 Training loss: 1.7756 0.6306 sec/batch
Epoch 18/20  Iteration 3190/3560 Training loss: 1.7756 0.6300 sec/batch
Epoch 18/20  Iteration 3191/3560 Training loss: 1.7756 0.6091 sec/batch
Epoch 18/20  Iteration 3192/3560 Training loss: 1.7755 0.6299 sec/batch
Epoch 18/20  Iteration 3193/3560 Training loss: 1.7756 0.6418 sec/batch
Epoch 18/20  Iteration 3194/3560 Training loss: 1.7760 0.6264 sec/batch
Epoch 18/20  Iteration 3195/3560 Training loss: 1.7758 0.6310 sec/batch
Epoch 18/20  Iteration 3196/3560 Training loss: 1.7757 0.6299 sec/batch
Epoch 18/20  Iteration 3197/3560 Training loss: 1.7756 0.6205 sec/batch
Epoch 18/20  Iteration 3198/3560 Training loss: 1.7755 0.6349 sec/batch
Epoch 18/20  Iteration 3199/3560 Training loss: 1.7757 0.6348 sec/batch
Epoch 18/20  Iteration 3200/3560 Training loss: 1.7757 0.6276 sec/batch
Epoch 18/20  Iteration 3201/3560 Training loss: 1.7758 0.6463 sec/batch
Epoch 18/20  Iteration 3202/3560 Training loss: 1.7758 0.6774 sec/batch
Epoch 18/20  Iteration 3203/3560 Training loss: 1.7757 0.6792 sec/batch
Epoch 18/20  Iteration 3204/3560 Training loss: 1.7757 0.6535 sec/batch
Epoch 19/20  Iteration 3205/3560 Training loss: 1.8527 0.6660 sec/batch
Epoch 19/20  Iteration 3206/3560 Training loss: 1.8138 0.6398 sec/batch
Epoch 19/20  Iteration 3207/3560 Training loss: 1.8006 0.6631 sec/batch
Epoch 19/20  Iteration 3208/3560 Training loss: 1.7938 0.6930 sec/batch
Epoch 19/20  Iteration 3209/3560 Training loss: 1.7910 0.6646 sec/batch
Epoch 19/20  Iteration 3210/3560 Training loss: 1.7781 0.6274 sec/batch
Epoch 19/20  Iteration 3211/3560 Training loss: 1.7794 0.6283 sec/batch
Epoch 19/20  Iteration 3212/3560 Training loss: 1.7769 0.6462 sec/batch
Epoch 19/20  Iteration 3213/3560 Training loss: 1.7780 0.6552 sec/batch
Epoch 19/20  Iteration 3214/3560 Training loss: 1.7777 0.6885 sec/batch
Epoch 19/20  Iteration 3215/3560 Training loss: 1.7749 0.6847 sec/batch
Epoch 19/20  Iteration 3216/3560 Training loss: 1.7730 0.6930 sec/batch
Epoch 19/20  Iteration 3217/3560 Training loss: 1.7722 0.6689 sec/batch
Epoch 19/20  Iteration 3218/3560 Training loss: 1.7754 0.6309 sec/batch
Epoch 19/20  Iteration 3219/3560 Training loss: 1.7736 0.7142 sec/batch
Epoch 19/20  Iteration 3220/3560 Training loss: 1.7719 0.6505 sec/batch
Epoch 19/20  Iteration 3221/3560 Training loss: 1.7719 0.6637 sec/batch
Epoch 19/20  Iteration 3222/3560 Training loss: 1.7738 0.6680 sec/batch
Epoch 19/20  Iteration 3223/3560 Training loss: 1.7740 0.8037 sec/batch
Epoch 19/20  Iteration 3224/3560 Training loss: 1.7740 0.7715 sec/batch
Epoch 19/20  Iteration 3225/3560 Training loss: 1.7730 0.7813 sec/batch
Epoch 19/20  Iteration 3226/3560 Training loss: 1.7742 0.6408 sec/batch
Epoch 19/20  Iteration 3227/3560 Training loss: 1.7728 0.6253 sec/batch
Epoch 19/20  Iteration 3228/3560 Training loss: 1.7723 0.6292 sec/batch
Epoch 19/20  Iteration 3229/3560 Training loss: 1.7724 0.6308 sec/batch
Epoch 19/20  Iteration 3230/3560 Training loss: 1.7708 0.6612 sec/batch
Epoch 19/20  Iteration 3231/3560 Training loss: 1.7698 0.6625 sec/batch
Epoch 19/20  Iteration 3232/3560 Training loss: 1.7704 0.6593 sec/batch
Epoch 19/20  Iteration 3233/3560 Training loss: 1.7714 0.6464 sec/batch
Epoch 19/20  Iteration 3234/3560 Training loss: 1.7715 0.6358 sec/batch
Epoch 19/20  Iteration 3235/3560 Training loss: 1.7718 0.6333 sec/batch
Epoch 19/20  Iteration 3236/3560 Training loss: 1.7713 0.6299 sec/batch
Epoch 19/20  Iteration 3237/3560 Training loss: 1.7713 0.6332 sec/batch
Epoch 19/20  Iteration 3238/3560 Training loss: 1.7721 0.6191 sec/batch
Epoch 19/20  Iteration 3239/3560 Training loss: 1.7719 0.6350 sec/batch
Epoch 19/20  Iteration 3240/3560 Training loss: 1.7718 0.6405 sec/batch
Epoch 19/20  Iteration 3241/3560 Training loss: 1.7714 0.6315 sec/batch
Epoch 19/20  Iteration 3242/3560 Training loss: 1.7705 0.6353 sec/batch
Epoch 19/20  Iteration 3243/3560 Training loss: 1.7693 0.6285 sec/batch
Epoch 19/20  Iteration 3244/3560 Training loss: 1.7686 0.6514 sec/batch
Epoch 19/20  Iteration 3245/3560 Training loss: 1.7681 0.6586 sec/batch
Epoch 19/20  Iteration 3246/3560 Training loss: 1.7684 0.6520 sec/batch
Epoch 19/20  Iteration 3247/3560 Training loss: 1.7682 0.6329 sec/batch
Epoch 19/20  Iteration 3248/3560 Training loss: 1.7673 0.6470 sec/batch
Epoch 19/20  Iteration 3249/3560 Training loss: 1.7675 0.6371 sec/batch
Epoch 19/20  Iteration 3250/3560 Training loss: 1.7664 0.6266 sec/batch
Epoch 19/20  Iteration 3251/3560 Training loss: 1.7663 0.6344 sec/batch
Epoch 19/20  Iteration 3252/3560 Training loss: 1.7659 0.6241 sec/batch
Epoch 19/20  Iteration 3253/3560 Training loss: 1.7656 0.6455 sec/batch
Epoch 19/20  Iteration 3254/3560 Training loss: 1.7663 0.6581 sec/batch
Epoch 19/20  Iteration 3255/3560 Training loss: 1.7659 0.6442 sec/batch
Epoch 19/20  Iteration 3256/3560 Training loss: 1.7667 0.6660 sec/batch
Epoch 19/20  Iteration 3257/3560 Training loss: 1.7665 0.6539 sec/batch
Epoch 19/20  Iteration 3258/3560 Training loss: 1.7665 0.6545 sec/batch
Epoch 19/20  Iteration 3259/3560 Training loss: 1.7662 0.6320 sec/batch
Epoch 19/20  Iteration 3260/3560 Training loss: 1.7663 0.6256 sec/batch
Epoch 19/20  Iteration 3261/3560 Training loss: 1.7666 0.6432 sec/batch
Epoch 19/20  Iteration 3262/3560 Training loss: 1.7662 0.6343 sec/batch
Epoch 19/20  Iteration 3263/3560 Training loss: 1.7658 0.6287 sec/batch
Epoch 19/20  Iteration 3264/3560 Training loss: 1.7662 0.6374 sec/batch
Epoch 19/20  Iteration 3265/3560 Training loss: 1.7662 0.6360 sec/batch
Epoch 19/20  Iteration 3266/3560 Training loss: 1.7670 0.6562 sec/batch
Epoch 19/20  Iteration 3267/3560 Training loss: 1.7673 0.6328 sec/batch
Epoch 19/20  Iteration 3268/3560 Training loss: 1.7676 0.6310 sec/batch
Epoch 19/20  Iteration 3269/3560 Training loss: 1.7675 0.6377 sec/batch
Epoch 19/20  Iteration 3270/3560 Training loss: 1.7677 0.6495 sec/batch
Epoch 19/20  Iteration 3271/3560 Training loss: 1.7678 0.6380 sec/batch
Epoch 19/20  Iteration 3272/3560 Training loss: 1.7673 0.6399 sec/batch
Epoch 19/20  Iteration 3273/3560 Training loss: 1.7673 0.6285 sec/batch
Epoch 19/20  Iteration 3274/3560 Training loss: 1.7670 0.6566 sec/batch
Epoch 19/20  Iteration 3275/3560 Training loss: 1.7676 0.6551 sec/batch
Epoch 19/20  Iteration 3276/3560 Training loss: 1.7678 0.6336 sec/batch
Epoch 19/20  Iteration 3277/3560 Training loss: 1.7683 0.6314 sec/batch
Epoch 19/20  Iteration 3278/3560 Training loss: 1.7681 0.6279 sec/batch
Epoch 19/20  Iteration 3279/3560 Training loss: 1.7682 0.6339 sec/batch
Epoch 19/20  Iteration 3280/3560 Training loss: 1.7684 0.6291 sec/batch
Epoch 19/20  Iteration 3281/3560 Training loss: 1.7684 0.6422 sec/batch
Epoch 19/20  Iteration 3282/3560 Training loss: 1.7685 0.6872 sec/batch
Epoch 19/20  Iteration 3283/3560 Training loss: 1.7681 0.7116 sec/batch
Epoch 19/20  Iteration 3284/3560 Training loss: 1.7680 0.6297 sec/batch
Epoch 19/20  Iteration 3285/3560 Training loss: 1.7675 0.6842 sec/batch
Epoch 19/20  Iteration 3286/3560 Training loss: 1.7676 0.6641 sec/batch
Epoch 19/20  Iteration 3287/3560 Training loss: 1.7671 0.6309 sec/batch
Epoch 19/20  Iteration 3288/3560 Training loss: 1.7672 0.6296 sec/batch
Epoch 19/20  Iteration 3289/3560 Training loss: 1.7667 0.6401 sec/batch
Epoch 19/20  Iteration 3290/3560 Training loss: 1.7664 0.6557 sec/batch
Epoch 19/20  Iteration 3291/3560 Training loss: 1.7664 0.6704 sec/batch
Epoch 19/20  Iteration 3292/3560 Training loss: 1.7661 0.6329 sec/batch
Epoch 19/20  Iteration 3293/3560 Training loss: 1.7657 0.6304 sec/batch
Epoch 19/20  Iteration 3294/3560 Training loss: 1.7658 0.6718 sec/batch
Epoch 19/20  Iteration 3295/3560 Training loss: 1.7655 0.6382 sec/batch
Epoch 19/20  Iteration 3296/3560 Training loss: 1.7653 0.6331 sec/batch
Epoch 19/20  Iteration 3297/3560 Training loss: 1.7649 0.6417 sec/batch
Epoch 19/20  Iteration 3298/3560 Training loss: 1.7646 0.6339 sec/batch
Epoch 19/20  Iteration 3299/3560 Training loss: 1.7643 0.6253 sec/batch
Epoch 19/20  Iteration 3300/3560 Training loss: 1.7642 0.6345 sec/batch
Epoch 19/20  Iteration 3301/3560 Training loss: 1.7641 0.6217 sec/batch
Epoch 19/20  Iteration 3302/3560 Training loss: 1.7638 0.6508 sec/batch
Epoch 19/20  Iteration 3303/3560 Training loss: 1.7634 0.6925 sec/batch
Epoch 19/20  Iteration 3304/3560 Training loss: 1.7630 0.6467 sec/batch
Epoch 19/20  Iteration 3305/3560 Training loss: 1.7629 0.6661 sec/batch
Epoch 19/20  Iteration 3306/3560 Training loss: 1.7628 0.6389 sec/batch
Epoch 19/20  Iteration 3307/3560 Training loss: 1.7626 0.6568 sec/batch
Epoch 19/20  Iteration 3308/3560 Training loss: 1.7624 0.6419 sec/batch
Epoch 19/20  Iteration 3309/3560 Training loss: 1.7622 0.6298 sec/batch
Epoch 19/20  Iteration 3310/3560 Training loss: 1.7620 0.6279 sec/batch
Epoch 19/20  Iteration 3311/3560 Training loss: 1.7621 0.6348 sec/batch
Epoch 19/20  Iteration 3312/3560 Training loss: 1.7619 0.6163 sec/batch
Epoch 19/20  Iteration 3313/3560 Training loss: 1.7619 0.6326 sec/batch
Epoch 19/20  Iteration 3314/3560 Training loss: 1.7619 0.6384 sec/batch
Epoch 19/20  Iteration 3315/3560 Training loss: 1.7617 0.6351 sec/batch
Epoch 19/20  Iteration 3316/3560 Training loss: 1.7616 0.6310 sec/batch
Epoch 19/20  Iteration 3317/3560 Training loss: 1.7615 0.6356 sec/batch
Epoch 19/20  Iteration 3318/3560 Training loss: 1.7614 0.6271 sec/batch
Epoch 19/20  Iteration 3319/3560 Training loss: 1.7612 0.6336 sec/batch
Epoch 19/20  Iteration 3320/3560 Training loss: 1.7609 0.6286 sec/batch
Epoch 19/20  Iteration 3321/3560 Training loss: 1.7610 0.6670 sec/batch
Epoch 19/20  Iteration 3322/3560 Training loss: 1.7610 0.6638 sec/batch
Epoch 19/20  Iteration 3323/3560 Training loss: 1.7610 0.6606 sec/batch
Epoch 19/20  Iteration 3324/3560 Training loss: 1.7610 0.6335 sec/batch
Epoch 19/20  Iteration 3325/3560 Training loss: 1.7610 0.6332 sec/batch
Epoch 19/20  Iteration 3326/3560 Training loss: 1.7608 0.6308 sec/batch
Epoch 19/20  Iteration 3327/3560 Training loss: 1.7605 0.6264 sec/batch
Epoch 19/20  Iteration 3328/3560 Training loss: 1.7607 0.6292 sec/batch
Epoch 19/20  Iteration 3329/3560 Training loss: 1.7607 0.6343 sec/batch
Epoch 19/20  Iteration 3330/3560 Training loss: 1.7605 0.6239 sec/batch
Epoch 19/20  Iteration 3331/3560 Training loss: 1.7607 0.6277 sec/batch
Epoch 19/20  Iteration 3332/3560 Training loss: 1.7606 0.6396 sec/batch
Epoch 19/20  Iteration 3333/3560 Training loss: 1.7606 0.6294 sec/batch
Epoch 19/20  Iteration 3334/3560 Training loss: 1.7605 0.6280 sec/batch
Epoch 19/20  Iteration 3335/3560 Training loss: 1.7602 0.6273 sec/batch
Epoch 19/20  Iteration 3336/3560 Training loss: 1.7599 0.6476 sec/batch
Epoch 19/20  Iteration 3337/3560 Training loss: 1.7600 0.6428 sec/batch
Epoch 19/20  Iteration 3338/3560 Training loss: 1.7600 0.6307 sec/batch
Epoch 19/20  Iteration 3339/3560 Training loss: 1.7601 0.6443 sec/batch
Epoch 19/20  Iteration 3340/3560 Training loss: 1.7601 0.6498 sec/batch
Epoch 19/20  Iteration 3341/3560 Training loss: 1.7602 0.6641 sec/batch
Epoch 19/20  Iteration 3342/3560 Training loss: 1.7603 0.6511 sec/batch
Epoch 19/20  Iteration 3343/3560 Training loss: 1.7603 0.6642 sec/batch
Epoch 19/20  Iteration 3344/3560 Training loss: 1.7602 0.6488 sec/batch
Epoch 19/20  Iteration 3345/3560 Training loss: 1.7605 0.6282 sec/batch
Epoch 19/20  Iteration 3346/3560 Training loss: 1.7604 0.6250 sec/batch
Epoch 19/20  Iteration 3347/3560 Training loss: 1.7604 0.6289 sec/batch
Epoch 19/20  Iteration 3348/3560 Training loss: 1.7605 0.6335 sec/batch
Epoch 19/20  Iteration 3349/3560 Training loss: 1.7603 0.6269 sec/batch
Epoch 19/20  Iteration 3350/3560 Training loss: 1.7605 0.6282 sec/batch
Epoch 19/20  Iteration 3351/3560 Training loss: 1.7605 0.6307 sec/batch
Epoch 19/20  Iteration 3352/3560 Training loss: 1.7607 0.6444 sec/batch
Epoch 19/20  Iteration 3353/3560 Training loss: 1.7607 0.6655 sec/batch
Epoch 19/20  Iteration 3354/3560 Training loss: 1.7607 0.6722 sec/batch
Epoch 19/20  Iteration 3355/3560 Training loss: 1.7605 0.6381 sec/batch
Epoch 19/20  Iteration 3356/3560 Training loss: 1.7606 0.6362 sec/batch
Epoch 19/20  Iteration 3357/3560 Training loss: 1.7606 0.6501 sec/batch
Epoch 19/20  Iteration 3358/3560 Training loss: 1.7607 0.6347 sec/batch
Epoch 19/20  Iteration 3359/3560 Training loss: 1.7607 0.6277 sec/batch
Epoch 19/20  Iteration 3360/3560 Training loss: 1.7606 0.6250 sec/batch
Epoch 19/20  Iteration 3361/3560 Training loss: 1.7607 0.6409 sec/batch
Epoch 19/20  Iteration 3362/3560 Training loss: 1.7608 0.6354 sec/batch
Epoch 19/20  Iteration 3363/3560 Training loss: 1.7605 0.6337 sec/batch
Epoch 19/20  Iteration 3364/3560 Training loss: 1.7608 0.6315 sec/batch
Epoch 19/20  Iteration 3365/3560 Training loss: 1.7609 0.6516 sec/batch
Epoch 19/20  Iteration 3366/3560 Training loss: 1.7609 0.6660 sec/batch
Epoch 19/20  Iteration 3367/3560 Training loss: 1.7610 0.6586 sec/batch
Epoch 19/20  Iteration 3368/3560 Training loss: 1.7610 0.6282 sec/batch
Epoch 19/20  Iteration 3369/3560 Training loss: 1.7610 0.6396 sec/batch
Epoch 19/20  Iteration 3370/3560 Training loss: 1.7610 0.6381 sec/batch
Epoch 19/20  Iteration 3371/3560 Training loss: 1.7610 0.6302 sec/batch
Epoch 19/20  Iteration 3372/3560 Training loss: 1.7613 0.6405 sec/batch
Epoch 19/20  Iteration 3373/3560 Training loss: 1.7613 0.6322 sec/batch
Epoch 19/20  Iteration 3374/3560 Training loss: 1.7613 0.6397 sec/batch
Epoch 19/20  Iteration 3375/3560 Training loss: 1.7612 0.6330 sec/batch
Epoch 19/20  Iteration 3376/3560 Training loss: 1.7610 0.6312 sec/batch
Epoch 19/20  Iteration 3377/3560 Training loss: 1.7611 0.6359 sec/batch
Epoch 19/20  Iteration 3378/3560 Training loss: 1.7612 0.6366 sec/batch
Epoch 19/20  Iteration 3379/3560 Training loss: 1.7612 0.6443 sec/batch
Epoch 19/20  Iteration 3380/3560 Training loss: 1.7612 0.6262 sec/batch
Epoch 19/20  Iteration 3381/3560 Training loss: 1.7611 0.6371 sec/batch
Epoch 19/20  Iteration 3382/3560 Training loss: 1.7611 0.6291 sec/batch
Epoch 20/20  Iteration 3383/3560 Training loss: 1.8198 0.6348 sec/batch
Epoch 20/20  Iteration 3384/3560 Training loss: 1.7875 0.6811 sec/batch
Epoch 20/20  Iteration 3385/3560 Training loss: 1.7744 0.6591 sec/batch
Epoch 20/20  Iteration 3386/3560 Training loss: 1.7691 0.6327 sec/batch
Epoch 20/20  Iteration 3387/3560 Training loss: 1.7638 0.6369 sec/batch
Epoch 20/20  Iteration 3388/3560 Training loss: 1.7542 0.6331 sec/batch
Epoch 20/20  Iteration 3389/3560 Training loss: 1.7564 0.6501 sec/batch
Epoch 20/20  Iteration 3390/3560 Training loss: 1.7556 0.6271 sec/batch
Epoch 20/20  Iteration 3391/3560 Training loss: 1.7582 0.6337 sec/batch
Epoch 20/20  Iteration 3392/3560 Training loss: 1.7579 0.6356 sec/batch
Epoch 20/20  Iteration 3393/3560 Training loss: 1.7550 0.6326 sec/batch
Epoch 20/20  Iteration 3394/3560 Training loss: 1.7542 0.6269 sec/batch
Epoch 20/20  Iteration 3395/3560 Training loss: 1.7546 0.6436 sec/batch
Epoch 20/20  Iteration 3396/3560 Training loss: 1.7571 0.6369 sec/batch
Epoch 20/20  Iteration 3397/3560 Training loss: 1.7570 0.6233 sec/batch
Epoch 20/20  Iteration 3398/3560 Training loss: 1.7568 0.6480 sec/batch
Epoch 20/20  Iteration 3399/3560 Training loss: 1.7570 0.6627 sec/batch
Epoch 20/20  Iteration 3400/3560 Training loss: 1.7598 0.6666 sec/batch
Epoch 20/20  Iteration 3401/3560 Training loss: 1.7593 0.6358 sec/batch
Epoch 20/20  Iteration 3402/3560 Training loss: 1.7598 0.6305 sec/batch
Epoch 20/20  Iteration 3403/3560 Training loss: 1.7590 0.6230 sec/batch
Epoch 20/20  Iteration 3404/3560 Training loss: 1.7598 0.6273 sec/batch
Epoch 20/20  Iteration 3405/3560 Training loss: 1.7593 0.6400 sec/batch
Epoch 20/20  Iteration 3406/3560 Training loss: 1.7594 0.6365 sec/batch
Epoch 20/20  Iteration 3407/3560 Training loss: 1.7589 0.6351 sec/batch
Epoch 20/20  Iteration 3408/3560 Training loss: 1.7577 0.6276 sec/batch
Epoch 20/20  Iteration 3409/3560 Training loss: 1.7565 0.6369 sec/batch
Epoch 20/20  Iteration 3410/3560 Training loss: 1.7570 0.6349 sec/batch
Epoch 20/20  Iteration 3411/3560 Training loss: 1.7579 0.6292 sec/batch
Epoch 20/20  Iteration 3412/3560 Training loss: 1.7583 0.6295 sec/batch
Epoch 20/20  Iteration 3413/3560 Training loss: 1.7583 0.6384 sec/batch
Epoch 20/20  Iteration 3414/3560 Training loss: 1.7575 0.6338 sec/batch
Epoch 20/20  Iteration 3415/3560 Training loss: 1.7580 0.6244 sec/batch
Epoch 20/20  Iteration 3416/3560 Training loss: 1.7586 0.6339 sec/batch
Epoch 20/20  Iteration 3417/3560 Training loss: 1.7584 0.6148 sec/batch
Epoch 20/20  Iteration 3418/3560 Training loss: 1.7584 0.6216 sec/batch
Epoch 20/20  Iteration 3419/3560 Training loss: 1.7576 0.6324 sec/batch
Epoch 20/20  Iteration 3420/3560 Training loss: 1.7566 0.6313 sec/batch
Epoch 20/20  Iteration 3421/3560 Training loss: 1.7553 0.6318 sec/batch
Epoch 20/20  Iteration 3422/3560 Training loss: 1.7546 0.6290 sec/batch
Epoch 20/20  Iteration 3423/3560 Training loss: 1.7540 0.6238 sec/batch
Epoch 20/20  Iteration 3424/3560 Training loss: 1.7544 0.6281 sec/batch
Epoch 20/20  Iteration 3425/3560 Training loss: 1.7541 0.6312 sec/batch
Epoch 20/20  Iteration 3426/3560 Training loss: 1.7535 0.6300 sec/batch
Epoch 20/20  Iteration 3427/3560 Training loss: 1.7539 0.6308 sec/batch
Epoch 20/20  Iteration 3428/3560 Training loss: 1.7529 0.6330 sec/batch
Epoch 20/20  Iteration 3429/3560 Training loss: 1.7529 0.6413 sec/batch
Epoch 20/20  Iteration 3430/3560 Training loss: 1.7523 0.6264 sec/batch
Epoch 20/20  Iteration 3431/3560 Training loss: 1.7520 0.6087 sec/batch
Epoch 20/20  Iteration 3432/3560 Training loss: 1.7528 0.6219 sec/batch
Epoch 20/20  Iteration 3433/3560 Training loss: 1.7523 0.6339 sec/batch
Epoch 20/20  Iteration 3434/3560 Training loss: 1.7531 0.6116 sec/batch
Epoch 20/20  Iteration 3435/3560 Training loss: 1.7529 0.6340 sec/batch
Epoch 20/20  Iteration 3436/3560 Training loss: 1.7530 0.6791 sec/batch
Epoch 20/20  Iteration 3437/3560 Training loss: 1.7525 0.6428 sec/batch
Epoch 20/20  Iteration 3438/3560 Training loss: 1.7527 0.6765 sec/batch
Epoch 20/20  Iteration 3439/3560 Training loss: 1.7529 0.6414 sec/batch
Epoch 20/20  Iteration 3440/3560 Training loss: 1.7526 0.6499 sec/batch
Epoch 20/20  Iteration 3441/3560 Training loss: 1.7522 0.6565 sec/batch
Epoch 20/20  Iteration 3442/3560 Training loss: 1.7528 0.6180 sec/batch
Epoch 20/20  Iteration 3443/3560 Training loss: 1.7528 0.6391 sec/batch
Epoch 20/20  Iteration 3444/3560 Training loss: 1.7536 0.6349 sec/batch
Epoch 20/20  Iteration 3445/3560 Training loss: 1.7541 0.6344 sec/batch
Epoch 20/20  Iteration 3446/3560 Training loss: 1.7542 0.6301 sec/batch
Epoch 20/20  Iteration 3447/3560 Training loss: 1.7543 0.6545 sec/batch
Epoch 20/20  Iteration 3448/3560 Training loss: 1.7546 0.6459 sec/batch
Epoch 20/20  Iteration 3449/3560 Training loss: 1.7545 0.6429 sec/batch
Epoch 20/20  Iteration 3450/3560 Training loss: 1.7541 0.6420 sec/batch
Epoch 20/20  Iteration 3451/3560 Training loss: 1.7541 0.6436 sec/batch
Epoch 20/20  Iteration 3452/3560 Training loss: 1.7541 0.6408 sec/batch
Epoch 20/20  Iteration 3453/3560 Training loss: 1.7545 0.6454 sec/batch
Epoch 20/20  Iteration 3454/3560 Training loss: 1.7547 0.6376 sec/batch
Epoch 20/20  Iteration 3455/3560 Training loss: 1.7551 0.6314 sec/batch
Epoch 20/20  Iteration 3456/3560 Training loss: 1.7548 0.6395 sec/batch
Epoch 20/20  Iteration 3457/3560 Training loss: 1.7549 0.6421 sec/batch
Epoch 20/20  Iteration 3458/3560 Training loss: 1.7552 0.6343 sec/batch
Epoch 20/20  Iteration 3459/3560 Training loss: 1.7552 0.6392 sec/batch
Epoch 20/20  Iteration 3460/3560 Training loss: 1.7552 0.6450 sec/batch
Epoch 20/20  Iteration 3461/3560 Training loss: 1.7549 0.6404 sec/batch
Epoch 20/20  Iteration 3462/3560 Training loss: 1.7547 0.6391 sec/batch
Epoch 20/20  Iteration 3463/3560 Training loss: 1.7542 0.6300 sec/batch
Epoch 20/20  Iteration 3464/3560 Training loss: 1.7543 0.6432 sec/batch
Epoch 20/20  Iteration 3465/3560 Training loss: 1.7538 0.6311 sec/batch
Epoch 20/20  Iteration 3466/3560 Training loss: 1.7537 0.6490 sec/batch
Epoch 20/20  Iteration 3467/3560 Training loss: 1.7532 0.6428 sec/batch
Epoch 20/20  Iteration 3468/3560 Training loss: 1.7530 0.7127 sec/batch
Epoch 20/20  Iteration 3469/3560 Training loss: 1.7529 0.6667 sec/batch
Epoch 20/20  Iteration 3470/3560 Training loss: 1.7525 0.6675 sec/batch
Epoch 20/20  Iteration 3471/3560 Training loss: 1.7521 0.6596 sec/batch
Epoch 20/20  Iteration 3472/3560 Training loss: 1.7522 0.6612 sec/batch
Epoch 20/20  Iteration 3473/3560 Training loss: 1.7520 0.6367 sec/batch
Epoch 20/20  Iteration 3474/3560 Training loss: 1.7518 0.6394 sec/batch
Epoch 20/20  Iteration 3475/3560 Training loss: 1.7515 0.6374 sec/batch
Epoch 20/20  Iteration 3476/3560 Training loss: 1.7512 0.6386 sec/batch
Epoch 20/20  Iteration 3477/3560 Training loss: 1.7508 0.6378 sec/batch
Epoch 20/20  Iteration 3478/3560 Training loss: 1.7508 0.6350 sec/batch
Epoch 20/20  Iteration 3479/3560 Training loss: 1.7506 0.6411 sec/batch
Epoch 20/20  Iteration 3480/3560 Training loss: 1.7502 0.6401 sec/batch
Epoch 20/20  Iteration 3481/3560 Training loss: 1.7499 0.6380 sec/batch
Epoch 20/20  Iteration 3482/3560 Training loss: 1.7495 0.6381 sec/batch
Epoch 20/20  Iteration 3483/3560 Training loss: 1.7496 0.6459 sec/batch
Epoch 20/20  Iteration 3484/3560 Training loss: 1.7496 0.6225 sec/batch
Epoch 20/20  Iteration 3485/3560 Training loss: 1.7494 0.6346 sec/batch
Epoch 20/20  Iteration 3486/3560 Training loss: 1.7491 0.6375 sec/batch
Epoch 20/20  Iteration 3487/3560 Training loss: 1.7490 0.6389 sec/batch
Epoch 20/20  Iteration 3488/3560 Training loss: 1.7489 0.6423 sec/batch
Epoch 20/20  Iteration 3489/3560 Training loss: 1.7489 0.6554 sec/batch
Epoch 20/20  Iteration 3490/3560 Training loss: 1.7488 0.6498 sec/batch
Epoch 20/20  Iteration 3491/3560 Training loss: 1.7490 0.6348 sec/batch
Epoch 20/20  Iteration 3492/3560 Training loss: 1.7490 0.6370 sec/batch
Epoch 20/20  Iteration 3493/3560 Training loss: 1.7489 0.6349 sec/batch
Epoch 20/20  Iteration 3494/3560 Training loss: 1.7489 0.6328 sec/batch
Epoch 20/20  Iteration 3495/3560 Training loss: 1.7487 0.6377 sec/batch
Epoch 20/20  Iteration 3496/3560 Training loss: 1.7485 0.6395 sec/batch
Epoch 20/20  Iteration 3497/3560 Training loss: 1.7483 0.6348 sec/batch
Epoch 20/20  Iteration 3498/3560 Training loss: 1.7480 0.6336 sec/batch
Epoch 20/20  Iteration 3499/3560 Training loss: 1.7480 0.6429 sec/batch
Epoch 20/20  Iteration 3500/3560 Training loss: 1.7479 0.6225 sec/batch
Epoch 20/20  Iteration 3501/3560 Training loss: 1.7478 0.6261 sec/batch
Epoch 20/20  Iteration 3502/3560 Training loss: 1.7478 0.6373 sec/batch
Epoch 20/20  Iteration 3503/3560 Training loss: 1.7477 0.6414 sec/batch
Epoch 20/20  Iteration 3504/3560 Training loss: 1.7475 0.6416 sec/batch
Epoch 20/20  Iteration 3505/3560 Training loss: 1.7473 0.6412 sec/batch
Epoch 20/20  Iteration 3506/3560 Training loss: 1.7475 0.6346 sec/batch
Epoch 20/20  Iteration 3507/3560 Training loss: 1.7475 0.6426 sec/batch
Epoch 20/20  Iteration 3508/3560 Training loss: 1.7471 0.6420 sec/batch
Epoch 20/20  Iteration 3509/3560 Training loss: 1.7473 0.6742 sec/batch
Epoch 20/20  Iteration 3510/3560 Training loss: 1.7473 0.6428 sec/batch
Epoch 20/20  Iteration 3511/3560 Training loss: 1.7472 0.6424 sec/batch
Epoch 20/20  Iteration 3512/3560 Training loss: 1.7470 0.6471 sec/batch
Epoch 20/20  Iteration 3513/3560 Training loss: 1.7467 0.6427 sec/batch
Epoch 20/20  Iteration 3514/3560 Training loss: 1.7465 0.6580 sec/batch
Epoch 20/20  Iteration 3515/3560 Training loss: 1.7465 0.6459 sec/batch
Epoch 20/20  Iteration 3516/3560 Training loss: 1.7466 0.6433 sec/batch
Epoch 20/20  Iteration 3517/3560 Training loss: 1.7466 0.6443 sec/batch
Epoch 20/20  Iteration 3518/3560 Training loss: 1.7467 0.6380 sec/batch
Epoch 20/20  Iteration 3519/3560 Training loss: 1.7468 0.6350 sec/batch
Epoch 20/20  Iteration 3520/3560 Training loss: 1.7469 0.6672 sec/batch
Epoch 20/20  Iteration 3521/3560 Training loss: 1.7469 0.6483 sec/batch
Epoch 20/20  Iteration 3522/3560 Training loss: 1.7469 0.6240 sec/batch
Epoch 20/20  Iteration 3523/3560 Training loss: 1.7472 0.6708 sec/batch
Epoch 20/20  Iteration 3524/3560 Training loss: 1.7471 0.6320 sec/batch
Epoch 20/20  Iteration 3525/3560 Training loss: 1.7471 0.6379 sec/batch
Epoch 20/20  Iteration 3526/3560 Training loss: 1.7472 0.6343 sec/batch
Epoch 20/20  Iteration 3527/3560 Training loss: 1.7471 0.6417 sec/batch
Epoch 20/20  Iteration 3528/3560 Training loss: 1.7472 0.6356 sec/batch
Epoch 20/20  Iteration 3529/3560 Training loss: 1.7472 0.6510 sec/batch
Epoch 20/20  Iteration 3530/3560 Training loss: 1.7473 0.6471 sec/batch
Epoch 20/20  Iteration 3531/3560 Training loss: 1.7474 0.6365 sec/batch
Epoch 20/20  Iteration 3532/3560 Training loss: 1.7474 0.6456 sec/batch
Epoch 20/20  Iteration 3533/3560 Training loss: 1.7471 0.6389 sec/batch
Epoch 20/20  Iteration 3534/3560 Training loss: 1.7472 0.6563 sec/batch
Epoch 20/20  Iteration 3535/3560 Training loss: 1.7472 0.6411 sec/batch
Epoch 20/20  Iteration 3536/3560 Training loss: 1.7473 0.6345 sec/batch
Epoch 20/20  Iteration 3537/3560 Training loss: 1.7473 0.6352 sec/batch
Epoch 20/20  Iteration 3538/3560 Training loss: 1.7473 0.6352 sec/batch
Epoch 20/20  Iteration 3539/3560 Training loss: 1.7474 0.6456 sec/batch
Epoch 20/20  Iteration 3540/3560 Training loss: 1.7474 0.6921 sec/batch
Epoch 20/20  Iteration 3541/3560 Training loss: 1.7472 0.6582 sec/batch
Epoch 20/20  Iteration 3542/3560 Training loss: 1.7474 0.6554 sec/batch
Epoch 20/20  Iteration 3543/3560 Training loss: 1.7476 0.6438 sec/batch
Epoch 20/20  Iteration 3544/3560 Training loss: 1.7476 0.6247 sec/batch
Epoch 20/20  Iteration 3545/3560 Training loss: 1.7476 0.6317 sec/batch
Epoch 20/20  Iteration 3546/3560 Training loss: 1.7476 0.6687 sec/batch
Epoch 20/20  Iteration 3547/3560 Training loss: 1.7476 0.6728 sec/batch
Epoch 20/20  Iteration 3548/3560 Training loss: 1.7475 0.6503 sec/batch
Epoch 20/20  Iteration 3549/3560 Training loss: 1.7476 0.6402 sec/batch
Epoch 20/20  Iteration 3550/3560 Training loss: 1.7480 0.6508 sec/batch
Epoch 20/20  Iteration 3551/3560 Training loss: 1.7480 0.6437 sec/batch
Epoch 20/20  Iteration 3552/3560 Training loss: 1.7479 0.6783 sec/batch
Epoch 20/20  Iteration 3553/3560 Training loss: 1.7478 0.6636 sec/batch
Epoch 20/20  Iteration 3554/3560 Training loss: 1.7477 0.6367 sec/batch
Epoch 20/20  Iteration 3555/3560 Training loss: 1.7478 0.6428 sec/batch
Epoch 20/20  Iteration 3556/3560 Training loss: 1.7479 0.6384 sec/batch
Epoch 20/20  Iteration 3557/3560 Training loss: 1.7479 0.6728 sec/batch
Epoch 20/20  Iteration 3558/3560 Training loss: 1.7479 0.6599 sec/batch
Epoch 20/20  Iteration 3559/3560 Training loss: 1.7478 0.6457 sec/batch
Epoch 20/20  Iteration 3560/3560 Training loss: 1.7479 0.6338 sec/batch
Epoch 1/20  Iteration 1/3560 Training loss: 4.4185 0.9624 sec/batch
Epoch 1/20  Iteration 2/3560 Training loss: 4.3935 0.8549 sec/batch
Epoch 1/20  Iteration 3/3560 Training loss: 4.3331 0.8866 sec/batch
Epoch 1/20  Iteration 4/3560 Training loss: 4.1888 0.8936 sec/batch
Epoch 1/20  Iteration 5/3560 Training loss: 4.0567 0.9176 sec/batch
Epoch 1/20  Iteration 6/3560 Training loss: 3.9597 0.8908 sec/batch
Epoch 1/20  Iteration 7/3560 Training loss: 3.8817 0.9123 sec/batch
Epoch 1/20  Iteration 8/3560 Training loss: 3.8167 0.8727 sec/batch
Epoch 1/20  Iteration 9/3560 Training loss: 3.7621 0.8845 sec/batch
Epoch 1/20  Iteration 10/3560 Training loss: 3.7155 0.9804 sec/batch
Epoch 1/20  Iteration 11/3560 Training loss: 3.6727 0.8927 sec/batch
Epoch 1/20  Iteration 12/3560 Training loss: 3.6381 0.8493 sec/batch
Epoch 1/20  Iteration 13/3560 Training loss: 3.6076 0.8773 sec/batch
Epoch 1/20  Iteration 14/3560 Training loss: 3.5816 0.8602 sec/batch
Epoch 1/20  Iteration 15/3560 Training loss: 3.5582 0.8590 sec/batch
Epoch 1/20  Iteration 16/3560 Training loss: 3.5369 0.8411 sec/batch
Epoch 1/20  Iteration 17/3560 Training loss: 3.5174 0.8485 sec/batch
Epoch 1/20  Iteration 18/3560 Training loss: 3.5010 0.8752 sec/batch
Epoch 1/20  Iteration 19/3560 Training loss: 3.4852 0.8486 sec/batch
Epoch 1/20  Iteration 20/3560 Training loss: 3.4693 0.8654 sec/batch
Epoch 1/20  Iteration 21/3560 Training loss: 3.4553 0.8471 sec/batch
Epoch 1/20  Iteration 22/3560 Training loss: 3.4424 0.8607 sec/batch
Epoch 1/20  Iteration 23/3560 Training loss: 3.4301 0.8382 sec/batch
Epoch 1/20  Iteration 24/3560 Training loss: 3.4189 0.8529 sec/batch
Epoch 1/20  Iteration 25/3560 Training loss: 3.4081 0.8384 sec/batch
Epoch 1/20  Iteration 26/3560 Training loss: 3.3986 0.8575 sec/batch
Epoch 1/20  Iteration 27/3560 Training loss: 3.3899 0.8989 sec/batch
Epoch 1/20  Iteration 28/3560 Training loss: 3.3808 0.8472 sec/batch
Epoch 1/20  Iteration 29/3560 Training loss: 3.3724 0.8487 sec/batch
Epoch 1/20  Iteration 30/3560 Training loss: 3.3643 0.8568 sec/batch
Epoch 1/20  Iteration 31/3560 Training loss: 3.3578 0.8471 sec/batch
Epoch 1/20  Iteration 32/3560 Training loss: 3.3505 0.8474 sec/batch
Epoch 1/20  Iteration 33/3560 Training loss: 3.3431 0.8480 sec/batch
Epoch 1/20  Iteration 34/3560 Training loss: 3.3368 0.8565 sec/batch
Epoch 1/20  Iteration 35/3560 Training loss: 3.3303 0.8514 sec/batch
Epoch 1/20  Iteration 36/3560 Training loss: 3.3248 0.8508 sec/batch
Epoch 1/20  Iteration 37/3560 Training loss: 3.3186 0.8586 sec/batch
Epoch 1/20  Iteration 38/3560 Training loss: 3.3127 0.8430 sec/batch
Epoch 1/20  Iteration 39/3560 Training loss: 3.3070 0.8510 sec/batch
Epoch 1/20  Iteration 40/3560 Training loss: 3.3017 0.8992 sec/batch
Epoch 1/20  Iteration 41/3560 Training loss: 3.2963 0.8493 sec/batch
Epoch 1/20  Iteration 42/3560 Training loss: 3.2912 0.8475 sec/batch
Epoch 1/20  Iteration 43/3560 Training loss: 3.2862 0.8549 sec/batch
Epoch 1/20  Iteration 44/3560 Training loss: 3.2813 0.8515 sec/batch
Epoch 1/20  Iteration 45/3560 Training loss: 3.2766 0.8458 sec/batch
Epoch 1/20  Iteration 46/3560 Training loss: 3.2722 0.8447 sec/batch
Epoch 1/20  Iteration 47/3560 Training loss: 3.2680 0.8538 sec/batch
Epoch 1/20  Iteration 48/3560 Training loss: 3.2641 0.8451 sec/batch
Epoch 1/20  Iteration 49/3560 Training loss: 3.2602 0.8672 sec/batch
Epoch 1/20  Iteration 50/3560 Training loss: 3.2564 0.8430 sec/batch
Epoch 1/20  Iteration 51/3560 Training loss: 3.2525 0.8440 sec/batch
Epoch 1/20  Iteration 52/3560 Training loss: 3.2486 0.9059 sec/batch
Epoch 1/20  Iteration 53/3560 Training loss: 3.2448 1.0640 sec/batch
Epoch 1/20  Iteration 54/3560 Training loss: 3.2409 0.8544 sec/batch
Epoch 1/20  Iteration 55/3560 Training loss: 3.2373 0.8389 sec/batch
Epoch 1/20  Iteration 56/3560 Training loss: 3.2335 0.8460 sec/batch
Epoch 1/20  Iteration 57/3560 Training loss: 3.2298 0.8454 sec/batch
Epoch 1/20  Iteration 58/3560 Training loss: 3.2263 0.8454 sec/batch
Epoch 1/20  Iteration 59/3560 Training loss: 3.2226 0.9184 sec/batch
Epoch 1/20  Iteration 60/3560 Training loss: 3.2191 0.8414 sec/batch
Epoch 1/20  Iteration 61/3560 Training loss: 3.2157 0.8521 sec/batch
Epoch 1/20  Iteration 62/3560 Training loss: 3.2126 0.8449 sec/batch
Epoch 1/20  Iteration 63/3560 Training loss: 3.2096 0.8481 sec/batch
Epoch 1/20  Iteration 64/3560 Training loss: 3.2058 0.8497 sec/batch
Epoch 1/20  Iteration 65/3560 Training loss: 3.2021 0.8381 sec/batch
Epoch 1/20  Iteration 66/3560 Training loss: 3.1989 0.9015 sec/batch
Epoch 1/20  Iteration 67/3560 Training loss: 3.1955 0.8475 sec/batch
Epoch 1/20  Iteration 68/3560 Training loss: 3.1914 0.9219 sec/batch
Epoch 1/20  Iteration 69/3560 Training loss: 3.1876 0.8650 sec/batch
Epoch 1/20  Iteration 70/3560 Training loss: 3.1842 0.9201 sec/batch
Epoch 1/20  Iteration 71/3560 Training loss: 3.1806 0.8632 sec/batch
Epoch 1/20  Iteration 72/3560 Training loss: 3.1774 0.8464 sec/batch
Epoch 1/20  Iteration 73/3560 Training loss: 3.1737 0.8492 sec/batch
Epoch 1/20  Iteration 74/3560 Training loss: 3.1702 0.8481 sec/batch
Epoch 1/20  Iteration 75/3560 Training loss: 3.1667 0.8528 sec/batch
Epoch 1/20  Iteration 76/3560 Training loss: 3.1632 0.8424 sec/batch
Epoch 1/20  Iteration 77/3560 Training loss: 3.1595 0.8934 sec/batch
Epoch 1/20  Iteration 78/3560 Training loss: 3.1558 0.8437 sec/batch
Epoch 1/20  Iteration 79/3560 Training loss: 3.1518 0.8575 sec/batch
Epoch 1/20  Iteration 80/3560 Training loss: 3.1479 0.8649 sec/batch
Epoch 1/20  Iteration 81/3560 Training loss: 3.1441 0.8464 sec/batch
Epoch 1/20  Iteration 82/3560 Training loss: 3.1404 0.8503 sec/batch
Epoch 1/20  Iteration 83/3560 Training loss: 3.1365 0.8462 sec/batch
Epoch 1/20  Iteration 84/3560 Training loss: 3.1325 0.8444 sec/batch
Epoch 1/20  Iteration 85/3560 Training loss: 3.1283 0.8416 sec/batch
Epoch 1/20  Iteration 86/3560 Training loss: 3.1242 0.8573 sec/batch
Epoch 1/20  Iteration 87/3560 Training loss: 3.1201 0.8630 sec/batch
Epoch 1/20  Iteration 88/3560 Training loss: 3.1160 0.8420 sec/batch
Epoch 1/20  Iteration 89/3560 Training loss: 3.1119 0.8451 sec/batch
Epoch 1/20  Iteration 90/3560 Training loss: 3.1080 0.8522 sec/batch
Epoch 1/20  Iteration 91/3560 Training loss: 3.1038 0.8458 sec/batch
Epoch 1/20  Iteration 92/3560 Training loss: 3.0997 0.8469 sec/batch
Epoch 1/20  Iteration 93/3560 Training loss: 3.0957 1.0345 sec/batch
Epoch 1/20  Iteration 94/3560 Training loss: 3.0914 0.9241 sec/batch
Epoch 1/20  Iteration 95/3560 Training loss: 3.0870 0.8840 sec/batch
Epoch 1/20  Iteration 96/3560 Training loss: 3.0827 0.8491 sec/batch
Epoch 1/20  Iteration 97/3560 Training loss: 3.0786 0.8848 sec/batch
Epoch 1/20  Iteration 98/3560 Training loss: 3.0744 0.8440 sec/batch
Epoch 1/20  Iteration 99/3560 Training loss: 3.0703 0.8636 sec/batch
Epoch 1/20  Iteration 100/3560 Training loss: 3.0661 0.8984 sec/batch
Epoch 1/20  Iteration 101/3560 Training loss: 3.0620 0.8453 sec/batch
Epoch 1/20  Iteration 102/3560 Training loss: 3.0579 0.8959 sec/batch
Epoch 1/20  Iteration 103/3560 Training loss: 3.0536 0.8495 sec/batch
Epoch 1/20  Iteration 104/3560 Training loss: 3.0493 0.8481 sec/batch
Epoch 1/20  Iteration 105/3560 Training loss: 3.0451 0.8493 sec/batch
Epoch 1/20  Iteration 106/3560 Training loss: 3.0410 0.8588 sec/batch
Epoch 1/20  Iteration 107/3560 Training loss: 3.0367 0.8480 sec/batch
Epoch 1/20  Iteration 108/3560 Training loss: 3.0326 0.8500 sec/batch
Epoch 1/20  Iteration 109/3560 Training loss: 3.0287 0.8511 sec/batch
Epoch 1/20  Iteration 110/3560 Training loss: 3.0242 0.8487 sec/batch
Epoch 1/20  Iteration 111/3560 Training loss: 3.0201 0.8618 sec/batch
Epoch 1/20  Iteration 112/3560 Training loss: 3.0161 0.8792 sec/batch
Epoch 1/20  Iteration 113/3560 Training loss: 3.0120 0.9152 sec/batch
Epoch 1/20  Iteration 114/3560 Training loss: 3.0079 0.8608 sec/batch
Epoch 1/20  Iteration 115/3560 Training loss: 3.0040 0.9063 sec/batch
Epoch 1/20  Iteration 116/3560 Training loss: 2.9997 0.9050 sec/batch
Epoch 1/20  Iteration 117/3560 Training loss: 2.9958 0.8932 sec/batch
Epoch 1/20  Iteration 118/3560 Training loss: 2.9919 0.9004 sec/batch
Epoch 1/20  Iteration 119/3560 Training loss: 2.9882 0.9408 sec/batch
Epoch 1/20  Iteration 120/3560 Training loss: 2.9843 0.8507 sec/batch
Epoch 1/20  Iteration 121/3560 Training loss: 2.9807 0.8926 sec/batch
Epoch 1/20  Iteration 122/3560 Training loss: 2.9768 0.8635 sec/batch
Epoch 1/20  Iteration 123/3560 Training loss: 2.9730 0.8648 sec/batch
Epoch 1/20  Iteration 124/3560 Training loss: 2.9692 0.8535 sec/batch
Epoch 1/20  Iteration 125/3560 Training loss: 2.9655 0.8557 sec/batch
Epoch 1/20  Iteration 126/3560 Training loss: 2.9615 0.9232 sec/batch
Epoch 1/20  Iteration 127/3560 Training loss: 2.9579 0.8710 sec/batch
Epoch 1/20  Iteration 128/3560 Training loss: 2.9543 0.8620 sec/batch
Epoch 1/20  Iteration 129/3560 Training loss: 2.9506 0.8618 sec/batch
Epoch 1/20  Iteration 130/3560 Training loss: 2.9469 0.8579 sec/batch
Epoch 1/20  Iteration 131/3560 Training loss: 2.9433 0.8522 sec/batch
Epoch 1/20  Iteration 132/3560 Training loss: 2.9395 0.8669 sec/batch
Epoch 1/20  Iteration 133/3560 Training loss: 2.9360 0.8499 sec/batch
Epoch 1/20  Iteration 134/3560 Training loss: 2.9325 0.8466 sec/batch
Epoch 1/20  Iteration 135/3560 Training loss: 2.9288 0.8878 sec/batch
Epoch 1/20  Iteration 136/3560 Training loss: 2.9252 0.8628 sec/batch
Epoch 1/20  Iteration 137/3560 Training loss: 2.9217 0.8550 sec/batch
Epoch 1/20  Iteration 138/3560 Training loss: 2.9183 0.8574 sec/batch
Epoch 1/20  Iteration 139/3560 Training loss: 2.9150 0.8837 sec/batch
Epoch 1/20  Iteration 140/3560 Training loss: 2.9115 0.8664 sec/batch
Epoch 1/20  Iteration 141/3560 Training loss: 2.9084 0.9269 sec/batch
Epoch 1/20  Iteration 142/3560 Training loss: 2.9049 0.9101 sec/batch
Epoch 1/20  Iteration 143/3560 Training loss: 2.9016 0.8962 sec/batch
Epoch 1/20  Iteration 144/3560 Training loss: 2.8983 0.8568 sec/batch
Epoch 1/20  Iteration 145/3560 Training loss: 2.8951 0.8693 sec/batch
Epoch 1/20  Iteration 146/3560 Training loss: 2.8920 0.8501 sec/batch
Epoch 1/20  Iteration 147/3560 Training loss: 2.8887 0.8913 sec/batch
Epoch 1/20  Iteration 148/3560 Training loss: 2.8857 0.8976 sec/batch
Epoch 1/20  Iteration 149/3560 Training loss: 2.8824 0.8466 sec/batch
Epoch 1/20  Iteration 150/3560 Training loss: 2.8791 0.9355 sec/batch
Epoch 1/20  Iteration 151/3560 Training loss: 2.8763 0.9166 sec/batch
Epoch 1/20  Iteration 152/3560 Training loss: 2.8734 0.8572 sec/batch
Epoch 1/20  Iteration 153/3560 Training loss: 2.8704 0.8729 sec/batch
Epoch 1/20  Iteration 154/3560 Training loss: 2.8674 0.8741 sec/batch
Epoch 1/20  Iteration 155/3560 Training loss: 2.8643 0.8808 sec/batch
Epoch 1/20  Iteration 156/3560 Training loss: 2.8612 0.8772 sec/batch
Epoch 1/20  Iteration 157/3560 Training loss: 2.8581 0.9010 sec/batch
Epoch 1/20  Iteration 158/3560 Training loss: 2.8551 0.8564 sec/batch
Epoch 1/20  Iteration 159/3560 Training loss: 2.8519 0.8677 sec/batch
Epoch 1/20  Iteration 160/3560 Training loss: 2.8491 0.8775 sec/batch
Epoch 1/20  Iteration 161/3560 Training loss: 2.8462 0.8490 sec/batch
Epoch 1/20  Iteration 162/3560 Training loss: 2.8431 0.8812 sec/batch
Epoch 1/20  Iteration 163/3560 Training loss: 2.8401 0.8566 sec/batch
Epoch 1/20  Iteration 164/3560 Training loss: 2.8372 0.8407 sec/batch
Epoch 1/20  Iteration 165/3560 Training loss: 2.8344 0.8548 sec/batch
Epoch 1/20  Iteration 166/3560 Training loss: 2.8316 0.8532 sec/batch
Epoch 1/20  Iteration 167/3560 Training loss: 2.8288 0.8795 sec/batch
Epoch 1/20  Iteration 168/3560 Training loss: 2.8261 0.8895 sec/batch
Epoch 1/20  Iteration 169/3560 Training loss: 2.8234 0.8538 sec/batch
Epoch 1/20  Iteration 170/3560 Training loss: 2.8206 0.8456 sec/batch
Epoch 1/20  Iteration 171/3560 Training loss: 2.8179 0.8573 sec/batch
Epoch 1/20  Iteration 172/3560 Training loss: 2.8153 0.8665 sec/batch
Epoch 1/20  Iteration 173/3560 Training loss: 2.8129 0.8560 sec/batch
Epoch 1/20  Iteration 174/3560 Training loss: 2.8105 0.8511 sec/batch
Epoch 1/20  Iteration 175/3560 Training loss: 2.8080 0.8644 sec/batch
Epoch 1/20  Iteration 176/3560 Training loss: 2.8054 0.8746 sec/batch
Epoch 1/20  Iteration 177/3560 Training loss: 2.8027 0.8698 sec/batch
Epoch 1/20  Iteration 178/3560 Training loss: 2.8000 0.8570 sec/batch
Epoch 2/20  Iteration 179/3560 Training loss: 2.3845 0.8392 sec/batch
Epoch 2/20  Iteration 180/3560 Training loss: 2.3397 0.8472 sec/batch
Epoch 2/20  Iteration 181/3560 Training loss: 2.3282 0.8459 sec/batch
Epoch 2/20  Iteration 182/3560 Training loss: 2.3261 0.8491 sec/batch
Epoch 2/20  Iteration 183/3560 Training loss: 2.3229 0.8506 sec/batch
Epoch 2/20  Iteration 184/3560 Training loss: 2.3208 0.8693 sec/batch
Epoch 2/20  Iteration 185/3560 Training loss: 2.3222 0.8814 sec/batch
Epoch 2/20  Iteration 186/3560 Training loss: 2.3223 0.8547 sec/batch
Epoch 2/20  Iteration 187/3560 Training loss: 2.3229 0.8546 sec/batch
Epoch 2/20  Iteration 188/3560 Training loss: 2.3221 0.9317 sec/batch
Epoch 2/20  Iteration 189/3560 Training loss: 2.3198 0.9321 sec/batch
Epoch 2/20  Iteration 190/3560 Training loss: 2.3190 0.8870 sec/batch
Epoch 2/20  Iteration 191/3560 Training loss: 2.3187 0.8647 sec/batch
Epoch 2/20  Iteration 192/3560 Training loss: 2.3206 0.8705 sec/batch
Epoch 2/20  Iteration 193/3560 Training loss: 2.3203 0.8982 sec/batch
Epoch 2/20  Iteration 194/3560 Training loss: 2.3197 0.9118 sec/batch
Epoch 2/20  Iteration 195/3560 Training loss: 2.3189 0.8488 sec/batch
Epoch 2/20  Iteration 196/3560 Training loss: 2.3200 0.9109 sec/batch
Epoch 2/20  Iteration 197/3560 Training loss: 2.3192 0.8779 sec/batch
Epoch 2/20  Iteration 198/3560 Training loss: 2.3171 0.8885 sec/batch
Epoch 2/20  Iteration 199/3560 Training loss: 2.3155 0.9043 sec/batch
Epoch 2/20  Iteration 200/3560 Training loss: 2.3162 0.8542 sec/batch
Epoch 2/20  Iteration 201/3560 Training loss: 2.3148 0.8745 sec/batch
Epoch 2/20  Iteration 202/3560 Training loss: 2.3132 0.8864 sec/batch
Epoch 2/20  Iteration 203/3560 Training loss: 2.3121 0.8571 sec/batch
Epoch 2/20  Iteration 204/3560 Training loss: 2.3107 0.8996 sec/batch
Epoch 2/20  Iteration 205/3560 Training loss: 2.3093 0.9436 sec/batch
Epoch 2/20  Iteration 206/3560 Training loss: 2.3083 0.9864 sec/batch
Epoch 2/20  Iteration 207/3560 Training loss: 2.3079 0.9328 sec/batch
Epoch 2/20  Iteration 208/3560 Training loss: 2.3074 0.9574 sec/batch
Epoch 2/20  Iteration 209/3560 Training loss: 2.3067 0.9228 sec/batch
Epoch 2/20  Iteration 210/3560 Training loss: 2.3056 0.8820 sec/batch
Epoch 2/20  Iteration 211/3560 Training loss: 2.3047 0.8983 sec/batch
Epoch 2/20  Iteration 212/3560 Training loss: 2.3042 0.8494 sec/batch
Epoch 2/20  Iteration 213/3560 Training loss: 2.3031 0.9356 sec/batch
Epoch 2/20  Iteration 214/3560 Training loss: 2.3022 0.8776 sec/batch
Epoch 2/20  Iteration 215/3560 Training loss: 2.3009 0.8733 sec/batch
Epoch 2/20  Iteration 216/3560 Training loss: 2.2991 0.8641 sec/batch
Epoch 2/20  Iteration 217/3560 Training loss: 2.2975 0.8847 sec/batch
Epoch 2/20  Iteration 218/3560 Training loss: 2.2960 0.9666 sec/batch
Epoch 2/20  Iteration 219/3560 Training loss: 2.2947 0.8790 sec/batch
Epoch 2/20  Iteration 220/3560 Training loss: 2.2934 0.8895 sec/batch
Epoch 2/20  Iteration 221/3560 Training loss: 2.2921 0.8860 sec/batch
Epoch 2/20  Iteration 222/3560 Training loss: 2.2905 0.8854 sec/batch
Epoch 2/20  Iteration 223/3560 Training loss: 2.2894 0.9630 sec/batch
Epoch 2/20  Iteration 224/3560 Training loss: 2.2873 0.8831 sec/batch
Epoch 2/20  Iteration 225/3560 Training loss: 2.2868 0.8932 sec/batch
Epoch 2/20  Iteration 226/3560 Training loss: 2.2856 0.8986 sec/batch
Epoch 2/20  Iteration 227/3560 Training loss: 2.2845 0.8957 sec/batch
Epoch 2/20  Iteration 228/3560 Training loss: 2.2841 0.8700 sec/batch
Epoch 2/20  Iteration 229/3560 Training loss: 2.2827 0.8741 sec/batch
Epoch 2/20  Iteration 230/3560 Training loss: 2.2821 0.8825 sec/batch
Epoch 2/20  Iteration 231/3560 Training loss: 2.2809 0.9320 sec/batch
Epoch 2/20  Iteration 232/3560 Training loss: 2.2797 0.8749 sec/batch
Epoch 2/20  Iteration 233/3560 Training loss: 2.2786 0.9252 sec/batch
Epoch 2/20  Iteration 234/3560 Training loss: 2.2778 0.8762 sec/batch
Epoch 2/20  Iteration 235/3560 Training loss: 2.2769 0.9165 sec/batch
Epoch 2/20  Iteration 236/3560 Training loss: 2.2759 0.9183 sec/batch
Epoch 2/20  Iteration 237/3560 Training loss: 2.2749 0.9017 sec/batch
Epoch 2/20  Iteration 238/3560 Training loss: 2.2745 0.8470 sec/batch
Epoch 2/20  Iteration 239/3560 Training loss: 2.2747 0.8850 sec/batch
Epoch 2/20  Iteration 240/3560 Training loss: 2.2754 0.8736 sec/batch
Epoch 2/20  Iteration 241/3560 Training loss: 2.2763 0.8591 sec/batch
Epoch 2/20  Iteration 242/3560 Training loss: 2.2766 0.8449 sec/batch
Epoch 2/20  Iteration 243/3560 Training loss: 2.2769 0.8625 sec/batch
Epoch 2/20  Iteration 244/3560 Training loss: 2.2771 0.8650 sec/batch
Epoch 2/20  Iteration 245/3560 Training loss: 2.2773 0.8726 sec/batch
Epoch 2/20  Iteration 246/3560 Training loss: 2.2768 0.8634 sec/batch
Epoch 2/20  Iteration 247/3560 Training loss: 2.2765 0.9156 sec/batch
Epoch 2/20  Iteration 248/3560 Training loss: 2.2762 0.8677 sec/batch
Epoch 2/20  Iteration 249/3560 Training loss: 2.2758 0.8529 sec/batch
Epoch 2/20  Iteration 250/3560 Training loss: 2.2751 0.9338 sec/batch
Epoch 2/20  Iteration 251/3560 Training loss: 2.2744 0.9140 sec/batch
Epoch 2/20  Iteration 252/3560 Training loss: 2.2733 0.9117 sec/batch
Epoch 2/20  Iteration 253/3560 Training loss: 2.2723 0.9086 sec/batch
Epoch 2/20  Iteration 254/3560 Training loss: 2.2719 0.8725 sec/batch
Epoch 2/20  Iteration 255/3560 Training loss: 2.2709 0.9696 sec/batch
Epoch 2/20  Iteration 256/3560 Training loss: 2.2702 0.8797 sec/batch
Epoch 2/20  Iteration 257/3560 Training loss: 2.2690 0.9733 sec/batch
Epoch 2/20  Iteration 258/3560 Training loss: 2.2679 0.8962 sec/batch
Epoch 2/20  Iteration 259/3560 Training loss: 2.2666 0.8756 sec/batch
Epoch 2/20  Iteration 260/3560 Training loss: 2.2659 0.8906 sec/batch
Epoch 2/20  Iteration 261/3560 Training loss: 2.2647 0.8899 sec/batch
Epoch 2/20  Iteration 262/3560 Training loss: 2.2636 0.8414 sec/batch
Epoch 2/20  Iteration 263/3560 Training loss: 2.2621 0.8478 sec/batch
Epoch 2/20  Iteration 264/3560 Training loss: 2.2611 0.8519 sec/batch
Epoch 2/20  Iteration 265/3560 Training loss: 2.2602 0.8475 sec/batch
Epoch 2/20  Iteration 266/3560 Training loss: 2.2592 0.8619 sec/batch
Epoch 2/20  Iteration 267/3560 Training loss: 2.2581 0.8652 sec/batch
Epoch 2/20  Iteration 268/3560 Training loss: 2.2573 0.8817 sec/batch
Epoch 2/20  Iteration 269/3560 Training loss: 2.2563 0.8893 sec/batch
Epoch 2/20  Iteration 270/3560 Training loss: 2.2554 0.8772 sec/batch
Epoch 2/20  Iteration 271/3560 Training loss: 2.2542 0.9195 sec/batch
Epoch 2/20  Iteration 272/3560 Training loss: 2.2532 0.8698 sec/batch
Epoch 2/20  Iteration 273/3560 Training loss: 2.2522 0.9173 sec/batch
Epoch 2/20  Iteration 274/3560 Training loss: 2.2512 0.8764 sec/batch
Epoch 2/20  Iteration 275/3560 Training loss: 2.2503 0.8542 sec/batch
Epoch 2/20  Iteration 276/3560 Training loss: 2.2494 0.8510 sec/batch
Epoch 2/20  Iteration 277/3560 Training loss: 2.2483 0.8609 sec/batch
Epoch 2/20  Iteration 278/3560 Training loss: 2.2472 0.8754 sec/batch
Epoch 2/20  Iteration 279/3560 Training loss: 2.2465 0.8592 sec/batch
Epoch 2/20  Iteration 280/3560 Training loss: 2.2457 0.8783 sec/batch
Epoch 2/20  Iteration 281/3560 Training loss: 2.2446 0.8585 sec/batch
Epoch 2/20  Iteration 282/3560 Training loss: 2.2437 0.8895 sec/batch
Epoch 2/20  Iteration 283/3560 Training loss: 2.2427 0.8591 sec/batch
Epoch 2/20  Iteration 284/3560 Training loss: 2.2419 0.8742 sec/batch
Epoch 2/20  Iteration 285/3560 Training loss: 2.2410 0.8904 sec/batch
Epoch 2/20  Iteration 286/3560 Training loss: 2.2404 0.8511 sec/batch
Epoch 2/20  Iteration 287/3560 Training loss: 2.2396 0.8538 sec/batch
Epoch 2/20  Iteration 288/3560 Training loss: 2.2387 0.8408 sec/batch
Epoch 2/20  Iteration 289/3560 Training loss: 2.2379 0.8482 sec/batch
Epoch 2/20  Iteration 290/3560 Training loss: 2.2372 0.8456 sec/batch
Epoch 2/20  Iteration 291/3560 Training loss: 2.2363 0.8603 sec/batch
Epoch 2/20  Iteration 292/3560 Training loss: 2.2354 0.8415 sec/batch
Epoch 2/20  Iteration 293/3560 Training loss: 2.2346 0.8980 sec/batch
Epoch 2/20  Iteration 294/3560 Training loss: 2.2334 0.8437 sec/batch
Epoch 2/20  Iteration 295/3560 Training loss: 2.2326 0.8446 sec/batch
Epoch 2/20  Iteration 296/3560 Training loss: 2.2318 0.8414 sec/batch
Epoch 2/20  Iteration 297/3560 Training loss: 2.2312 0.8450 sec/batch
Epoch 2/20  Iteration 298/3560 Training loss: 2.2306 0.8892 sec/batch
Epoch 2/20  Iteration 299/3560 Training loss: 2.2299 0.9129 sec/batch
Epoch 2/20  Iteration 300/3560 Training loss: 2.2290 0.9377 sec/batch
Epoch 2/20  Iteration 301/3560 Training loss: 2.2281 0.8639 sec/batch
Epoch 2/20  Iteration 302/3560 Training loss: 2.2275 0.8615 sec/batch
Epoch 2/20  Iteration 303/3560 Training loss: 2.2268 0.8633 sec/batch
Epoch 2/20  Iteration 304/3560 Training loss: 2.2259 0.8383 sec/batch
Epoch 2/20  Iteration 305/3560 Training loss: 2.2252 0.8385 sec/batch
Epoch 2/20  Iteration 306/3560 Training loss: 2.2246 0.8429 sec/batch
Epoch 2/20  Iteration 307/3560 Training loss: 2.2239 0.8439 sec/batch
Epoch 2/20  Iteration 308/3560 Training loss: 2.2231 0.8756 sec/batch
Epoch 2/20  Iteration 309/3560 Training loss: 2.2223 0.8691 sec/batch
Epoch 2/20  Iteration 310/3560 Training loss: 2.2214 0.8520 sec/batch
Epoch 2/20  Iteration 311/3560 Training loss: 2.2207 0.8690 sec/batch
Epoch 2/20  Iteration 312/3560 Training loss: 2.2202 0.8604 sec/batch
Epoch 2/20  Iteration 313/3560 Training loss: 2.2194 0.9039 sec/batch
Epoch 2/20  Iteration 314/3560 Training loss: 2.2188 0.8561 sec/batch
Epoch 2/20  Iteration 315/3560 Training loss: 2.2180 0.8556 sec/batch
Epoch 2/20  Iteration 316/3560 Training loss: 2.2174 1.0006 sec/batch
Epoch 2/20  Iteration 317/3560 Training loss: 2.2170 0.8772 sec/batch
Epoch 2/20  Iteration 318/3560 Training loss: 2.2161 0.9147 sec/batch
Epoch 2/20  Iteration 319/3560 Training loss: 2.2156 0.8616 sec/batch
Epoch 2/20  Iteration 320/3560 Training loss: 2.2148 0.8652 sec/batch
Epoch 2/20  Iteration 321/3560 Training loss: 2.2142 0.8503 sec/batch
Epoch 2/20  Iteration 322/3560 Training loss: 2.2135 0.9025 sec/batch
Epoch 2/20  Iteration 323/3560 Training loss: 2.2127 0.8933 sec/batch
Epoch 2/20  Iteration 324/3560 Training loss: 2.2122 0.8435 sec/batch
Epoch 2/20  Iteration 325/3560 Training loss: 2.2115 0.9475 sec/batch
Epoch 2/20  Iteration 326/3560 Training loss: 2.2109 0.8662 sec/batch
Epoch 2/20  Iteration 327/3560 Training loss: 2.2102 0.8605 sec/batch
Epoch 2/20  Iteration 328/3560 Training loss: 2.2093 0.8385 sec/batch
Epoch 2/20  Iteration 329/3560 Training loss: 2.2086 0.8543 sec/batch
Epoch 2/20  Iteration 330/3560 Training loss: 2.2082 0.8466 sec/batch
Epoch 2/20  Iteration 331/3560 Training loss: 2.2076 0.8442 sec/batch
Epoch 2/20  Iteration 332/3560 Training loss: 2.2070 0.8490 sec/batch
Epoch 2/20  Iteration 333/3560 Training loss: 2.2063 0.8773 sec/batch
Epoch 2/20  Iteration 334/3560 Training loss: 2.2056 0.8470 sec/batch
Epoch 2/20  Iteration 335/3560 Training loss: 2.2049 0.8568 sec/batch
Epoch 2/20  Iteration 336/3560 Training loss: 2.2042 0.8414 sec/batch
Epoch 2/20  Iteration 337/3560 Training loss: 2.2033 0.8633 sec/batch
Epoch 2/20  Iteration 338/3560 Training loss: 2.2029 0.8723 sec/batch
Epoch 2/20  Iteration 339/3560 Training loss: 2.2023 0.9104 sec/batch
Epoch 2/20  Iteration 340/3560 Training loss: 2.2016 0.9024 sec/batch
Epoch 2/20  Iteration 341/3560 Training loss: 2.2009 0.9442 sec/batch
Epoch 2/20  Iteration 342/3560 Training loss: 2.2003 0.9068 sec/batch
Epoch 2/20  Iteration 343/3560 Training loss: 2.1996 0.9180 sec/batch
Epoch 2/20  Iteration 344/3560 Training loss: 2.1989 0.9259 sec/batch
Epoch 2/20  Iteration 345/3560 Training loss: 2.1983 0.8993 sec/batch
Epoch 2/20  Iteration 346/3560 Training loss: 2.1978 0.8398 sec/batch
Epoch 2/20  Iteration 347/3560 Training loss: 2.1972 0.8740 sec/batch
Epoch 2/20  Iteration 348/3560 Training loss: 2.1965 0.8555 sec/batch
Epoch 2/20  Iteration 349/3560 Training loss: 2.1958 0.8471 sec/batch
Epoch 2/20  Iteration 350/3560 Training loss: 2.1950 0.8897 sec/batch
Epoch 2/20  Iteration 351/3560 Training loss: 2.1945 0.8594 sec/batch
Epoch 2/20  Iteration 352/3560 Training loss: 2.1940 0.8555 sec/batch
Epoch 2/20  Iteration 353/3560 Training loss: 2.1934 0.8712 sec/batch
Epoch 2/20  Iteration 354/3560 Training loss: 2.1927 0.8928 sec/batch
Epoch 2/20  Iteration 355/3560 Training loss: 2.1920 0.8683 sec/batch
Epoch 2/20  Iteration 356/3560 Training loss: 2.1914 0.8550 sec/batch
Epoch 3/20  Iteration 357/3560 Training loss: 2.1339 0.8493 sec/batch
Epoch 3/20  Iteration 358/3560 Training loss: 2.0858 0.8563 sec/batch
Epoch 3/20  Iteration 359/3560 Training loss: 2.0763 0.8719 sec/batch
Epoch 3/20  Iteration 360/3560 Training loss: 2.0713 0.8687 sec/batch
Epoch 3/20  Iteration 361/3560 Training loss: 2.0689 0.8389 sec/batch
Epoch 3/20  Iteration 362/3560 Training loss: 2.0625 0.8387 sec/batch
Epoch 3/20  Iteration 363/3560 Training loss: 2.0639 0.8984 sec/batch
Epoch 3/20  Iteration 364/3560 Training loss: 2.0655 0.8783 sec/batch
Epoch 3/20  Iteration 365/3560 Training loss: 2.0675 0.9801 sec/batch
Epoch 3/20  Iteration 366/3560 Training loss: 2.0664 0.9077 sec/batch
Epoch 3/20  Iteration 367/3560 Training loss: 2.0638 0.8439 sec/batch
Epoch 3/20  Iteration 368/3560 Training loss: 2.0619 0.8463 sec/batch
Epoch 3/20  Iteration 369/3560 Training loss: 2.0624 0.8727 sec/batch
Epoch 3/20  Iteration 370/3560 Training loss: 2.0640 0.8962 sec/batch
Epoch 3/20  Iteration 371/3560 Training loss: 2.0630 0.8464 sec/batch
Epoch 3/20  Iteration 372/3560 Training loss: 2.0616 0.9194 sec/batch
Epoch 3/20  Iteration 373/3560 Training loss: 2.0608 0.9309 sec/batch
Epoch 3/20  Iteration 374/3560 Training loss: 2.0631 0.8575 sec/batch
Epoch 3/20  Iteration 375/3560 Training loss: 2.0623 0.8927 sec/batch
Epoch 3/20  Iteration 376/3560 Training loss: 2.0614 0.8417 sec/batch
Epoch 3/20  Iteration 377/3560 Training loss: 2.0602 0.8495 sec/batch
Epoch 3/20  Iteration 378/3560 Training loss: 2.0608 0.8548 sec/batch
Epoch 3/20  Iteration 379/3560 Training loss: 2.0601 0.8958 sec/batch
Epoch 3/20  Iteration 380/3560 Training loss: 2.0588 0.8535 sec/batch
Epoch 3/20  Iteration 381/3560 Training loss: 2.0580 0.8426 sec/batch
Epoch 3/20  Iteration 382/3560 Training loss: 2.0565 0.8457 sec/batch
Epoch 3/20  Iteration 383/3560 Training loss: 2.0549 0.8533 sec/batch
Epoch 3/20  Iteration 384/3560 Training loss: 2.0547 0.8454 sec/batch
Epoch 3/20  Iteration 385/3560 Training loss: 2.0554 0.8454 sec/batch
Epoch 3/20  Iteration 386/3560 Training loss: 2.0552 0.9406 sec/batch
Epoch 3/20  Iteration 387/3560 Training loss: 2.0546 0.8497 sec/batch
Epoch 3/20  Iteration 388/3560 Training loss: 2.0537 0.8535 sec/batch
Epoch 3/20  Iteration 389/3560 Training loss: 2.0532 0.8448 sec/batch
Epoch 3/20  Iteration 390/3560 Training loss: 2.0536 0.8399 sec/batch
Epoch 3/20  Iteration 391/3560 Training loss: 2.0530 0.8482 sec/batch
Epoch 3/20  Iteration 392/3560 Training loss: 2.0524 0.8469 sec/batch
Epoch 3/20  Iteration 393/3560 Training loss: 2.0516 0.8430 sec/batch
Epoch 3/20  Iteration 394/3560 Training loss: 2.0498 0.8699 sec/batch
Epoch 3/20  Iteration 395/3560 Training loss: 2.0483 0.8461 sec/batch
Epoch 3/20  Iteration 396/3560 Training loss: 2.0472 0.8418 sec/batch
Epoch 3/20  Iteration 397/3560 Training loss: 2.0463 0.8493 sec/batch
Epoch 3/20  Iteration 398/3560 Training loss: 2.0458 0.8480 sec/batch
Epoch 3/20  Iteration 399/3560 Training loss: 2.0451 0.8577 sec/batch
Epoch 3/20  Iteration 400/3560 Training loss: 2.0437 0.8834 sec/batch
Epoch 3/20  Iteration 401/3560 Training loss: 2.0432 0.8822 sec/batch
Epoch 3/20  Iteration 402/3560 Training loss: 2.0415 0.9395 sec/batch
Epoch 3/20  Iteration 403/3560 Training loss: 2.0412 0.8939 sec/batch
Epoch 3/20  Iteration 404/3560 Training loss: 2.0402 0.8585 sec/batch
Epoch 3/20  Iteration 405/3560 Training loss: 2.0395 0.8622 sec/batch
Epoch 3/20  Iteration 406/3560 Training loss: 2.0398 0.8644 sec/batch
Epoch 3/20  Iteration 407/3560 Training loss: 2.0388 0.8893 sec/batch
Epoch 3/20  Iteration 408/3560 Training loss: 2.0389 0.8796 sec/batch
Epoch 3/20  Iteration 409/3560 Training loss: 2.0384 0.8756 sec/batch
Epoch 3/20  Iteration 410/3560 Training loss: 2.0377 0.9872 sec/batch
Epoch 3/20  Iteration 411/3560 Training loss: 2.0371 1.1357 sec/batch
Epoch 3/20  Iteration 412/3560 Training loss: 2.0369 1.0156 sec/batch
Epoch 3/20  Iteration 413/3560 Training loss: 2.0366 0.8601 sec/batch
Epoch 3/20  Iteration 414/3560 Training loss: 2.0360 0.8520 sec/batch
Epoch 3/20  Iteration 415/3560 Training loss: 2.0352 0.8474 sec/batch
Epoch 3/20  Iteration 416/3560 Training loss: 2.0353 0.8505 sec/batch
Epoch 3/20  Iteration 417/3560 Training loss: 2.0348 0.8770 sec/batch
Epoch 3/20  Iteration 418/3560 Training loss: 2.0349 0.8752 sec/batch
Epoch 3/20  Iteration 419/3560 Training loss: 2.0349 0.9437 sec/batch
Epoch 3/20  Iteration 420/3560 Training loss: 2.0346 0.8798 sec/batch
Epoch 3/20  Iteration 421/3560 Training loss: 2.0341 0.9616 sec/batch
Epoch 3/20  Iteration 422/3560 Training loss: 2.0339 0.9209 sec/batch
Epoch 3/20  Iteration 423/3560 Training loss: 2.0334 0.8701 sec/batch
Epoch 3/20  Iteration 424/3560 Training loss: 2.0325 0.8737 sec/batch
Epoch 3/20  Iteration 425/3560 Training loss: 2.0320 0.8726 sec/batch
Epoch 3/20  Iteration 426/3560 Training loss: 2.0314 0.8769 sec/batch
Epoch 3/20  Iteration 427/3560 Training loss: 2.0313 0.9369 sec/batch
Epoch 3/20  Iteration 428/3560 Training loss: 2.0309 0.9389 sec/batch
Epoch 3/20  Iteration 429/3560 Training loss: 2.0308 0.9778 sec/batch
Epoch 3/20  Iteration 430/3560 Training loss: 2.0301 1.0373 sec/batch
Epoch 3/20  Iteration 431/3560 Training loss: 2.0296 1.0563 sec/batch
Epoch 3/20  Iteration 432/3560 Training loss: 2.0295 1.2834 sec/batch
Epoch 3/20  Iteration 433/3560 Training loss: 2.0288 1.2769 sec/batch
Epoch 3/20  Iteration 434/3560 Training loss: 2.0285 1.1134 sec/batch
Epoch 3/20  Iteration 435/3560 Training loss: 2.0276 1.1964 sec/batch
Epoch 3/20  Iteration 436/3560 Training loss: 2.0270 1.2698 sec/batch
Epoch 3/20  Iteration 437/3560 Training loss: 2.0263 1.0477 sec/batch
Epoch 3/20  Iteration 438/3560 Training loss: 2.0260 0.9458 sec/batch
Epoch 3/20  Iteration 439/3560 Training loss: 2.0250 0.9056 sec/batch
Epoch 3/20  Iteration 440/3560 Training loss: 2.0244 0.9422 sec/batch
Epoch 3/20  Iteration 441/3560 Training loss: 2.0235 0.8904 sec/batch
Epoch 3/20  Iteration 442/3560 Training loss: 2.0227 0.8950 sec/batch
Epoch 3/20  Iteration 443/3560 Training loss: 2.0222 0.9498 sec/batch
Epoch 3/20  Iteration 444/3560 Training loss: 2.0215 0.9421 sec/batch
Epoch 3/20  Iteration 445/3560 Training loss: 2.0206 0.9934 sec/batch
Epoch 3/20  Iteration 446/3560 Training loss: 2.0203 0.9519 sec/batch
Epoch 3/20  Iteration 447/3560 Training loss: 2.0197 0.9161 sec/batch
Epoch 3/20  Iteration 448/3560 Training loss: 2.0192 0.8719 sec/batch
Epoch 3/20  Iteration 449/3560 Training loss: 2.0184 0.8999 sec/batch
Epoch 3/20  Iteration 450/3560 Training loss: 2.0178 0.9094 sec/batch
Epoch 3/20  Iteration 451/3560 Training loss: 2.0171 1.0111 sec/batch
Epoch 3/20  Iteration 452/3560 Training loss: 2.0166 0.9254 sec/batch
Epoch 3/20  Iteration 453/3560 Training loss: 2.0159 0.9263 sec/batch
Epoch 3/20  Iteration 454/3560 Training loss: 2.0152 0.9025 sec/batch
Epoch 3/20  Iteration 455/3560 Training loss: 2.0145 0.8909 sec/batch
Epoch 3/20  Iteration 456/3560 Training loss: 2.0136 0.8896 sec/batch
Epoch 3/20  Iteration 457/3560 Training loss: 2.0133 0.8847 sec/batch
Epoch 3/20  Iteration 458/3560 Training loss: 2.0128 0.8863 sec/batch
Epoch 3/20  Iteration 459/3560 Training loss: 2.0121 0.9958 sec/batch
Epoch 3/20  Iteration 460/3560 Training loss: 2.0115 1.1696 sec/batch
Epoch 3/20  Iteration 461/3560 Training loss: 2.0108 1.2508 sec/batch
Epoch 3/20  Iteration 462/3560 Training loss: 2.0104 1.1374 sec/batch
Epoch 3/20  Iteration 463/3560 Training loss: 2.0099 1.1436 sec/batch
Epoch 3/20  Iteration 464/3560 Training loss: 2.0095 1.2240 sec/batch
Epoch 3/20  Iteration 465/3560 Training loss: 2.0091 1.0697 sec/batch
Epoch 3/20  Iteration 466/3560 Training loss: 2.0086 0.9073 sec/batch
Epoch 3/20  Iteration 467/3560 Training loss: 2.0081 0.8504 sec/batch
Epoch 3/20  Iteration 468/3560 Training loss: 2.0075 0.8807 sec/batch
Epoch 3/20  Iteration 469/3560 Training loss: 2.0070 0.8735 sec/batch
Epoch 3/20  Iteration 470/3560 Training loss: 2.0063 0.9443 sec/batch
Epoch 3/20  Iteration 471/3560 Training loss: 2.0057 1.0455 sec/batch
Epoch 3/20  Iteration 472/3560 Training loss: 2.0050 0.8917 sec/batch
Epoch 3/20  Iteration 473/3560 Training loss: 2.0045 0.9568 sec/batch
Epoch 3/20  Iteration 474/3560 Training loss: 2.0040 0.8823 sec/batch
Epoch 3/20  Iteration 475/3560 Training loss: 2.0035 0.8842 sec/batch
Epoch 3/20  Iteration 476/3560 Training loss: 2.0030 0.8649 sec/batch
Epoch 3/20  Iteration 477/3560 Training loss: 2.0026 0.8446 sec/batch
Epoch 3/20  Iteration 478/3560 Training loss: 2.0018 0.8428 sec/batch
Epoch 3/20  Iteration 479/3560 Training loss: 2.0012 0.8450 sec/batch
Epoch 3/20  Iteration 480/3560 Training loss: 2.0010 1.0421 sec/batch
Epoch 3/20  Iteration 481/3560 Training loss: 2.0004 1.1683 sec/batch
Epoch 3/20  Iteration 482/3560 Training loss: 1.9997 1.1089 sec/batch
Epoch 3/20  Iteration 483/3560 Training loss: 1.9994 0.9403 sec/batch
Epoch 3/20  Iteration 484/3560 Training loss: 1.9990 0.8422 sec/batch
Epoch 3/20  Iteration 485/3560 Training loss: 1.9985 0.8421 sec/batch
Epoch 3/20  Iteration 486/3560 Training loss: 1.9981 0.8431 sec/batch
Epoch 3/20  Iteration 487/3560 Training loss: 1.9974 0.8481 sec/batch
Epoch 3/20  Iteration 488/3560 Training loss: 1.9968 0.8561 sec/batch
Epoch 3/20  Iteration 489/3560 Training loss: 1.9963 0.8773 sec/batch
Epoch 3/20  Iteration 490/3560 Training loss: 1.9960 0.8414 sec/batch
Epoch 3/20  Iteration 491/3560 Training loss: 1.9956 0.8472 sec/batch
Epoch 3/20  Iteration 492/3560 Training loss: 1.9952 0.8376 sec/batch
Epoch 3/20  Iteration 493/3560 Training loss: 1.9949 0.8416 sec/batch
Epoch 3/20  Iteration 494/3560 Training loss: 1.9945 0.8514 sec/batch
Epoch 3/20  Iteration 495/3560 Training loss: 1.9943 0.8501 sec/batch
Epoch 3/20  Iteration 496/3560 Training loss: 1.9938 0.8413 sec/batch
Epoch 3/20  Iteration 497/3560 Training loss: 1.9936 0.8406 sec/batch
Epoch 3/20  Iteration 498/3560 Training loss: 1.9931 0.8478 sec/batch
Epoch 3/20  Iteration 499/3560 Training loss: 1.9927 0.8426 sec/batch
Epoch 3/20  Iteration 500/3560 Training loss: 1.9923 0.8534 sec/batch
Epoch 3/20  Iteration 501/3560 Training loss: 1.9918 0.8501 sec/batch
Epoch 3/20  Iteration 502/3560 Training loss: 1.9916 0.8405 sec/batch
Epoch 3/20  Iteration 503/3560 Training loss: 1.9911 0.8485 sec/batch
Epoch 3/20  Iteration 504/3560 Training loss: 1.9909 0.8479 sec/batch
Epoch 3/20  Iteration 505/3560 Training loss: 1.9905 0.8444 sec/batch
Epoch 3/20  Iteration 506/3560 Training loss: 1.9900 0.8373 sec/batch
Epoch 3/20  Iteration 507/3560 Training loss: 1.9894 0.8548 sec/batch
Epoch 3/20  Iteration 508/3560 Training loss: 1.9892 0.8417 sec/batch
Epoch 3/20  Iteration 509/3560 Training loss: 1.9888 0.8407 sec/batch
Epoch 3/20  Iteration 510/3560 Training loss: 1.9885 0.8445 sec/batch
Epoch 3/20  Iteration 511/3560 Training loss: 1.9881 0.8590 sec/batch
Epoch 3/20  Iteration 512/3560 Training loss: 1.9877 0.8469 sec/batch
Epoch 3/20  Iteration 513/3560 Training loss: 1.9873 0.8482 sec/batch
Epoch 3/20  Iteration 514/3560 Training loss: 1.9868 0.8480 sec/batch
Epoch 3/20  Iteration 515/3560 Training loss: 1.9863 0.8390 sec/batch
Epoch 3/20  Iteration 516/3560 Training loss: 1.9861 0.8476 sec/batch
Epoch 3/20  Iteration 517/3560 Training loss: 1.9859 0.8390 sec/batch
Epoch 3/20  Iteration 518/3560 Training loss: 1.9854 0.8406 sec/batch
Epoch 3/20  Iteration 519/3560 Training loss: 1.9851 0.8448 sec/batch
Epoch 3/20  Iteration 520/3560 Training loss: 1.9847 0.8470 sec/batch
Epoch 3/20  Iteration 521/3560 Training loss: 1.9843 0.8467 sec/batch
Epoch 3/20  Iteration 522/3560 Training loss: 1.9839 0.8400 sec/batch
Epoch 3/20  Iteration 523/3560 Training loss: 1.9836 0.8473 sec/batch
Epoch 3/20  Iteration 524/3560 Training loss: 1.9835 1.0367 sec/batch
Epoch 3/20  Iteration 525/3560 Training loss: 1.9831 0.8504 sec/batch
Epoch 3/20  Iteration 526/3560 Training loss: 1.9827 0.8889 sec/batch
Epoch 3/20  Iteration 527/3560 Training loss: 1.9823 0.8947 sec/batch
Epoch 3/20  Iteration 528/3560 Training loss: 1.9818 0.8596 sec/batch
Epoch 3/20  Iteration 529/3560 Training loss: 1.9816 0.8483 sec/batch
Epoch 3/20  Iteration 530/3560 Training loss: 1.9812 0.8706 sec/batch
Epoch 3/20  Iteration 531/3560 Training loss: 1.9808 0.8598 sec/batch
Epoch 3/20  Iteration 532/3560 Training loss: 1.9804 0.8487 sec/batch
Epoch 3/20  Iteration 533/3560 Training loss: 1.9799 0.8463 sec/batch
Epoch 3/20  Iteration 534/3560 Training loss: 1.9796 0.8561 sec/batch
Epoch 4/20  Iteration 535/3560 Training loss: 1.9779 0.8481 sec/batch
Epoch 4/20  Iteration 536/3560 Training loss: 1.9320 0.8520 sec/batch
Epoch 4/20  Iteration 537/3560 Training loss: 1.9231 0.8952 sec/batch
Epoch 4/20  Iteration 538/3560 Training loss: 1.9176 0.8501 sec/batch
Epoch 4/20  Iteration 539/3560 Training loss: 1.9140 0.8629 sec/batch
Epoch 4/20  Iteration 540/3560 Training loss: 1.9067 0.8474 sec/batch
Epoch 4/20  Iteration 541/3560 Training loss: 1.9058 0.8522 sec/batch
Epoch 4/20  Iteration 542/3560 Training loss: 1.9045 0.8589 sec/batch
Epoch 4/20  Iteration 543/3560 Training loss: 1.9067 0.8525 sec/batch
Epoch 4/20  Iteration 544/3560 Training loss: 1.9066 0.8958 sec/batch
Epoch 4/20  Iteration 545/3560 Training loss: 1.9029 0.8530 sec/batch
Epoch 4/20  Iteration 546/3560 Training loss: 1.9004 0.8644 sec/batch
Epoch 4/20  Iteration 547/3560 Training loss: 1.9009 0.8531 sec/batch
Epoch 4/20  Iteration 548/3560 Training loss: 1.9024 0.8441 sec/batch
Epoch 4/20  Iteration 549/3560 Training loss: 1.9009 0.8477 sec/batch
Epoch 4/20  Iteration 550/3560 Training loss: 1.8989 0.8497 sec/batch
Epoch 4/20  Iteration 551/3560 Training loss: 1.8985 0.8510 sec/batch
Epoch 4/20  Iteration 552/3560 Training loss: 1.9005 0.8713 sec/batch
Epoch 4/20  Iteration 553/3560 Training loss: 1.8997 0.8564 sec/batch
Epoch 4/20  Iteration 554/3560 Training loss: 1.9000 0.8497 sec/batch
Epoch 4/20  Iteration 555/3560 Training loss: 1.8988 0.8528 sec/batch
Epoch 4/20  Iteration 556/3560 Training loss: 1.8993 0.8496 sec/batch
Epoch 4/20  Iteration 557/3560 Training loss: 1.8986 0.8515 sec/batch
Epoch 4/20  Iteration 558/3560 Training loss: 1.8976 0.8516 sec/batch
Epoch 4/20  Iteration 559/3560 Training loss: 1.8970 0.9103 sec/batch
Epoch 4/20  Iteration 560/3560 Training loss: 1.8956 0.8715 sec/batch
Epoch 4/20  Iteration 561/3560 Training loss: 1.8942 0.9228 sec/batch
Epoch 4/20  Iteration 562/3560 Training loss: 1.8943 0.8829 sec/batch
Epoch 4/20  Iteration 563/3560 Training loss: 1.8950 0.9046 sec/batch
Epoch 4/20  Iteration 564/3560 Training loss: 1.8950 0.8578 sec/batch
Epoch 4/20  Iteration 565/3560 Training loss: 1.8946 0.8633 sec/batch
Epoch 4/20  Iteration 566/3560 Training loss: 1.8934 0.8540 sec/batch
Epoch 4/20  Iteration 567/3560 Training loss: 1.8933 0.8512 sec/batch
Epoch 4/20  Iteration 568/3560 Training loss: 1.8939 0.8530 sec/batch
Epoch 4/20  Iteration 569/3560 Training loss: 1.8932 0.9405 sec/batch
Epoch 4/20  Iteration 570/3560 Training loss: 1.8925 0.8601 sec/batch
Epoch 4/20  Iteration 571/3560 Training loss: 1.8916 0.8680 sec/batch
Epoch 4/20  Iteration 572/3560 Training loss: 1.8901 0.8578 sec/batch
Epoch 4/20  Iteration 573/3560 Training loss: 1.8886 0.8566 sec/batch
Epoch 4/20  Iteration 574/3560 Training loss: 1.8876 0.8903 sec/batch
Epoch 4/20  Iteration 575/3560 Training loss: 1.8867 0.8490 sec/batch
Epoch 4/20  Iteration 576/3560 Training loss: 1.8867 0.8518 sec/batch
Epoch 4/20  Iteration 577/3560 Training loss: 1.8859 0.8487 sec/batch
Epoch 4/20  Iteration 578/3560 Training loss: 1.8850 0.8516 sec/batch
Epoch 4/20  Iteration 579/3560 Training loss: 1.8849 0.8482 sec/batch
Epoch 4/20  Iteration 580/3560 Training loss: 1.8834 0.9007 sec/batch
Epoch 4/20  Iteration 581/3560 Training loss: 1.8831 0.8560 sec/batch
Epoch 4/20  Iteration 582/3560 Training loss: 1.8822 0.8496 sec/batch
Epoch 4/20  Iteration 583/3560 Training loss: 1.8817 0.8514 sec/batch
Epoch 4/20  Iteration 584/3560 Training loss: 1.8820 0.8529 sec/batch
Epoch 4/20  Iteration 585/3560 Training loss: 1.8814 0.8550 sec/batch
Epoch 4/20  Iteration 586/3560 Training loss: 1.8818 0.8930 sec/batch
Epoch 4/20  Iteration 587/3560 Training loss: 1.8813 0.8533 sec/batch
Epoch 4/20  Iteration 588/3560 Training loss: 1.8809 0.8472 sec/batch
Epoch 4/20  Iteration 589/3560 Training loss: 1.8804 0.8528 sec/batch
Epoch 4/20  Iteration 590/3560 Training loss: 1.8804 0.8500 sec/batch
Epoch 4/20  Iteration 591/3560 Training loss: 1.8805 0.8956 sec/batch
Epoch 4/20  Iteration 592/3560 Training loss: 1.8801 0.8524 sec/batch
Epoch 4/20  Iteration 593/3560 Training loss: 1.8793 0.9589 sec/batch
Epoch 4/20  Iteration 594/3560 Training loss: 1.8796 0.8529 sec/batch
Epoch 4/20  Iteration 595/3560 Training loss: 1.8792 0.8485 sec/batch
Epoch 4/20  Iteration 596/3560 Training loss: 1.8797 0.8535 sec/batch
Epoch 4/20  Iteration 597/3560 Training loss: 1.8798 0.8517 sec/batch
Epoch 4/20  Iteration 598/3560 Training loss: 1.8797 0.8536 sec/batch
Epoch 4/20  Iteration 599/3560 Training loss: 1.8793 0.8533 sec/batch
Epoch 4/20  Iteration 600/3560 Training loss: 1.8794 0.8562 sec/batch
Epoch 4/20  Iteration 601/3560 Training loss: 1.8791 0.8483 sec/batch
Epoch 4/20  Iteration 602/3560 Training loss: 1.8785 0.8592 sec/batch
Epoch 4/20  Iteration 603/3560 Training loss: 1.8782 0.9013 sec/batch
Epoch 4/20  Iteration 604/3560 Training loss: 1.8779 0.8493 sec/batch
Epoch 4/20  Iteration 605/3560 Training loss: 1.8782 0.8721 sec/batch
Epoch 4/20  Iteration 606/3560 Training loss: 1.8781 0.8500 sec/batch
Epoch 4/20  Iteration 607/3560 Training loss: 1.8783 0.8409 sec/batch
Epoch 4/20  Iteration 608/3560 Training loss: 1.8777 0.9056 sec/batch
Epoch 4/20  Iteration 609/3560 Training loss: 1.8775 0.9114 sec/batch
Epoch 4/20  Iteration 610/3560 Training loss: 1.8775 0.8745 sec/batch
Epoch 4/20  Iteration 611/3560 Training loss: 1.8772 0.9200 sec/batch
Epoch 4/20  Iteration 612/3560 Training loss: 1.8771 0.8572 sec/batch
Epoch 4/20  Iteration 613/3560 Training loss: 1.8763 0.8597 sec/batch
Epoch 4/20  Iteration 614/3560 Training loss: 1.8760 0.8510 sec/batch
Epoch 4/20  Iteration 615/3560 Training loss: 1.8752 0.8671 sec/batch
Epoch 4/20  Iteration 616/3560 Training loss: 1.8751 0.8578 sec/batch
Epoch 4/20  Iteration 617/3560 Training loss: 1.8743 0.8494 sec/batch
Epoch 4/20  Iteration 618/3560 Training loss: 1.8739 0.8509 sec/batch
Epoch 4/20  Iteration 619/3560 Training loss: 1.8731 0.9078 sec/batch
Epoch 4/20  Iteration 620/3560 Training loss: 1.8724 0.8792 sec/batch
Epoch 4/20  Iteration 621/3560 Training loss: 1.8720 0.8592 sec/batch
Epoch 4/20  Iteration 622/3560 Training loss: 1.8713 0.8536 sec/batch
Epoch 4/20  Iteration 623/3560 Training loss: 1.8705 0.8526 sec/batch
Epoch 4/20  Iteration 624/3560 Training loss: 1.8703 0.8427 sec/batch
Epoch 4/20  Iteration 625/3560 Training loss: 1.8698 0.8553 sec/batch
Epoch 4/20  Iteration 626/3560 Training loss: 1.8693 0.8471 sec/batch
Epoch 4/20  Iteration 627/3560 Training loss: 1.8686 0.9224 sec/batch
Epoch 4/20  Iteration 628/3560 Training loss: 1.8679 0.8538 sec/batch
Epoch 4/20  Iteration 629/3560 Training loss: 1.8673 0.8446 sec/batch
Epoch 4/20  Iteration 630/3560 Training loss: 1.8669 0.8615 sec/batch
Epoch 4/20  Iteration 631/3560 Training loss: 1.8665 0.8615 sec/batch
Epoch 4/20  Iteration 632/3560 Training loss: 1.8658 0.8508 sec/batch
Epoch 4/20  Iteration 633/3560 Training loss: 1.8651 0.9023 sec/batch
Epoch 4/20  Iteration 634/3560 Training loss: 1.8643 0.8523 sec/batch
Epoch 4/20  Iteration 635/3560 Training loss: 1.8639 0.8539 sec/batch
Epoch 4/20  Iteration 636/3560 Training loss: 1.8635 0.8549 sec/batch
Epoch 4/20  Iteration 637/3560 Training loss: 1.8630 0.8511 sec/batch
Epoch 4/20  Iteration 638/3560 Training loss: 1.8625 0.8538 sec/batch
Epoch 4/20  Iteration 639/3560 Training loss: 1.8620 0.8654 sec/batch
Epoch 4/20  Iteration 640/3560 Training loss: 1.8617 0.8535 sec/batch
Epoch 4/20  Iteration 641/3560 Training loss: 1.8613 0.8700 sec/batch
Epoch 4/20  Iteration 642/3560 Training loss: 1.8609 0.8532 sec/batch
Epoch 4/20  Iteration 643/3560 Training loss: 1.8607 0.8486 sec/batch
Epoch 4/20  Iteration 644/3560 Training loss: 1.8604 0.8447 sec/batch
Epoch 4/20  Iteration 645/3560 Training loss: 1.8600 0.8538 sec/batch
Epoch 4/20  Iteration 646/3560 Training loss: 1.8597 0.8484 sec/batch
Epoch 4/20  Iteration 647/3560 Training loss: 1.8592 0.8556 sec/batch
Epoch 4/20  Iteration 648/3560 Training loss: 1.8587 0.8568 sec/batch
Epoch 4/20  Iteration 649/3560 Training loss: 1.8582 0.9128 sec/batch
Epoch 4/20  Iteration 650/3560 Training loss: 1.8576 0.9156 sec/batch
Epoch 4/20  Iteration 651/3560 Training loss: 1.8573 0.8770 sec/batch
Epoch 4/20  Iteration 652/3560 Training loss: 1.8570 0.8597 sec/batch
Epoch 4/20  Iteration 653/3560 Training loss: 1.8567 0.8842 sec/batch
Epoch 4/20  Iteration 654/3560 Training loss: 1.8563 0.8609 sec/batch
Epoch 4/20  Iteration 655/3560 Training loss: 1.8560 0.8902 sec/batch
Epoch 4/20  Iteration 656/3560 Training loss: 1.8554 0.9609 sec/batch
Epoch 4/20  Iteration 657/3560 Training loss: 1.8547 0.9355 sec/batch
Epoch 4/20  Iteration 658/3560 Training loss: 1.8546 0.9377 sec/batch
Epoch 4/20  Iteration 659/3560 Training loss: 1.8543 0.8621 sec/batch
Epoch 4/20  Iteration 660/3560 Training loss: 1.8536 0.8520 sec/batch
Epoch 4/20  Iteration 661/3560 Training loss: 1.8534 0.8490 sec/batch
Epoch 4/20  Iteration 662/3560 Training loss: 1.8532 0.9019 sec/batch
Epoch 4/20  Iteration 663/3560 Training loss: 1.8528 0.8516 sec/batch
Epoch 4/20  Iteration 664/3560 Training loss: 1.8524 0.8425 sec/batch
Epoch 4/20  Iteration 665/3560 Training loss: 1.8519 0.8537 sec/batch
Epoch 4/20  Iteration 666/3560 Training loss: 1.8514 0.8503 sec/batch
Epoch 4/20  Iteration 667/3560 Training loss: 1.8512 0.8504 sec/batch
Epoch 4/20  Iteration 668/3560 Training loss: 1.8509 0.8458 sec/batch
Epoch 4/20  Iteration 669/3560 Training loss: 1.8506 0.8669 sec/batch
Epoch 4/20  Iteration 670/3560 Training loss: 1.8505 0.8568 sec/batch
Epoch 4/20  Iteration 671/3560 Training loss: 1.8503 0.8528 sec/batch
Epoch 4/20  Iteration 672/3560 Training loss: 1.8501 0.8486 sec/batch
Epoch 4/20  Iteration 673/3560 Training loss: 1.8500 0.8455 sec/batch
Epoch 4/20  Iteration 674/3560 Training loss: 1.8496 0.8486 sec/batch
Epoch 4/20  Iteration 675/3560 Training loss: 1.8496 0.8781 sec/batch
Epoch 4/20  Iteration 676/3560 Training loss: 1.8493 0.8906 sec/batch
Epoch 4/20  Iteration 677/3560 Training loss: 1.8490 0.8675 sec/batch
Epoch 4/20  Iteration 678/3560 Training loss: 1.8487 0.9006 sec/batch
Epoch 4/20  Iteration 679/3560 Training loss: 1.8484 0.8807 sec/batch
Epoch 4/20  Iteration 680/3560 Training loss: 1.8483 0.8797 sec/batch
Epoch 4/20  Iteration 681/3560 Training loss: 1.8481 0.8691 sec/batch
Epoch 4/20  Iteration 682/3560 Training loss: 1.8480 0.8624 sec/batch
Epoch 4/20  Iteration 683/3560 Training loss: 1.8477 0.8481 sec/batch
Epoch 4/20  Iteration 684/3560 Training loss: 1.8473 0.8501 sec/batch
Epoch 4/20  Iteration 685/3560 Training loss: 1.8469 0.8679 sec/batch
Epoch 4/20  Iteration 686/3560 Training loss: 1.8468 0.8702 sec/batch
Epoch 4/20  Iteration 687/3560 Training loss: 1.8466 0.8776 sec/batch
Epoch 4/20  Iteration 688/3560 Training loss: 1.8465 0.8656 sec/batch
Epoch 4/20  Iteration 689/3560 Training loss: 1.8462 0.8850 sec/batch
Epoch 4/20  Iteration 690/3560 Training loss: 1.8460 1.1230 sec/batch
Epoch 4/20  Iteration 691/3560 Training loss: 1.8458 1.0332 sec/batch
Epoch 4/20  Iteration 692/3560 Training loss: 1.8455 0.9941 sec/batch
Epoch 4/20  Iteration 693/3560 Training loss: 1.8451 0.8533 sec/batch
Epoch 4/20  Iteration 694/3560 Training loss: 1.8450 0.8504 sec/batch
Epoch 4/20  Iteration 695/3560 Training loss: 1.8449 0.8650 sec/batch
Epoch 4/20  Iteration 696/3560 Training loss: 1.8446 0.8489 sec/batch
Epoch 4/20  Iteration 697/3560 Training loss: 1.8444 0.8489 sec/batch
Epoch 4/20  Iteration 698/3560 Training loss: 1.8442 0.8416 sec/batch
Epoch 4/20  Iteration 699/3560 Training loss: 1.8440 0.8367 sec/batch
Epoch 4/20  Iteration 700/3560 Training loss: 1.8437 0.8488 sec/batch
Epoch 4/20  Iteration 701/3560 Training loss: 1.8435 0.8406 sec/batch
Epoch 4/20  Iteration 702/3560 Training loss: 1.8437 0.8443 sec/batch
Epoch 4/20  Iteration 703/3560 Training loss: 1.8434 0.8493 sec/batch
Epoch 4/20  Iteration 704/3560 Training loss: 1.8431 0.8508 sec/batch
Epoch 4/20  Iteration 705/3560 Training loss: 1.8427 0.8399 sec/batch
Epoch 4/20  Iteration 706/3560 Training loss: 1.8424 0.9116 sec/batch
Epoch 4/20  Iteration 707/3560 Training loss: 1.8422 0.8477 sec/batch
Epoch 4/20  Iteration 708/3560 Training loss: 1.8420 0.8636 sec/batch
Epoch 4/20  Iteration 709/3560 Training loss: 1.8417 0.8530 sec/batch
Epoch 4/20  Iteration 710/3560 Training loss: 1.8415 0.8684 sec/batch
Epoch 4/20  Iteration 711/3560 Training loss: 1.8411 0.8607 sec/batch
Epoch 4/20  Iteration 712/3560 Training loss: 1.8409 0.8647 sec/batch
Epoch 5/20  Iteration 713/3560 Training loss: 1.8558 0.8503 sec/batch
Epoch 5/20  Iteration 714/3560 Training loss: 1.8215 0.8413 sec/batch
Epoch 5/20  Iteration 715/3560 Training loss: 1.8093 0.8410 sec/batch
Epoch 5/20  Iteration 716/3560 Training loss: 1.8021 0.8417 sec/batch
Epoch 5/20  Iteration 717/3560 Training loss: 1.7979 0.8497 sec/batch
Epoch 5/20  Iteration 718/3560 Training loss: 1.7895 0.8393 sec/batch
Epoch 5/20  Iteration 719/3560 Training loss: 1.7890 0.8634 sec/batch
Epoch 5/20  Iteration 720/3560 Training loss: 1.7866 0.8693 sec/batch
Epoch 5/20  Iteration 721/3560 Training loss: 1.7892 0.8458 sec/batch
Epoch 5/20  Iteration 722/3560 Training loss: 1.7891 0.8640 sec/batch
Epoch 5/20  Iteration 723/3560 Training loss: 1.7867 0.8558 sec/batch
Epoch 5/20  Iteration 724/3560 Training loss: 1.7846 0.8385 sec/batch
Epoch 5/20  Iteration 725/3560 Training loss: 1.7848 0.9244 sec/batch
Epoch 5/20  Iteration 726/3560 Training loss: 1.7863 0.8741 sec/batch
Epoch 5/20  Iteration 727/3560 Training loss: 1.7855 0.8911 sec/batch
Epoch 5/20  Iteration 728/3560 Training loss: 1.7836 0.8680 sec/batch
Epoch 5/20  Iteration 729/3560 Training loss: 1.7834 0.8411 sec/batch
Epoch 5/20  Iteration 730/3560 Training loss: 1.7856 0.9171 sec/batch
Epoch 5/20  Iteration 731/3560 Training loss: 1.7861 0.8469 sec/batch
Epoch 5/20  Iteration 732/3560 Training loss: 1.7868 0.8987 sec/batch
Epoch 5/20  Iteration 733/3560 Training loss: 1.7858 0.8977 sec/batch
Epoch 5/20  Iteration 734/3560 Training loss: 1.7861 0.8514 sec/batch
Epoch 5/20  Iteration 735/3560 Training loss: 1.7857 0.8685 sec/batch
Epoch 5/20  Iteration 736/3560 Training loss: 1.7851 0.8510 sec/batch
Epoch 5/20  Iteration 737/3560 Training loss: 1.7844 0.8748 sec/batch
Epoch 5/20  Iteration 738/3560 Training loss: 1.7832 0.8422 sec/batch
Epoch 5/20  Iteration 739/3560 Training loss: 1.7823 0.8706 sec/batch
Epoch 5/20  Iteration 740/3560 Training loss: 1.7826 0.8584 sec/batch
Epoch 5/20  Iteration 741/3560 Training loss: 1.7833 0.8944 sec/batch
Epoch 5/20  Iteration 742/3560 Training loss: 1.7833 0.8538 sec/batch
Epoch 5/20  Iteration 743/3560 Training loss: 1.7829 0.8917 sec/batch
Epoch 5/20  Iteration 744/3560 Training loss: 1.7819 0.8598 sec/batch
Epoch 5/20  Iteration 745/3560 Training loss: 1.7820 0.9071 sec/batch
Epoch 5/20  Iteration 746/3560 Training loss: 1.7825 0.9008 sec/batch
Epoch 5/20  Iteration 747/3560 Training loss: 1.7820 0.8897 sec/batch
Epoch 5/20  Iteration 748/3560 Training loss: 1.7814 0.8951 sec/batch
Epoch 5/20  Iteration 749/3560 Training loss: 1.7806 0.8570 sec/batch
Epoch 5/20  Iteration 750/3560 Training loss: 1.7791 0.8509 sec/batch
Epoch 5/20  Iteration 751/3560 Training loss: 1.7776 0.8518 sec/batch
Epoch 5/20  Iteration 752/3560 Training loss: 1.7769 0.8510 sec/batch
Epoch 5/20  Iteration 753/3560 Training loss: 1.7762 0.8501 sec/batch
Epoch 5/20  Iteration 754/3560 Training loss: 1.7766 0.8600 sec/batch
Epoch 5/20  Iteration 755/3560 Training loss: 1.7759 0.8533 sec/batch
Epoch 5/20  Iteration 756/3560 Training loss: 1.7748 0.8656 sec/batch
Epoch 5/20  Iteration 757/3560 Training loss: 1.7748 0.8497 sec/batch
Epoch 5/20  Iteration 758/3560 Training loss: 1.7734 0.8527 sec/batch
Epoch 5/20  Iteration 759/3560 Training loss: 1.7728 0.8566 sec/batch
Epoch 5/20  Iteration 760/3560 Training loss: 1.7722 0.8568 sec/batch
Epoch 5/20  Iteration 761/3560 Training loss: 1.7719 0.8448 sec/batch
Epoch 5/20  Iteration 762/3560 Training loss: 1.7722 0.8579 sec/batch
Epoch 5/20  Iteration 763/3560 Training loss: 1.7716 0.8462 sec/batch
Epoch 5/20  Iteration 764/3560 Training loss: 1.7722 0.8656 sec/batch
Epoch 5/20  Iteration 765/3560 Training loss: 1.7719 0.8993 sec/batch
Epoch 5/20  Iteration 766/3560 Training loss: 1.7717 0.8666 sec/batch
Epoch 5/20  Iteration 767/3560 Training loss: 1.7715 0.8520 sec/batch
Epoch 5/20  Iteration 768/3560 Training loss: 1.7715 0.8993 sec/batch
Epoch 5/20  Iteration 769/3560 Training loss: 1.7715 0.8628 sec/batch
Epoch 5/20  Iteration 770/3560 Training loss: 1.7710 1.0622 sec/batch
Epoch 5/20  Iteration 771/3560 Training loss: 1.7704 0.9743 sec/batch
Epoch 5/20  Iteration 772/3560 Training loss: 1.7708 0.8634 sec/batch
Epoch 5/20  Iteration 773/3560 Training loss: 1.7705 0.8594 sec/batch
Epoch 5/20  Iteration 774/3560 Training loss: 1.7711 0.8464 sec/batch
Epoch 5/20  Iteration 775/3560 Training loss: 1.7715 0.8575 sec/batch
Epoch 5/20  Iteration 776/3560 Training loss: 1.7716 0.8573 sec/batch
Epoch 5/20  Iteration 777/3560 Training loss: 1.7712 0.8516 sec/batch
Epoch 5/20  Iteration 778/3560 Training loss: 1.7713 0.8555 sec/batch
Epoch 5/20  Iteration 779/3560 Training loss: 1.7713 0.8508 sec/batch
Epoch 5/20  Iteration 780/3560 Training loss: 1.7708 0.8563 sec/batch
Epoch 5/20  Iteration 781/3560 Training loss: 1.7707 0.9052 sec/batch
Epoch 5/20  Iteration 782/3560 Training loss: 1.7704 0.8527 sec/batch
Epoch 5/20  Iteration 783/3560 Training loss: 1.7709 0.8552 sec/batch
Epoch 5/20  Iteration 784/3560 Training loss: 1.7709 0.8470 sec/batch
Epoch 5/20  Iteration 785/3560 Training loss: 1.7711 0.9023 sec/batch
Epoch 5/20  Iteration 786/3560 Training loss: 1.7706 0.8645 sec/batch
Epoch 5/20  Iteration 787/3560 Training loss: 1.7703 0.8430 sec/batch
Epoch 5/20  Iteration 788/3560 Training loss: 1.7705 0.8635 sec/batch
Epoch 5/20  Iteration 789/3560 Training loss: 1.7701 0.8497 sec/batch
Epoch 5/20  Iteration 790/3560 Training loss: 1.7699 0.8476 sec/batch
Epoch 5/20  Iteration 791/3560 Training loss: 1.7693 0.8575 sec/batch
Epoch 5/20  Iteration 792/3560 Training loss: 1.7690 0.8661 sec/batch
Epoch 5/20  Iteration 793/3560 Training loss: 1.7683 0.8446 sec/batch
Epoch 5/20  Iteration 794/3560 Training loss: 1.7683 0.8492 sec/batch
Epoch 5/20  Iteration 795/3560 Training loss: 1.7676 0.8409 sec/batch
Epoch 5/20  Iteration 796/3560 Training loss: 1.7675 0.8554 sec/batch
Epoch 5/20  Iteration 797/3560 Training loss: 1.7669 0.8526 sec/batch
Epoch 5/20  Iteration 798/3560 Training loss: 1.7664 0.8567 sec/batch
Epoch 5/20  Iteration 799/3560 Training loss: 1.7660 0.8771 sec/batch
Epoch 5/20  Iteration 800/3560 Training loss: 1.7655 0.9020 sec/batch
Epoch 5/20  Iteration 801/3560 Training loss: 1.7648 0.8512 sec/batch
Epoch 5/20  Iteration 802/3560 Training loss: 1.7648 0.8576 sec/batch
Epoch 5/20  Iteration 803/3560 Training loss: 1.7644 0.8466 sec/batch
Epoch 5/20  Iteration 804/3560 Training loss: 1.7639 0.8474 sec/batch
Epoch 5/20  Iteration 805/3560 Training loss: 1.7633 0.8553 sec/batch
Epoch 5/20  Iteration 806/3560 Training loss: 1.7627 0.8530 sec/batch
Epoch 5/20  Iteration 807/3560 Training loss: 1.7622 0.8475 sec/batch
Epoch 5/20  Iteration 808/3560 Training loss: 1.7621 0.8679 sec/batch
Epoch 5/20  Iteration 809/3560 Training loss: 1.7617 0.8535 sec/batch
Epoch 5/20  Iteration 810/3560 Training loss: 1.7611 0.8572 sec/batch
Epoch 5/20  Iteration 811/3560 Training loss: 1.7605 0.8543 sec/batch
Epoch 5/20  Iteration 812/3560 Training loss: 1.7598 0.8589 sec/batch
Epoch 5/20  Iteration 813/3560 Training loss: 1.7596 0.8548 sec/batch
Epoch 5/20  Iteration 814/3560 Training loss: 1.7593 0.8456 sec/batch
Epoch 5/20  Iteration 815/3560 Training loss: 1.7589 0.8486 sec/batch
Epoch 5/20  Iteration 816/3560 Training loss: 1.7585 0.8597 sec/batch
Epoch 5/20  Iteration 817/3560 Training loss: 1.7581 0.8813 sec/batch
Epoch 5/20  Iteration 818/3560 Training loss: 1.7577 0.8728 sec/batch
Epoch 5/20  Iteration 819/3560 Training loss: 1.7574 0.8605 sec/batch
Epoch 5/20  Iteration 820/3560 Training loss: 1.7573 0.9030 sec/batch
Epoch 5/20  Iteration 821/3560 Training loss: 1.7571 0.8524 sec/batch
Epoch 5/20  Iteration 822/3560 Training loss: 1.7570 0.8513 sec/batch
Epoch 5/20  Iteration 823/3560 Training loss: 1.7566 0.8930 sec/batch
Epoch 5/20  Iteration 824/3560 Training loss: 1.7563 0.8503 sec/batch
Epoch 5/20  Iteration 825/3560 Training loss: 1.7559 0.8374 sec/batch
Epoch 5/20  Iteration 826/3560 Training loss: 1.7556 0.8522 sec/batch
Epoch 5/20  Iteration 827/3560 Training loss: 1.7552 0.8892 sec/batch
Epoch 5/20  Iteration 828/3560 Training loss: 1.7547 0.8524 sec/batch
Epoch 5/20  Iteration 829/3560 Training loss: 1.7545 0.8526 sec/batch
Epoch 5/20  Iteration 830/3560 Training loss: 1.7544 0.8563 sec/batch
Epoch 5/20  Iteration 831/3560 Training loss: 1.7541 0.8482 sec/batch
Epoch 5/20  Iteration 832/3560 Training loss: 1.7539 0.8508 sec/batch
Epoch 5/20  Iteration 833/3560 Training loss: 1.7536 0.8633 sec/batch
Epoch 5/20  Iteration 834/3560 Training loss: 1.7531 0.8551 sec/batch
Epoch 5/20  Iteration 835/3560 Training loss: 1.7526 0.8510 sec/batch
Epoch 5/20  Iteration 836/3560 Training loss: 1.7526 0.8446 sec/batch
Epoch 5/20  Iteration 837/3560 Training loss: 1.7523 0.8523 sec/batch
Epoch 5/20  Iteration 838/3560 Training loss: 1.7517 0.8478 sec/batch
Epoch 5/20  Iteration 839/3560 Training loss: 1.7517 0.8488 sec/batch
Epoch 5/20  Iteration 840/3560 Training loss: 1.7516 0.8511 sec/batch
Epoch 5/20  Iteration 841/3560 Training loss: 1.7514 0.8502 sec/batch
Epoch 5/20  Iteration 842/3560 Training loss: 1.7511 0.8581 sec/batch
Epoch 5/20  Iteration 843/3560 Training loss: 1.7507 0.8560 sec/batch
Epoch 5/20  Iteration 844/3560 Training loss: 1.7504 0.9052 sec/batch
Epoch 5/20  Iteration 845/3560 Training loss: 1.7503 0.8645 sec/batch
Epoch 5/20  Iteration 846/3560 Training loss: 1.7502 0.8655 sec/batch
Epoch 5/20  Iteration 847/3560 Training loss: 1.7500 0.8509 sec/batch
Epoch 5/20  Iteration 848/3560 Training loss: 1.7500 0.8565 sec/batch
Epoch 5/20  Iteration 849/3560 Training loss: 1.7499 0.8545 sec/batch
Epoch 5/20  Iteration 850/3560 Training loss: 1.7498 0.8585 sec/batch
Epoch 5/20  Iteration 851/3560 Training loss: 1.7498 0.9002 sec/batch
Epoch 5/20  Iteration 852/3560 Training loss: 1.7496 0.8544 sec/batch
Epoch 5/20  Iteration 853/3560 Training loss: 1.7497 0.8608 sec/batch
Epoch 5/20  Iteration 854/3560 Training loss: 1.7495 0.8539 sec/batch
Epoch 5/20  Iteration 855/3560 Training loss: 1.7493 0.8590 sec/batch
Epoch 5/20  Iteration 856/3560 Training loss: 1.7492 0.8483 sec/batch
Epoch 5/20  Iteration 857/3560 Training loss: 1.7489 0.8962 sec/batch
Epoch 5/20  Iteration 858/3560 Training loss: 1.7488 0.8603 sec/batch
Epoch 5/20  Iteration 859/3560 Training loss: 1.7487 0.8647 sec/batch
Epoch 5/20  Iteration 860/3560 Training loss: 1.7487 0.8568 sec/batch
Epoch 5/20  Iteration 861/3560 Training loss: 1.7486 0.8563 sec/batch
Epoch 5/20  Iteration 862/3560 Training loss: 1.7482 0.8594 sec/batch
Epoch 5/20  Iteration 863/3560 Training loss: 1.7478 0.8532 sec/batch
Epoch 5/20  Iteration 864/3560 Training loss: 1.7478 0.8461 sec/batch
Epoch 5/20  Iteration 865/3560 Training loss: 1.7477 0.8688 sec/batch
Epoch 5/20  Iteration 866/3560 Training loss: 1.7476 0.9526 sec/batch
Epoch 5/20  Iteration 867/3560 Training loss: 1.7474 0.8595 sec/batch
Epoch 5/20  Iteration 868/3560 Training loss: 1.7473 0.8711 sec/batch
Epoch 5/20  Iteration 869/3560 Training loss: 1.7472 0.8487 sec/batch
Epoch 5/20  Iteration 870/3560 Training loss: 1.7470 0.8458 sec/batch
Epoch 5/20  Iteration 871/3560 Training loss: 1.7466 0.8528 sec/batch
Epoch 5/20  Iteration 872/3560 Training loss: 1.7466 0.8519 sec/batch
Epoch 5/20  Iteration 873/3560 Training loss: 1.7466 0.8591 sec/batch
Epoch 5/20  Iteration 874/3560 Training loss: 1.7464 0.8634 sec/batch
Epoch 5/20  Iteration 875/3560 Training loss: 1.7464 0.8509 sec/batch
Epoch 5/20  Iteration 876/3560 Training loss: 1.7462 0.8519 sec/batch
Epoch 5/20  Iteration 877/3560 Training loss: 1.7461 0.8660 sec/batch
Epoch 5/20  Iteration 878/3560 Training loss: 1.7459 0.8931 sec/batch
Epoch 5/20  Iteration 879/3560 Training loss: 1.7458 0.9881 sec/batch
Epoch 5/20  Iteration 880/3560 Training loss: 1.7461 1.2332 sec/batch
Epoch 5/20  Iteration 881/3560 Training loss: 1.7458 1.2522 sec/batch
Epoch 5/20  Iteration 882/3560 Training loss: 1.7457 0.9746 sec/batch
Epoch 5/20  Iteration 883/3560 Training loss: 1.7454 0.9027 sec/batch
Epoch 5/20  Iteration 884/3560 Training loss: 1.7450 0.8720 sec/batch
Epoch 5/20  Iteration 885/3560 Training loss: 1.7450 0.8724 sec/batch
Epoch 5/20  Iteration 886/3560 Training loss: 1.7448 0.9358 sec/batch
Epoch 5/20  Iteration 887/3560 Training loss: 1.7447 0.9182 sec/batch
Epoch 5/20  Iteration 888/3560 Training loss: 1.7445 0.9241 sec/batch
Epoch 5/20  Iteration 889/3560 Training loss: 1.7442 0.8562 sec/batch
Epoch 5/20  Iteration 890/3560 Training loss: 1.7441 0.8856 sec/batch
Epoch 6/20  Iteration 891/3560 Training loss: 1.7873 0.8662 sec/batch
Epoch 6/20  Iteration 892/3560 Training loss: 1.7517 0.9449 sec/batch
Epoch 6/20  Iteration 893/3560 Training loss: 1.7377 1.0899 sec/batch
Epoch 6/20  Iteration 894/3560 Training loss: 1.7311 0.9821 sec/batch
Epoch 6/20  Iteration 895/3560 Training loss: 1.7276 1.2213 sec/batch
Epoch 6/20  Iteration 896/3560 Training loss: 1.7177 0.9197 sec/batch
Epoch 6/20  Iteration 897/3560 Training loss: 1.7171 0.9241 sec/batch
Epoch 6/20  Iteration 898/3560 Training loss: 1.7144 1.0621 sec/batch
Epoch 6/20  Iteration 899/3560 Training loss: 1.7156 1.0306 sec/batch
Epoch 6/20  Iteration 900/3560 Training loss: 1.7144 1.0288 sec/batch
Epoch 6/20  Iteration 901/3560 Training loss: 1.7105 0.8724 sec/batch
Epoch 6/20  Iteration 902/3560 Training loss: 1.7084 0.8492 sec/batch
Epoch 6/20  Iteration 903/3560 Training loss: 1.7076 1.1964 sec/batch
Epoch 6/20  Iteration 904/3560 Training loss: 1.7099 1.3469 sec/batch
Epoch 6/20  Iteration 905/3560 Training loss: 1.7086 1.3512 sec/batch
Epoch 6/20  Iteration 906/3560 Training loss: 1.7067 1.0108 sec/batch
Epoch 6/20  Iteration 907/3560 Training loss: 1.7066 0.8994 sec/batch
Epoch 6/20  Iteration 908/3560 Training loss: 1.7084 1.0975 sec/batch
Epoch 6/20  Iteration 909/3560 Training loss: 1.7085 1.0037 sec/batch
Epoch 6/20  Iteration 910/3560 Training loss: 1.7092 0.9079 sec/batch
Epoch 6/20  Iteration 911/3560 Training loss: 1.7079 0.8670 sec/batch
Epoch 6/20  Iteration 912/3560 Training loss: 1.7085 0.8604 sec/batch
Epoch 6/20  Iteration 913/3560 Training loss: 1.7079 0.8731 sec/batch
Epoch 6/20  Iteration 914/3560 Training loss: 1.7076 0.9008 sec/batch
Epoch 6/20  Iteration 915/3560 Training loss: 1.7074 1.2867 sec/batch
Epoch 6/20  Iteration 916/3560 Training loss: 1.7058 1.0245 sec/batch
Epoch 6/20  Iteration 917/3560 Training loss: 1.7043 1.1371 sec/batch
Epoch 6/20  Iteration 918/3560 Training loss: 1.7045 1.4258 sec/batch
Epoch 6/20  Iteration 919/3560 Training loss: 1.7050 1.5048 sec/batch
Epoch 6/20  Iteration 920/3560 Training loss: 1.7050 1.2197 sec/batch
Epoch 6/20  Iteration 921/3560 Training loss: 1.7045 1.0950 sec/batch
Epoch 6/20  Iteration 922/3560 Training loss: 1.7038 0.8566 sec/batch
Epoch 6/20  Iteration 923/3560 Training loss: 1.7041 0.9496 sec/batch
Epoch 6/20  Iteration 924/3560 Training loss: 1.7044 0.9314 sec/batch
Epoch 6/20  Iteration 925/3560 Training loss: 1.7041 0.9579 sec/batch
Epoch 6/20  Iteration 926/3560 Training loss: 1.7038 0.8599 sec/batch
Epoch 6/20  Iteration 927/3560 Training loss: 1.7031 0.8553 sec/batch
Epoch 6/20  Iteration 928/3560 Training loss: 1.7017 0.8807 sec/batch
Epoch 6/20  Iteration 929/3560 Training loss: 1.7002 0.8560 sec/batch
Epoch 6/20  Iteration 930/3560 Training loss: 1.6996 0.8655 sec/batch
Epoch 6/20  Iteration 931/3560 Training loss: 1.6990 0.8782 sec/batch
Epoch 6/20  Iteration 932/3560 Training loss: 1.6994 0.8618 sec/batch
Epoch 6/20  Iteration 933/3560 Training loss: 1.6989 0.8594 sec/batch
Epoch 6/20  Iteration 934/3560 Training loss: 1.6981 0.8658 sec/batch
Epoch 6/20  Iteration 935/3560 Training loss: 1.6981 0.8641 sec/batch
Epoch 6/20  Iteration 936/3560 Training loss: 1.6970 0.8569 sec/batch
Epoch 6/20  Iteration 937/3560 Training loss: 1.6966 0.8602 sec/batch
Epoch 6/20  Iteration 938/3560 Training loss: 1.6961 0.8599 sec/batch
Epoch 6/20  Iteration 939/3560 Training loss: 1.6958 0.8559 sec/batch
Epoch 6/20  Iteration 940/3560 Training loss: 1.6963 0.8960 sec/batch
Epoch 6/20  Iteration 941/3560 Training loss: 1.6957 0.8798 sec/batch
Epoch 6/20  Iteration 942/3560 Training loss: 1.6963 0.8539 sec/batch
Epoch 6/20  Iteration 943/3560 Training loss: 1.6962 0.8458 sec/batch
Epoch 6/20  Iteration 944/3560 Training loss: 1.6961 0.8552 sec/batch
Epoch 6/20  Iteration 945/3560 Training loss: 1.6961 0.8547 sec/batch
Epoch 6/20  Iteration 946/3560 Training loss: 1.6960 0.8556 sec/batch
Epoch 6/20  Iteration 947/3560 Training loss: 1.6961 0.8628 sec/batch
Epoch 6/20  Iteration 948/3560 Training loss: 1.6957 0.8610 sec/batch
Epoch 6/20  Iteration 949/3560 Training loss: 1.6952 0.8546 sec/batch
Epoch 6/20  Iteration 950/3560 Training loss: 1.6957 0.8517 sec/batch
Epoch 6/20  Iteration 951/3560 Training loss: 1.6955 0.8560 sec/batch
Epoch 6/20  Iteration 952/3560 Training loss: 1.6963 0.8649 sec/batch
Epoch 6/20  Iteration 953/3560 Training loss: 1.6965 0.8614 sec/batch
Epoch 6/20  Iteration 954/3560 Training loss: 1.6966 0.8599 sec/batch
Epoch 6/20  Iteration 955/3560 Training loss: 1.6964 0.9022 sec/batch
Epoch 6/20  Iteration 956/3560 Training loss: 1.6966 0.8563 sec/batch
Epoch 6/20  Iteration 957/3560 Training loss: 1.6967 0.8600 sec/batch
Epoch 6/20  Iteration 958/3560 Training loss: 1.6963 0.8506 sec/batch
Epoch 6/20  Iteration 959/3560 Training loss: 1.6962 0.8572 sec/batch
Epoch 6/20  Iteration 960/3560 Training loss: 1.6959 0.9118 sec/batch
Epoch 6/20  Iteration 961/3560 Training loss: 1.6964 0.9718 sec/batch
Epoch 6/20  Iteration 962/3560 Training loss: 1.6965 0.9107 sec/batch
Epoch 6/20  Iteration 963/3560 Training loss: 1.6966 0.9419 sec/batch
Epoch 6/20  Iteration 964/3560 Training loss: 1.6962 0.9045 sec/batch
Epoch 6/20  Iteration 965/3560 Training loss: 1.6959 0.9221 sec/batch
Epoch 6/20  Iteration 966/3560 Training loss: 1.6960 0.9094 sec/batch
Epoch 6/20  Iteration 967/3560 Training loss: 1.6957 0.9032 sec/batch
Epoch 6/20  Iteration 968/3560 Training loss: 1.6956 0.9112 sec/batch
Epoch 6/20  Iteration 969/3560 Training loss: 1.6949 0.8972 sec/batch
Epoch 6/20  Iteration 970/3560 Training loss: 1.6947 0.9003 sec/batch
Epoch 6/20  Iteration 971/3560 Training loss: 1.6940 0.9604 sec/batch
Epoch 6/20  Iteration 972/3560 Training loss: 1.6941 0.9438 sec/batch
Epoch 6/20  Iteration 973/3560 Training loss: 1.6933 0.9745 sec/batch
Epoch 6/20  Iteration 974/3560 Training loss: 1.6930 0.9034 sec/batch
Epoch 6/20  Iteration 975/3560 Training loss: 1.6925 1.0043 sec/batch
Epoch 6/20  Iteration 976/3560 Training loss: 1.6921 0.9196 sec/batch
Epoch 6/20  Iteration 977/3560 Training loss: 1.6917 0.9054 sec/batch
Epoch 6/20  Iteration 978/3560 Training loss: 1.6913 0.8584 sec/batch
Epoch 6/20  Iteration 979/3560 Training loss: 1.6907 0.8594 sec/batch
Epoch 6/20  Iteration 980/3560 Training loss: 1.6907 0.8559 sec/batch
Epoch 6/20  Iteration 981/3560 Training loss: 1.6903 0.8852 sec/batch
Epoch 6/20  Iteration 982/3560 Training loss: 1.6900 0.8551 sec/batch
Epoch 6/20  Iteration 983/3560 Training loss: 1.6895 0.8630 sec/batch
Epoch 6/20  Iteration 984/3560 Training loss: 1.6890 0.8575 sec/batch
Epoch 6/20  Iteration 985/3560 Training loss: 1.6885 0.8625 sec/batch
Epoch 6/20  Iteration 986/3560 Training loss: 1.6882 0.8582 sec/batch
Epoch 6/20  Iteration 987/3560 Training loss: 1.6880 0.9032 sec/batch
Epoch 6/20  Iteration 988/3560 Training loss: 1.6875 0.8578 sec/batch
Epoch 6/20  Iteration 989/3560 Training loss: 1.6870 0.9041 sec/batch
Epoch 6/20  Iteration 990/3560 Training loss: 1.6864 0.8540 sec/batch
Epoch 6/20  Iteration 991/3560 Training loss: 1.6862 0.8551 sec/batch
Epoch 6/20  Iteration 992/3560 Training loss: 1.6859 0.8539 sec/batch
Epoch 6/20  Iteration 993/3560 Training loss: 1.6856 0.8522 sec/batch
Epoch 6/20  Iteration 994/3560 Training loss: 1.6852 0.8550 sec/batch
Epoch 6/20  Iteration 995/3560 Training loss: 1.6850 0.8525 sec/batch
Epoch 6/20  Iteration 996/3560 Training loss: 1.6847 0.8714 sec/batch
Epoch 6/20  Iteration 997/3560 Training loss: 1.6845 0.8548 sec/batch
Epoch 6/20  Iteration 998/3560 Training loss: 1.6843 0.9036 sec/batch
Epoch 6/20  Iteration 999/3560 Training loss: 1.6842 0.8599 sec/batch
Epoch 6/20  Iteration 1000/3560 Training loss: 1.6841 0.8755 sec/batch
Epoch 6/20  Iteration 1001/3560 Training loss: 1.6838 0.8873 sec/batch
Epoch 6/20  Iteration 1002/3560 Training loss: 1.6836 0.8652 sec/batch
Epoch 6/20  Iteration 1003/3560 Training loss: 1.6833 0.8754 sec/batch
Epoch 6/20  Iteration 1004/3560 Training loss: 1.6829 0.8660 sec/batch
Epoch 6/20  Iteration 1005/3560 Training loss: 1.6825 0.8603 sec/batch
Epoch 6/20  Iteration 1006/3560 Training loss: 1.6820 0.8749 sec/batch
Epoch 6/20  Iteration 1007/3560 Training loss: 1.6819 0.8699 sec/batch
Epoch 6/20  Iteration 1008/3560 Training loss: 1.6817 0.8589 sec/batch
Epoch 6/20  Iteration 1009/3560 Training loss: 1.6815 0.8563 sec/batch
Epoch 6/20  Iteration 1010/3560 Training loss: 1.6812 0.8540 sec/batch
Epoch 6/20  Iteration 1011/3560 Training loss: 1.6810 0.9503 sec/batch
Epoch 6/20  Iteration 1012/3560 Training loss: 1.6805 1.0223 sec/batch
Epoch 6/20  Iteration 1013/3560 Training loss: 1.6799 0.8591 sec/batch
Epoch 6/20  Iteration 1014/3560 Training loss: 1.6799 0.8878 sec/batch
Epoch 6/20  Iteration 1015/3560 Training loss: 1.6797 0.8784 sec/batch
Epoch 6/20  Iteration 1016/3560 Training loss: 1.6792 0.8592 sec/batch
Epoch 6/20  Iteration 1017/3560 Training loss: 1.6792 0.8661 sec/batch
Epoch 6/20  Iteration 1018/3560 Training loss: 1.6792 0.9046 sec/batch
Epoch 6/20  Iteration 1019/3560 Training loss: 1.6789 0.8582 sec/batch
Epoch 6/20  Iteration 1020/3560 Training loss: 1.6786 0.8569 sec/batch
Epoch 6/20  Iteration 1021/3560 Training loss: 1.6782 0.8634 sec/batch
Epoch 6/20  Iteration 1022/3560 Training loss: 1.6778 0.8604 sec/batch
Epoch 6/20  Iteration 1023/3560 Training loss: 1.6778 0.8513 sec/batch
Epoch 6/20  Iteration 1024/3560 Training loss: 1.6777 0.8578 sec/batch
Epoch 6/20  Iteration 1025/3560 Training loss: 1.6776 0.8490 sec/batch
Epoch 6/20  Iteration 1026/3560 Training loss: 1.6775 0.8543 sec/batch
Epoch 6/20  Iteration 1027/3560 Training loss: 1.6776 0.8589 sec/batch
Epoch 6/20  Iteration 1028/3560 Training loss: 1.6775 0.8593 sec/batch
Epoch 6/20  Iteration 1029/3560 Training loss: 1.6775 0.8576 sec/batch
Epoch 6/20  Iteration 1030/3560 Training loss: 1.6773 0.8704 sec/batch
Epoch 6/20  Iteration 1031/3560 Training loss: 1.6775 0.9065 sec/batch
Epoch 6/20  Iteration 1032/3560 Training loss: 1.6773 0.8627 sec/batch
Epoch 6/20  Iteration 1033/3560 Training loss: 1.6771 0.8522 sec/batch
Epoch 6/20  Iteration 1034/3560 Training loss: 1.6772 0.8517 sec/batch
Epoch 6/20  Iteration 1035/3560 Training loss: 1.6769 0.8505 sec/batch
Epoch 6/20  Iteration 1036/3560 Training loss: 1.6769 0.8564 sec/batch
Epoch 6/20  Iteration 1037/3560 Training loss: 1.6769 0.8585 sec/batch
Epoch 6/20  Iteration 1038/3560 Training loss: 1.6770 0.8516 sec/batch
Epoch 6/20  Iteration 1039/3560 Training loss: 1.6769 0.8497 sec/batch
Epoch 6/20  Iteration 1040/3560 Training loss: 1.6767 0.8497 sec/batch
Epoch 6/20  Iteration 1041/3560 Training loss: 1.6763 0.8479 sec/batch
Epoch 6/20  Iteration 1042/3560 Training loss: 1.6763 0.8525 sec/batch
Epoch 6/20  Iteration 1043/3560 Training loss: 1.6763 0.8542 sec/batch
Epoch 6/20  Iteration 1044/3560 Training loss: 1.6762 0.8569 sec/batch
Epoch 6/20  Iteration 1045/3560 Training loss: 1.6760 0.8542 sec/batch
Epoch 6/20  Iteration 1046/3560 Training loss: 1.6759 0.8524 sec/batch
Epoch 6/20  Iteration 1047/3560 Training loss: 1.6759 0.8526 sec/batch
Epoch 6/20  Iteration 1048/3560 Training loss: 1.6758 0.8666 sec/batch
Epoch 6/20  Iteration 1049/3560 Training loss: 1.6754 0.8579 sec/batch
Epoch 6/20  Iteration 1050/3560 Training loss: 1.6754 0.8519 sec/batch
Epoch 6/20  Iteration 1051/3560 Training loss: 1.6755 0.8646 sec/batch
Epoch 6/20  Iteration 1052/3560 Training loss: 1.6754 0.8513 sec/batch
Epoch 6/20  Iteration 1053/3560 Training loss: 1.6753 0.8538 sec/batch
Epoch 6/20  Iteration 1054/3560 Training loss: 1.6753 0.8466 sec/batch
Epoch 6/20  Iteration 1055/3560 Training loss: 1.6752 0.8914 sec/batch
Epoch 6/20  Iteration 1056/3560 Training loss: 1.6750 0.8528 sec/batch
Epoch 6/20  Iteration 1057/3560 Training loss: 1.6750 0.8620 sec/batch
Epoch 6/20  Iteration 1058/3560 Training loss: 1.6753 0.8493 sec/batch
Epoch 6/20  Iteration 1059/3560 Training loss: 1.6752 0.8508 sec/batch
Epoch 6/20  Iteration 1060/3560 Training loss: 1.6750 0.8978 sec/batch
Epoch 6/20  Iteration 1061/3560 Training loss: 1.6748 0.8518 sec/batch
Epoch 6/20  Iteration 1062/3560 Training loss: 1.6745 0.8585 sec/batch
Epoch 6/20  Iteration 1063/3560 Training loss: 1.6745 0.8557 sec/batch
Epoch 6/20  Iteration 1064/3560 Training loss: 1.6744 0.9004 sec/batch
Epoch 6/20  Iteration 1065/3560 Training loss: 1.6744 1.1447 sec/batch
Epoch 6/20  Iteration 1066/3560 Training loss: 1.6742 0.8625 sec/batch
Epoch 6/20  Iteration 1067/3560 Training loss: 1.6739 0.8525 sec/batch
Epoch 6/20  Iteration 1068/3560 Training loss: 1.6740 0.8496 sec/batch
Epoch 7/20  Iteration 1069/3560 Training loss: 1.7288 0.8489 sec/batch
Epoch 7/20  Iteration 1070/3560 Training loss: 1.6891 0.8960 sec/batch
Epoch 7/20  Iteration 1071/3560 Training loss: 1.6760 0.8652 sec/batch
Epoch 7/20  Iteration 1072/3560 Training loss: 1.6691 0.8508 sec/batch
Epoch 7/20  Iteration 1073/3560 Training loss: 1.6657 0.8532 sec/batch
Epoch 7/20  Iteration 1074/3560 Training loss: 1.6558 0.9123 sec/batch
Epoch 7/20  Iteration 1075/3560 Training loss: 1.6560 0.8530 sec/batch
Epoch 7/20  Iteration 1076/3560 Training loss: 1.6530 0.8509 sec/batch
Epoch 7/20  Iteration 1077/3560 Training loss: 1.6537 0.8560 sec/batch
Epoch 7/20  Iteration 1078/3560 Training loss: 1.6534 0.8589 sec/batch
Epoch 7/20  Iteration 1079/3560 Training loss: 1.6493 0.8514 sec/batch
Epoch 7/20  Iteration 1080/3560 Training loss: 1.6477 0.8696 sec/batch
Epoch 7/20  Iteration 1081/3560 Training loss: 1.6476 0.8597 sec/batch
Epoch 7/20  Iteration 1082/3560 Training loss: 1.6499 0.8554 sec/batch
Epoch 7/20  Iteration 1083/3560 Training loss: 1.6492 0.8590 sec/batch
Epoch 7/20  Iteration 1084/3560 Training loss: 1.6472 0.8607 sec/batch
Epoch 7/20  Iteration 1085/3560 Training loss: 1.6473 0.8617 sec/batch
Epoch 7/20  Iteration 1086/3560 Training loss: 1.6488 0.9058 sec/batch
Epoch 7/20  Iteration 1087/3560 Training loss: 1.6490 0.8552 sec/batch
Epoch 7/20  Iteration 1088/3560 Training loss: 1.6494 0.8500 sec/batch
Epoch 7/20  Iteration 1089/3560 Training loss: 1.6486 0.8472 sec/batch
Epoch 7/20  Iteration 1090/3560 Training loss: 1.6490 0.8603 sec/batch
Epoch 7/20  Iteration 1091/3560 Training loss: 1.6481 0.8568 sec/batch
Epoch 7/20  Iteration 1092/3560 Training loss: 1.6477 0.8517 sec/batch
Epoch 7/20  Iteration 1093/3560 Training loss: 1.6475 0.8602 sec/batch
Epoch 7/20  Iteration 1094/3560 Training loss: 1.6458 0.8589 sec/batch
Epoch 7/20  Iteration 1095/3560 Training loss: 1.6443 0.8541 sec/batch
Epoch 7/20  Iteration 1096/3560 Training loss: 1.6442 0.9038 sec/batch
Epoch 7/20  Iteration 1097/3560 Training loss: 1.6447 0.8657 sec/batch
Epoch 7/20  Iteration 1098/3560 Training loss: 1.6448 0.8562 sec/batch
Epoch 7/20  Iteration 1099/3560 Training loss: 1.6445 0.8681 sec/batch
Epoch 7/20  Iteration 1100/3560 Training loss: 1.6436 0.8506 sec/batch
Epoch 7/20  Iteration 1101/3560 Training loss: 1.6437 0.8496 sec/batch
Epoch 7/20  Iteration 1102/3560 Training loss: 1.6441 0.8575 sec/batch
Epoch 7/20  Iteration 1103/3560 Training loss: 1.6435 0.8488 sec/batch
Epoch 7/20  Iteration 1104/3560 Training loss: 1.6432 0.8883 sec/batch
Epoch 7/20  Iteration 1105/3560 Training loss: 1.6422 1.0351 sec/batch
Epoch 7/20  Iteration 1106/3560 Training loss: 1.6407 0.9499 sec/batch
Epoch 7/20  Iteration 1107/3560 Training loss: 1.6394 0.8613 sec/batch
Epoch 7/20  Iteration 1108/3560 Training loss: 1.6387 0.8570 sec/batch
Epoch 7/20  Iteration 1109/3560 Training loss: 1.6383 0.8543 sec/batch
Epoch 7/20  Iteration 1110/3560 Training loss: 1.6388 0.8614 sec/batch
Epoch 7/20  Iteration 1111/3560 Training loss: 1.6382 0.8609 sec/batch
Epoch 7/20  Iteration 1112/3560 Training loss: 1.6375 0.8522 sec/batch
Epoch 7/20  Iteration 1113/3560 Training loss: 1.6377 0.8569 sec/batch
Epoch 7/20  Iteration 1114/3560 Training loss: 1.6366 0.8511 sec/batch
Epoch 7/20  Iteration 1115/3560 Training loss: 1.6361 0.8589 sec/batch
Epoch 7/20  Iteration 1116/3560 Training loss: 1.6357 0.8573 sec/batch
Epoch 7/20  Iteration 1117/3560 Training loss: 1.6354 0.8601 sec/batch
Epoch 7/20  Iteration 1118/3560 Training loss: 1.6358 0.8599 sec/batch
Epoch 7/20  Iteration 1119/3560 Training loss: 1.6353 0.8582 sec/batch
Epoch 7/20  Iteration 1120/3560 Training loss: 1.6361 0.9002 sec/batch
Epoch 7/20  Iteration 1121/3560 Training loss: 1.6360 0.8524 sec/batch
Epoch 7/20  Iteration 1122/3560 Training loss: 1.6359 0.8565 sec/batch
Epoch 7/20  Iteration 1123/3560 Training loss: 1.6357 0.8844 sec/batch
Epoch 7/20  Iteration 1124/3560 Training loss: 1.6357 0.8554 sec/batch
Epoch 7/20  Iteration 1125/3560 Training loss: 1.6359 0.8574 sec/batch
Epoch 7/20  Iteration 1126/3560 Training loss: 1.6352 0.8516 sec/batch
Epoch 7/20  Iteration 1127/3560 Training loss: 1.6347 0.8586 sec/batch
Epoch 7/20  Iteration 1128/3560 Training loss: 1.6351 0.8655 sec/batch
Epoch 7/20  Iteration 1129/3560 Training loss: 1.6350 0.8611 sec/batch
Epoch 7/20  Iteration 1130/3560 Training loss: 1.6358 0.8501 sec/batch
Epoch 7/20  Iteration 1131/3560 Training loss: 1.6362 0.8971 sec/batch
Epoch 7/20  Iteration 1132/3560 Training loss: 1.6362 0.9067 sec/batch
Epoch 7/20  Iteration 1133/3560 Training loss: 1.6360 0.8724 sec/batch
Epoch 7/20  Iteration 1134/3560 Training loss: 1.6360 0.8530 sec/batch
Epoch 7/20  Iteration 1135/3560 Training loss: 1.6361 0.8503 sec/batch
Epoch 7/20  Iteration 1136/3560 Training loss: 1.6356 0.8574 sec/batch
Epoch 7/20  Iteration 1137/3560 Training loss: 1.6355 0.8518 sec/batch
Epoch 7/20  Iteration 1138/3560 Training loss: 1.6352 0.8702 sec/batch
Epoch 7/20  Iteration 1139/3560 Training loss: 1.6359 0.8549 sec/batch
Epoch 7/20  Iteration 1140/3560 Training loss: 1.6361 0.9056 sec/batch
Epoch 7/20  Iteration 1141/3560 Training loss: 1.6364 0.8535 sec/batch
Epoch 7/20  Iteration 1142/3560 Training loss: 1.6360 0.8665 sec/batch
Epoch 7/20  Iteration 1143/3560 Training loss: 1.6358 0.8814 sec/batch
Epoch 7/20  Iteration 1144/3560 Training loss: 1.6359 0.8509 sec/batch
Epoch 7/20  Iteration 1145/3560 Training loss: 1.6357 0.8570 sec/batch
Epoch 7/20  Iteration 1146/3560 Training loss: 1.6356 0.8511 sec/batch
Epoch 7/20  Iteration 1147/3560 Training loss: 1.6350 0.8511 sec/batch
Epoch 7/20  Iteration 1148/3560 Training loss: 1.6349 0.8573 sec/batch
Epoch 7/20  Iteration 1149/3560 Training loss: 1.6342 0.9694 sec/batch
Epoch 7/20  Iteration 1150/3560 Training loss: 1.6343 0.8562 sec/batch
Epoch 7/20  Iteration 1151/3560 Training loss: 1.6335 0.8942 sec/batch
Epoch 7/20  Iteration 1152/3560 Training loss: 1.6333 0.8726 sec/batch
Epoch 7/20  Iteration 1153/3560 Training loss: 1.6329 0.8539 sec/batch
Epoch 7/20  Iteration 1154/3560 Training loss: 1.6324 0.8551 sec/batch
Epoch 7/20  Iteration 1155/3560 Training loss: 1.6322 0.9126 sec/batch
Epoch 7/20  Iteration 1156/3560 Training loss: 1.6317 0.8584 sec/batch
Epoch 7/20  Iteration 1157/3560 Training loss: 1.6311 0.8563 sec/batch
Epoch 7/20  Iteration 1158/3560 Training loss: 1.6311 0.8480 sec/batch
Epoch 7/20  Iteration 1159/3560 Training loss: 1.6309 0.8575 sec/batch
Epoch 7/20  Iteration 1160/3560 Training loss: 1.6306 0.8485 sec/batch
Epoch 7/20  Iteration 1161/3560 Training loss: 1.6302 0.8629 sec/batch
Epoch 7/20  Iteration 1162/3560 Training loss: 1.6298 0.8580 sec/batch
Epoch 7/20  Iteration 1163/3560 Training loss: 1.6293 0.8573 sec/batch
Epoch 7/20  Iteration 1164/3560 Training loss: 1.6291 0.8553 sec/batch
Epoch 7/20  Iteration 1165/3560 Training loss: 1.6289 0.8494 sec/batch
Epoch 7/20  Iteration 1166/3560 Training loss: 1.6284 0.8547 sec/batch
Epoch 7/20  Iteration 1167/3560 Training loss: 1.6280 0.8544 sec/batch
Epoch 7/20  Iteration 1168/3560 Training loss: 1.6275 0.8736 sec/batch
Epoch 7/20  Iteration 1169/3560 Training loss: 1.6274 0.8538 sec/batch
Epoch 7/20  Iteration 1170/3560 Training loss: 1.6272 0.8458 sec/batch
Epoch 7/20  Iteration 1171/3560 Training loss: 1.6270 0.8497 sec/batch
Epoch 7/20  Iteration 1172/3560 Training loss: 1.6268 0.8498 sec/batch
Epoch 7/20  Iteration 1173/3560 Training loss: 1.6265 0.8573 sec/batch
Epoch 7/20  Iteration 1174/3560 Training loss: 1.6262 0.8592 sec/batch
Epoch 7/20  Iteration 1175/3560 Training loss: 1.6261 0.8598 sec/batch
Epoch 7/20  Iteration 1176/3560 Training loss: 1.6260 0.8526 sec/batch
Epoch 7/20  Iteration 1177/3560 Training loss: 1.6259 0.8598 sec/batch
Epoch 7/20  Iteration 1178/3560 Training loss: 1.6259 0.8551 sec/batch
Epoch 7/20  Iteration 1179/3560 Training loss: 1.6256 0.8529 sec/batch
Epoch 7/20  Iteration 1180/3560 Training loss: 1.6253 0.8705 sec/batch
Epoch 7/20  Iteration 1181/3560 Training loss: 1.6251 0.8505 sec/batch
Epoch 7/20  Iteration 1182/3560 Training loss: 1.6248 0.8546 sec/batch
Epoch 7/20  Iteration 1183/3560 Training loss: 1.6244 0.8754 sec/batch
Epoch 7/20  Iteration 1184/3560 Training loss: 1.6241 0.8764 sec/batch
Epoch 7/20  Iteration 1185/3560 Training loss: 1.6240 0.8580 sec/batch
Epoch 7/20  Iteration 1186/3560 Training loss: 1.6239 0.8583 sec/batch
Epoch 7/20  Iteration 1187/3560 Training loss: 1.6237 0.8527 sec/batch
Epoch 7/20  Iteration 1188/3560 Training loss: 1.6235 0.8684 sec/batch
Epoch 7/20  Iteration 1189/3560 Training loss: 1.6234 0.8538 sec/batch
Epoch 7/20  Iteration 1190/3560 Training loss: 1.6230 0.8562 sec/batch
Epoch 7/20  Iteration 1191/3560 Training loss: 1.6225 0.8526 sec/batch
Epoch 7/20  Iteration 1192/3560 Training loss: 1.6224 0.8465 sec/batch
Epoch 7/20  Iteration 1193/3560 Training loss: 1.6223 0.8541 sec/batch
Epoch 7/20  Iteration 1194/3560 Training loss: 1.6218 0.8510 sec/batch
Epoch 7/20  Iteration 1195/3560 Training loss: 1.6219 0.8547 sec/batch
Epoch 7/20  Iteration 1196/3560 Training loss: 1.6219 0.8523 sec/batch
Epoch 7/20  Iteration 1197/3560 Training loss: 1.6217 0.8559 sec/batch
Epoch 7/20  Iteration 1198/3560 Training loss: 1.6214 0.8552 sec/batch
Epoch 7/20  Iteration 1199/3560 Training loss: 1.6209 0.8578 sec/batch
Epoch 7/20  Iteration 1200/3560 Training loss: 1.6206 0.8528 sec/batch
Epoch 7/20  Iteration 1201/3560 Training loss: 1.6205 0.8475 sec/batch
Epoch 7/20  Iteration 1202/3560 Training loss: 1.6206 0.8521 sec/batch
Epoch 7/20  Iteration 1203/3560 Training loss: 1.6205 0.9786 sec/batch
Epoch 7/20  Iteration 1204/3560 Training loss: 1.6205 0.8572 sec/batch
Epoch 7/20  Iteration 1205/3560 Training loss: 1.6205 0.8529 sec/batch
Epoch 7/20  Iteration 1206/3560 Training loss: 1.6205 0.8511 sec/batch
Epoch 7/20  Iteration 1207/3560 Training loss: 1.6205 0.8600 sec/batch
Epoch 7/20  Iteration 1208/3560 Training loss: 1.6203 0.8514 sec/batch
Epoch 7/20  Iteration 1209/3560 Training loss: 1.6206 0.8645 sec/batch
Epoch 7/20  Iteration 1210/3560 Training loss: 1.6205 0.8526 sec/batch
Epoch 7/20  Iteration 1211/3560 Training loss: 1.6204 0.8616 sec/batch
Epoch 7/20  Iteration 1212/3560 Training loss: 1.6204 0.8596 sec/batch
Epoch 7/20  Iteration 1213/3560 Training loss: 1.6201 0.8974 sec/batch
Epoch 7/20  Iteration 1214/3560 Training loss: 1.6202 0.8528 sec/batch
Epoch 7/20  Iteration 1215/3560 Training loss: 1.6201 0.8509 sec/batch
Epoch 7/20  Iteration 1216/3560 Training loss: 1.6203 0.9011 sec/batch
Epoch 7/20  Iteration 1217/3560 Training loss: 1.6203 0.8563 sec/batch
Epoch 7/20  Iteration 1218/3560 Training loss: 1.6201 0.8526 sec/batch
Epoch 7/20  Iteration 1219/3560 Training loss: 1.6197 0.8565 sec/batch
Epoch 7/20  Iteration 1220/3560 Training loss: 1.6197 0.8609 sec/batch
Epoch 7/20  Iteration 1221/3560 Training loss: 1.6197 0.8581 sec/batch
Epoch 7/20  Iteration 1222/3560 Training loss: 1.6197 0.8528 sec/batch
Epoch 7/20  Iteration 1223/3560 Training loss: 1.6196 0.8669 sec/batch
Epoch 7/20  Iteration 1224/3560 Training loss: 1.6196 0.8545 sec/batch
Epoch 7/20  Iteration 1225/3560 Training loss: 1.6195 0.8500 sec/batch
Epoch 7/20  Iteration 1226/3560 Training loss: 1.6194 0.8532 sec/batch
Epoch 7/20  Iteration 1227/3560 Training loss: 1.6192 0.8564 sec/batch
Epoch 7/20  Iteration 1228/3560 Training loss: 1.6192 0.9030 sec/batch
Epoch 7/20  Iteration 1229/3560 Training loss: 1.6193 0.8578 sec/batch
Epoch 7/20  Iteration 1230/3560 Training loss: 1.6192 0.8559 sec/batch
Epoch 7/20  Iteration 1231/3560 Training loss: 1.6192 0.8566 sec/batch
Epoch 7/20  Iteration 1232/3560 Training loss: 1.6191 0.8607 sec/batch
Epoch 7/20  Iteration 1233/3560 Training loss: 1.6190 0.8513 sec/batch
Epoch 7/20  Iteration 1234/3560 Training loss: 1.6189 0.8558 sec/batch
Epoch 7/20  Iteration 1235/3560 Training loss: 1.6190 0.8500 sec/batch
Epoch 7/20  Iteration 1236/3560 Training loss: 1.6193 0.8513 sec/batch
Epoch 7/20  Iteration 1237/3560 Training loss: 1.6192 0.8765 sec/batch
Epoch 7/20  Iteration 1238/3560 Training loss: 1.6191 0.8516 sec/batch
Epoch 7/20  Iteration 1239/3560 Training loss: 1.6189 0.8523 sec/batch
Epoch 7/20  Iteration 1240/3560 Training loss: 1.6187 0.8583 sec/batch
Epoch 7/20  Iteration 1241/3560 Training loss: 1.6187 0.8571 sec/batch
Epoch 7/20  Iteration 1242/3560 Training loss: 1.6187 0.8505 sec/batch
Epoch 7/20  Iteration 1243/3560 Training loss: 1.6187 0.9056 sec/batch
Epoch 7/20  Iteration 1244/3560 Training loss: 1.6185 0.8619 sec/batch
Epoch 7/20  Iteration 1245/3560 Training loss: 1.6183 0.8524 sec/batch
Epoch 7/20  Iteration 1246/3560 Training loss: 1.6183 0.9019 sec/batch
Epoch 8/20  Iteration 1247/3560 Training loss: 1.6765 0.8503 sec/batch
Epoch 8/20  Iteration 1248/3560 Training loss: 1.6372 0.9119 sec/batch
Epoch 8/20  Iteration 1249/3560 Training loss: 1.6239 0.8926 sec/batch
Epoch 8/20  Iteration 1250/3560 Training loss: 1.6190 0.8780 sec/batch
Epoch 8/20  Iteration 1251/3560 Training loss: 1.6145 0.8555 sec/batch
Epoch 8/20  Iteration 1252/3560 Training loss: 1.6062 0.8512 sec/batch
Epoch 8/20  Iteration 1253/3560 Training loss: 1.6065 0.8572 sec/batch
Epoch 8/20  Iteration 1254/3560 Training loss: 1.6041 0.9035 sec/batch
Epoch 8/20  Iteration 1255/3560 Training loss: 1.6049 0.8599 sec/batch
Epoch 8/20  Iteration 1256/3560 Training loss: 1.6042 0.9055 sec/batch
Epoch 8/20  Iteration 1257/3560 Training loss: 1.6005 0.8648 sec/batch
Epoch 8/20  Iteration 1258/3560 Training loss: 1.5992 0.8755 sec/batch
Epoch 8/20  Iteration 1259/3560 Training loss: 1.5989 0.8641 sec/batch
Epoch 8/20  Iteration 1260/3560 Training loss: 1.6003 0.9239 sec/batch
Epoch 8/20  Iteration 1261/3560 Training loss: 1.5994 0.8484 sec/batch
Epoch 8/20  Iteration 1262/3560 Training loss: 1.5973 0.8480 sec/batch
Epoch 8/20  Iteration 1263/3560 Training loss: 1.5975 0.8713 sec/batch
Epoch 8/20  Iteration 1264/3560 Training loss: 1.5992 0.8630 sec/batch
Epoch 8/20  Iteration 1265/3560 Training loss: 1.5989 0.9133 sec/batch
Epoch 8/20  Iteration 1266/3560 Training loss: 1.5995 0.8485 sec/batch
Epoch 8/20  Iteration 1267/3560 Training loss: 1.5983 0.8582 sec/batch
Epoch 8/20  Iteration 1268/3560 Training loss: 1.5987 0.8554 sec/batch
Epoch 8/20  Iteration 1269/3560 Training loss: 1.5977 0.8520 sec/batch
Epoch 8/20  Iteration 1270/3560 Training loss: 1.5972 0.8488 sec/batch
Epoch 8/20  Iteration 1271/3560 Training loss: 1.5970 0.9193 sec/batch
Epoch 8/20  Iteration 1272/3560 Training loss: 1.5953 0.8633 sec/batch
Epoch 8/20  Iteration 1273/3560 Training loss: 1.5942 0.9001 sec/batch
Epoch 8/20  Iteration 1274/3560 Training loss: 1.5945 1.1064 sec/batch
Epoch 8/20  Iteration 1275/3560 Training loss: 1.5949 1.1505 sec/batch
Epoch 8/20  Iteration 1276/3560 Training loss: 1.5953 0.9479 sec/batch
Epoch 8/20  Iteration 1277/3560 Training loss: 1.5947 0.9251 sec/batch
Epoch 8/20  Iteration 1278/3560 Training loss: 1.5935 1.2276 sec/batch
Epoch 8/20  Iteration 1279/3560 Training loss: 1.5936 1.4862 sec/batch
Epoch 8/20  Iteration 1280/3560 Training loss: 1.5940 1.0875 sec/batch
Epoch 8/20  Iteration 1281/3560 Training loss: 1.5936 0.8802 sec/batch
Epoch 8/20  Iteration 1282/3560 Training loss: 1.5933 0.9007 sec/batch
Epoch 8/20  Iteration 1283/3560 Training loss: 1.5924 0.8548 sec/batch
Epoch 8/20  Iteration 1284/3560 Training loss: 1.5911 0.8609 sec/batch
Epoch 8/20  Iteration 1285/3560 Training loss: 1.5898 1.0208 sec/batch
Epoch 8/20  Iteration 1286/3560 Training loss: 1.5891 0.9178 sec/batch
Epoch 8/20  Iteration 1287/3560 Training loss: 1.5886 1.0207 sec/batch
Epoch 8/20  Iteration 1288/3560 Training loss: 1.5892 1.1395 sec/batch
Epoch 8/20  Iteration 1289/3560 Training loss: 1.5888 1.1539 sec/batch
Epoch 8/20  Iteration 1290/3560 Training loss: 1.5881 0.9668 sec/batch
Epoch 8/20  Iteration 1291/3560 Training loss: 1.5884 0.9004 sec/batch
Epoch 8/20  Iteration 1292/3560 Training loss: 1.5873 1.4269 sec/batch
Epoch 8/20  Iteration 1293/3560 Training loss: 1.5869 1.4138 sec/batch
Epoch 8/20  Iteration 1294/3560 Training loss: 1.5864 1.0144 sec/batch
Epoch 8/20  Iteration 1295/3560 Training loss: 1.5863 0.8710 sec/batch
Epoch 8/20  Iteration 1296/3560 Training loss: 1.5867 0.9071 sec/batch
Epoch 8/20  Iteration 1297/3560 Training loss: 1.5863 0.8652 sec/batch
Epoch 8/20  Iteration 1298/3560 Training loss: 1.5871 0.8570 sec/batch
Epoch 8/20  Iteration 1299/3560 Training loss: 1.5871 0.9783 sec/batch
Epoch 8/20  Iteration 1300/3560 Training loss: 1.5871 1.2157 sec/batch
Epoch 8/20  Iteration 1301/3560 Training loss: 1.5868 1.0027 sec/batch
Epoch 8/20  Iteration 1302/3560 Training loss: 1.5869 0.8822 sec/batch
Epoch 8/20  Iteration 1303/3560 Training loss: 1.5873 0.9749 sec/batch
Epoch 8/20  Iteration 1304/3560 Training loss: 1.5868 0.9719 sec/batch
Epoch 8/20  Iteration 1305/3560 Training loss: 1.5862 1.2719 sec/batch
Epoch 8/20  Iteration 1306/3560 Training loss: 1.5867 1.2658 sec/batch
Epoch 8/20  Iteration 1307/3560 Training loss: 1.5865 1.1969 sec/batch
Epoch 8/20  Iteration 1308/3560 Training loss: 1.5873 0.9431 sec/batch
Epoch 8/20  Iteration 1309/3560 Training loss: 1.5877 0.8613 sec/batch
Epoch 8/20  Iteration 1310/3560 Training loss: 1.5879 0.8722 sec/batch
Epoch 8/20  Iteration 1311/3560 Training loss: 1.5877 0.8583 sec/batch
Epoch 8/20  Iteration 1312/3560 Training loss: 1.5880 0.8479 sec/batch
Epoch 8/20  Iteration 1313/3560 Training loss: 1.5881 0.8420 sec/batch
Epoch 8/20  Iteration 1314/3560 Training loss: 1.5877 0.8432 sec/batch
Epoch 8/20  Iteration 1315/3560 Training loss: 1.5876 0.8888 sec/batch
Epoch 8/20  Iteration 1316/3560 Training loss: 1.5875 0.8482 sec/batch
Epoch 8/20  Iteration 1317/3560 Training loss: 1.5879 0.8432 sec/batch
Epoch 8/20  Iteration 1318/3560 Training loss: 1.5881 0.8584 sec/batch
Epoch 8/20  Iteration 1319/3560 Training loss: 1.5884 0.8580 sec/batch
Epoch 8/20  Iteration 1320/3560 Training loss: 1.5881 0.8461 sec/batch
Epoch 8/20  Iteration 1321/3560 Training loss: 1.5879 0.8427 sec/batch
Epoch 8/20  Iteration 1322/3560 Training loss: 1.5882 0.8389 sec/batch
Epoch 8/20  Iteration 1323/3560 Training loss: 1.5879 0.8563 sec/batch
Epoch 8/20  Iteration 1324/3560 Training loss: 1.5879 0.8558 sec/batch
Epoch 8/20  Iteration 1325/3560 Training loss: 1.5873 0.9296 sec/batch
Epoch 8/20  Iteration 1326/3560 Training loss: 1.5872 0.9208 sec/batch
Epoch 8/20  Iteration 1327/3560 Training loss: 1.5865 0.8739 sec/batch
Epoch 8/20  Iteration 1328/3560 Training loss: 1.5866 0.9912 sec/batch
Epoch 8/20  Iteration 1329/3560 Training loss: 1.5860 1.1753 sec/batch
Epoch 8/20  Iteration 1330/3560 Training loss: 1.5859 0.9404 sec/batch
Epoch 8/20  Iteration 1331/3560 Training loss: 1.5854 0.8857 sec/batch
Epoch 8/20  Iteration 1332/3560 Training loss: 1.5850 0.8832 sec/batch
Epoch 8/20  Iteration 1333/3560 Training loss: 1.5847 0.9531 sec/batch
Epoch 8/20  Iteration 1334/3560 Training loss: 1.5844 0.8715 sec/batch
Epoch 8/20  Iteration 1335/3560 Training loss: 1.5839 0.9982 sec/batch
Epoch 8/20  Iteration 1336/3560 Training loss: 1.5839 1.3536 sec/batch
Epoch 8/20  Iteration 1337/3560 Training loss: 1.5835 1.1585 sec/batch
Epoch 8/20  Iteration 1338/3560 Training loss: 1.5834 1.0188 sec/batch
Epoch 8/20  Iteration 1339/3560 Training loss: 1.5828 0.9921 sec/batch
Epoch 8/20  Iteration 1340/3560 Training loss: 1.5824 0.9224 sec/batch
Epoch 8/20  Iteration 1341/3560 Training loss: 1.5822 1.1687 sec/batch
Epoch 8/20  Iteration 1342/3560 Training loss: 1.5821 0.9859 sec/batch
Epoch 8/20  Iteration 1343/3560 Training loss: 1.5820 0.9050 sec/batch
Epoch 8/20  Iteration 1344/3560 Training loss: 1.5816 0.8987 sec/batch
Epoch 8/20  Iteration 1345/3560 Training loss: 1.5811 0.9245 sec/batch
Epoch 8/20  Iteration 1346/3560 Training loss: 1.5805 0.8751 sec/batch
Epoch 8/20  Iteration 1347/3560 Training loss: 1.5805 0.9757 sec/batch
Epoch 8/20  Iteration 1348/3560 Training loss: 1.5802 0.8609 sec/batch
Epoch 8/20  Iteration 1349/3560 Training loss: 1.5800 0.8472 sec/batch
Epoch 8/20  Iteration 1350/3560 Training loss: 1.5799 0.8476 sec/batch
Epoch 8/20  Iteration 1351/3560 Training loss: 1.5797 0.8837 sec/batch
Epoch 8/20  Iteration 1352/3560 Training loss: 1.5795 0.8546 sec/batch
Epoch 8/20  Iteration 1353/3560 Training loss: 1.5794 0.8446 sec/batch
Epoch 8/20  Iteration 1354/3560 Training loss: 1.5795 0.8466 sec/batch
Epoch 8/20  Iteration 1355/3560 Training loss: 1.5793 0.8770 sec/batch
Epoch 8/20  Iteration 1356/3560 Training loss: 1.5793 0.8737 sec/batch
Epoch 8/20  Iteration 1357/3560 Training loss: 1.5790 0.8695 sec/batch
Epoch 8/20  Iteration 1358/3560 Training loss: 1.5788 0.8481 sec/batch
Epoch 8/20  Iteration 1359/3560 Training loss: 1.5786 0.8637 sec/batch
Epoch 8/20  Iteration 1360/3560 Training loss: 1.5784 0.8601 sec/batch
Epoch 8/20  Iteration 1361/3560 Training loss: 1.5781 0.8397 sec/batch
Epoch 8/20  Iteration 1362/3560 Training loss: 1.5777 1.2174 sec/batch
Epoch 8/20  Iteration 1363/3560 Training loss: 1.5776 1.2288 sec/batch
Epoch 8/20  Iteration 1364/3560 Training loss: 1.5775 1.3226 sec/batch
Epoch 8/20  Iteration 1365/3560 Training loss: 1.5772 1.3576 sec/batch
Epoch 8/20  Iteration 1366/3560 Training loss: 1.5771 1.7676 sec/batch
Epoch 8/20  Iteration 1367/3560 Training loss: 1.5770 1.6900 sec/batch
Epoch 8/20  Iteration 1368/3560 Training loss: 1.5765 1.3913 sec/batch
Epoch 8/20  Iteration 1369/3560 Training loss: 1.5761 1.5263 sec/batch
Epoch 8/20  Iteration 1370/3560 Training loss: 1.5760 1.1530 sec/batch
Epoch 8/20  Iteration 1371/3560 Training loss: 1.5760 1.1568 sec/batch
Epoch 8/20  Iteration 1372/3560 Training loss: 1.5755 1.3593 sec/batch
Epoch 8/20  Iteration 1373/3560 Training loss: 1.5756 1.1921 sec/batch
Epoch 8/20  Iteration 1374/3560 Training loss: 1.5756 1.0213 sec/batch
Epoch 8/20  Iteration 1375/3560 Training loss: 1.5755 0.9400 sec/batch
Epoch 8/20  Iteration 1376/3560 Training loss: 1.5751 0.8945 sec/batch
Epoch 8/20  Iteration 1377/3560 Training loss: 1.5746 0.8770 sec/batch
Epoch 8/20  Iteration 1378/3560 Training loss: 1.5743 0.8577 sec/batch
Epoch 8/20  Iteration 1379/3560 Training loss: 1.5744 0.9362 sec/batch
Epoch 8/20  Iteration 1380/3560 Training loss: 1.5743 0.9191 sec/batch
Epoch 8/20  Iteration 1381/3560 Training loss: 1.5743 0.8499 sec/batch
Epoch 8/20  Iteration 1382/3560 Training loss: 1.5743 0.8801 sec/batch
Epoch 8/20  Iteration 1383/3560 Training loss: 1.5743 0.8686 sec/batch
Epoch 8/20  Iteration 1384/3560 Training loss: 1.5743 0.8940 sec/batch
Epoch 8/20  Iteration 1385/3560 Training loss: 1.5742 0.8548 sec/batch
Epoch 8/20  Iteration 1386/3560 Training loss: 1.5741 0.8478 sec/batch
Epoch 8/20  Iteration 1387/3560 Training loss: 1.5744 0.8522 sec/batch
Epoch 8/20  Iteration 1388/3560 Training loss: 1.5744 0.8735 sec/batch
Epoch 8/20  Iteration 1389/3560 Training loss: 1.5742 0.8842 sec/batch
Epoch 8/20  Iteration 1390/3560 Training loss: 1.5743 0.8683 sec/batch
Epoch 8/20  Iteration 1391/3560 Training loss: 1.5742 0.9332 sec/batch
Epoch 8/20  Iteration 1392/3560 Training loss: 1.5742 0.9211 sec/batch
Epoch 8/20  Iteration 1393/3560 Training loss: 1.5742 1.0336 sec/batch
Epoch 8/20  Iteration 1394/3560 Training loss: 1.5744 1.9876 sec/batch
Epoch 8/20  Iteration 1395/3560 Training loss: 1.5744 1.6102 sec/batch
Epoch 8/20  Iteration 1396/3560 Training loss: 1.5742 1.2794 sec/batch
Epoch 8/20  Iteration 1397/3560 Training loss: 1.5739 1.0042 sec/batch
Epoch 8/20  Iteration 1398/3560 Training loss: 1.5738 0.9338 sec/batch
Epoch 8/20  Iteration 1399/3560 Training loss: 1.5739 0.8777 sec/batch
Epoch 8/20  Iteration 1400/3560 Training loss: 1.5739 0.9065 sec/batch
Epoch 8/20  Iteration 1401/3560 Training loss: 1.5738 0.8859 sec/batch
Epoch 8/20  Iteration 1402/3560 Training loss: 1.5738 0.8460 sec/batch
Epoch 8/20  Iteration 1403/3560 Training loss: 1.5738 0.8504 sec/batch
Epoch 8/20  Iteration 1404/3560 Training loss: 1.5737 0.8873 sec/batch
Epoch 8/20  Iteration 1405/3560 Training loss: 1.5735 0.9752 sec/batch
Epoch 8/20  Iteration 1406/3560 Training loss: 1.5736 0.8911 sec/batch
Epoch 8/20  Iteration 1407/3560 Training loss: 1.5738 0.8779 sec/batch
Epoch 8/20  Iteration 1408/3560 Training loss: 1.5737 0.8571 sec/batch
Epoch 8/20  Iteration 1409/3560 Training loss: 1.5737 0.8951 sec/batch
Epoch 8/20  Iteration 1410/3560 Training loss: 1.5736 0.8480 sec/batch
Epoch 8/20  Iteration 1411/3560 Training loss: 1.5736 0.8488 sec/batch
Epoch 8/20  Iteration 1412/3560 Training loss: 1.5736 0.8463 sec/batch
Epoch 8/20  Iteration 1413/3560 Training loss: 1.5737 0.8567 sec/batch
Epoch 8/20  Iteration 1414/3560 Training loss: 1.5741 0.8723 sec/batch
Epoch 8/20  Iteration 1415/3560 Training loss: 1.5740 0.9001 sec/batch
Epoch 8/20  Iteration 1416/3560 Training loss: 1.5739 0.8489 sec/batch
Epoch 8/20  Iteration 1417/3560 Training loss: 1.5737 0.9495 sec/batch
Epoch 8/20  Iteration 1418/3560 Training loss: 1.5735 0.8713 sec/batch
Epoch 8/20  Iteration 1419/3560 Training loss: 1.5735 0.8585 sec/batch
Epoch 8/20  Iteration 1420/3560 Training loss: 1.5735 0.8516 sec/batch
Epoch 8/20  Iteration 1421/3560 Training loss: 1.5735 0.8777 sec/batch
Epoch 8/20  Iteration 1422/3560 Training loss: 1.5733 0.8736 sec/batch
Epoch 8/20  Iteration 1423/3560 Training loss: 1.5731 0.9345 sec/batch
Epoch 8/20  Iteration 1424/3560 Training loss: 1.5733 0.9036 sec/batch
Epoch 9/20  Iteration 1425/3560 Training loss: 1.6428 0.9141 sec/batch
Epoch 9/20  Iteration 1426/3560 Training loss: 1.6076 1.2953 sec/batch
Epoch 9/20  Iteration 1427/3560 Training loss: 1.5936 1.0927 sec/batch
Epoch 9/20  Iteration 1428/3560 Training loss: 1.5891 1.5343 sec/batch
Epoch 9/20  Iteration 1429/3560 Training loss: 1.5801 1.2980 sec/batch
Epoch 9/20  Iteration 1430/3560 Training loss: 1.5698 1.4159 sec/batch
Epoch 9/20  Iteration 1431/3560 Training loss: 1.5686 1.2261 sec/batch
Epoch 9/20  Iteration 1432/3560 Training loss: 1.5652 1.2663 sec/batch
Epoch 9/20  Iteration 1433/3560 Training loss: 1.5668 1.2442 sec/batch
Epoch 9/20  Iteration 1434/3560 Training loss: 1.5667 1.1832 sec/batch
Epoch 9/20  Iteration 1435/3560 Training loss: 1.5640 1.1075 sec/batch
Epoch 9/20  Iteration 1436/3560 Training loss: 1.5628 1.1937 sec/batch
Epoch 9/20  Iteration 1437/3560 Training loss: 1.5615 0.8677 sec/batch
Epoch 9/20  Iteration 1438/3560 Training loss: 1.5636 0.9860 sec/batch
Epoch 9/20  Iteration 1439/3560 Training loss: 1.5626 0.8549 sec/batch
Epoch 9/20  Iteration 1440/3560 Training loss: 1.5609 0.8450 sec/batch
Epoch 9/20  Iteration 1441/3560 Training loss: 1.5612 0.8446 sec/batch
Epoch 9/20  Iteration 1442/3560 Training loss: 1.5625 0.8409 sec/batch
Epoch 9/20  Iteration 1443/3560 Training loss: 1.5628 0.8454 sec/batch
Epoch 9/20  Iteration 1444/3560 Training loss: 1.5633 0.8418 sec/batch
Epoch 9/20  Iteration 1445/3560 Training loss: 1.5625 0.8585 sec/batch
Epoch 9/20  Iteration 1446/3560 Training loss: 1.5627 0.8683 sec/batch
Epoch 9/20  Iteration 1447/3560 Training loss: 1.5615 0.8970 sec/batch
Epoch 9/20  Iteration 1448/3560 Training loss: 1.5610 0.8556 sec/batch
Epoch 9/20  Iteration 1449/3560 Training loss: 1.5612 0.8600 sec/batch
Epoch 9/20  Iteration 1450/3560 Training loss: 1.5594 0.8886 sec/batch
Epoch 9/20  Iteration 1451/3560 Training loss: 1.5576 0.8637 sec/batch
Epoch 9/20  Iteration 1452/3560 Training loss: 1.5577 0.8715 sec/batch
Epoch 9/20  Iteration 1453/3560 Training loss: 1.5578 0.8858 sec/batch
Epoch 9/20  Iteration 1454/3560 Training loss: 1.5582 0.8664 sec/batch
Epoch 9/20  Iteration 1455/3560 Training loss: 1.5576 0.9377 sec/batch
Epoch 9/20  Iteration 1456/3560 Training loss: 1.5568 1.5592 sec/batch
Epoch 9/20  Iteration 1457/3560 Training loss: 1.5569 1.5216 sec/batch
Epoch 9/20  Iteration 1458/3560 Training loss: 1.5571 1.2878 sec/batch
Epoch 9/20  Iteration 1459/3560 Training loss: 1.5566 0.9087 sec/batch
Epoch 9/20  Iteration 1460/3560 Training loss: 1.5564 0.8716 sec/batch
Epoch 9/20  Iteration 1461/3560 Training loss: 1.5555 0.8641 sec/batch
Epoch 9/20  Iteration 1462/3560 Training loss: 1.5541 0.8637 sec/batch
Epoch 9/20  Iteration 1463/3560 Training loss: 1.5529 0.8657 sec/batch
Epoch 9/20  Iteration 1464/3560 Training loss: 1.5524 0.9236 sec/batch
Epoch 9/20  Iteration 1465/3560 Training loss: 1.5520 0.9229 sec/batch
Epoch 9/20  Iteration 1466/3560 Training loss: 1.5526 1.1650 sec/batch
Epoch 9/20  Iteration 1467/3560 Training loss: 1.5519 1.2475 sec/batch
Epoch 9/20  Iteration 1468/3560 Training loss: 1.5512 1.0421 sec/batch
Epoch 9/20  Iteration 1469/3560 Training loss: 1.5514 0.9478 sec/batch
Epoch 9/20  Iteration 1470/3560 Training loss: 1.5503 0.8976 sec/batch
Epoch 9/20  Iteration 1471/3560 Training loss: 1.5498 0.9016 sec/batch
Epoch 9/20  Iteration 1472/3560 Training loss: 1.5493 1.2430 sec/batch
Epoch 9/20  Iteration 1473/3560 Training loss: 1.5491 1.2108 sec/batch
Epoch 9/20  Iteration 1474/3560 Training loss: 1.5498 1.0854 sec/batch
Epoch 9/20  Iteration 1475/3560 Training loss: 1.5492 0.9927 sec/batch
Epoch 9/20  Iteration 1476/3560 Training loss: 1.5499 1.4133 sec/batch
Epoch 9/20  Iteration 1477/3560 Training loss: 1.5499 1.1212 sec/batch
Epoch 9/20  Iteration 1478/3560 Training loss: 1.5499 0.8672 sec/batch
Epoch 9/20  Iteration 1479/3560 Training loss: 1.5496 0.8842 sec/batch
Epoch 9/20  Iteration 1480/3560 Training loss: 1.5497 0.8627 sec/batch
Epoch 9/20  Iteration 1481/3560 Training loss: 1.5501 0.8750 sec/batch
Epoch 9/20  Iteration 1482/3560 Training loss: 1.5496 0.8582 sec/batch
Epoch 9/20  Iteration 1483/3560 Training loss: 1.5490 0.8553 sec/batch
Epoch 9/20  Iteration 1484/3560 Training loss: 1.5495 0.8569 sec/batch
Epoch 9/20  Iteration 1485/3560 Training loss: 1.5494 0.8753 sec/batch
Epoch 9/20  Iteration 1486/3560 Training loss: 1.5503 0.8667 sec/batch
Epoch 9/20  Iteration 1487/3560 Training loss: 1.5505 0.8530 sec/batch
Epoch 9/20  Iteration 1488/3560 Training loss: 1.5508 0.8561 sec/batch
Epoch 9/20  Iteration 1489/3560 Training loss: 1.5506 0.8645 sec/batch
Epoch 9/20  Iteration 1490/3560 Training loss: 1.5508 0.8522 sec/batch
Epoch 9/20  Iteration 1491/3560 Training loss: 1.5510 0.8901 sec/batch
Epoch 9/20  Iteration 1492/3560 Training loss: 1.5506 0.8792 sec/batch
Epoch 9/20  Iteration 1493/3560 Training loss: 1.5506 0.8859 sec/batch
Epoch 9/20  Iteration 1494/3560 Training loss: 1.5504 0.9030 sec/batch
Epoch 9/20  Iteration 1495/3560 Training loss: 1.5509 0.8491 sec/batch
Epoch 9/20  Iteration 1496/3560 Training loss: 1.5510 0.8938 sec/batch
Epoch 9/20  Iteration 1497/3560 Training loss: 1.5514 0.8750 sec/batch
Epoch 9/20  Iteration 1498/3560 Training loss: 1.5510 0.8560 sec/batch
Epoch 9/20  Iteration 1499/3560 Training loss: 1.5507 0.9598 sec/batch
Epoch 9/20  Iteration 1500/3560 Training loss: 1.5509 0.9202 sec/batch
Epoch 9/20  Iteration 1501/3560 Training loss: 1.5507 0.8972 sec/batch
Epoch 9/20  Iteration 1502/3560 Training loss: 1.5507 1.0724 sec/batch
Epoch 9/20  Iteration 1503/3560 Training loss: 1.5501 0.8559 sec/batch
Epoch 9/20  Iteration 1504/3560 Training loss: 1.5498 0.9470 sec/batch
Epoch 9/20  Iteration 1505/3560 Training loss: 1.5492 0.9226 sec/batch
Epoch 9/20  Iteration 1506/3560 Training loss: 1.5492 0.9992 sec/batch
Epoch 9/20  Iteration 1507/3560 Training loss: 1.5487 0.9538 sec/batch
Epoch 9/20  Iteration 1508/3560 Training loss: 1.5486 0.9186 sec/batch
Epoch 9/20  Iteration 1509/3560 Training loss: 1.5483 0.8955 sec/batch
Epoch 9/20  Iteration 1510/3560 Training loss: 1.5480 0.8913 sec/batch
Epoch 9/20  Iteration 1511/3560 Training loss: 1.5478 0.9420 sec/batch
Epoch 9/20  Iteration 1512/3560 Training loss: 1.5475 0.9654 sec/batch
Epoch 9/20  Iteration 1513/3560 Training loss: 1.5471 1.2522 sec/batch
Epoch 9/20  Iteration 1514/3560 Training loss: 1.5472 0.9142 sec/batch
Epoch 9/20  Iteration 1515/3560 Training loss: 1.5469 0.8808 sec/batch
Epoch 9/20  Iteration 1516/3560 Training loss: 1.5467 0.9176 sec/batch
Epoch 9/20  Iteration 1517/3560 Training loss: 1.5463 0.8771 sec/batch
Epoch 9/20  Iteration 1518/3560 Training loss: 1.5460 1.4140 sec/batch
Epoch 9/20  Iteration 1519/3560 Training loss: 1.5457 1.0810 sec/batch
Epoch 9/20  Iteration 1520/3560 Training loss: 1.5457 0.9040 sec/batch
Epoch 9/20  Iteration 1521/3560 Training loss: 1.5454 0.8517 sec/batch
Epoch 9/20  Iteration 1522/3560 Training loss: 1.5450 0.8419 sec/batch
Epoch 9/20  Iteration 1523/3560 Training loss: 1.5447 0.8511 sec/batch
Epoch 9/20  Iteration 1524/3560 Training loss: 1.5442 0.8518 sec/batch
Epoch 9/20  Iteration 1525/3560 Training loss: 1.5441 0.8954 sec/batch
Epoch 9/20  Iteration 1526/3560 Training loss: 1.5439 0.8589 sec/batch
Epoch 9/20  Iteration 1527/3560 Training loss: 1.5437 0.8446 sec/batch
Epoch 9/20  Iteration 1528/3560 Training loss: 1.5436 0.9642 sec/batch
Epoch 9/20  Iteration 1529/3560 Training loss: 1.5434 0.8623 sec/batch
Epoch 9/20  Iteration 1530/3560 Training loss: 1.5432 0.8979 sec/batch
Epoch 9/20  Iteration 1531/3560 Training loss: 1.5432 0.8681 sec/batch
Epoch 9/20  Iteration 1532/3560 Training loss: 1.5431 0.8519 sec/batch
Epoch 9/20  Iteration 1533/3560 Training loss: 1.5429 0.8467 sec/batch
Epoch 9/20  Iteration 1534/3560 Training loss: 1.5429 0.8448 sec/batch
Epoch 9/20  Iteration 1535/3560 Training loss: 1.5426 0.8391 sec/batch
Epoch 9/20  Iteration 1536/3560 Training loss: 1.5425 0.8437 sec/batch
Epoch 9/20  Iteration 1537/3560 Training loss: 1.5422 0.9024 sec/batch
Epoch 9/20  Iteration 1538/3560 Training loss: 1.5420 0.8442 sec/batch
Epoch 9/20  Iteration 1539/3560 Training loss: 1.5416 0.8519 sec/batch
Epoch 9/20  Iteration 1540/3560 Training loss: 1.5412 0.8404 sec/batch
Epoch 9/20  Iteration 1541/3560 Training loss: 1.5412 0.8424 sec/batch
Epoch 9/20  Iteration 1542/3560 Training loss: 1.5411 0.9174 sec/batch
Epoch 9/20  Iteration 1543/3560 Training loss: 1.5410 0.8366 sec/batch
Epoch 9/20  Iteration 1544/3560 Training loss: 1.5409 0.8545 sec/batch
Epoch 9/20  Iteration 1545/3560 Training loss: 1.5408 0.8512 sec/batch
Epoch 9/20  Iteration 1546/3560 Training loss: 1.5404 0.9179 sec/batch
Epoch 9/20  Iteration 1547/3560 Training loss: 1.5400 0.8711 sec/batch
Epoch 9/20  Iteration 1548/3560 Training loss: 1.5399 0.8477 sec/batch
Epoch 9/20  Iteration 1549/3560 Training loss: 1.5398 0.8769 sec/batch
Epoch 9/20  Iteration 1550/3560 Training loss: 1.5394 0.8564 sec/batch
Epoch 9/20  Iteration 1551/3560 Training loss: 1.5395 0.9039 sec/batch
Epoch 9/20  Iteration 1552/3560 Training loss: 1.5395 0.9004 sec/batch
Epoch 9/20  Iteration 1553/3560 Training loss: 1.5393 0.8845 sec/batch
Epoch 9/20  Iteration 1554/3560 Training loss: 1.5390 0.8769 sec/batch
Epoch 9/20  Iteration 1555/3560 Training loss: 1.5385 0.9114 sec/batch
Epoch 9/20  Iteration 1556/3560 Training loss: 1.5382 0.9666 sec/batch
Epoch 9/20  Iteration 1557/3560 Training loss: 1.5383 0.9015 sec/batch
Epoch 9/20  Iteration 1558/3560 Training loss: 1.5383 0.8796 sec/batch
Epoch 9/20  Iteration 1559/3560 Training loss: 1.5383 0.8835 sec/batch
Epoch 9/20  Iteration 1560/3560 Training loss: 1.5383 0.8636 sec/batch
Epoch 9/20  Iteration 1561/3560 Training loss: 1.5384 0.8509 sec/batch
Epoch 9/20  Iteration 1562/3560 Training loss: 1.5384 0.8528 sec/batch
Epoch 9/20  Iteration 1563/3560 Training loss: 1.5384 0.8572 sec/batch
Epoch 9/20  Iteration 1564/3560 Training loss: 1.5383 0.8671 sec/batch
Epoch 9/20  Iteration 1565/3560 Training loss: 1.5387 0.9026 sec/batch
Epoch 9/20  Iteration 1566/3560 Training loss: 1.5386 0.9159 sec/batch
Epoch 9/20  Iteration 1567/3560 Training loss: 1.5385 0.8818 sec/batch
Epoch 9/20  Iteration 1568/3560 Training loss: 1.5386 0.8630 sec/batch
Epoch 9/20  Iteration 1569/3560 Training loss: 1.5384 0.8541 sec/batch
Epoch 9/20  Iteration 1570/3560 Training loss: 1.5386 0.8860 sec/batch
Epoch 9/20  Iteration 1571/3560 Training loss: 1.5386 0.8544 sec/batch
Epoch 9/20  Iteration 1572/3560 Training loss: 1.5388 0.8548 sec/batch
Epoch 9/20  Iteration 1573/3560 Training loss: 1.5389 0.8565 sec/batch
Epoch 9/20  Iteration 1574/3560 Training loss: 1.5387 0.8562 sec/batch
Epoch 9/20  Iteration 1575/3560 Training loss: 1.5384 0.8508 sec/batch
Epoch 9/20  Iteration 1576/3560 Training loss: 1.5384 0.8555 sec/batch
Epoch 9/20  Iteration 1577/3560 Training loss: 1.5384 0.8536 sec/batch
Epoch 9/20  Iteration 1578/3560 Training loss: 1.5384 0.8483 sec/batch
Epoch 9/20  Iteration 1579/3560 Training loss: 1.5383 0.8587 sec/batch
Epoch 9/20  Iteration 1580/3560 Training loss: 1.5383 0.8515 sec/batch
Epoch 9/20  Iteration 1581/3560 Training loss: 1.5384 0.8761 sec/batch
Epoch 9/20  Iteration 1582/3560 Training loss: 1.5384 0.8876 sec/batch
Epoch 9/20  Iteration 1583/3560 Training loss: 1.5381 0.8893 sec/batch
Epoch 9/20  Iteration 1584/3560 Training loss: 1.5383 0.8968 sec/batch
Epoch 9/20  Iteration 1585/3560 Training loss: 1.5384 0.8419 sec/batch
Epoch 9/20  Iteration 1586/3560 Training loss: 1.5383 0.8474 sec/batch
Epoch 9/20  Iteration 1587/3560 Training loss: 1.5383 0.8487 sec/batch
Epoch 9/20  Iteration 1588/3560 Training loss: 1.5383 0.8609 sec/batch
Epoch 9/20  Iteration 1589/3560 Training loss: 1.5383 0.8407 sec/batch
Epoch 9/20  Iteration 1590/3560 Training loss: 1.5382 0.8554 sec/batch
Epoch 9/20  Iteration 1591/3560 Training loss: 1.5383 0.8421 sec/batch
Epoch 9/20  Iteration 1592/3560 Training loss: 1.5386 0.8482 sec/batch
Epoch 9/20  Iteration 1593/3560 Training loss: 1.5385 0.9210 sec/batch
Epoch 9/20  Iteration 1594/3560 Training loss: 1.5385 0.8973 sec/batch
Epoch 9/20  Iteration 1595/3560 Training loss: 1.5383 0.8444 sec/batch
Epoch 9/20  Iteration 1596/3560 Training loss: 1.5381 0.8489 sec/batch
Epoch 9/20  Iteration 1597/3560 Training loss: 1.5381 0.8499 sec/batch
Epoch 9/20  Iteration 1598/3560 Training loss: 1.5381 0.8527 sec/batch
Epoch 9/20  Iteration 1599/3560 Training loss: 1.5382 0.8832 sec/batch
Epoch 9/20  Iteration 1600/3560 Training loss: 1.5381 0.8666 sec/batch
Epoch 9/20  Iteration 1601/3560 Training loss: 1.5379 0.8883 sec/batch
Epoch 9/20  Iteration 1602/3560 Training loss: 1.5380 0.9224 sec/batch
Epoch 10/20  Iteration 1603/3560 Training loss: 1.6095 0.8681 sec/batch
Epoch 10/20  Iteration 1604/3560 Training loss: 1.5671 0.8700 sec/batch
Epoch 10/20  Iteration 1605/3560 Training loss: 1.5543 0.8551 sec/batch
Epoch 10/20  Iteration 1606/3560 Training loss: 1.5488 0.8596 sec/batch
Epoch 10/20  Iteration 1607/3560 Training loss: 1.5433 0.8903 sec/batch
Epoch 10/20  Iteration 1608/3560 Training loss: 1.5323 0.8598 sec/batch
Epoch 10/20  Iteration 1609/3560 Training loss: 1.5327 0.8613 sec/batch
Epoch 10/20  Iteration 1610/3560 Training loss: 1.5294 0.8965 sec/batch
Epoch 10/20  Iteration 1611/3560 Training loss: 1.5303 0.9234 sec/batch
Epoch 10/20  Iteration 1612/3560 Training loss: 1.5300 0.8809 sec/batch
Epoch 10/20  Iteration 1613/3560 Training loss: 1.5269 0.8535 sec/batch
Epoch 10/20  Iteration 1614/3560 Training loss: 1.5261 0.9098 sec/batch
Epoch 10/20  Iteration 1615/3560 Training loss: 1.5263 0.8801 sec/batch
Epoch 10/20  Iteration 1616/3560 Training loss: 1.5282 0.8566 sec/batch
Epoch 10/20  Iteration 1617/3560 Training loss: 1.5268 0.8714 sec/batch
Epoch 10/20  Iteration 1618/3560 Training loss: 1.5245 0.8606 sec/batch
Epoch 10/20  Iteration 1619/3560 Training loss: 1.5252 0.8537 sec/batch
Epoch 10/20  Iteration 1620/3560 Training loss: 1.5264 0.8515 sec/batch
Epoch 10/20  Iteration 1621/3560 Training loss: 1.5267 0.8628 sec/batch
Epoch 10/20  Iteration 1622/3560 Training loss: 1.5279 0.8528 sec/batch
Epoch 10/20  Iteration 1623/3560 Training loss: 1.5273 0.8508 sec/batch
Epoch 10/20  Iteration 1624/3560 Training loss: 1.5274 0.8655 sec/batch
Epoch 10/20  Iteration 1625/3560 Training loss: 1.5264 0.8497 sec/batch
Epoch 10/20  Iteration 1626/3560 Training loss: 1.5267 0.8967 sec/batch
Epoch 10/20  Iteration 1627/3560 Training loss: 1.5264 0.8575 sec/batch
Epoch 10/20  Iteration 1628/3560 Training loss: 1.5249 0.8730 sec/batch
Epoch 10/20  Iteration 1629/3560 Training loss: 1.5235 0.8570 sec/batch
Epoch 10/20  Iteration 1630/3560 Training loss: 1.5239 0.8742 sec/batch
Epoch 10/20  Iteration 1631/3560 Training loss: 1.5245 0.8799 sec/batch
Epoch 10/20  Iteration 1632/3560 Training loss: 1.5246 0.8820 sec/batch
Epoch 10/20  Iteration 1633/3560 Training loss: 1.5244 0.8683 sec/batch
Epoch 10/20  Iteration 1634/3560 Training loss: 1.5235 0.8695 sec/batch
Epoch 10/20  Iteration 1635/3560 Training loss: 1.5236 0.8800 sec/batch
Epoch 10/20  Iteration 1636/3560 Training loss: 1.5237 0.8770 sec/batch
Epoch 10/20  Iteration 1637/3560 Training loss: 1.5233 0.8736 sec/batch
Epoch 10/20  Iteration 1638/3560 Training loss: 1.5230 0.8888 sec/batch
Epoch 10/20  Iteration 1639/3560 Training loss: 1.5219 0.8664 sec/batch
Epoch 10/20  Iteration 1640/3560 Training loss: 1.5206 0.9186 sec/batch
Epoch 10/20  Iteration 1641/3560 Training loss: 1.5190 0.8556 sec/batch
Epoch 10/20  Iteration 1642/3560 Training loss: 1.5186 0.8517 sec/batch
Epoch 10/20  Iteration 1643/3560 Training loss: 1.5180 0.8589 sec/batch
Epoch 10/20  Iteration 1644/3560 Training loss: 1.5187 0.8559 sec/batch
Epoch 10/20  Iteration 1645/3560 Training loss: 1.5182 0.8797 sec/batch
Epoch 10/20  Iteration 1646/3560 Training loss: 1.5174 0.8632 sec/batch
Epoch 10/20  Iteration 1647/3560 Training loss: 1.5179 0.9417 sec/batch
Epoch 10/20  Iteration 1648/3560 Training loss: 1.5169 0.9156 sec/batch
Epoch 10/20  Iteration 1649/3560 Training loss: 1.5165 0.8969 sec/batch
Epoch 10/20  Iteration 1650/3560 Training loss: 1.5161 0.8648 sec/batch
Epoch 10/20  Iteration 1651/3560 Training loss: 1.5161 0.8496 sec/batch
Epoch 10/20  Iteration 1652/3560 Training loss: 1.5166 0.8418 sec/batch
Epoch 10/20  Iteration 1653/3560 Training loss: 1.5162 0.8487 sec/batch
Epoch 10/20  Iteration 1654/3560 Training loss: 1.5168 0.8519 sec/batch
Epoch 10/20  Iteration 1655/3560 Training loss: 1.5168 0.8979 sec/batch
Epoch 10/20  Iteration 1656/3560 Training loss: 1.5170 0.8482 sec/batch
Epoch 10/20  Iteration 1657/3560 Training loss: 1.5167 0.8501 sec/batch
Epoch 10/20  Iteration 1658/3560 Training loss: 1.5166 0.8588 sec/batch
Epoch 10/20  Iteration 1659/3560 Training loss: 1.5169 0.8524 sec/batch
Epoch 10/20  Iteration 1660/3560 Training loss: 1.5164 0.8422 sec/batch
Epoch 10/20  Iteration 1661/3560 Training loss: 1.5157 0.8495 sec/batch
Epoch 10/20  Iteration 1662/3560 Training loss: 1.5163 0.8583 sec/batch
Epoch 10/20  Iteration 1663/3560 Training loss: 1.5163 0.8462 sec/batch
Epoch 10/20  Iteration 1664/3560 Training loss: 1.5173 0.8477 sec/batch
Epoch 10/20  Iteration 1665/3560 Training loss: 1.5176 0.9525 sec/batch
Epoch 10/20  Iteration 1666/3560 Training loss: 1.5177 0.8428 sec/batch
Epoch 10/20  Iteration 1667/3560 Training loss: 1.5177 0.8817 sec/batch
Epoch 10/20  Iteration 1668/3560 Training loss: 1.5179 0.8759 sec/batch
Epoch 10/20  Iteration 1669/3560 Training loss: 1.5181 0.8506 sec/batch
Epoch 10/20  Iteration 1670/3560 Training loss: 1.5178 0.8465 sec/batch
Epoch 10/20  Iteration 1671/3560 Training loss: 1.5177 0.8476 sec/batch
Epoch 10/20  Iteration 1672/3560 Training loss: 1.5176 0.8445 sec/batch
Epoch 10/20  Iteration 1673/3560 Training loss: 1.5181 0.8545 sec/batch
Epoch 10/20  Iteration 1674/3560 Training loss: 1.5184 0.8542 sec/batch
Epoch 10/20  Iteration 1675/3560 Training loss: 1.5187 0.8457 sec/batch
Epoch 10/20  Iteration 1676/3560 Training loss: 1.5184 0.8443 sec/batch
Epoch 10/20  Iteration 1677/3560 Training loss: 1.5182 0.8472 sec/batch
Epoch 10/20  Iteration 1678/3560 Training loss: 1.5184 0.8700 sec/batch
Epoch 10/20  Iteration 1679/3560 Training loss: 1.5182 0.8759 sec/batch
Epoch 10/20  Iteration 1680/3560 Training loss: 1.5181 0.8753 sec/batch
Epoch 10/20  Iteration 1681/3560 Training loss: 1.5175 0.8651 sec/batch
Epoch 10/20  Iteration 1682/3560 Training loss: 1.5174 1.0152 sec/batch
Epoch 10/20  Iteration 1683/3560 Training loss: 1.5168 0.8530 sec/batch
Epoch 10/20  Iteration 1684/3560 Training loss: 1.5170 0.9011 sec/batch
Epoch 10/20  Iteration 1685/3560 Training loss: 1.5164 0.8939 sec/batch
Epoch 10/20  Iteration 1686/3560 Training loss: 1.5163 0.8440 sec/batch
Epoch 10/20  Iteration 1687/3560 Training loss: 1.5160 0.8763 sec/batch
Epoch 10/20  Iteration 1688/3560 Training loss: 1.5157 0.8626 sec/batch
Epoch 10/20  Iteration 1689/3560 Training loss: 1.5155 0.9053 sec/batch
Epoch 10/20  Iteration 1690/3560 Training loss: 1.5151 0.9121 sec/batch
Epoch 10/20  Iteration 1691/3560 Training loss: 1.5146 0.8981 sec/batch
Epoch 10/20  Iteration 1692/3560 Training loss: 1.5146 0.8377 sec/batch
Epoch 10/20  Iteration 1693/3560 Training loss: 1.5144 0.8898 sec/batch
Epoch 10/20  Iteration 1694/3560 Training loss: 1.5142 0.8641 sec/batch
Epoch 10/20  Iteration 1695/3560 Training loss: 1.5139 0.8592 sec/batch
Epoch 10/20  Iteration 1696/3560 Training loss: 1.5135 0.9204 sec/batch
Epoch 10/20  Iteration 1697/3560 Training loss: 1.5133 0.8508 sec/batch
Epoch 10/20  Iteration 1698/3560 Training loss: 1.5134 0.8980 sec/batch
Epoch 10/20  Iteration 1699/3560 Training loss: 1.5132 0.8490 sec/batch
Epoch 10/20  Iteration 1700/3560 Training loss: 1.5128 0.8552 sec/batch
Epoch 10/20  Iteration 1701/3560 Training loss: 1.5125 0.8573 sec/batch
Epoch 10/20  Iteration 1702/3560 Training loss: 1.5120 0.8496 sec/batch
Epoch 10/20  Iteration 1703/3560 Training loss: 1.5120 0.8458 sec/batch
Epoch 10/20  Iteration 1704/3560 Training loss: 1.5119 0.8484 sec/batch
Epoch 10/20  Iteration 1705/3560 Training loss: 1.5118 0.8485 sec/batch
Epoch 10/20  Iteration 1706/3560 Training loss: 1.5115 0.8516 sec/batch
Epoch 10/20  Iteration 1707/3560 Training loss: 1.5114 0.8623 sec/batch
Epoch 10/20  Iteration 1708/3560 Training loss: 1.5112 0.8481 sec/batch
Epoch 10/20  Iteration 1709/3560 Training loss: 1.5112 0.8514 sec/batch
Epoch 10/20  Iteration 1710/3560 Training loss: 1.5112 0.8551 sec/batch
Epoch 10/20  Iteration 1711/3560 Training loss: 1.5111 0.8507 sec/batch
Epoch 10/20  Iteration 1712/3560 Training loss: 1.5112 0.8556 sec/batch
Epoch 10/20  Iteration 1713/3560 Training loss: 1.5110 0.8501 sec/batch
Epoch 10/20  Iteration 1714/3560 Training loss: 1.5109 0.8513 sec/batch
Epoch 10/20  Iteration 1715/3560 Training loss: 1.5107 0.8482 sec/batch
Epoch 10/20  Iteration 1716/3560 Training loss: 1.5105 0.8431 sec/batch
Epoch 10/20  Iteration 1717/3560 Training loss: 1.5101 0.8493 sec/batch
Epoch 10/20  Iteration 1718/3560 Training loss: 1.5097 0.8722 sec/batch
Epoch 10/20  Iteration 1719/3560 Training loss: 1.5096 0.8463 sec/batch
Epoch 10/20  Iteration 1720/3560 Training loss: 1.5095 0.8531 sec/batch
Epoch 10/20  Iteration 1721/3560 Training loss: 1.5095 0.8566 sec/batch
Epoch 10/20  Iteration 1722/3560 Training loss: 1.5093 0.8591 sec/batch
Epoch 10/20  Iteration 1723/3560 Training loss: 1.5092 0.8460 sec/batch
Epoch 10/20  Iteration 1724/3560 Training loss: 1.5089 0.8446 sec/batch
Epoch 10/20  Iteration 1725/3560 Training loss: 1.5085 0.8780 sec/batch
Epoch 10/20  Iteration 1726/3560 Training loss: 1.5085 0.9137 sec/batch
Epoch 10/20  Iteration 1727/3560 Training loss: 1.5085 0.9173 sec/batch
Epoch 10/20  Iteration 1728/3560 Training loss: 1.5080 0.8670 sec/batch
Epoch 10/20  Iteration 1729/3560 Training loss: 1.5081 0.8336 sec/batch
Epoch 10/20  Iteration 1730/3560 Training loss: 1.5081 0.8430 sec/batch
Epoch 10/20  Iteration 1731/3560 Training loss: 1.5080 0.8389 sec/batch
Epoch 10/20  Iteration 1732/3560 Training loss: 1.5077 0.8915 sec/batch
Epoch 10/20  Iteration 1733/3560 Training loss: 1.5074 0.8377 sec/batch
Epoch 10/20  Iteration 1734/3560 Training loss: 1.5071 0.8342 sec/batch
Epoch 10/20  Iteration 1735/3560 Training loss: 1.5072 0.8443 sec/batch
Epoch 10/20  Iteration 1736/3560 Training loss: 1.5073 0.8480 sec/batch
Epoch 10/20  Iteration 1737/3560 Training loss: 1.5073 0.8486 sec/batch
Epoch 10/20  Iteration 1738/3560 Training loss: 1.5073 0.8739 sec/batch
Epoch 10/20  Iteration 1739/3560 Training loss: 1.5074 0.9192 sec/batch
Epoch 10/20  Iteration 1740/3560 Training loss: 1.5075 0.8458 sec/batch
Epoch 10/20  Iteration 1741/3560 Training loss: 1.5075 0.8490 sec/batch
Epoch 10/20  Iteration 1742/3560 Training loss: 1.5074 0.8992 sec/batch
Epoch 10/20  Iteration 1743/3560 Training loss: 1.5078 0.8537 sec/batch
Epoch 10/20  Iteration 1744/3560 Training loss: 1.5078 0.8684 sec/batch
Epoch 10/20  Iteration 1745/3560 Training loss: 1.5077 0.8525 sec/batch
Epoch 10/20  Iteration 1746/3560 Training loss: 1.5078 0.8591 sec/batch
Epoch 10/20  Iteration 1747/3560 Training loss: 1.5077 0.8485 sec/batch
Epoch 10/20  Iteration 1748/3560 Training loss: 1.5078 0.8538 sec/batch
Epoch 10/20  Iteration 1749/3560 Training loss: 1.5078 0.8733 sec/batch
Epoch 10/20  Iteration 1750/3560 Training loss: 1.5081 0.8780 sec/batch
Epoch 10/20  Iteration 1751/3560 Training loss: 1.5082 0.8498 sec/batch
Epoch 10/20  Iteration 1752/3560 Training loss: 1.5080 0.8691 sec/batch
Epoch 10/20  Iteration 1753/3560 Training loss: 1.5077 0.8780 sec/batch
Epoch 10/20  Iteration 1754/3560 Training loss: 1.5076 0.8467 sec/batch
Epoch 10/20  Iteration 1755/3560 Training loss: 1.5076 0.8795 sec/batch
Epoch 10/20  Iteration 1756/3560 Training loss: 1.5077 0.8619 sec/batch
Epoch 10/20  Iteration 1757/3560 Training loss: 1.5076 0.8968 sec/batch
Epoch 10/20  Iteration 1758/3560 Training loss: 1.5077 0.9225 sec/batch
Epoch 10/20  Iteration 1759/3560 Training loss: 1.5077 1.0183 sec/batch
Epoch 10/20  Iteration 1760/3560 Training loss: 1.5077 0.8608 sec/batch
Epoch 10/20  Iteration 1761/3560 Training loss: 1.5075 0.8519 sec/batch
Epoch 10/20  Iteration 1762/3560 Training loss: 1.5077 0.8518 sec/batch
Epoch 10/20  Iteration 1763/3560 Training loss: 1.5078 0.9014 sec/batch
Epoch 10/20  Iteration 1764/3560 Training loss: 1.5078 0.8582 sec/batch
Epoch 10/20  Iteration 1765/3560 Training loss: 1.5078 0.8582 sec/batch
Epoch 10/20  Iteration 1766/3560 Training loss: 1.5078 0.9123 sec/batch
Epoch 10/20  Iteration 1767/3560 Training loss: 1.5078 0.9069 sec/batch
Epoch 10/20  Iteration 1768/3560 Training loss: 1.5077 0.8631 sec/batch
Epoch 10/20  Iteration 1769/3560 Training loss: 1.5078 0.8936 sec/batch
Epoch 10/20  Iteration 1770/3560 Training loss: 1.5082 0.8729 sec/batch
Epoch 10/20  Iteration 1771/3560 Training loss: 1.5082 0.8870 sec/batch
Epoch 10/20  Iteration 1772/3560 Training loss: 1.5082 0.9055 sec/batch
Epoch 10/20  Iteration 1773/3560 Training loss: 1.5080 0.8594 sec/batch
Epoch 10/20  Iteration 1774/3560 Training loss: 1.5078 0.8522 sec/batch
Epoch 10/20  Iteration 1775/3560 Training loss: 1.5079 0.8631 sec/batch
Epoch 10/20  Iteration 1776/3560 Training loss: 1.5079 0.8537 sec/batch
Epoch 10/20  Iteration 1777/3560 Training loss: 1.5080 0.8928 sec/batch
Epoch 10/20  Iteration 1778/3560 Training loss: 1.5078 0.8628 sec/batch
Epoch 10/20  Iteration 1779/3560 Training loss: 1.5077 0.8518 sec/batch
Epoch 10/20  Iteration 1780/3560 Training loss: 1.5078 0.8478 sec/batch
Epoch 11/20  Iteration 1781/3560 Training loss: 1.5836 0.8687 sec/batch
Epoch 11/20  Iteration 1782/3560 Training loss: 1.5449 0.8768 sec/batch
Epoch 11/20  Iteration 1783/3560 Training loss: 1.5306 0.9279 sec/batch
Epoch 11/20  Iteration 1784/3560 Training loss: 1.5248 0.8591 sec/batch
Epoch 11/20  Iteration 1785/3560 Training loss: 1.5144 0.9066 sec/batch
Epoch 11/20  Iteration 1786/3560 Training loss: 1.5063 0.8547 sec/batch
Epoch 11/20  Iteration 1787/3560 Training loss: 1.5072 0.8769 sec/batch
Epoch 11/20  Iteration 1788/3560 Training loss: 1.5049 0.8478 sec/batch
Epoch 11/20  Iteration 1789/3560 Training loss: 1.5048 0.8589 sec/batch
Epoch 11/20  Iteration 1790/3560 Training loss: 1.5044 0.8489 sec/batch
Epoch 11/20  Iteration 1791/3560 Training loss: 1.5009 0.8611 sec/batch
Epoch 11/20  Iteration 1792/3560 Training loss: 1.5003 0.8936 sec/batch
Epoch 11/20  Iteration 1793/3560 Training loss: 1.4987 0.8443 sec/batch
Epoch 11/20  Iteration 1794/3560 Training loss: 1.5003 0.8543 sec/batch
Epoch 11/20  Iteration 1795/3560 Training loss: 1.4992 0.8909 sec/batch
Epoch 11/20  Iteration 1796/3560 Training loss: 1.4971 0.8633 sec/batch
Epoch 11/20  Iteration 1797/3560 Training loss: 1.4974 0.8838 sec/batch
Epoch 11/20  Iteration 1798/3560 Training loss: 1.4990 0.8678 sec/batch
Epoch 11/20  Iteration 1799/3560 Training loss: 1.4991 0.9002 sec/batch
Epoch 11/20  Iteration 1800/3560 Training loss: 1.5005 0.8833 sec/batch
Epoch 11/20  Iteration 1801/3560 Training loss: 1.4996 0.9237 sec/batch
Epoch 11/20  Iteration 1802/3560 Training loss: 1.4997 0.9608 sec/batch
Epoch 11/20  Iteration 1803/3560 Training loss: 1.4986 0.8823 sec/batch
Epoch 11/20  Iteration 1804/3560 Training loss: 1.4984 0.8762 sec/batch
Epoch 11/20  Iteration 1805/3560 Training loss: 1.4987 0.9305 sec/batch
Epoch 11/20  Iteration 1806/3560 Training loss: 1.4969 0.8635 sec/batch
Epoch 11/20  Iteration 1807/3560 Training loss: 1.4958 0.8436 sec/batch
Epoch 11/20  Iteration 1808/3560 Training loss: 1.4963 0.8854 sec/batch
Epoch 11/20  Iteration 1809/3560 Training loss: 1.4966 0.8751 sec/batch
Epoch 11/20  Iteration 1810/3560 Training loss: 1.4971 0.9106 sec/batch
Epoch 11/20  Iteration 1811/3560 Training loss: 1.4963 0.8818 sec/batch
Epoch 11/20  Iteration 1812/3560 Training loss: 1.4951 0.8821 sec/batch
Epoch 11/20  Iteration 1813/3560 Training loss: 1.4952 0.8694 sec/batch
Epoch 11/20  Iteration 1814/3560 Training loss: 1.4955 0.8532 sec/batch
Epoch 11/20  Iteration 1815/3560 Training loss: 1.4952 0.8920 sec/batch
Epoch 11/20  Iteration 1816/3560 Training loss: 1.4949 0.8506 sec/batch
Epoch 11/20  Iteration 1817/3560 Training loss: 1.4940 0.8678 sec/batch
Epoch 11/20  Iteration 1818/3560 Training loss: 1.4928 0.8736 sec/batch
Epoch 11/20  Iteration 1819/3560 Training loss: 1.4913 0.8605 sec/batch
Epoch 11/20  Iteration 1820/3560 Training loss: 1.4909 0.8994 sec/batch
Epoch 11/20  Iteration 1821/3560 Training loss: 1.4904 0.9077 sec/batch
Epoch 11/20  Iteration 1822/3560 Training loss: 1.4912 0.8763 sec/batch
Epoch 11/20  Iteration 1823/3560 Training loss: 1.4909 0.9099 sec/batch
Epoch 11/20  Iteration 1824/3560 Training loss: 1.4901 0.9302 sec/batch
Epoch 11/20  Iteration 1825/3560 Training loss: 1.4905 0.8533 sec/batch
Epoch 11/20  Iteration 1826/3560 Training loss: 1.4898 0.8665 sec/batch
Epoch 11/20  Iteration 1827/3560 Training loss: 1.4894 0.8522 sec/batch
Epoch 11/20  Iteration 1828/3560 Training loss: 1.4890 0.8871 sec/batch
Epoch 11/20  Iteration 1829/3560 Training loss: 1.4888 1.0432 sec/batch
Epoch 11/20  Iteration 1830/3560 Training loss: 1.4892 0.8692 sec/batch
Epoch 11/20  Iteration 1831/3560 Training loss: 1.4885 0.9123 sec/batch
Epoch 11/20  Iteration 1832/3560 Training loss: 1.4893 0.9318 sec/batch
Epoch 11/20  Iteration 1833/3560 Training loss: 1.4893 0.8657 sec/batch
Epoch 11/20  Iteration 1834/3560 Training loss: 1.4894 0.8517 sec/batch
Epoch 11/20  Iteration 1835/3560 Training loss: 1.4892 0.8445 sec/batch
Epoch 11/20  Iteration 1836/3560 Training loss: 1.4893 0.8540 sec/batch
Epoch 11/20  Iteration 1837/3560 Training loss: 1.4898 0.8470 sec/batch
Epoch 11/20  Iteration 1838/3560 Training loss: 1.4892 0.8494 sec/batch
Epoch 11/20  Iteration 1839/3560 Training loss: 1.4886 0.8484 sec/batch
Epoch 11/20  Iteration 1840/3560 Training loss: 1.4890 0.8454 sec/batch
Epoch 11/20  Iteration 1841/3560 Training loss: 1.4890 0.8539 sec/batch
Epoch 11/20  Iteration 1842/3560 Training loss: 1.4900 0.8419 sec/batch
Epoch 11/20  Iteration 1843/3560 Training loss: 1.4903 0.8546 sec/batch
Epoch 11/20  Iteration 1844/3560 Training loss: 1.4905 0.8507 sec/batch
Epoch 11/20  Iteration 1845/3560 Training loss: 1.4903 0.8442 sec/batch
Epoch 11/20  Iteration 1846/3560 Training loss: 1.4904 0.8594 sec/batch
Epoch 11/20  Iteration 1847/3560 Training loss: 1.4909 0.8459 sec/batch
Epoch 11/20  Iteration 1848/3560 Training loss: 1.4905 0.8484 sec/batch
Epoch 11/20  Iteration 1849/3560 Training loss: 1.4906 0.8468 sec/batch
Epoch 11/20  Iteration 1850/3560 Training loss: 1.4906 0.8489 sec/batch
Epoch 11/20  Iteration 1851/3560 Training loss: 1.4912 0.8537 sec/batch
Epoch 11/20  Iteration 1852/3560 Training loss: 1.4913 0.8511 sec/batch
Epoch 11/20  Iteration 1853/3560 Training loss: 1.4918 0.8956 sec/batch
Epoch 11/20  Iteration 1854/3560 Training loss: 1.4915 0.8506 sec/batch
Epoch 11/20  Iteration 1855/3560 Training loss: 1.4912 0.8748 sec/batch
Epoch 11/20  Iteration 1856/3560 Training loss: 1.4915 0.8501 sec/batch
Epoch 11/20  Iteration 1857/3560 Training loss: 1.4913 0.8433 sec/batch
Epoch 11/20  Iteration 1858/3560 Training loss: 1.4913 0.8453 sec/batch
Epoch 11/20  Iteration 1859/3560 Training loss: 1.4907 0.8499 sec/batch
Epoch 11/20  Iteration 1860/3560 Training loss: 1.4907 0.8559 sec/batch
Epoch 11/20  Iteration 1861/3560 Training loss: 1.4903 0.8565 sec/batch
Epoch 11/20  Iteration 1862/3560 Training loss: 1.4903 0.8528 sec/batch
Epoch 11/20  Iteration 1863/3560 Training loss: 1.4897 0.8519 sec/batch
Epoch 11/20  Iteration 1864/3560 Training loss: 1.4897 0.8644 sec/batch
Epoch 11/20  Iteration 1865/3560 Training loss: 1.4893 0.8459 sec/batch
Epoch 11/20  Iteration 1866/3560 Training loss: 1.4891 0.9099 sec/batch
Epoch 11/20  Iteration 1867/3560 Training loss: 1.4888 0.8678 sec/batch
Epoch 11/20  Iteration 1868/3560 Training loss: 1.4885 0.8592 sec/batch
Epoch 11/20  Iteration 1869/3560 Training loss: 1.4881 0.8484 sec/batch
Epoch 11/20  Iteration 1870/3560 Training loss: 1.4883 0.8473 sec/batch
Epoch 11/20  Iteration 1871/3560 Training loss: 1.4880 0.8565 sec/batch
Epoch 11/20  Iteration 1872/3560 Training loss: 1.4877 0.8471 sec/batch
Epoch 11/20  Iteration 1873/3560 Training loss: 1.4872 0.8568 sec/batch
Epoch 11/20  Iteration 1874/3560 Training loss: 1.4869 0.8578 sec/batch
Epoch 11/20  Iteration 1875/3560 Training loss: 1.4866 0.8581 sec/batch
Epoch 11/20  Iteration 1876/3560 Training loss: 1.4867 0.8537 sec/batch
Epoch 11/20  Iteration 1877/3560 Training loss: 1.4867 0.8466 sec/batch
Epoch 11/20  Iteration 1878/3560 Training loss: 1.4864 0.8498 sec/batch
Epoch 11/20  Iteration 1879/3560 Training loss: 1.4860 0.8968 sec/batch
Epoch 11/20  Iteration 1880/3560 Training loss: 1.4856 0.8477 sec/batch
Epoch 11/20  Iteration 1881/3560 Training loss: 1.4856 0.8416 sec/batch
Epoch 11/20  Iteration 1882/3560 Training loss: 1.4854 0.8591 sec/batch
Epoch 11/20  Iteration 1883/3560 Training loss: 1.4852 0.8509 sec/batch
Epoch 11/20  Iteration 1884/3560 Training loss: 1.4852 0.8441 sec/batch
Epoch 11/20  Iteration 1885/3560 Training loss: 1.4850 0.8931 sec/batch
Epoch 11/20  Iteration 1886/3560 Training loss: 1.4848 0.8534 sec/batch
Epoch 11/20  Iteration 1887/3560 Training loss: 1.4847 0.8571 sec/batch
Epoch 11/20  Iteration 1888/3560 Training loss: 1.4847 0.8486 sec/batch
Epoch 11/20  Iteration 1889/3560 Training loss: 1.4846 0.8535 sec/batch
Epoch 11/20  Iteration 1890/3560 Training loss: 1.4847 0.8873 sec/batch
Epoch 11/20  Iteration 1891/3560 Training loss: 1.4844 0.8506 sec/batch
Epoch 11/20  Iteration 1892/3560 Training loss: 1.4843 0.8524 sec/batch
Epoch 11/20  Iteration 1893/3560 Training loss: 1.4840 0.8433 sec/batch
Epoch 11/20  Iteration 1894/3560 Training loss: 1.4838 0.8479 sec/batch
Epoch 11/20  Iteration 1895/3560 Training loss: 1.4836 0.8529 sec/batch
Epoch 11/20  Iteration 1896/3560 Training loss: 1.4831 0.8496 sec/batch
Epoch 11/20  Iteration 1897/3560 Training loss: 1.4830 0.8464 sec/batch
Epoch 11/20  Iteration 1898/3560 Training loss: 1.4829 0.8446 sec/batch
Epoch 11/20  Iteration 1899/3560 Training loss: 1.4828 0.8496 sec/batch
Epoch 11/20  Iteration 1900/3560 Training loss: 1.4827 0.8847 sec/batch
Epoch 11/20  Iteration 1901/3560 Training loss: 1.4826 0.8688 sec/batch
Epoch 11/20  Iteration 1902/3560 Training loss: 1.4823 0.8601 sec/batch
Epoch 11/20  Iteration 1903/3560 Training loss: 1.4819 0.8700 sec/batch
Epoch 11/20  Iteration 1904/3560 Training loss: 1.4818 0.8956 sec/batch
Epoch 11/20  Iteration 1905/3560 Training loss: 1.4818 0.8546 sec/batch
Epoch 11/20  Iteration 1906/3560 Training loss: 1.4814 0.8517 sec/batch
Epoch 11/20  Iteration 1907/3560 Training loss: 1.4815 0.8644 sec/batch
Epoch 11/20  Iteration 1908/3560 Training loss: 1.4815 0.8449 sec/batch
Epoch 11/20  Iteration 1909/3560 Training loss: 1.4813 0.8494 sec/batch
Epoch 11/20  Iteration 1910/3560 Training loss: 1.4810 0.8522 sec/batch
Epoch 11/20  Iteration 1911/3560 Training loss: 1.4807 0.8498 sec/batch
Epoch 11/20  Iteration 1912/3560 Training loss: 1.4805 0.8527 sec/batch
Epoch 11/20  Iteration 1913/3560 Training loss: 1.4805 0.8430 sec/batch
Epoch 11/20  Iteration 1914/3560 Training loss: 1.4805 0.8471 sec/batch
Epoch 11/20  Iteration 1915/3560 Training loss: 1.4805 0.8540 sec/batch
Epoch 11/20  Iteration 1916/3560 Training loss: 1.4805 0.8945 sec/batch
Epoch 11/20  Iteration 1917/3560 Training loss: 1.4807 0.8497 sec/batch
Epoch 11/20  Iteration 1918/3560 Training loss: 1.4807 0.8508 sec/batch
Epoch 11/20  Iteration 1919/3560 Training loss: 1.4807 0.8535 sec/batch
Epoch 11/20  Iteration 1920/3560 Training loss: 1.4807 0.8476 sec/batch
Epoch 11/20  Iteration 1921/3560 Training loss: 1.4810 0.8531 sec/batch
Epoch 11/20  Iteration 1922/3560 Training loss: 1.4810 0.8478 sec/batch
Epoch 11/20  Iteration 1923/3560 Training loss: 1.4809 0.8481 sec/batch
Epoch 11/20  Iteration 1924/3560 Training loss: 1.4811 0.8606 sec/batch
Epoch 11/20  Iteration 1925/3560 Training loss: 1.4809 0.8497 sec/batch
Epoch 11/20  Iteration 1926/3560 Training loss: 1.4811 0.8495 sec/batch
Epoch 11/20  Iteration 1927/3560 Training loss: 1.4811 0.8483 sec/batch
Epoch 11/20  Iteration 1928/3560 Training loss: 1.4813 0.8436 sec/batch
Epoch 11/20  Iteration 1929/3560 Training loss: 1.4814 0.8549 sec/batch
Epoch 11/20  Iteration 1930/3560 Training loss: 1.4812 0.8431 sec/batch
Epoch 11/20  Iteration 1931/3560 Training loss: 1.4809 0.8549 sec/batch
Epoch 11/20  Iteration 1932/3560 Training loss: 1.4809 0.8524 sec/batch
Epoch 11/20  Iteration 1933/3560 Training loss: 1.4809 0.8496 sec/batch
Epoch 11/20  Iteration 1934/3560 Training loss: 1.4809 0.8590 sec/batch
Epoch 11/20  Iteration 1935/3560 Training loss: 1.4808 0.8406 sec/batch
Epoch 11/20  Iteration 1936/3560 Training loss: 1.4809 0.8525 sec/batch
Epoch 11/20  Iteration 1937/3560 Training loss: 1.4809 0.8872 sec/batch
Epoch 11/20  Iteration 1938/3560 Training loss: 1.4809 0.8443 sec/batch
Epoch 11/20  Iteration 1939/3560 Training loss: 1.4807 0.8885 sec/batch
Epoch 11/20  Iteration 1940/3560 Training loss: 1.4809 0.9261 sec/batch
Epoch 11/20  Iteration 1941/3560 Training loss: 1.4810 0.8869 sec/batch
Epoch 11/20  Iteration 1942/3560 Training loss: 1.4810 0.8879 sec/batch
Epoch 11/20  Iteration 1943/3560 Training loss: 1.4811 0.8770 sec/batch
Epoch 11/20  Iteration 1944/3560 Training loss: 1.4810 0.8443 sec/batch
Epoch 11/20  Iteration 1945/3560 Training loss: 1.4810 0.8477 sec/batch
Epoch 11/20  Iteration 1946/3560 Training loss: 1.4810 0.8502 sec/batch
Epoch 11/20  Iteration 1947/3560 Training loss: 1.4812 0.8431 sec/batch
Epoch 11/20  Iteration 1948/3560 Training loss: 1.4815 0.8465 sec/batch
Epoch 11/20  Iteration 1949/3560 Training loss: 1.4815 0.8623 sec/batch
Epoch 11/20  Iteration 1950/3560 Training loss: 1.4815 0.8496 sec/batch
Epoch 11/20  Iteration 1951/3560 Training loss: 1.4814 0.8501 sec/batch
Epoch 11/20  Iteration 1952/3560 Training loss: 1.4812 0.8532 sec/batch
Epoch 11/20  Iteration 1953/3560 Training loss: 1.4813 0.8477 sec/batch
Epoch 11/20  Iteration 1954/3560 Training loss: 1.4813 0.8519 sec/batch
Epoch 11/20  Iteration 1955/3560 Training loss: 1.4814 0.8969 sec/batch
Epoch 11/20  Iteration 1956/3560 Training loss: 1.4813 0.8612 sec/batch
Epoch 11/20  Iteration 1957/3560 Training loss: 1.4812 0.8558 sec/batch
Epoch 11/20  Iteration 1958/3560 Training loss: 1.4813 0.8527 sec/batch
Epoch 12/20  Iteration 1959/3560 Training loss: 1.5650 0.8635 sec/batch
Epoch 12/20  Iteration 1960/3560 Training loss: 1.5238 0.8435 sec/batch
Epoch 12/20  Iteration 1961/3560 Training loss: 1.5036 0.8501 sec/batch
Epoch 12/20  Iteration 1962/3560 Training loss: 1.4978 0.8551 sec/batch
Epoch 12/20  Iteration 1963/3560 Training loss: 1.4897 0.8993 sec/batch
Epoch 12/20  Iteration 1964/3560 Training loss: 1.4801 0.8635 sec/batch
Epoch 12/20  Iteration 1965/3560 Training loss: 1.4806 0.8471 sec/batch
Epoch 12/20  Iteration 1966/3560 Training loss: 1.4780 0.8941 sec/batch
Epoch 12/20  Iteration 1967/3560 Training loss: 1.4767 0.8415 sec/batch
Epoch 12/20  Iteration 1968/3560 Training loss: 1.4760 0.8478 sec/batch
Epoch 12/20  Iteration 1969/3560 Training loss: 1.4726 0.8467 sec/batch
Epoch 12/20  Iteration 1970/3560 Training loss: 1.4720 0.8490 sec/batch
Epoch 12/20  Iteration 1971/3560 Training loss: 1.4713 0.8397 sec/batch
Epoch 12/20  Iteration 1972/3560 Training loss: 1.4732 0.8914 sec/batch
Epoch 12/20  Iteration 1973/3560 Training loss: 1.4720 0.8497 sec/batch
Epoch 12/20  Iteration 1974/3560 Training loss: 1.4705 0.8949 sec/batch
Epoch 12/20  Iteration 1975/3560 Training loss: 1.4707 0.8491 sec/batch
Epoch 12/20  Iteration 1976/3560 Training loss: 1.4726 0.8955 sec/batch
Epoch 12/20  Iteration 1977/3560 Training loss: 1.4729 0.8472 sec/batch
Epoch 12/20  Iteration 1978/3560 Training loss: 1.4739 0.8510 sec/batch
Epoch 12/20  Iteration 1979/3560 Training loss: 1.4735 0.8495 sec/batch
Epoch 12/20  Iteration 1980/3560 Training loss: 1.4738 0.8507 sec/batch
Epoch 12/20  Iteration 1981/3560 Training loss: 1.4728 0.8549 sec/batch
Epoch 12/20  Iteration 1982/3560 Training loss: 1.4729 0.8424 sec/batch
Epoch 12/20  Iteration 1983/3560 Training loss: 1.4733 0.8524 sec/batch
Epoch 12/20  Iteration 1984/3560 Training loss: 1.4718 0.8557 sec/batch
Epoch 12/20  Iteration 1985/3560 Training loss: 1.4707 0.8506 sec/batch
Epoch 12/20  Iteration 1986/3560 Training loss: 1.4712 0.8450 sec/batch
Epoch 12/20  Iteration 1987/3560 Training loss: 1.4716 0.8389 sec/batch
Epoch 12/20  Iteration 1988/3560 Training loss: 1.4720 0.8481 sec/batch
Epoch 12/20  Iteration 1989/3560 Training loss: 1.4715 0.8560 sec/batch
Epoch 12/20  Iteration 1990/3560 Training loss: 1.4706 0.8545 sec/batch
Epoch 12/20  Iteration 1991/3560 Training loss: 1.4714 0.8420 sec/batch
Epoch 12/20  Iteration 1992/3560 Training loss: 1.4715 0.8462 sec/batch
Epoch 12/20  Iteration 1993/3560 Training loss: 1.4712 0.8556 sec/batch
Epoch 12/20  Iteration 1994/3560 Training loss: 1.4711 0.8685 sec/batch
Epoch 12/20  Iteration 1995/3560 Training loss: 1.4703 0.8560 sec/batch
Epoch 12/20  Iteration 1996/3560 Training loss: 1.4691 0.8485 sec/batch
Epoch 12/20  Iteration 1997/3560 Training loss: 1.4677 0.8947 sec/batch
Epoch 12/20  Iteration 1998/3560 Training loss: 1.4671 0.8495 sec/batch
Epoch 12/20  Iteration 1999/3560 Training loss: 1.4667 0.8462 sec/batch
Epoch 12/20  Iteration 2000/3560 Training loss: 1.4675 0.8467 sec/batch
Epoch 12/20  Iteration 2001/3560 Training loss: 1.4671 0.8518 sec/batch
Epoch 12/20  Iteration 2002/3560 Training loss: 1.4665 0.8562 sec/batch
Epoch 12/20  Iteration 2003/3560 Training loss: 1.4669 0.8447 sec/batch
Epoch 12/20  Iteration 2004/3560 Training loss: 1.4660 0.8611 sec/batch
Epoch 12/20  Iteration 2005/3560 Training loss: 1.4657 0.8471 sec/batch
Epoch 12/20  Iteration 2006/3560 Training loss: 1.4653 0.8411 sec/batch
Epoch 12/20  Iteration 2007/3560 Training loss: 1.4653 0.8513 sec/batch
Epoch 12/20  Iteration 2008/3560 Training loss: 1.4659 0.8483 sec/batch
Epoch 12/20  Iteration 2009/3560 Training loss: 1.4655 0.8482 sec/batch
Epoch 12/20  Iteration 2010/3560 Training loss: 1.4663 0.8475 sec/batch
Epoch 12/20  Iteration 2011/3560 Training loss: 1.4664 0.8489 sec/batch
Epoch 12/20  Iteration 2012/3560 Training loss: 1.4666 0.8500 sec/batch
Epoch 12/20  Iteration 2013/3560 Training loss: 1.4665 0.8504 sec/batch
Epoch 12/20  Iteration 2014/3560 Training loss: 1.4667 0.8455 sec/batch
Epoch 12/20  Iteration 2015/3560 Training loss: 1.4672 0.8471 sec/batch
Epoch 12/20  Iteration 2016/3560 Training loss: 1.4668 0.8520 sec/batch
Epoch 12/20  Iteration 2017/3560 Training loss: 1.4662 0.8698 sec/batch
Epoch 12/20  Iteration 2018/3560 Training loss: 1.4667 0.8951 sec/batch
Epoch 12/20  Iteration 2019/3560 Training loss: 1.4668 0.8475 sec/batch
Epoch 12/20  Iteration 2020/3560 Training loss: 1.4676 0.8477 sec/batch
Epoch 12/20  Iteration 2021/3560 Training loss: 1.4680 0.8957 sec/batch
Epoch 12/20  Iteration 2022/3560 Training loss: 1.4682 0.8410 sec/batch
Epoch 12/20  Iteration 2023/3560 Training loss: 1.4683 0.8512 sec/batch
Epoch 12/20  Iteration 2024/3560 Training loss: 1.4687 0.8491 sec/batch
Epoch 12/20  Iteration 2025/3560 Training loss: 1.4689 0.8542 sec/batch
Epoch 12/20  Iteration 2026/3560 Training loss: 1.4686 0.9242 sec/batch
Epoch 12/20  Iteration 2027/3560 Training loss: 1.4687 0.8482 sec/batch
Epoch 12/20  Iteration 2028/3560 Training loss: 1.4687 0.8651 sec/batch
Epoch 12/20  Iteration 2029/3560 Training loss: 1.4693 0.8744 sec/batch
Epoch 12/20  Iteration 2030/3560 Training loss: 1.4695 0.8549 sec/batch
Epoch 12/20  Iteration 2031/3560 Training loss: 1.4700 0.8415 sec/batch
Epoch 12/20  Iteration 2032/3560 Training loss: 1.4697 0.8393 sec/batch
Epoch 12/20  Iteration 2033/3560 Training loss: 1.4695 0.8553 sec/batch
Epoch 12/20  Iteration 2034/3560 Training loss: 1.4696 0.8430 sec/batch
Epoch 12/20  Iteration 2035/3560 Training loss: 1.4694 0.8472 sec/batch
Epoch 12/20  Iteration 2036/3560 Training loss: 1.4693 0.8490 sec/batch
Epoch 12/20  Iteration 2037/3560 Training loss: 1.4688 0.8524 sec/batch
Epoch 12/20  Iteration 2038/3560 Training loss: 1.4689 0.8449 sec/batch
Epoch 12/20  Iteration 2039/3560 Training loss: 1.4682 0.8595 sec/batch
Epoch 12/20  Iteration 2040/3560 Training loss: 1.4683 0.8509 sec/batch
Epoch 12/20  Iteration 2041/3560 Training loss: 1.4677 0.8484 sec/batch
Epoch 12/20  Iteration 2042/3560 Training loss: 1.4677 0.8929 sec/batch
Epoch 12/20  Iteration 2043/3560 Training loss: 1.4674 0.8527 sec/batch
Epoch 12/20  Iteration 2044/3560 Training loss: 1.4671 0.8447 sec/batch
Epoch 12/20  Iteration 2045/3560 Training loss: 1.4669 0.8483 sec/batch
Epoch 12/20  Iteration 2046/3560 Training loss: 1.4665 0.8509 sec/batch
Epoch 12/20  Iteration 2047/3560 Training loss: 1.4661 0.8501 sec/batch
Epoch 12/20  Iteration 2048/3560 Training loss: 1.4663 0.8410 sec/batch
Epoch 12/20  Iteration 2049/3560 Training loss: 1.4660 0.8478 sec/batch
Epoch 12/20  Iteration 2050/3560 Training loss: 1.4659 0.8571 sec/batch
Epoch 12/20  Iteration 2051/3560 Training loss: 1.4654 0.8583 sec/batch
Epoch 12/20  Iteration 2052/3560 Training loss: 1.4651 0.8491 sec/batch
Epoch 12/20  Iteration 2053/3560 Training loss: 1.4649 0.8454 sec/batch
Epoch 12/20  Iteration 2054/3560 Training loss: 1.4649 0.8464 sec/batch
Epoch 12/20  Iteration 2055/3560 Training loss: 1.4649 0.8553 sec/batch
Epoch 12/20  Iteration 2056/3560 Training loss: 1.4645 0.8481 sec/batch
Epoch 12/20  Iteration 2057/3560 Training loss: 1.4642 0.8542 sec/batch
Epoch 12/20  Iteration 2058/3560 Training loss: 1.4638 0.8539 sec/batch
Epoch 12/20  Iteration 2059/3560 Training loss: 1.4638 0.8539 sec/batch
Epoch 12/20  Iteration 2060/3560 Training loss: 1.4636 0.8573 sec/batch
Epoch 12/20  Iteration 2061/3560 Training loss: 1.4635 0.8453 sec/batch
Epoch 12/20  Iteration 2062/3560 Training loss: 1.4634 0.8443 sec/batch
Epoch 12/20  Iteration 2063/3560 Training loss: 1.4632 0.8893 sec/batch
Epoch 12/20  Iteration 2064/3560 Training loss: 1.4631 0.8697 sec/batch
Epoch 12/20  Iteration 2065/3560 Training loss: 1.4630 0.8504 sec/batch
Epoch 12/20  Iteration 2066/3560 Training loss: 1.4630 0.8509 sec/batch
Epoch 12/20  Iteration 2067/3560 Training loss: 1.4628 0.9199 sec/batch
Epoch 12/20  Iteration 2068/3560 Training loss: 1.4629 0.8894 sec/batch
Epoch 12/20  Iteration 2069/3560 Training loss: 1.4626 0.8487 sec/batch
Epoch 12/20  Iteration 2070/3560 Training loss: 1.4625 0.8511 sec/batch
Epoch 12/20  Iteration 2071/3560 Training loss: 1.4624 0.8706 sec/batch
Epoch 12/20  Iteration 2072/3560 Training loss: 1.4622 0.8448 sec/batch
Epoch 12/20  Iteration 2073/3560 Training loss: 1.4619 0.8575 sec/batch
Epoch 12/20  Iteration 2074/3560 Training loss: 1.4615 0.8577 sec/batch
Epoch 12/20  Iteration 2075/3560 Training loss: 1.4614 0.8549 sec/batch
Epoch 12/20  Iteration 2076/3560 Training loss: 1.4614 0.8513 sec/batch
Epoch 12/20  Iteration 2077/3560 Training loss: 1.4613 0.8485 sec/batch
Epoch 12/20  Iteration 2078/3560 Training loss: 1.4612 0.8487 sec/batch
Epoch 12/20  Iteration 2079/3560 Training loss: 1.4611 0.9088 sec/batch
Epoch 12/20  Iteration 2080/3560 Training loss: 1.4608 0.9414 sec/batch
Epoch 12/20  Iteration 2081/3560 Training loss: 1.4605 0.8525 sec/batch
Epoch 12/20  Iteration 2082/3560 Training loss: 1.4605 0.8833 sec/batch
Epoch 12/20  Iteration 2083/3560 Training loss: 1.4604 0.8485 sec/batch
Epoch 12/20  Iteration 2084/3560 Training loss: 1.4600 0.8409 sec/batch
Epoch 12/20  Iteration 2085/3560 Training loss: 1.4601 0.8491 sec/batch
Epoch 12/20  Iteration 2086/3560 Training loss: 1.4601 0.8513 sec/batch
Epoch 12/20  Iteration 2087/3560 Training loss: 1.4600 0.8990 sec/batch
Epoch 12/20  Iteration 2088/3560 Training loss: 1.4597 0.8618 sec/batch
Epoch 12/20  Iteration 2089/3560 Training loss: 1.4594 0.8643 sec/batch
Epoch 12/20  Iteration 2090/3560 Training loss: 1.4592 0.8528 sec/batch
Epoch 12/20  Iteration 2091/3560 Training loss: 1.4593 0.8730 sec/batch
Epoch 12/20  Iteration 2092/3560 Training loss: 1.4593 0.8523 sec/batch
Epoch 12/20  Iteration 2093/3560 Training loss: 1.4594 0.8482 sec/batch
Epoch 12/20  Iteration 2094/3560 Training loss: 1.4595 0.8470 sec/batch
Epoch 12/20  Iteration 2095/3560 Training loss: 1.4597 0.8854 sec/batch
Epoch 12/20  Iteration 2096/3560 Training loss: 1.4597 0.8544 sec/batch
Epoch 12/20  Iteration 2097/3560 Training loss: 1.4597 0.8880 sec/batch
Epoch 12/20  Iteration 2098/3560 Training loss: 1.4597 0.8617 sec/batch
Epoch 12/20  Iteration 2099/3560 Training loss: 1.4600 0.8473 sec/batch
Epoch 12/20  Iteration 2100/3560 Training loss: 1.4601 0.8514 sec/batch
Epoch 12/20  Iteration 2101/3560 Training loss: 1.4600 0.8568 sec/batch
Epoch 12/20  Iteration 2102/3560 Training loss: 1.4602 0.8525 sec/batch
Epoch 12/20  Iteration 2103/3560 Training loss: 1.4601 0.8443 sec/batch
Epoch 12/20  Iteration 2104/3560 Training loss: 1.4602 1.1720 sec/batch
Epoch 12/20  Iteration 2105/3560 Training loss: 1.4602 1.0162 sec/batch
Epoch 12/20  Iteration 2106/3560 Training loss: 1.4604 0.8465 sec/batch
Epoch 12/20  Iteration 2107/3560 Training loss: 1.4605 0.8486 sec/batch
Epoch 12/20  Iteration 2108/3560 Training loss: 1.4603 0.8495 sec/batch
Epoch 12/20  Iteration 2109/3560 Training loss: 1.4600 0.8508 sec/batch
Epoch 12/20  Iteration 2110/3560 Training loss: 1.4600 0.8508 sec/batch
Epoch 12/20  Iteration 2111/3560 Training loss: 1.4601 0.8959 sec/batch
Epoch 12/20  Iteration 2112/3560 Training loss: 1.4602 0.8553 sec/batch
Epoch 12/20  Iteration 2113/3560 Training loss: 1.4601 0.9028 sec/batch
Epoch 12/20  Iteration 2114/3560 Training loss: 1.4602 0.8498 sec/batch
Epoch 12/20  Iteration 2115/3560 Training loss: 1.4602 0.8566 sec/batch
Epoch 12/20  Iteration 2116/3560 Training loss: 1.4602 0.8418 sec/batch
Epoch 12/20  Iteration 2117/3560 Training loss: 1.4600 0.8950 sec/batch
Epoch 12/20  Iteration 2118/3560 Training loss: 1.4602 0.8550 sec/batch
Epoch 12/20  Iteration 2119/3560 Training loss: 1.4603 0.8455 sec/batch
Epoch 12/20  Iteration 2120/3560 Training loss: 1.4604 0.8560 sec/batch
Epoch 12/20  Iteration 2121/3560 Training loss: 1.4604 0.8502 sec/batch
Epoch 12/20  Iteration 2122/3560 Training loss: 1.4605 0.8647 sec/batch
Epoch 12/20  Iteration 2123/3560 Training loss: 1.4605 0.8479 sec/batch
Epoch 12/20  Iteration 2124/3560 Training loss: 1.4604 0.8502 sec/batch
Epoch 12/20  Iteration 2125/3560 Training loss: 1.4606 0.8508 sec/batch
Epoch 12/20  Iteration 2126/3560 Training loss: 1.4610 0.8562 sec/batch
Epoch 12/20  Iteration 2127/3560 Training loss: 1.4610 0.8541 sec/batch
Epoch 12/20  Iteration 2128/3560 Training loss: 1.4610 0.8951 sec/batch
Epoch 12/20  Iteration 2129/3560 Training loss: 1.4610 0.8498 sec/batch
Epoch 12/20  Iteration 2130/3560 Training loss: 1.4608 0.8526 sec/batch
Epoch 12/20  Iteration 2131/3560 Training loss: 1.4609 0.8558 sec/batch
Epoch 12/20  Iteration 2132/3560 Training loss: 1.4609 0.8637 sec/batch
Epoch 12/20  Iteration 2133/3560 Training loss: 1.4609 0.8582 sec/batch
Epoch 12/20  Iteration 2134/3560 Training loss: 1.4608 0.8537 sec/batch
Epoch 12/20  Iteration 2135/3560 Training loss: 1.4606 0.8478 sec/batch
Epoch 12/20  Iteration 2136/3560 Training loss: 1.4608 0.8543 sec/batch
Epoch 13/20  Iteration 2137/3560 Training loss: 1.5375 0.8465 sec/batch
Epoch 13/20  Iteration 2138/3560 Training loss: 1.5024 0.8463 sec/batch
Epoch 13/20  Iteration 2139/3560 Training loss: 1.4892 0.8520 sec/batch
Epoch 13/20  Iteration 2140/3560 Training loss: 1.4829 0.8506 sec/batch
Epoch 13/20  Iteration 2141/3560 Training loss: 1.4735 0.8491 sec/batch
Epoch 13/20  Iteration 2142/3560 Training loss: 1.4638 0.8635 sec/batch
Epoch 13/20  Iteration 2143/3560 Training loss: 1.4641 0.8491 sec/batch
Epoch 13/20  Iteration 2144/3560 Training loss: 1.4624 0.8572 sec/batch
Epoch 13/20  Iteration 2145/3560 Training loss: 1.4623 0.8493 sec/batch
Epoch 13/20  Iteration 2146/3560 Training loss: 1.4627 0.8586 sec/batch
Epoch 13/20  Iteration 2147/3560 Training loss: 1.4595 0.8852 sec/batch
Epoch 13/20  Iteration 2148/3560 Training loss: 1.4589 0.8744 sec/batch
Epoch 13/20  Iteration 2149/3560 Training loss: 1.4578 0.8705 sec/batch
Epoch 13/20  Iteration 2150/3560 Training loss: 1.4584 0.8802 sec/batch
Epoch 13/20  Iteration 2151/3560 Training loss: 1.4577 0.9796 sec/batch
Epoch 13/20  Iteration 2152/3560 Training loss: 1.4557 0.9523 sec/batch
Epoch 13/20  Iteration 2153/3560 Training loss: 1.4556 0.9857 sec/batch
Epoch 13/20  Iteration 2154/3560 Training loss: 1.4566 0.9214 sec/batch
Epoch 13/20  Iteration 2155/3560 Training loss: 1.4564 0.8753 sec/batch
Epoch 13/20  Iteration 2156/3560 Training loss: 1.4575 0.8891 sec/batch
Epoch 13/20  Iteration 2157/3560 Training loss: 1.4568 0.8491 sec/batch
Epoch 13/20  Iteration 2158/3560 Training loss: 1.4566 0.8443 sec/batch
Epoch 13/20  Iteration 2159/3560 Training loss: 1.4554 0.8488 sec/batch
Epoch 13/20  Iteration 2160/3560 Training loss: 1.4554 0.8417 sec/batch
Epoch 13/20  Iteration 2161/3560 Training loss: 1.4555 0.8405 sec/batch
Epoch 13/20  Iteration 2162/3560 Training loss: 1.4536 0.8448 sec/batch
Epoch 13/20  Iteration 2163/3560 Training loss: 1.4525 0.8465 sec/batch
Epoch 13/20  Iteration 2164/3560 Training loss: 1.4530 0.8561 sec/batch
Epoch 13/20  Iteration 2165/3560 Training loss: 1.4532 0.8516 sec/batch
Epoch 13/20  Iteration 2166/3560 Training loss: 1.4536 0.8486 sec/batch
Epoch 13/20  Iteration 2167/3560 Training loss: 1.4532 0.8544 sec/batch
Epoch 13/20  Iteration 2168/3560 Training loss: 1.4525 0.8460 sec/batch
Epoch 13/20  Iteration 2169/3560 Training loss: 1.4529 0.8602 sec/batch
Epoch 13/20  Iteration 2170/3560 Training loss: 1.4532 0.8426 sec/batch
Epoch 13/20  Iteration 2171/3560 Training loss: 1.4529 0.8411 sec/batch
Epoch 13/20  Iteration 2172/3560 Training loss: 1.4526 0.8459 sec/batch
Epoch 13/20  Iteration 2173/3560 Training loss: 1.4517 0.8548 sec/batch
Epoch 13/20  Iteration 2174/3560 Training loss: 1.4503 0.8729 sec/batch
Epoch 13/20  Iteration 2175/3560 Training loss: 1.4489 0.8405 sec/batch
Epoch 13/20  Iteration 2176/3560 Training loss: 1.4486 0.8433 sec/batch
Epoch 13/20  Iteration 2177/3560 Training loss: 1.4481 0.8475 sec/batch
Epoch 13/20  Iteration 2178/3560 Training loss: 1.4488 0.8367 sec/batch
Epoch 13/20  Iteration 2179/3560 Training loss: 1.4484 0.8423 sec/batch
Epoch 13/20  Iteration 2180/3560 Training loss: 1.4478 0.8433 sec/batch
Epoch 13/20  Iteration 2181/3560 Training loss: 1.4479 0.8393 sec/batch
Epoch 13/20  Iteration 2182/3560 Training loss: 1.4472 0.8402 sec/batch
Epoch 13/20  Iteration 2183/3560 Training loss: 1.4469 0.8376 sec/batch
Epoch 13/20  Iteration 2184/3560 Training loss: 1.4465 0.8368 sec/batch
Epoch 13/20  Iteration 2185/3560 Training loss: 1.4466 0.8458 sec/batch
Epoch 13/20  Iteration 2186/3560 Training loss: 1.4469 0.8459 sec/batch
Epoch 13/20  Iteration 2187/3560 Training loss: 1.4466 0.8434 sec/batch
Epoch 13/20  Iteration 2188/3560 Training loss: 1.4472 0.8414 sec/batch
Epoch 13/20  Iteration 2189/3560 Training loss: 1.4472 0.8487 sec/batch
Epoch 13/20  Iteration 2190/3560 Training loss: 1.4474 0.8838 sec/batch
Epoch 13/20  Iteration 2191/3560 Training loss: 1.4472 0.8421 sec/batch
Epoch 13/20  Iteration 2192/3560 Training loss: 1.4473 0.8393 sec/batch
Epoch 13/20  Iteration 2193/3560 Training loss: 1.4478 0.8425 sec/batch
Epoch 13/20  Iteration 2194/3560 Training loss: 1.4474 0.8399 sec/batch
Epoch 13/20  Iteration 2195/3560 Training loss: 1.4467 0.8458 sec/batch
Epoch 13/20  Iteration 2196/3560 Training loss: 1.4471 0.8482 sec/batch
Epoch 13/20  Iteration 2197/3560 Training loss: 1.4471 0.8443 sec/batch
Epoch 13/20  Iteration 2198/3560 Training loss: 1.4481 0.8810 sec/batch
Epoch 13/20  Iteration 2199/3560 Training loss: 1.4484 0.8461 sec/batch
Epoch 13/20  Iteration 2200/3560 Training loss: 1.4486 0.8560 sec/batch
Epoch 13/20  Iteration 2201/3560 Training loss: 1.4486 0.8446 sec/batch
Epoch 13/20  Iteration 2202/3560 Training loss: 1.4489 0.8642 sec/batch
Epoch 13/20  Iteration 2203/3560 Training loss: 1.4492 0.8399 sec/batch
Epoch 13/20  Iteration 2204/3560 Training loss: 1.4489 0.8533 sec/batch
Epoch 13/20  Iteration 2205/3560 Training loss: 1.4489 0.8453 sec/batch
Epoch 13/20  Iteration 2206/3560 Training loss: 1.4488 0.8455 sec/batch
Epoch 13/20  Iteration 2207/3560 Training loss: 1.4494 0.8402 sec/batch
Epoch 13/20  Iteration 2208/3560 Training loss: 1.4496 0.8494 sec/batch
Epoch 13/20  Iteration 2209/3560 Training loss: 1.4501 0.8448 sec/batch
Epoch 13/20  Iteration 2210/3560 Training loss: 1.4499 0.8458 sec/batch
Epoch 13/20  Iteration 2211/3560 Training loss: 1.4498 0.8635 sec/batch
Epoch 13/20  Iteration 2212/3560 Training loss: 1.4501 0.8405 sec/batch
Epoch 13/20  Iteration 2213/3560 Training loss: 1.4500 0.8494 sec/batch
Epoch 13/20  Iteration 2214/3560 Training loss: 1.4500 0.8424 sec/batch
Epoch 13/20  Iteration 2215/3560 Training loss: 1.4494 0.8474 sec/batch
Epoch 13/20  Iteration 2216/3560 Training loss: 1.4495 0.8396 sec/batch
Epoch 13/20  Iteration 2217/3560 Training loss: 1.4489 0.9400 sec/batch
Epoch 13/20  Iteration 2218/3560 Training loss: 1.4490 1.0029 sec/batch
Epoch 13/20  Iteration 2219/3560 Training loss: 1.4484 0.8819 sec/batch
Epoch 13/20  Iteration 2220/3560 Training loss: 1.4485 0.8603 sec/batch
Epoch 13/20  Iteration 2221/3560 Training loss: 1.4482 0.8981 sec/batch
Epoch 13/20  Iteration 2222/3560 Training loss: 1.4480 0.8443 sec/batch
Epoch 13/20  Iteration 2223/3560 Training loss: 1.4478 0.8379 sec/batch
Epoch 13/20  Iteration 2224/3560 Training loss: 1.4476 0.8349 sec/batch
Epoch 13/20  Iteration 2225/3560 Training loss: 1.4470 0.8455 sec/batch
Epoch 13/20  Iteration 2226/3560 Training loss: 1.4471 0.8456 sec/batch
Epoch 13/20  Iteration 2227/3560 Training loss: 1.4469 0.8435 sec/batch
Epoch 13/20  Iteration 2228/3560 Training loss: 1.4469 0.8396 sec/batch
Epoch 13/20  Iteration 2229/3560 Training loss: 1.4465 0.8346 sec/batch
Epoch 13/20  Iteration 2230/3560 Training loss: 1.4462 0.8458 sec/batch
Epoch 13/20  Iteration 2231/3560 Training loss: 1.4460 0.8389 sec/batch
Epoch 13/20  Iteration 2232/3560 Training loss: 1.4460 0.8388 sec/batch
Epoch 13/20  Iteration 2233/3560 Training loss: 1.4459 0.8456 sec/batch
Epoch 13/20  Iteration 2234/3560 Training loss: 1.4457 0.8491 sec/batch
Epoch 13/20  Iteration 2235/3560 Training loss: 1.4454 0.8422 sec/batch
Epoch 13/20  Iteration 2236/3560 Training loss: 1.4449 0.8438 sec/batch
Epoch 13/20  Iteration 2237/3560 Training loss: 1.4449 0.8552 sec/batch
Epoch 13/20  Iteration 2238/3560 Training loss: 1.4448 0.8444 sec/batch
Epoch 13/20  Iteration 2239/3560 Training loss: 1.4447 0.8419 sec/batch
Epoch 13/20  Iteration 2240/3560 Training loss: 1.4446 0.8376 sec/batch
Epoch 13/20  Iteration 2241/3560 Training loss: 1.4444 0.8376 sec/batch
Epoch 13/20  Iteration 2242/3560 Training loss: 1.4443 0.8437 sec/batch
Epoch 13/20  Iteration 2243/3560 Training loss: 1.4443 0.8408 sec/batch
Epoch 13/20  Iteration 2244/3560 Training loss: 1.4442 0.8400 sec/batch
Epoch 13/20  Iteration 2245/3560 Training loss: 1.4440 0.8472 sec/batch
Epoch 13/20  Iteration 2246/3560 Training loss: 1.4441 0.8382 sec/batch
Epoch 13/20  Iteration 2247/3560 Training loss: 1.4438 0.8466 sec/batch
Epoch 13/20  Iteration 2248/3560 Training loss: 1.4437 0.8469 sec/batch
Epoch 13/20  Iteration 2249/3560 Training loss: 1.4436 0.8455 sec/batch
Epoch 13/20  Iteration 2250/3560 Training loss: 1.4434 0.8387 sec/batch
Epoch 13/20  Iteration 2251/3560 Training loss: 1.4431 0.8361 sec/batch
Epoch 13/20  Iteration 2252/3560 Training loss: 1.4428 0.8401 sec/batch
Epoch 13/20  Iteration 2253/3560 Training loss: 1.4427 0.8488 sec/batch
Epoch 13/20  Iteration 2254/3560 Training loss: 1.4427 0.8475 sec/batch
Epoch 13/20  Iteration 2255/3560 Training loss: 1.4426 0.8421 sec/batch
Epoch 13/20  Iteration 2256/3560 Training loss: 1.4425 0.9002 sec/batch
Epoch 13/20  Iteration 2257/3560 Training loss: 1.4424 0.8389 sec/batch
Epoch 13/20  Iteration 2258/3560 Training loss: 1.4422 0.8488 sec/batch
Epoch 13/20  Iteration 2259/3560 Training loss: 1.4417 0.8509 sec/batch
Epoch 13/20  Iteration 2260/3560 Training loss: 1.4418 0.8367 sec/batch
Epoch 13/20  Iteration 2261/3560 Training loss: 1.4416 0.8379 sec/batch
Epoch 13/20  Iteration 2262/3560 Training loss: 1.4411 0.8395 sec/batch
Epoch 13/20  Iteration 2263/3560 Training loss: 1.4413 0.8476 sec/batch
Epoch 13/20  Iteration 2264/3560 Training loss: 1.4413 0.8463 sec/batch
Epoch 13/20  Iteration 2265/3560 Training loss: 1.4411 0.8407 sec/batch
Epoch 13/20  Iteration 2266/3560 Training loss: 1.4409 0.8414 sec/batch
Epoch 13/20  Iteration 2267/3560 Training loss: 1.4405 0.8452 sec/batch
Epoch 13/20  Iteration 2268/3560 Training loss: 1.4403 0.8459 sec/batch
Epoch 13/20  Iteration 2269/3560 Training loss: 1.4404 0.8395 sec/batch
Epoch 13/20  Iteration 2270/3560 Training loss: 1.4404 0.8405 sec/batch
Epoch 13/20  Iteration 2271/3560 Training loss: 1.4404 0.8505 sec/batch
Epoch 13/20  Iteration 2272/3560 Training loss: 1.4405 0.8588 sec/batch
Epoch 13/20  Iteration 2273/3560 Training loss: 1.4407 0.8405 sec/batch
Epoch 13/20  Iteration 2274/3560 Training loss: 1.4408 0.8391 sec/batch
Epoch 13/20  Iteration 2275/3560 Training loss: 1.4408 0.8387 sec/batch
Epoch 13/20  Iteration 2276/3560 Training loss: 1.4407 0.8388 sec/batch
Epoch 13/20  Iteration 2277/3560 Training loss: 1.4410 0.8840 sec/batch
Epoch 13/20  Iteration 2278/3560 Training loss: 1.4411 0.8389 sec/batch
Epoch 13/20  Iteration 2279/3560 Training loss: 1.4410 0.8383 sec/batch
Epoch 13/20  Iteration 2280/3560 Training loss: 1.4411 0.8456 sec/batch
Epoch 13/20  Iteration 2281/3560 Training loss: 1.4410 0.8412 sec/batch
Epoch 13/20  Iteration 2282/3560 Training loss: 1.4413 0.8464 sec/batch
Epoch 13/20  Iteration 2283/3560 Training loss: 1.4413 0.8854 sec/batch
Epoch 13/20  Iteration 2284/3560 Training loss: 1.4416 0.8426 sec/batch
Epoch 13/20  Iteration 2285/3560 Training loss: 1.4417 0.8452 sec/batch
Epoch 13/20  Iteration 2286/3560 Training loss: 1.4416 0.8444 sec/batch
Epoch 13/20  Iteration 2287/3560 Training loss: 1.4413 0.8432 sec/batch
Epoch 13/20  Iteration 2288/3560 Training loss: 1.4412 0.8454 sec/batch
Epoch 13/20  Iteration 2289/3560 Training loss: 1.4412 0.8350 sec/batch
Epoch 13/20  Iteration 2290/3560 Training loss: 1.4413 0.8500 sec/batch
Epoch 13/20  Iteration 2291/3560 Training loss: 1.4412 0.9043 sec/batch
Epoch 13/20  Iteration 2292/3560 Training loss: 1.4412 0.8536 sec/batch
Epoch 13/20  Iteration 2293/3560 Training loss: 1.4412 0.9976 sec/batch
Epoch 13/20  Iteration 2294/3560 Training loss: 1.4412 1.0442 sec/batch
Epoch 13/20  Iteration 2295/3560 Training loss: 1.4410 0.8760 sec/batch
Epoch 13/20  Iteration 2296/3560 Training loss: 1.4411 1.1811 sec/batch
Epoch 13/20  Iteration 2297/3560 Training loss: 1.4413 0.9780 sec/batch
Epoch 13/20  Iteration 2298/3560 Training loss: 1.4413 0.8866 sec/batch
Epoch 13/20  Iteration 2299/3560 Training loss: 1.4413 0.8738 sec/batch
Epoch 13/20  Iteration 2300/3560 Training loss: 1.4414 0.8883 sec/batch
Epoch 13/20  Iteration 2301/3560 Training loss: 1.4414 0.8883 sec/batch
Epoch 13/20  Iteration 2302/3560 Training loss: 1.4413 0.8790 sec/batch
Epoch 13/20  Iteration 2303/3560 Training loss: 1.4415 1.0348 sec/batch
Epoch 13/20  Iteration 2304/3560 Training loss: 1.4419 0.9949 sec/batch
Epoch 13/20  Iteration 2305/3560 Training loss: 1.4420 0.8889 sec/batch
Epoch 13/20  Iteration 2306/3560 Training loss: 1.4420 0.9065 sec/batch
Epoch 13/20  Iteration 2307/3560 Training loss: 1.4419 0.8533 sec/batch
Epoch 13/20  Iteration 2308/3560 Training loss: 1.4416 0.8501 sec/batch
Epoch 13/20  Iteration 2309/3560 Training loss: 1.4417 0.8540 sec/batch
Epoch 13/20  Iteration 2310/3560 Training loss: 1.4417 0.8928 sec/batch
Epoch 13/20  Iteration 2311/3560 Training loss: 1.4418 0.8794 sec/batch
Epoch 13/20  Iteration 2312/3560 Training loss: 1.4416 1.2166 sec/batch
Epoch 13/20  Iteration 2313/3560 Training loss: 1.4415 0.9719 sec/batch
Epoch 13/20  Iteration 2314/3560 Training loss: 1.4417 0.8577 sec/batch
Epoch 14/20  Iteration 2315/3560 Training loss: 1.5186 0.8693 sec/batch
Epoch 14/20  Iteration 2316/3560 Training loss: 1.4796 0.9299 sec/batch
Epoch 14/20  Iteration 2317/3560 Training loss: 1.4655 0.8438 sec/batch
Epoch 14/20  Iteration 2318/3560 Training loss: 1.4595 0.9679 sec/batch
Epoch 14/20  Iteration 2319/3560 Training loss: 1.4519 0.8659 sec/batch
Epoch 14/20  Iteration 2320/3560 Training loss: 1.4414 0.8986 sec/batch
Epoch 14/20  Iteration 2321/3560 Training loss: 1.4418 0.9588 sec/batch
Epoch 14/20  Iteration 2322/3560 Training loss: 1.4390 0.8596 sec/batch
Epoch 14/20  Iteration 2323/3560 Training loss: 1.4395 0.8841 sec/batch
Epoch 14/20  Iteration 2324/3560 Training loss: 1.4387 0.8750 sec/batch
Epoch 14/20  Iteration 2325/3560 Training loss: 1.4355 0.9295 sec/batch
Epoch 14/20  Iteration 2326/3560 Training loss: 1.4355 0.9756 sec/batch
Epoch 14/20  Iteration 2327/3560 Training loss: 1.4354 0.9943 sec/batch
Epoch 14/20  Iteration 2328/3560 Training loss: 1.4364 0.9087 sec/batch
Epoch 14/20  Iteration 2329/3560 Training loss: 1.4356 0.8946 sec/batch
Epoch 14/20  Iteration 2330/3560 Training loss: 1.4334 0.8607 sec/batch
Epoch 14/20  Iteration 2331/3560 Training loss: 1.4341 0.8474 sec/batch
Epoch 14/20  Iteration 2332/3560 Training loss: 1.4355 0.8620 sec/batch
Epoch 14/20  Iteration 2333/3560 Training loss: 1.4357 0.8490 sec/batch
Epoch 14/20  Iteration 2334/3560 Training loss: 1.4372 0.8458 sec/batch
Epoch 14/20  Iteration 2335/3560 Training loss: 1.4364 0.8392 sec/batch
Epoch 14/20  Iteration 2336/3560 Training loss: 1.4368 0.8419 sec/batch
Epoch 14/20  Iteration 2337/3560 Training loss: 1.4364 0.8869 sec/batch
Epoch 14/20  Iteration 2338/3560 Training loss: 1.4364 1.2574 sec/batch
Epoch 14/20  Iteration 2339/3560 Training loss: 1.4362 1.1124 sec/batch
Epoch 14/20  Iteration 2340/3560 Training loss: 1.4345 1.1670 sec/batch
Epoch 14/20  Iteration 2341/3560 Training loss: 1.4334 0.9320 sec/batch
Epoch 14/20  Iteration 2342/3560 Training loss: 1.4340 0.9091 sec/batch
Epoch 14/20  Iteration 2343/3560 Training loss: 1.4342 0.8892 sec/batch
Epoch 14/20  Iteration 2344/3560 Training loss: 1.4344 0.8414 sec/batch
Epoch 14/20  Iteration 2345/3560 Training loss: 1.4339 0.9049 sec/batch
Epoch 14/20  Iteration 2346/3560 Training loss: 1.4332 0.9143 sec/batch
Epoch 14/20  Iteration 2347/3560 Training loss: 1.4335 0.8831 sec/batch
Epoch 14/20  Iteration 2348/3560 Training loss: 1.4335 0.8444 sec/batch
Epoch 14/20  Iteration 2349/3560 Training loss: 1.4332 0.8408 sec/batch
Epoch 14/20  Iteration 2350/3560 Training loss: 1.4334 0.8599 sec/batch
Epoch 14/20  Iteration 2351/3560 Training loss: 1.4325 0.8496 sec/batch
Epoch 14/20  Iteration 2352/3560 Training loss: 1.4313 0.8817 sec/batch
Epoch 14/20  Iteration 2353/3560 Training loss: 1.4299 0.8991 sec/batch
Epoch 14/20  Iteration 2354/3560 Training loss: 1.4295 0.9251 sec/batch
Epoch 14/20  Iteration 2355/3560 Training loss: 1.4290 1.0385 sec/batch
Epoch 14/20  Iteration 2356/3560 Training loss: 1.4296 1.1620 sec/batch
Epoch 14/20  Iteration 2357/3560 Training loss: 1.4293 0.8400 sec/batch
Epoch 14/20  Iteration 2358/3560 Training loss: 1.4284 0.8604 sec/batch
Epoch 14/20  Iteration 2359/3560 Training loss: 1.4287 0.8611 sec/batch
Epoch 14/20  Iteration 2360/3560 Training loss: 1.4280 0.8834 sec/batch
Epoch 14/20  Iteration 2361/3560 Training loss: 1.4275 0.8711 sec/batch
Epoch 14/20  Iteration 2362/3560 Training loss: 1.4270 0.8562 sec/batch
Epoch 14/20  Iteration 2363/3560 Training loss: 1.4269 0.9103 sec/batch
Epoch 14/20  Iteration 2364/3560 Training loss: 1.4274 0.8917 sec/batch
Epoch 14/20  Iteration 2365/3560 Training loss: 1.4269 1.2446 sec/batch
Epoch 14/20  Iteration 2366/3560 Training loss: 1.4276 1.0189 sec/batch
Epoch 14/20  Iteration 2367/3560 Training loss: 1.4277 0.8706 sec/batch
Epoch 14/20  Iteration 2368/3560 Training loss: 1.4279 0.8738 sec/batch
Epoch 14/20  Iteration 2369/3560 Training loss: 1.4275 0.9101 sec/batch
Epoch 14/20  Iteration 2370/3560 Training loss: 1.4277 0.8810 sec/batch
Epoch 14/20  Iteration 2371/3560 Training loss: 1.4282 0.8944 sec/batch
Epoch 14/20  Iteration 2372/3560 Training loss: 1.4277 0.8864 sec/batch
Epoch 14/20  Iteration 2373/3560 Training loss: 1.4271 0.9708 sec/batch
Epoch 14/20  Iteration 2374/3560 Training loss: 1.4277 0.8692 sec/batch
Epoch 14/20  Iteration 2375/3560 Training loss: 1.4278 0.8866 sec/batch
Epoch 14/20  Iteration 2376/3560 Training loss: 1.4287 0.8637 sec/batch
Epoch 14/20  Iteration 2377/3560 Training loss: 1.4290 0.8698 sec/batch
Epoch 14/20  Iteration 2378/3560 Training loss: 1.4291 0.8778 sec/batch
Epoch 14/20  Iteration 2379/3560 Training loss: 1.4291 0.8538 sec/batch
Epoch 14/20  Iteration 2380/3560 Training loss: 1.4294 0.8525 sec/batch
Epoch 14/20  Iteration 2381/3560 Training loss: 1.4297 0.8968 sec/batch
Epoch 14/20  Iteration 2382/3560 Training loss: 1.4294 0.9901 sec/batch
Epoch 14/20  Iteration 2383/3560 Training loss: 1.4295 0.8509 sec/batch
Epoch 14/20  Iteration 2384/3560 Training loss: 1.4294 0.9111 sec/batch
Epoch 14/20  Iteration 2385/3560 Training loss: 1.4300 0.9133 sec/batch
Epoch 14/20  Iteration 2386/3560 Training loss: 1.4303 0.8429 sec/batch
Epoch 14/20  Iteration 2387/3560 Training loss: 1.4309 0.8473 sec/batch
Epoch 14/20  Iteration 2388/3560 Training loss: 1.4306 0.8493 sec/batch
Epoch 14/20  Iteration 2389/3560 Training loss: 1.4305 0.9083 sec/batch
Epoch 14/20  Iteration 2390/3560 Training loss: 1.4307 0.8598 sec/batch
Epoch 14/20  Iteration 2391/3560 Training loss: 1.4307 0.8699 sec/batch
Epoch 14/20  Iteration 2392/3560 Training loss: 1.4305 0.8562 sec/batch
Epoch 14/20  Iteration 2393/3560 Training loss: 1.4298 0.8467 sec/batch
Epoch 14/20  Iteration 2394/3560 Training loss: 1.4299 0.8566 sec/batch
Epoch 14/20  Iteration 2395/3560 Training loss: 1.4294 0.8763 sec/batch
Epoch 14/20  Iteration 2396/3560 Training loss: 1.4294 0.8914 sec/batch
Epoch 14/20  Iteration 2397/3560 Training loss: 1.4290 0.8949 sec/batch
Epoch 14/20  Iteration 2398/3560 Training loss: 1.4289 0.8646 sec/batch
Epoch 14/20  Iteration 2399/3560 Training loss: 1.4286 0.8667 sec/batch
Epoch 14/20  Iteration 2400/3560 Training loss: 1.4285 0.8544 sec/batch
Epoch 14/20  Iteration 2401/3560 Training loss: 1.4282 0.8485 sec/batch
Epoch 14/20  Iteration 2402/3560 Training loss: 1.4279 0.8650 sec/batch
Epoch 14/20  Iteration 2403/3560 Training loss: 1.4275 0.8849 sec/batch
Epoch 14/20  Iteration 2404/3560 Training loss: 1.4277 0.8627 sec/batch
Epoch 14/20  Iteration 2405/3560 Training loss: 1.4274 0.9217 sec/batch
Epoch 14/20  Iteration 2406/3560 Training loss: 1.4273 0.8442 sec/batch
Epoch 14/20  Iteration 2407/3560 Training loss: 1.4269 0.8792 sec/batch
Epoch 14/20  Iteration 2408/3560 Training loss: 1.4267 0.9006 sec/batch
Epoch 14/20  Iteration 2409/3560 Training loss: 1.4265 0.8590 sec/batch
Epoch 14/20  Iteration 2410/3560 Training loss: 1.4266 0.9565 sec/batch
Epoch 14/20  Iteration 2411/3560 Training loss: 1.4265 0.9147 sec/batch
Epoch 14/20  Iteration 2412/3560 Training loss: 1.4262 0.9597 sec/batch
Epoch 14/20  Iteration 2413/3560 Training loss: 1.4259 0.9186 sec/batch
Epoch 14/20  Iteration 2414/3560 Training loss: 1.4256 0.9375 sec/batch
Epoch 14/20  Iteration 2415/3560 Training loss: 1.4256 0.9218 sec/batch
Epoch 14/20  Iteration 2416/3560 Training loss: 1.4255 0.8443 sec/batch
Epoch 14/20  Iteration 2417/3560 Training loss: 1.4254 0.8668 sec/batch
Epoch 14/20  Iteration 2418/3560 Training loss: 1.4253 0.8422 sec/batch
Epoch 14/20  Iteration 2419/3560 Training loss: 1.4252 0.8851 sec/batch
Epoch 14/20  Iteration 2420/3560 Training loss: 1.4250 0.8481 sec/batch
Epoch 14/20  Iteration 2421/3560 Training loss: 1.4250 0.8625 sec/batch
Epoch 14/20  Iteration 2422/3560 Training loss: 1.4249 0.9020 sec/batch
Epoch 14/20  Iteration 2423/3560 Training loss: 1.4248 0.8989 sec/batch
Epoch 14/20  Iteration 2424/3560 Training loss: 1.4248 1.1780 sec/batch
Epoch 14/20  Iteration 2425/3560 Training loss: 1.4246 1.3880 sec/batch
Epoch 14/20  Iteration 2426/3560 Training loss: 1.4246 1.3503 sec/batch
Epoch 14/20  Iteration 2427/3560 Training loss: 1.4245 1.4110 sec/batch
Epoch 14/20  Iteration 2428/3560 Training loss: 1.4243 1.2205 sec/batch
Epoch 14/20  Iteration 2429/3560 Training loss: 1.4240 0.8936 sec/batch
Epoch 14/20  Iteration 2430/3560 Training loss: 1.4236 0.9250 sec/batch
Epoch 14/20  Iteration 2431/3560 Training loss: 1.4234 1.2249 sec/batch
Epoch 14/20  Iteration 2432/3560 Training loss: 1.4234 1.2257 sec/batch
Epoch 14/20  Iteration 2433/3560 Training loss: 1.4234 0.8893 sec/batch
Epoch 14/20  Iteration 2434/3560 Training loss: 1.4233 0.9466 sec/batch
Epoch 14/20  Iteration 2435/3560 Training loss: 1.4233 0.9026 sec/batch
Epoch 14/20  Iteration 2436/3560 Training loss: 1.4230 0.9288 sec/batch
Epoch 14/20  Iteration 2437/3560 Training loss: 1.4226 1.0312 sec/batch
Epoch 14/20  Iteration 2438/3560 Training loss: 1.4226 0.8725 sec/batch
Epoch 14/20  Iteration 2439/3560 Training loss: 1.4226 0.8589 sec/batch
Epoch 14/20  Iteration 2440/3560 Training loss: 1.4222 1.0074 sec/batch
Epoch 14/20  Iteration 2441/3560 Training loss: 1.4224 1.1553 sec/batch
Epoch 14/20  Iteration 2442/3560 Training loss: 1.4224 1.0023 sec/batch
Epoch 14/20  Iteration 2443/3560 Training loss: 1.4222 0.9245 sec/batch
Epoch 14/20  Iteration 2444/3560 Training loss: 1.4220 0.8367 sec/batch
Epoch 14/20  Iteration 2445/3560 Training loss: 1.4216 0.9169 sec/batch
Epoch 14/20  Iteration 2446/3560 Training loss: 1.4215 0.9514 sec/batch
Epoch 14/20  Iteration 2447/3560 Training loss: 1.4216 0.9707 sec/batch
Epoch 14/20  Iteration 2448/3560 Training loss: 1.4216 0.9573 sec/batch
Epoch 14/20  Iteration 2449/3560 Training loss: 1.4217 0.8876 sec/batch
Epoch 14/20  Iteration 2450/3560 Training loss: 1.4218 0.9108 sec/batch
Epoch 14/20  Iteration 2451/3560 Training loss: 1.4219 0.8631 sec/batch
Epoch 14/20  Iteration 2452/3560 Training loss: 1.4220 0.8556 sec/batch
Epoch 14/20  Iteration 2453/3560 Training loss: 1.4221 0.8619 sec/batch
Epoch 14/20  Iteration 2454/3560 Training loss: 1.4220 0.9080 sec/batch
Epoch 14/20  Iteration 2455/3560 Training loss: 1.4224 0.8548 sec/batch
Epoch 14/20  Iteration 2456/3560 Training loss: 1.4224 0.8592 sec/batch
Epoch 14/20  Iteration 2457/3560 Training loss: 1.4223 0.8703 sec/batch
Epoch 14/20  Iteration 2458/3560 Training loss: 1.4226 0.8931 sec/batch
Epoch 14/20  Iteration 2459/3560 Training loss: 1.4225 0.9550 sec/batch
Epoch 14/20  Iteration 2460/3560 Training loss: 1.4227 0.8919 sec/batch
Epoch 14/20  Iteration 2461/3560 Training loss: 1.4227 0.8633 sec/batch
Epoch 14/20  Iteration 2462/3560 Training loss: 1.4230 0.8492 sec/batch
Epoch 14/20  Iteration 2463/3560 Training loss: 1.4231 0.8422 sec/batch
Epoch 14/20  Iteration 2464/3560 Training loss: 1.4230 0.8425 sec/batch
Epoch 14/20  Iteration 2465/3560 Training loss: 1.4227 0.8966 sec/batch
Epoch 14/20  Iteration 2466/3560 Training loss: 1.4227 0.8443 sec/batch
Epoch 14/20  Iteration 2467/3560 Training loss: 1.4228 0.8443 sec/batch
Epoch 14/20  Iteration 2468/3560 Training loss: 1.4228 0.8860 sec/batch
Epoch 14/20  Iteration 2469/3560 Training loss: 1.4228 0.8821 sec/batch
Epoch 14/20  Iteration 2470/3560 Training loss: 1.4229 0.9067 sec/batch
Epoch 14/20  Iteration 2471/3560 Training loss: 1.4229 0.8845 sec/batch
Epoch 14/20  Iteration 2472/3560 Training loss: 1.4229 1.0917 sec/batch
Epoch 14/20  Iteration 2473/3560 Training loss: 1.4227 1.0437 sec/batch
Epoch 14/20  Iteration 2474/3560 Training loss: 1.4229 0.8701 sec/batch
Epoch 14/20  Iteration 2475/3560 Training loss: 1.4231 0.8914 sec/batch
Epoch 14/20  Iteration 2476/3560 Training loss: 1.4231 0.8726 sec/batch
Epoch 14/20  Iteration 2477/3560 Training loss: 1.4232 0.8740 sec/batch
Epoch 14/20  Iteration 2478/3560 Training loss: 1.4232 0.8691 sec/batch
Epoch 14/20  Iteration 2479/3560 Training loss: 1.4232 0.8490 sec/batch
Epoch 14/20  Iteration 2480/3560 Training loss: 1.4232 0.8737 sec/batch
Epoch 14/20  Iteration 2481/3560 Training loss: 1.4233 0.9418 sec/batch
Epoch 14/20  Iteration 2482/3560 Training loss: 1.4237 0.8563 sec/batch
Epoch 14/20  Iteration 2483/3560 Training loss: 1.4238 0.9611 sec/batch
Epoch 14/20  Iteration 2484/3560 Training loss: 1.4237 0.8651 sec/batch
Epoch 14/20  Iteration 2485/3560 Training loss: 1.4237 0.8893 sec/batch
Epoch 14/20  Iteration 2486/3560 Training loss: 1.4235 0.9024 sec/batch
Epoch 14/20  Iteration 2487/3560 Training loss: 1.4235 0.9104 sec/batch
Epoch 14/20  Iteration 2488/3560 Training loss: 1.4236 0.8970 sec/batch
Epoch 14/20  Iteration 2489/3560 Training loss: 1.4236 0.9031 sec/batch
Epoch 14/20  Iteration 2490/3560 Training loss: 1.4234 0.8682 sec/batch
Epoch 14/20  Iteration 2491/3560 Training loss: 1.4233 0.8698 sec/batch
Epoch 14/20  Iteration 2492/3560 Training loss: 1.4234 0.8501 sec/batch
Epoch 15/20  Iteration 2493/3560 Training loss: 1.5200 0.8895 sec/batch
Epoch 15/20  Iteration 2494/3560 Training loss: 1.4757 0.8654 sec/batch
Epoch 15/20  Iteration 2495/3560 Training loss: 1.4600 0.8541 sec/batch
Epoch 15/20  Iteration 2496/3560 Training loss: 1.4540 0.8500 sec/batch
Epoch 15/20  Iteration 2497/3560 Training loss: 1.4445 0.8476 sec/batch
Epoch 15/20  Iteration 2498/3560 Training loss: 1.4341 0.8488 sec/batch
Epoch 15/20  Iteration 2499/3560 Training loss: 1.4339 0.8536 sec/batch
Epoch 15/20  Iteration 2500/3560 Training loss: 1.4308 0.8486 sec/batch
Epoch 15/20  Iteration 2501/3560 Training loss: 1.4300 0.8674 sec/batch
Epoch 15/20  Iteration 2502/3560 Training loss: 1.4289 0.8622 sec/batch
Epoch 15/20  Iteration 2503/3560 Training loss: 1.4255 0.8465 sec/batch
Epoch 15/20  Iteration 2504/3560 Training loss: 1.4247 0.8619 sec/batch
Epoch 15/20  Iteration 2505/3560 Training loss: 1.4237 0.8519 sec/batch
Epoch 15/20  Iteration 2506/3560 Training loss: 1.4249 0.8734 sec/batch
Epoch 15/20  Iteration 2507/3560 Training loss: 1.4236 0.8529 sec/batch
Epoch 15/20  Iteration 2508/3560 Training loss: 1.4216 0.8525 sec/batch
Epoch 15/20  Iteration 2509/3560 Training loss: 1.4209 0.8494 sec/batch
Epoch 15/20  Iteration 2510/3560 Training loss: 1.4218 0.8471 sec/batch
Epoch 15/20  Iteration 2511/3560 Training loss: 1.4217 0.8505 sec/batch
Epoch 15/20  Iteration 2512/3560 Training loss: 1.4231 0.8463 sec/batch
Epoch 15/20  Iteration 2513/3560 Training loss: 1.4223 0.8550 sec/batch
Epoch 15/20  Iteration 2514/3560 Training loss: 1.4225 0.8787 sec/batch
Epoch 15/20  Iteration 2515/3560 Training loss: 1.4219 0.8680 sec/batch
Epoch 15/20  Iteration 2516/3560 Training loss: 1.4218 0.8544 sec/batch
Epoch 15/20  Iteration 2517/3560 Training loss: 1.4219 0.8576 sec/batch
Epoch 15/20  Iteration 2518/3560 Training loss: 1.4203 0.8729 sec/batch
Epoch 15/20  Iteration 2519/3560 Training loss: 1.4194 0.8510 sec/batch
Epoch 15/20  Iteration 2520/3560 Training loss: 1.4198 0.8610 sec/batch
Epoch 15/20  Iteration 2521/3560 Training loss: 1.4201 0.8560 sec/batch
Epoch 15/20  Iteration 2522/3560 Training loss: 1.4202 0.8497 sec/batch
Epoch 15/20  Iteration 2523/3560 Training loss: 1.4195 0.8775 sec/batch
Epoch 15/20  Iteration 2524/3560 Training loss: 1.4188 0.8594 sec/batch
Epoch 15/20  Iteration 2525/3560 Training loss: 1.4192 0.8944 sec/batch
Epoch 15/20  Iteration 2526/3560 Training loss: 1.4193 0.8772 sec/batch
Epoch 15/20  Iteration 2527/3560 Training loss: 1.4188 0.8611 sec/batch
Epoch 15/20  Iteration 2528/3560 Training loss: 1.4188 0.8647 sec/batch
Epoch 15/20  Iteration 2529/3560 Training loss: 1.4178 0.8747 sec/batch
Epoch 15/20  Iteration 2530/3560 Training loss: 1.4169 0.8670 sec/batch
Epoch 15/20  Iteration 2531/3560 Training loss: 1.4155 0.8507 sec/batch
Epoch 15/20  Iteration 2532/3560 Training loss: 1.4150 0.9325 sec/batch
Epoch 15/20  Iteration 2533/3560 Training loss: 1.4145 0.9467 sec/batch
Epoch 15/20  Iteration 2534/3560 Training loss: 1.4155 0.8837 sec/batch
Epoch 15/20  Iteration 2535/3560 Training loss: 1.4152 0.8656 sec/batch
Epoch 15/20  Iteration 2536/3560 Training loss: 1.4148 0.8580 sec/batch
Epoch 15/20  Iteration 2537/3560 Training loss: 1.4151 0.9211 sec/batch
Epoch 15/20  Iteration 2538/3560 Training loss: 1.4143 0.8714 sec/batch
Epoch 15/20  Iteration 2539/3560 Training loss: 1.4137 0.8474 sec/batch
Epoch 15/20  Iteration 2540/3560 Training loss: 1.4132 0.8840 sec/batch
Epoch 15/20  Iteration 2541/3560 Training loss: 1.4132 0.8497 sec/batch
Epoch 15/20  Iteration 2542/3560 Training loss: 1.4136 0.8651 sec/batch
Epoch 15/20  Iteration 2543/3560 Training loss: 1.4132 0.8562 sec/batch
Epoch 15/20  Iteration 2544/3560 Training loss: 1.4139 0.8710 sec/batch
Epoch 15/20  Iteration 2545/3560 Training loss: 1.4142 0.8801 sec/batch
Epoch 15/20  Iteration 2546/3560 Training loss: 1.4142 0.9568 sec/batch
Epoch 15/20  Iteration 2547/3560 Training loss: 1.4139 0.9176 sec/batch
Epoch 15/20  Iteration 2548/3560 Training loss: 1.4141 0.8443 sec/batch
Epoch 15/20  Iteration 2549/3560 Training loss: 1.4145 0.8604 sec/batch
Epoch 15/20  Iteration 2550/3560 Training loss: 1.4141 0.8513 sec/batch
Epoch 15/20  Iteration 2551/3560 Training loss: 1.4136 0.8460 sec/batch
Epoch 15/20  Iteration 2552/3560 Training loss: 1.4142 0.8511 sec/batch
Epoch 15/20  Iteration 2553/3560 Training loss: 1.4143 0.8576 sec/batch
Epoch 15/20  Iteration 2554/3560 Training loss: 1.4150 0.8551 sec/batch
Epoch 15/20  Iteration 2555/3560 Training loss: 1.4153 0.8478 sec/batch
Epoch 15/20  Iteration 2556/3560 Training loss: 1.4155 0.8469 sec/batch
Epoch 15/20  Iteration 2557/3560 Training loss: 1.4155 0.8466 sec/batch
Epoch 15/20  Iteration 2558/3560 Training loss: 1.4158 0.8446 sec/batch
Epoch 15/20  Iteration 2559/3560 Training loss: 1.4161 0.8511 sec/batch
Epoch 15/20  Iteration 2560/3560 Training loss: 1.4158 0.8461 sec/batch
Epoch 15/20  Iteration 2561/3560 Training loss: 1.4160 0.8482 sec/batch
Epoch 15/20  Iteration 2562/3560 Training loss: 1.4159 0.8536 sec/batch
Epoch 15/20  Iteration 2563/3560 Training loss: 1.4165 0.8501 sec/batch
Epoch 15/20  Iteration 2564/3560 Training loss: 1.4168 0.8540 sec/batch
Epoch 15/20  Iteration 2565/3560 Training loss: 1.4172 0.8429 sec/batch
Epoch 15/20  Iteration 2566/3560 Training loss: 1.4169 0.8474 sec/batch
Epoch 15/20  Iteration 2567/3560 Training loss: 1.4168 0.8590 sec/batch
Epoch 15/20  Iteration 2568/3560 Training loss: 1.4171 0.8505 sec/batch
Epoch 15/20  Iteration 2569/3560 Training loss: 1.4170 0.8687 sec/batch
Epoch 15/20  Iteration 2570/3560 Training loss: 1.4169 0.8687 sec/batch
Epoch 15/20  Iteration 2571/3560 Training loss: 1.4163 0.8483 sec/batch
Epoch 15/20  Iteration 2572/3560 Training loss: 1.4164 0.8512 sec/batch
Epoch 15/20  Iteration 2573/3560 Training loss: 1.4159 0.8417 sec/batch
Epoch 15/20  Iteration 2574/3560 Training loss: 1.4160 0.8575 sec/batch
Epoch 15/20  Iteration 2575/3560 Training loss: 1.4155 0.8465 sec/batch
Epoch 15/20  Iteration 2576/3560 Training loss: 1.4155 0.8456 sec/batch
Epoch 15/20  Iteration 2577/3560 Training loss: 1.4152 0.8529 sec/batch
Epoch 15/20  Iteration 2578/3560 Training loss: 1.4151 0.8511 sec/batch
Epoch 15/20  Iteration 2579/3560 Training loss: 1.4150 0.8857 sec/batch
Epoch 15/20  Iteration 2580/3560 Training loss: 1.4147 0.8481 sec/batch
Epoch 15/20  Iteration 2581/3560 Training loss: 1.4143 0.9406 sec/batch
Epoch 15/20  Iteration 2582/3560 Training loss: 1.4144 0.8566 sec/batch
Epoch 15/20  Iteration 2583/3560 Training loss: 1.4142 0.8762 sec/batch
Epoch 15/20  Iteration 2584/3560 Training loss: 1.4142 0.8486 sec/batch
Epoch 15/20  Iteration 2585/3560 Training loss: 1.4137 0.8521 sec/batch
Epoch 15/20  Iteration 2586/3560 Training loss: 1.4134 0.8695 sec/batch
Epoch 15/20  Iteration 2587/3560 Training loss: 1.4133 0.8643 sec/batch
Epoch 15/20  Iteration 2588/3560 Training loss: 1.4134 0.8647 sec/batch
Epoch 15/20  Iteration 2589/3560 Training loss: 1.4134 0.8608 sec/batch
Epoch 15/20  Iteration 2590/3560 Training loss: 1.4130 0.8700 sec/batch
Epoch 15/20  Iteration 2591/3560 Training loss: 1.4126 0.8540 sec/batch
Epoch 15/20  Iteration 2592/3560 Training loss: 1.4123 0.8510 sec/batch
Epoch 15/20  Iteration 2593/3560 Training loss: 1.4123 0.8464 sec/batch
Epoch 15/20  Iteration 2594/3560 Training loss: 1.4122 0.8571 sec/batch
Epoch 15/20  Iteration 2595/3560 Training loss: 1.4121 0.8506 sec/batch
Epoch 15/20  Iteration 2596/3560 Training loss: 1.4119 0.8477 sec/batch
Epoch 15/20  Iteration 2597/3560 Training loss: 1.4118 0.8485 sec/batch
Epoch 15/20  Iteration 2598/3560 Training loss: 1.4118 0.8465 sec/batch
Epoch 15/20  Iteration 2599/3560 Training loss: 1.4118 0.8910 sec/batch
Epoch 15/20  Iteration 2600/3560 Training loss: 1.4118 0.8472 sec/batch
Epoch 15/20  Iteration 2601/3560 Training loss: 1.4116 0.8591 sec/batch
Epoch 15/20  Iteration 2602/3560 Training loss: 1.4116 0.8550 sec/batch
Epoch 15/20  Iteration 2603/3560 Training loss: 1.4114 0.8540 sec/batch
Epoch 15/20  Iteration 2604/3560 Training loss: 1.4113 0.9252 sec/batch
Epoch 15/20  Iteration 2605/3560 Training loss: 1.4113 0.8900 sec/batch
Epoch 15/20  Iteration 2606/3560 Training loss: 1.4110 0.8564 sec/batch
Epoch 15/20  Iteration 2607/3560 Training loss: 1.4108 0.8447 sec/batch
Epoch 15/20  Iteration 2608/3560 Training loss: 1.4104 0.8532 sec/batch
Epoch 15/20  Iteration 2609/3560 Training loss: 1.4103 0.8485 sec/batch
Epoch 15/20  Iteration 2610/3560 Training loss: 1.4102 0.8601 sec/batch
Epoch 15/20  Iteration 2611/3560 Training loss: 1.4101 0.8513 sec/batch
Epoch 15/20  Iteration 2612/3560 Training loss: 1.4100 0.8549 sec/batch
Epoch 15/20  Iteration 2613/3560 Training loss: 1.4100 0.8531 sec/batch
Epoch 15/20  Iteration 2614/3560 Training loss: 1.4096 0.8472 sec/batch
Epoch 15/20  Iteration 2615/3560 Training loss: 1.4092 0.8476 sec/batch
Epoch 15/20  Iteration 2616/3560 Training loss: 1.4093 0.8505 sec/batch
Epoch 15/20  Iteration 2617/3560 Training loss: 1.4093 0.8499 sec/batch
Epoch 15/20  Iteration 2618/3560 Training loss: 1.4089 0.8701 sec/batch
Epoch 15/20  Iteration 2619/3560 Training loss: 1.4090 0.8680 sec/batch
Epoch 15/20  Iteration 2620/3560 Training loss: 1.4090 0.9179 sec/batch
Epoch 15/20  Iteration 2621/3560 Training loss: 1.4089 0.9382 sec/batch
Epoch 15/20  Iteration 2622/3560 Training loss: 1.4086 0.8865 sec/batch
Epoch 15/20  Iteration 2623/3560 Training loss: 1.4082 0.8810 sec/batch
Epoch 15/20  Iteration 2624/3560 Training loss: 1.4080 0.8537 sec/batch
Epoch 15/20  Iteration 2625/3560 Training loss: 1.4081 0.8491 sec/batch
Epoch 15/20  Iteration 2626/3560 Training loss: 1.4081 0.8493 sec/batch
Epoch 15/20  Iteration 2627/3560 Training loss: 1.4082 0.8482 sec/batch
Epoch 15/20  Iteration 2628/3560 Training loss: 1.4083 0.8481 sec/batch
Epoch 15/20  Iteration 2629/3560 Training loss: 1.4084 0.8500 sec/batch
Epoch 15/20  Iteration 2630/3560 Training loss: 1.4085 0.8524 sec/batch
Epoch 15/20  Iteration 2631/3560 Training loss: 1.4085 0.8470 sec/batch
Epoch 15/20  Iteration 2632/3560 Training loss: 1.4085 0.8863 sec/batch
Epoch 15/20  Iteration 2633/3560 Training loss: 1.4088 0.8523 sec/batch
Epoch 15/20  Iteration 2634/3560 Training loss: 1.4088 0.8670 sec/batch
Epoch 15/20  Iteration 2635/3560 Training loss: 1.4087 0.8644 sec/batch
Epoch 15/20  Iteration 2636/3560 Training loss: 1.4089 0.8589 sec/batch
Epoch 15/20  Iteration 2637/3560 Training loss: 1.4088 0.9018 sec/batch
Epoch 15/20  Iteration 2638/3560 Training loss: 1.4091 0.8617 sec/batch
Epoch 15/20  Iteration 2639/3560 Training loss: 1.4092 0.8683 sec/batch
Epoch 15/20  Iteration 2640/3560 Training loss: 1.4095 0.8804 sec/batch
Epoch 15/20  Iteration 2641/3560 Training loss: 1.4097 0.8603 sec/batch
Epoch 15/20  Iteration 2642/3560 Training loss: 1.4097 0.8467 sec/batch
Epoch 15/20  Iteration 2643/3560 Training loss: 1.4094 0.8525 sec/batch
Epoch 15/20  Iteration 2644/3560 Training loss: 1.4093 0.8603 sec/batch
Epoch 15/20  Iteration 2645/3560 Training loss: 1.4094 0.9335 sec/batch
Epoch 15/20  Iteration 2646/3560 Training loss: 1.4094 0.8978 sec/batch
Epoch 15/20  Iteration 2647/3560 Training loss: 1.4094 0.8507 sec/batch
Epoch 15/20  Iteration 2648/3560 Training loss: 1.4095 0.8577 sec/batch
Epoch 15/20  Iteration 2649/3560 Training loss: 1.4095 0.8557 sec/batch
Epoch 15/20  Iteration 2650/3560 Training loss: 1.4095 0.8463 sec/batch
Epoch 15/20  Iteration 2651/3560 Training loss: 1.4094 0.8641 sec/batch
Epoch 15/20  Iteration 2652/3560 Training loss: 1.4095 0.8546 sec/batch
Epoch 15/20  Iteration 2653/3560 Training loss: 1.4097 0.8913 sec/batch
Epoch 15/20  Iteration 2654/3560 Training loss: 1.4097 0.8488 sec/batch
Epoch 15/20  Iteration 2655/3560 Training loss: 1.4098 0.8749 sec/batch
Epoch 15/20  Iteration 2656/3560 Training loss: 1.4098 0.8579 sec/batch
Epoch 15/20  Iteration 2657/3560 Training loss: 1.4098 0.8678 sec/batch
Epoch 15/20  Iteration 2658/3560 Training loss: 1.4098 0.8648 sec/batch
Epoch 15/20  Iteration 2659/3560 Training loss: 1.4099 0.8688 sec/batch
Epoch 15/20  Iteration 2660/3560 Training loss: 1.4104 0.8898 sec/batch
Epoch 15/20  Iteration 2661/3560 Training loss: 1.4103 1.0498 sec/batch
Epoch 15/20  Iteration 2662/3560 Training loss: 1.4103 0.8973 sec/batch
Epoch 15/20  Iteration 2663/3560 Training loss: 1.4102 0.8613 sec/batch
Epoch 15/20  Iteration 2664/3560 Training loss: 1.4101 0.8641 sec/batch
Epoch 15/20  Iteration 2665/3560 Training loss: 1.4102 0.8561 sec/batch
Epoch 15/20  Iteration 2666/3560 Training loss: 1.4102 0.8599 sec/batch
Epoch 15/20  Iteration 2667/3560 Training loss: 1.4103 0.8476 sec/batch
Epoch 15/20  Iteration 2668/3560 Training loss: 1.4101 0.8648 sec/batch
Epoch 15/20  Iteration 2669/3560 Training loss: 1.4100 0.8678 sec/batch
Epoch 15/20  Iteration 2670/3560 Training loss: 1.4101 0.8571 sec/batch
Epoch 16/20  Iteration 2671/3560 Training loss: 1.4897 0.8575 sec/batch
Epoch 16/20  Iteration 2672/3560 Training loss: 1.4554 0.8433 sec/batch
Epoch 16/20  Iteration 2673/3560 Training loss: 1.4413 0.8862 sec/batch
Epoch 16/20  Iteration 2674/3560 Training loss: 1.4346 0.8650 sec/batch
Epoch 16/20  Iteration 2675/3560 Training loss: 1.4252 0.8489 sec/batch
Epoch 16/20  Iteration 2676/3560 Training loss: 1.4144 0.8688 sec/batch
Epoch 16/20  Iteration 2677/3560 Training loss: 1.4151 0.8676 sec/batch
Epoch 16/20  Iteration 2678/3560 Training loss: 1.4135 0.9022 sec/batch
Epoch 16/20  Iteration 2679/3560 Training loss: 1.4125 0.8890 sec/batch
Epoch 16/20  Iteration 2680/3560 Training loss: 1.4113 0.8855 sec/batch
Epoch 16/20  Iteration 2681/3560 Training loss: 1.4084 0.8706 sec/batch
Epoch 16/20  Iteration 2682/3560 Training loss: 1.4086 0.8508 sec/batch
Epoch 16/20  Iteration 2683/3560 Training loss: 1.4081 0.8720 sec/batch
Epoch 16/20  Iteration 2684/3560 Training loss: 1.4089 0.8612 sec/batch
Epoch 16/20  Iteration 2685/3560 Training loss: 1.4080 0.8828 sec/batch
Epoch 16/20  Iteration 2686/3560 Training loss: 1.4057 0.8522 sec/batch
Epoch 16/20  Iteration 2687/3560 Training loss: 1.4066 0.8516 sec/batch
Epoch 16/20  Iteration 2688/3560 Training loss: 1.4083 0.8458 sec/batch
Epoch 16/20  Iteration 2689/3560 Training loss: 1.4083 0.8510 sec/batch
Epoch 16/20  Iteration 2690/3560 Training loss: 1.4092 0.8444 sec/batch
Epoch 16/20  Iteration 2691/3560 Training loss: 1.4082 0.8501 sec/batch
Epoch 16/20  Iteration 2692/3560 Training loss: 1.4083 1.0667 sec/batch
Epoch 16/20  Iteration 2693/3560 Training loss: 1.4073 0.9173 sec/batch
Epoch 16/20  Iteration 2694/3560 Training loss: 1.4074 0.8738 sec/batch
Epoch 16/20  Iteration 2695/3560 Training loss: 1.4076 0.8561 sec/batch
Epoch 16/20  Iteration 2696/3560 Training loss: 1.4061 0.8587 sec/batch
Epoch 16/20  Iteration 2697/3560 Training loss: 1.4051 0.8768 sec/batch
Epoch 16/20  Iteration 2698/3560 Training loss: 1.4057 0.8376 sec/batch
Epoch 16/20  Iteration 2699/3560 Training loss: 1.4059 0.8438 sec/batch
Epoch 16/20  Iteration 2700/3560 Training loss: 1.4060 0.8534 sec/batch
Epoch 16/20  Iteration 2701/3560 Training loss: 1.4056 0.8485 sec/batch
Epoch 16/20  Iteration 2702/3560 Training loss: 1.4047 0.8448 sec/batch
Epoch 16/20  Iteration 2703/3560 Training loss: 1.4051 0.8533 sec/batch
Epoch 16/20  Iteration 2704/3560 Training loss: 1.4055 0.8403 sec/batch
Epoch 16/20  Iteration 2705/3560 Training loss: 1.4052 0.8411 sec/batch
Epoch 16/20  Iteration 2706/3560 Training loss: 1.4048 0.8567 sec/batch
Epoch 16/20  Iteration 2707/3560 Training loss: 1.4039 0.8387 sec/batch
Epoch 16/20  Iteration 2708/3560 Training loss: 1.4028 0.8603 sec/batch
Epoch 16/20  Iteration 2709/3560 Training loss: 1.4015 0.8508 sec/batch
Epoch 16/20  Iteration 2710/3560 Training loss: 1.4011 0.8415 sec/batch
Epoch 16/20  Iteration 2711/3560 Training loss: 1.4007 0.8375 sec/batch
Epoch 16/20  Iteration 2712/3560 Training loss: 1.4016 0.8452 sec/batch
Epoch 16/20  Iteration 2713/3560 Training loss: 1.4012 0.8500 sec/batch
Epoch 16/20  Iteration 2714/3560 Training loss: 1.4007 0.8610 sec/batch
Epoch 16/20  Iteration 2715/3560 Training loss: 1.4011 0.8536 sec/batch
Epoch 16/20  Iteration 2716/3560 Training loss: 1.4003 0.8391 sec/batch
Epoch 16/20  Iteration 2717/3560 Training loss: 1.3999 0.8430 sec/batch
Epoch 16/20  Iteration 2718/3560 Training loss: 1.3995 0.8622 sec/batch
Epoch 16/20  Iteration 2719/3560 Training loss: 1.3996 0.9236 sec/batch
Epoch 16/20  Iteration 2720/3560 Training loss: 1.4000 0.9072 sec/batch
Epoch 16/20  Iteration 2721/3560 Training loss: 1.3997 0.9090 sec/batch
Epoch 16/20  Iteration 2722/3560 Training loss: 1.4004 0.8844 sec/batch
Epoch 16/20  Iteration 2723/3560 Training loss: 1.4005 0.8726 sec/batch
Epoch 16/20  Iteration 2724/3560 Training loss: 1.4008 0.8529 sec/batch
Epoch 16/20  Iteration 2725/3560 Training loss: 1.4005 0.8467 sec/batch
Epoch 16/20  Iteration 2726/3560 Training loss: 1.4007 0.9008 sec/batch
Epoch 16/20  Iteration 2727/3560 Training loss: 1.4011 0.8749 sec/batch
Epoch 16/20  Iteration 2728/3560 Training loss: 1.4006 0.8802 sec/batch
Epoch 16/20  Iteration 2729/3560 Training loss: 1.4000 0.8742 sec/batch
Epoch 16/20  Iteration 2730/3560 Training loss: 1.4004 0.8943 sec/batch
Epoch 16/20  Iteration 2731/3560 Training loss: 1.4006 0.9092 sec/batch
Epoch 16/20  Iteration 2732/3560 Training loss: 1.4014 0.8667 sec/batch
Epoch 16/20  Iteration 2733/3560 Training loss: 1.4017 0.8711 sec/batch
Epoch 16/20  Iteration 2734/3560 Training loss: 1.4019 0.8564 sec/batch
Epoch 16/20  Iteration 2735/3560 Training loss: 1.4018 0.8565 sec/batch
Epoch 16/20  Iteration 2736/3560 Training loss: 1.4021 0.8656 sec/batch
Epoch 16/20  Iteration 2737/3560 Training loss: 1.4024 0.8747 sec/batch
Epoch 16/20  Iteration 2738/3560 Training loss: 1.4020 0.8665 sec/batch
Epoch 16/20  Iteration 2739/3560 Training loss: 1.4021 0.8567 sec/batch
Epoch 16/20  Iteration 2740/3560 Training loss: 1.4019 0.8791 sec/batch
Epoch 16/20  Iteration 2741/3560 Training loss: 1.4026 0.8472 sec/batch
Epoch 16/20  Iteration 2742/3560 Training loss: 1.4029 0.9380 sec/batch
Epoch 16/20  Iteration 2743/3560 Training loss: 1.4033 0.9600 sec/batch
Epoch 16/20  Iteration 2744/3560 Training loss: 1.4029 0.8484 sec/batch
Epoch 16/20  Iteration 2745/3560 Training loss: 1.4028 0.8691 sec/batch
Epoch 16/20  Iteration 2746/3560 Training loss: 1.4030 0.8750 sec/batch
Epoch 16/20  Iteration 2747/3560 Training loss: 1.4031 0.8616 sec/batch
Epoch 16/20  Iteration 2748/3560 Training loss: 1.4029 0.8909 sec/batch
Epoch 16/20  Iteration 2749/3560 Training loss: 1.4023 0.8993 sec/batch
Epoch 16/20  Iteration 2750/3560 Training loss: 1.4022 0.8850 sec/batch
Epoch 16/20  Iteration 2751/3560 Training loss: 1.4018 0.8493 sec/batch
Epoch 16/20  Iteration 2752/3560 Training loss: 1.4018 0.9148 sec/batch
Epoch 16/20  Iteration 2753/3560 Training loss: 1.4014 0.8742 sec/batch
Epoch 16/20  Iteration 2754/3560 Training loss: 1.4015 0.8761 sec/batch
Epoch 16/20  Iteration 2755/3560 Training loss: 1.4011 0.8588 sec/batch
Epoch 16/20  Iteration 2756/3560 Training loss: 1.4011 0.8678 sec/batch
Epoch 16/20  Iteration 2757/3560 Training loss: 1.4010 0.8660 sec/batch
Epoch 16/20  Iteration 2758/3560 Training loss: 1.4007 0.9662 sec/batch
Epoch 16/20  Iteration 2759/3560 Training loss: 1.4003 0.8811 sec/batch
Epoch 16/20  Iteration 2760/3560 Training loss: 1.4003 0.8679 sec/batch
Epoch 16/20  Iteration 2761/3560 Training loss: 1.4001 0.8463 sec/batch
Epoch 16/20  Iteration 2762/3560 Training loss: 1.4000 0.8481 sec/batch
Epoch 16/20  Iteration 2763/3560 Training loss: 1.3997 0.8820 sec/batch
Epoch 16/20  Iteration 2764/3560 Training loss: 1.3994 0.9045 sec/batch
Epoch 16/20  Iteration 2765/3560 Training loss: 1.3993 0.8532 sec/batch
Epoch 16/20  Iteration 2766/3560 Training loss: 1.3993 0.8530 sec/batch
Epoch 16/20  Iteration 2767/3560 Training loss: 1.3994 0.8442 sec/batch
Epoch 16/20  Iteration 2768/3560 Training loss: 1.3991 0.8497 sec/batch
Epoch 16/20  Iteration 2769/3560 Training loss: 1.3987 0.8563 sec/batch
Epoch 16/20  Iteration 2770/3560 Training loss: 1.3984 0.8537 sec/batch
Epoch 16/20  Iteration 2771/3560 Training loss: 1.3983 0.8476 sec/batch
Epoch 16/20  Iteration 2772/3560 Training loss: 1.3982 0.8444 sec/batch
Epoch 16/20  Iteration 2773/3560 Training loss: 1.3982 0.8996 sec/batch
Epoch 16/20  Iteration 2774/3560 Training loss: 1.3981 0.8504 sec/batch
Epoch 16/20  Iteration 2775/3560 Training loss: 1.3981 0.8614 sec/batch
Epoch 16/20  Iteration 2776/3560 Training loss: 1.3980 0.8950 sec/batch
Epoch 16/20  Iteration 2777/3560 Training loss: 1.3979 0.8726 sec/batch
Epoch 16/20  Iteration 2778/3560 Training loss: 1.3980 0.8532 sec/batch
Epoch 16/20  Iteration 2779/3560 Training loss: 1.3978 0.8580 sec/batch
Epoch 16/20  Iteration 2780/3560 Training loss: 1.3979 0.8554 sec/batch
Epoch 16/20  Iteration 2781/3560 Training loss: 1.3977 0.8552 sec/batch
Epoch 16/20  Iteration 2782/3560 Training loss: 1.3977 1.0288 sec/batch
Epoch 16/20  Iteration 2783/3560 Training loss: 1.3976 1.0648 sec/batch
Epoch 16/20  Iteration 2784/3560 Training loss: 1.3974 0.8652 sec/batch
Epoch 16/20  Iteration 2785/3560 Training loss: 1.3971 0.8457 sec/batch
Epoch 16/20  Iteration 2786/3560 Training loss: 1.3968 0.8513 sec/batch
Epoch 16/20  Iteration 2787/3560 Training loss: 1.3967 0.8729 sec/batch
Epoch 16/20  Iteration 2788/3560 Training loss: 1.3966 0.8948 sec/batch
Epoch 16/20  Iteration 2789/3560 Training loss: 1.3965 0.8673 sec/batch
Epoch 16/20  Iteration 2790/3560 Training loss: 1.3964 0.8973 sec/batch
Epoch 16/20  Iteration 2791/3560 Training loss: 1.3963 0.8763 sec/batch
Epoch 16/20  Iteration 2792/3560 Training loss: 1.3961 0.8428 sec/batch
Epoch 16/20  Iteration 2793/3560 Training loss: 1.3957 0.8479 sec/batch
Epoch 16/20  Iteration 2794/3560 Training loss: 1.3958 0.8473 sec/batch
Epoch 16/20  Iteration 2795/3560 Training loss: 1.3957 0.8871 sec/batch
Epoch 16/20  Iteration 2796/3560 Training loss: 1.3953 0.8451 sec/batch
Epoch 16/20  Iteration 2797/3560 Training loss: 1.3954 0.8456 sec/batch
Epoch 16/20  Iteration 2798/3560 Training loss: 1.3954 0.8416 sec/batch
Epoch 16/20  Iteration 2799/3560 Training loss: 1.3953 0.8463 sec/batch
Epoch 16/20  Iteration 2800/3560 Training loss: 1.3950 0.8467 sec/batch
Epoch 16/20  Iteration 2801/3560 Training loss: 1.3946 0.8560 sec/batch
Epoch 16/20  Iteration 2802/3560 Training loss: 1.3944 0.8487 sec/batch
Epoch 16/20  Iteration 2803/3560 Training loss: 1.3946 0.8405 sec/batch
Epoch 16/20  Iteration 2804/3560 Training loss: 1.3947 0.8500 sec/batch
Epoch 16/20  Iteration 2805/3560 Training loss: 1.3946 0.8752 sec/batch
Epoch 16/20  Iteration 2806/3560 Training loss: 1.3946 0.9493 sec/batch
Epoch 16/20  Iteration 2807/3560 Training loss: 1.3947 0.8789 sec/batch
Epoch 16/20  Iteration 2808/3560 Training loss: 1.3948 0.8503 sec/batch
Epoch 16/20  Iteration 2809/3560 Training loss: 1.3949 0.9102 sec/batch
Epoch 16/20  Iteration 2810/3560 Training loss: 1.3949 0.8941 sec/batch
Epoch 16/20  Iteration 2811/3560 Training loss: 1.3952 0.8752 sec/batch
Epoch 16/20  Iteration 2812/3560 Training loss: 1.3952 0.8437 sec/batch
Epoch 16/20  Iteration 2813/3560 Training loss: 1.3952 0.8442 sec/batch
Epoch 16/20  Iteration 2814/3560 Training loss: 1.3954 0.8985 sec/batch
Epoch 16/20  Iteration 2815/3560 Training loss: 1.3953 0.9116 sec/batch
Epoch 16/20  Iteration 2816/3560 Training loss: 1.3956 0.9187 sec/batch
Epoch 16/20  Iteration 2817/3560 Training loss: 1.3956 0.8738 sec/batch
Epoch 16/20  Iteration 2818/3560 Training loss: 1.3958 0.8551 sec/batch
Epoch 16/20  Iteration 2819/3560 Training loss: 1.3960 0.8771 sec/batch
Epoch 16/20  Iteration 2820/3560 Training loss: 1.3959 0.8705 sec/batch
Epoch 16/20  Iteration 2821/3560 Training loss: 1.3956 0.9126 sec/batch
Epoch 16/20  Iteration 2822/3560 Training loss: 1.3955 0.8529 sec/batch
Epoch 16/20  Iteration 2823/3560 Training loss: 1.3956 0.8635 sec/batch
Epoch 16/20  Iteration 2824/3560 Training loss: 1.3956 0.8894 sec/batch
Epoch 16/20  Iteration 2825/3560 Training loss: 1.3956 0.9088 sec/batch
Epoch 16/20  Iteration 2826/3560 Training loss: 1.3957 0.8483 sec/batch
Epoch 16/20  Iteration 2827/3560 Training loss: 1.3957 0.8957 sec/batch
Epoch 16/20  Iteration 2828/3560 Training loss: 1.3956 0.8490 sec/batch
Epoch 16/20  Iteration 2829/3560 Training loss: 1.3954 0.8645 sec/batch
Epoch 16/20  Iteration 2830/3560 Training loss: 1.3955 0.8583 sec/batch
Epoch 16/20  Iteration 2831/3560 Training loss: 1.3957 0.8470 sec/batch
Epoch 16/20  Iteration 2832/3560 Training loss: 1.3958 0.8541 sec/batch
Epoch 16/20  Iteration 2833/3560 Training loss: 1.3958 0.9251 sec/batch
Epoch 16/20  Iteration 2834/3560 Training loss: 1.3958 0.8590 sec/batch
Epoch 16/20  Iteration 2835/3560 Training loss: 1.3958 0.8584 sec/batch
Epoch 16/20  Iteration 2836/3560 Training loss: 1.3958 0.8643 sec/batch
Epoch 16/20  Iteration 2837/3560 Training loss: 1.3959 0.8524 sec/batch
Epoch 16/20  Iteration 2838/3560 Training loss: 1.3964 0.8575 sec/batch
Epoch 16/20  Iteration 2839/3560 Training loss: 1.3965 0.8594 sec/batch
Epoch 16/20  Iteration 2840/3560 Training loss: 1.3965 0.9045 sec/batch
Epoch 16/20  Iteration 2841/3560 Training loss: 1.3964 0.8462 sec/batch
Epoch 16/20  Iteration 2842/3560 Training loss: 1.3963 0.8609 sec/batch
Epoch 16/20  Iteration 2843/3560 Training loss: 1.3964 0.8680 sec/batch
Epoch 16/20  Iteration 2844/3560 Training loss: 1.3964 0.8970 sec/batch
Epoch 16/20  Iteration 2845/3560 Training loss: 1.3965 0.8689 sec/batch
Epoch 16/20  Iteration 2846/3560 Training loss: 1.3964 0.8538 sec/batch
Epoch 16/20  Iteration 2847/3560 Training loss: 1.3963 0.8571 sec/batch
Epoch 16/20  Iteration 2848/3560 Training loss: 1.3964 0.8475 sec/batch
Epoch 17/20  Iteration 2849/3560 Training loss: 1.4825 0.8554 sec/batch
Epoch 17/20  Iteration 2850/3560 Training loss: 1.4447 0.8622 sec/batch
Epoch 17/20  Iteration 2851/3560 Training loss: 1.4282 0.8917 sec/batch
Epoch 17/20  Iteration 2852/3560 Training loss: 1.4224 0.8517 sec/batch
Epoch 17/20  Iteration 2853/3560 Training loss: 1.4130 0.9286 sec/batch
Epoch 17/20  Iteration 2854/3560 Training loss: 1.4021 0.8534 sec/batch
Epoch 17/20  Iteration 2855/3560 Training loss: 1.4021 0.9006 sec/batch
Epoch 17/20  Iteration 2856/3560 Training loss: 1.3995 0.8798 sec/batch
Epoch 17/20  Iteration 2857/3560 Training loss: 1.3985 0.9076 sec/batch
Epoch 17/20  Iteration 2858/3560 Training loss: 1.3985 0.8845 sec/batch
Epoch 17/20  Iteration 2859/3560 Training loss: 1.3950 0.8645 sec/batch
Epoch 17/20  Iteration 2860/3560 Training loss: 1.3958 0.8727 sec/batch
Epoch 17/20  Iteration 2861/3560 Training loss: 1.3948 0.9229 sec/batch
Epoch 17/20  Iteration 2862/3560 Training loss: 1.3952 0.8463 sec/batch
Epoch 17/20  Iteration 2863/3560 Training loss: 1.3940 0.9152 sec/batch
Epoch 17/20  Iteration 2864/3560 Training loss: 1.3923 0.8740 sec/batch
Epoch 17/20  Iteration 2865/3560 Training loss: 1.3924 0.8971 sec/batch
Epoch 17/20  Iteration 2866/3560 Training loss: 1.3937 0.8886 sec/batch
Epoch 17/20  Iteration 2867/3560 Training loss: 1.3933 0.8627 sec/batch
Epoch 17/20  Iteration 2868/3560 Training loss: 1.3943 0.9424 sec/batch
Epoch 17/20  Iteration 2869/3560 Training loss: 1.3936 0.8506 sec/batch
Epoch 17/20  Iteration 2870/3560 Training loss: 1.3942 0.8770 sec/batch
Epoch 17/20  Iteration 2871/3560 Training loss: 1.3933 0.9103 sec/batch
Epoch 17/20  Iteration 2872/3560 Training loss: 1.3935 0.8677 sec/batch
Epoch 17/20  Iteration 2873/3560 Training loss: 1.3935 0.8560 sec/batch
Epoch 17/20  Iteration 2874/3560 Training loss: 1.3918 0.8506 sec/batch
Epoch 17/20  Iteration 2875/3560 Training loss: 1.3909 0.8982 sec/batch
Epoch 17/20  Iteration 2876/3560 Training loss: 1.3916 0.8540 sec/batch
Epoch 17/20  Iteration 2877/3560 Training loss: 1.3919 0.8583 sec/batch
Epoch 17/20  Iteration 2878/3560 Training loss: 1.3923 0.8597 sec/batch
Epoch 17/20  Iteration 2879/3560 Training loss: 1.3920 0.9068 sec/batch
Epoch 17/20  Iteration 2880/3560 Training loss: 1.3913 0.9205 sec/batch
Epoch 17/20  Iteration 2881/3560 Training loss: 1.3914 0.8741 sec/batch
Epoch 17/20  Iteration 2882/3560 Training loss: 1.3917 0.8741 sec/batch
Epoch 17/20  Iteration 2883/3560 Training loss: 1.3913 0.8508 sec/batch
Epoch 17/20  Iteration 2884/3560 Training loss: 1.3912 0.8844 sec/batch
Epoch 17/20  Iteration 2885/3560 Training loss: 1.3904 0.8785 sec/batch
Epoch 17/20  Iteration 2886/3560 Training loss: 1.3893 0.9587 sec/batch
Epoch 17/20  Iteration 2887/3560 Training loss: 1.3880 0.8824 sec/batch
Epoch 17/20  Iteration 2888/3560 Training loss: 1.3876 0.9112 sec/batch
Epoch 17/20  Iteration 2889/3560 Training loss: 1.3873 0.8831 sec/batch
Epoch 17/20  Iteration 2890/3560 Training loss: 1.3884 0.8715 sec/batch
Epoch 17/20  Iteration 2891/3560 Training loss: 1.3882 0.8502 sec/batch
Epoch 17/20  Iteration 2892/3560 Training loss: 1.3878 0.9042 sec/batch
Epoch 17/20  Iteration 2893/3560 Training loss: 1.3881 0.8919 sec/batch
Epoch 17/20  Iteration 2894/3560 Training loss: 1.3874 0.9555 sec/batch
Epoch 17/20  Iteration 2895/3560 Training loss: 1.3872 0.9177 sec/batch
Epoch 17/20  Iteration 2896/3560 Training loss: 1.3868 0.9677 sec/batch
Epoch 17/20  Iteration 2897/3560 Training loss: 1.3867 0.9077 sec/batch
Epoch 17/20  Iteration 2898/3560 Training loss: 1.3872 0.8820 sec/batch
Epoch 17/20  Iteration 2899/3560 Training loss: 1.3869 0.8505 sec/batch
Epoch 17/20  Iteration 2900/3560 Training loss: 1.3878 0.8715 sec/batch
Epoch 17/20  Iteration 2901/3560 Training loss: 1.3878 0.8500 sec/batch
Epoch 17/20  Iteration 2902/3560 Training loss: 1.3878 0.8718 sec/batch
Epoch 17/20  Iteration 2903/3560 Training loss: 1.3874 0.8430 sec/batch
Epoch 17/20  Iteration 2904/3560 Training loss: 1.3875 0.8728 sec/batch
Epoch 17/20  Iteration 2905/3560 Training loss: 1.3880 0.8589 sec/batch
Epoch 17/20  Iteration 2906/3560 Training loss: 1.3877 0.8595 sec/batch
Epoch 17/20  Iteration 2907/3560 Training loss: 1.3873 0.8472 sec/batch
Epoch 17/20  Iteration 2908/3560 Training loss: 1.3879 0.8471 sec/batch
Epoch 17/20  Iteration 2909/3560 Training loss: 1.3880 0.8429 sec/batch
Epoch 17/20  Iteration 2910/3560 Training loss: 1.3889 0.8498 sec/batch
Epoch 17/20  Iteration 2911/3560 Training loss: 1.3892 0.8586 sec/batch
Epoch 17/20  Iteration 2912/3560 Training loss: 1.3895 0.8642 sec/batch
Epoch 17/20  Iteration 2913/3560 Training loss: 1.3896 0.8455 sec/batch
Epoch 17/20  Iteration 2914/3560 Training loss: 1.3898 0.8458 sec/batch
Epoch 17/20  Iteration 2915/3560 Training loss: 1.3902 0.8506 sec/batch
Epoch 17/20  Iteration 2916/3560 Training loss: 1.3899 0.8533 sec/batch
Epoch 17/20  Iteration 2917/3560 Training loss: 1.3899 0.8454 sec/batch
Epoch 17/20  Iteration 2918/3560 Training loss: 1.3898 0.8477 sec/batch
Epoch 17/20  Iteration 2919/3560 Training loss: 1.3905 0.8984 sec/batch
Epoch 17/20  Iteration 2920/3560 Training loss: 1.3909 0.8484 sec/batch
Epoch 17/20  Iteration 2921/3560 Training loss: 1.3913 0.8682 sec/batch
Epoch 17/20  Iteration 2922/3560 Training loss: 1.3910 0.8562 sec/batch
Epoch 17/20  Iteration 2923/3560 Training loss: 1.3909 0.8484 sec/batch
Epoch 17/20  Iteration 2924/3560 Training loss: 1.3911 0.8537 sec/batch
Epoch 17/20  Iteration 2925/3560 Training loss: 1.3910 0.8559 sec/batch
Epoch 17/20  Iteration 2926/3560 Training loss: 1.3908 0.8894 sec/batch
Epoch 17/20  Iteration 2927/3560 Training loss: 1.3901 0.8987 sec/batch
Epoch 17/20  Iteration 2928/3560 Training loss: 1.3900 0.8448 sec/batch
Epoch 17/20  Iteration 2929/3560 Training loss: 1.3895 0.8434 sec/batch
Epoch 17/20  Iteration 2930/3560 Training loss: 1.3896 0.8528 sec/batch
Epoch 17/20  Iteration 2931/3560 Training loss: 1.3891 0.8651 sec/batch
Epoch 17/20  Iteration 2932/3560 Training loss: 1.3890 0.8604 sec/batch
Epoch 17/20  Iteration 2933/3560 Training loss: 1.3887 0.8703 sec/batch
Epoch 17/20  Iteration 2934/3560 Training loss: 1.3887 0.8643 sec/batch
Epoch 17/20  Iteration 2935/3560 Training loss: 1.3885 0.8548 sec/batch
Epoch 17/20  Iteration 2936/3560 Training loss: 1.3882 0.8565 sec/batch
Epoch 17/20  Iteration 2937/3560 Training loss: 1.3878 0.8449 sec/batch
Epoch 17/20  Iteration 2938/3560 Training loss: 1.3878 0.9004 sec/batch
Epoch 17/20  Iteration 2939/3560 Training loss: 1.3877 0.8528 sec/batch
Epoch 17/20  Iteration 2940/3560 Training loss: 1.3876 0.8464 sec/batch
Epoch 17/20  Iteration 2941/3560 Training loss: 1.3872 0.8534 sec/batch
Epoch 17/20  Iteration 2942/3560 Training loss: 1.3870 0.8497 sec/batch
Epoch 17/20  Iteration 2943/3560 Training loss: 1.3869 0.8453 sec/batch
Epoch 17/20  Iteration 2944/3560 Training loss: 1.3870 0.8499 sec/batch
Epoch 17/20  Iteration 2945/3560 Training loss: 1.3871 0.8480 sec/batch
Epoch 17/20  Iteration 2946/3560 Training loss: 1.3868 0.8584 sec/batch
Epoch 17/20  Iteration 2947/3560 Training loss: 1.3865 0.8607 sec/batch
Epoch 17/20  Iteration 2948/3560 Training loss: 1.3862 0.8890 sec/batch
Epoch 17/20  Iteration 2949/3560 Training loss: 1.3862 0.8502 sec/batch
Epoch 17/20  Iteration 2950/3560 Training loss: 1.3861 0.8541 sec/batch
Epoch 17/20  Iteration 2951/3560 Training loss: 1.3860 0.8584 sec/batch
Epoch 17/20  Iteration 2952/3560 Training loss: 1.3859 0.8541 sec/batch
Epoch 17/20  Iteration 2953/3560 Training loss: 1.3858 0.8471 sec/batch
Epoch 17/20  Iteration 2954/3560 Training loss: 1.3857 0.8722 sec/batch
Epoch 17/20  Iteration 2955/3560 Training loss: 1.3856 0.8487 sec/batch
Epoch 17/20  Iteration 2956/3560 Training loss: 1.3857 0.8533 sec/batch
Epoch 17/20  Iteration 2957/3560 Training loss: 1.3856 0.8997 sec/batch
Epoch 17/20  Iteration 2958/3560 Training loss: 1.3857 0.8524 sec/batch
Epoch 17/20  Iteration 2959/3560 Training loss: 1.3855 0.8524 sec/batch
Epoch 17/20  Iteration 2960/3560 Training loss: 1.3855 0.8467 sec/batch
Epoch 17/20  Iteration 2961/3560 Training loss: 1.3855 0.8437 sec/batch
Epoch 17/20  Iteration 2962/3560 Training loss: 1.3853 0.8555 sec/batch
Epoch 17/20  Iteration 2963/3560 Training loss: 1.3850 0.9178 sec/batch
Epoch 17/20  Iteration 2964/3560 Training loss: 1.3847 0.8728 sec/batch
Epoch 17/20  Iteration 2965/3560 Training loss: 1.3846 0.8558 sec/batch
Epoch 17/20  Iteration 2966/3560 Training loss: 1.3846 0.8583 sec/batch
Epoch 17/20  Iteration 2967/3560 Training loss: 1.3845 0.8576 sec/batch
Epoch 17/20  Iteration 2968/3560 Training loss: 1.3844 0.8641 sec/batch
Epoch 17/20  Iteration 2969/3560 Training loss: 1.3843 0.9086 sec/batch
Epoch 17/20  Iteration 2970/3560 Training loss: 1.3840 0.9196 sec/batch
Epoch 17/20  Iteration 2971/3560 Training loss: 1.3837 0.8527 sec/batch
Epoch 17/20  Iteration 2972/3560 Training loss: 1.3837 0.8693 sec/batch
Epoch 17/20  Iteration 2973/3560 Training loss: 1.3836 0.8616 sec/batch
Epoch 17/20  Iteration 2974/3560 Training loss: 1.3832 0.8592 sec/batch
Epoch 17/20  Iteration 2975/3560 Training loss: 1.3833 0.8852 sec/batch
Epoch 17/20  Iteration 2976/3560 Training loss: 1.3832 0.9007 sec/batch
Epoch 17/20  Iteration 2977/3560 Training loss: 1.3831 0.8684 sec/batch
Epoch 17/20  Iteration 2978/3560 Training loss: 1.3828 0.8427 sec/batch
Epoch 17/20  Iteration 2979/3560 Training loss: 1.3824 0.8929 sec/batch
Epoch 17/20  Iteration 2980/3560 Training loss: 1.3822 0.8625 sec/batch
Epoch 17/20  Iteration 2981/3560 Training loss: 1.3823 0.8830 sec/batch
Epoch 17/20  Iteration 2982/3560 Training loss: 1.3824 0.8610 sec/batch
Epoch 17/20  Iteration 2983/3560 Training loss: 1.3824 0.8544 sec/batch
Epoch 17/20  Iteration 2984/3560 Training loss: 1.3824 0.8386 sec/batch
Epoch 17/20  Iteration 2985/3560 Training loss: 1.3825 0.9029 sec/batch
Epoch 17/20  Iteration 2986/3560 Training loss: 1.3826 0.8388 sec/batch
Epoch 17/20  Iteration 2987/3560 Training loss: 1.3827 0.8937 sec/batch
Epoch 17/20  Iteration 2988/3560 Training loss: 1.3826 0.8720 sec/batch
Epoch 17/20  Iteration 2989/3560 Training loss: 1.3829 0.8567 sec/batch
Epoch 17/20  Iteration 2990/3560 Training loss: 1.3830 0.9266 sec/batch
Epoch 17/20  Iteration 2991/3560 Training loss: 1.3829 0.9052 sec/batch
Epoch 17/20  Iteration 2992/3560 Training loss: 1.3832 0.9040 sec/batch
Epoch 17/20  Iteration 2993/3560 Training loss: 1.3831 0.9106 sec/batch
Epoch 17/20  Iteration 2994/3560 Training loss: 1.3833 0.8810 sec/batch
Epoch 17/20  Iteration 2995/3560 Training loss: 1.3833 0.8901 sec/batch
Epoch 17/20  Iteration 2996/3560 Training loss: 1.3836 0.8660 sec/batch
Epoch 17/20  Iteration 2997/3560 Training loss: 1.3838 0.8637 sec/batch
Epoch 17/20  Iteration 2998/3560 Training loss: 1.3837 0.8519 sec/batch
Epoch 17/20  Iteration 2999/3560 Training loss: 1.3834 0.8405 sec/batch
Epoch 17/20  Iteration 3000/3560 Training loss: 1.3833 0.8525 sec/batch
Epoch 17/20  Iteration 3001/3560 Training loss: 1.3833 0.8490 sec/batch
Epoch 17/20  Iteration 3002/3560 Training loss: 1.3834 0.8462 sec/batch
Epoch 17/20  Iteration 3003/3560 Training loss: 1.3833 0.8472 sec/batch
Epoch 17/20  Iteration 3004/3560 Training loss: 1.3834 0.8479 sec/batch
Epoch 17/20  Iteration 3005/3560 Training loss: 1.3834 0.8545 sec/batch
Epoch 17/20  Iteration 3006/3560 Training loss: 1.3834 0.8538 sec/batch
Epoch 17/20  Iteration 3007/3560 Training loss: 1.3832 0.8444 sec/batch
Epoch 17/20  Iteration 3008/3560 Training loss: 1.3834 0.8457 sec/batch
Epoch 17/20  Iteration 3009/3560 Training loss: 1.3835 0.8516 sec/batch
Epoch 17/20  Iteration 3010/3560 Training loss: 1.3835 0.8911 sec/batch
Epoch 17/20  Iteration 3011/3560 Training loss: 1.3835 0.8523 sec/batch
Epoch 17/20  Iteration 3012/3560 Training loss: 1.3835 0.8486 sec/batch
Epoch 17/20  Iteration 3013/3560 Training loss: 1.3835 0.8482 sec/batch
Epoch 17/20  Iteration 3014/3560 Training loss: 1.3835 0.8530 sec/batch
Epoch 17/20  Iteration 3015/3560 Training loss: 1.3836 0.8457 sec/batch
Epoch 17/20  Iteration 3016/3560 Training loss: 1.3840 0.8687 sec/batch
Epoch 17/20  Iteration 3017/3560 Training loss: 1.3841 0.8541 sec/batch
Epoch 17/20  Iteration 3018/3560 Training loss: 1.3841 0.8472 sec/batch
Epoch 17/20  Iteration 3019/3560 Training loss: 1.3840 0.8515 sec/batch
Epoch 17/20  Iteration 3020/3560 Training loss: 1.3839 0.8472 sec/batch
Epoch 17/20  Iteration 3021/3560 Training loss: 1.3840 0.8478 sec/batch
Epoch 17/20  Iteration 3022/3560 Training loss: 1.3841 0.8503 sec/batch
Epoch 17/20  Iteration 3023/3560 Training loss: 1.3841 0.8520 sec/batch
Epoch 17/20  Iteration 3024/3560 Training loss: 1.3840 0.8474 sec/batch
Epoch 17/20  Iteration 3025/3560 Training loss: 1.3839 0.8569 sec/batch
Epoch 17/20  Iteration 3026/3560 Training loss: 1.3840 0.8555 sec/batch
Epoch 18/20  Iteration 3027/3560 Training loss: 1.4670 0.8486 sec/batch
Epoch 18/20  Iteration 3028/3560 Training loss: 1.4308 0.8677 sec/batch
Epoch 18/20  Iteration 3029/3560 Training loss: 1.4199 0.8861 sec/batch
Epoch 18/20  Iteration 3030/3560 Training loss: 1.4125 0.8516 sec/batch
Epoch 18/20  Iteration 3031/3560 Training loss: 1.4033 0.8999 sec/batch
Epoch 18/20  Iteration 3032/3560 Training loss: 1.3927 1.0080 sec/batch
Epoch 18/20  Iteration 3033/3560 Training loss: 1.3936 0.8918 sec/batch
Epoch 18/20  Iteration 3034/3560 Training loss: 1.3908 0.9125 sec/batch
Epoch 18/20  Iteration 3035/3560 Training loss: 1.3906 0.8639 sec/batch
Epoch 18/20  Iteration 3036/3560 Training loss: 1.3901 0.9118 sec/batch
Epoch 18/20  Iteration 3037/3560 Training loss: 1.3873 0.8698 sec/batch
Epoch 18/20  Iteration 3038/3560 Training loss: 1.3865 0.8604 sec/batch
Epoch 18/20  Iteration 3039/3560 Training loss: 1.3861 0.8671 sec/batch
Epoch 18/20  Iteration 3040/3560 Training loss: 1.3873 0.8935 sec/batch
Epoch 18/20  Iteration 3041/3560 Training loss: 1.3859 0.8612 sec/batch
Epoch 18/20  Iteration 3042/3560 Training loss: 1.3839 0.9030 sec/batch
Epoch 18/20  Iteration 3043/3560 Training loss: 1.3841 0.8842 sec/batch
Epoch 18/20  Iteration 3044/3560 Training loss: 1.3856 0.8494 sec/batch
Epoch 18/20  Iteration 3045/3560 Training loss: 1.3854 0.8675 sec/batch
Epoch 18/20  Iteration 3046/3560 Training loss: 1.3869 0.8515 sec/batch
Epoch 18/20  Iteration 3047/3560 Training loss: 1.3861 0.8915 sec/batch
Epoch 18/20  Iteration 3048/3560 Training loss: 1.3865 0.8510 sec/batch
Epoch 18/20  Iteration 3049/3560 Training loss: 1.3856 0.8439 sec/batch
Epoch 18/20  Iteration 3050/3560 Training loss: 1.3859 0.8866 sec/batch
Epoch 18/20  Iteration 3051/3560 Training loss: 1.3857 0.8779 sec/batch
Epoch 18/20  Iteration 3052/3560 Training loss: 1.3840 0.8526 sec/batch
Epoch 18/20  Iteration 3053/3560 Training loss: 1.3830 0.8499 sec/batch
Epoch 18/20  Iteration 3054/3560 Training loss: 1.3837 0.8628 sec/batch
Epoch 18/20  Iteration 3055/3560 Training loss: 1.3841 0.8653 sec/batch
Epoch 18/20  Iteration 3056/3560 Training loss: 1.3843 0.8680 sec/batch
Epoch 18/20  Iteration 3057/3560 Training loss: 1.3837 0.8507 sec/batch
Epoch 18/20  Iteration 3058/3560 Training loss: 1.3831 0.8479 sec/batch
Epoch 18/20  Iteration 3059/3560 Training loss: 1.3832 0.8555 sec/batch
Epoch 18/20  Iteration 3060/3560 Training loss: 1.3834 0.8506 sec/batch
Epoch 18/20  Iteration 3061/3560 Training loss: 1.3830 0.8531 sec/batch
Epoch 18/20  Iteration 3062/3560 Training loss: 1.3830 0.8632 sec/batch
Epoch 18/20  Iteration 3063/3560 Training loss: 1.3820 0.8502 sec/batch
Epoch 18/20  Iteration 3064/3560 Training loss: 1.3813 0.8465 sec/batch
Epoch 18/20  Iteration 3065/3560 Training loss: 1.3801 0.8829 sec/batch
Epoch 18/20  Iteration 3066/3560 Training loss: 1.3798 0.9287 sec/batch
Epoch 18/20  Iteration 3067/3560 Training loss: 1.3794 0.8750 sec/batch
Epoch 18/20  Iteration 3068/3560 Training loss: 1.3803 0.8781 sec/batch
Epoch 18/20  Iteration 3069/3560 Training loss: 1.3802 0.8625 sec/batch
Epoch 18/20  Iteration 3070/3560 Training loss: 1.3796 0.8554 sec/batch
Epoch 18/20  Iteration 3071/3560 Training loss: 1.3798 0.9236 sec/batch
Epoch 18/20  Iteration 3072/3560 Training loss: 1.3790 0.8514 sec/batch
Epoch 18/20  Iteration 3073/3560 Training loss: 1.3785 0.9007 sec/batch
Epoch 18/20  Iteration 3074/3560 Training loss: 1.3781 0.9640 sec/batch
Epoch 18/20  Iteration 3075/3560 Training loss: 1.3783 0.8576 sec/batch
Epoch 18/20  Iteration 3076/3560 Training loss: 1.3788 0.8582 sec/batch
Epoch 18/20  Iteration 3077/3560 Training loss: 1.3785 0.8731 sec/batch
Epoch 18/20  Iteration 3078/3560 Training loss: 1.3792 0.8495 sec/batch
Epoch 18/20  Iteration 3079/3560 Training loss: 1.3792 0.8565 sec/batch
Epoch 18/20  Iteration 3080/3560 Training loss: 1.3792 0.9023 sec/batch
Epoch 18/20  Iteration 3081/3560 Training loss: 1.3790 0.9024 sec/batch
Epoch 18/20  Iteration 3082/3560 Training loss: 1.3792 0.8594 sec/batch
Epoch 18/20  Iteration 3083/3560 Training loss: 1.3796 0.8404 sec/batch
Epoch 18/20  Iteration 3084/3560 Training loss: 1.3791 0.8643 sec/batch
Epoch 18/20  Iteration 3085/3560 Training loss: 1.3784 0.8404 sec/batch
Epoch 18/20  Iteration 3086/3560 Training loss: 1.3789 0.8410 sec/batch
Epoch 18/20  Iteration 3087/3560 Training loss: 1.3789 0.8419 sec/batch
Epoch 18/20  Iteration 3088/3560 Training loss: 1.3797 0.8414 sec/batch
Epoch 18/20  Iteration 3089/3560 Training loss: 1.3801 0.8374 sec/batch
Epoch 18/20  Iteration 3090/3560 Training loss: 1.3803 0.8428 sec/batch
Epoch 18/20  Iteration 3091/3560 Training loss: 1.3805 0.9028 sec/batch
Epoch 18/20  Iteration 3092/3560 Training loss: 1.3806 0.9028 sec/batch
Epoch 18/20  Iteration 3093/3560 Training loss: 1.3810 0.8430 sec/batch
Epoch 18/20  Iteration 3094/3560 Training loss: 1.3807 0.8947 sec/batch
Epoch 18/20  Iteration 3095/3560 Training loss: 1.3808 0.8406 sec/batch
Epoch 18/20  Iteration 3096/3560 Training loss: 1.3807 0.8489 sec/batch
Epoch 18/20  Iteration 3097/3560 Training loss: 1.3814 0.8474 sec/batch
Epoch 18/20  Iteration 3098/3560 Training loss: 1.3817 0.8567 sec/batch
Epoch 18/20  Iteration 3099/3560 Training loss: 1.3820 0.8449 sec/batch
Epoch 18/20  Iteration 3100/3560 Training loss: 1.3817 0.8422 sec/batch
Epoch 18/20  Iteration 3101/3560 Training loss: 1.3816 0.8486 sec/batch
Epoch 18/20  Iteration 3102/3560 Training loss: 1.3818 0.8400 sec/batch
Epoch 18/20  Iteration 3103/3560 Training loss: 1.3818 0.8460 sec/batch
Epoch 18/20  Iteration 3104/3560 Training loss: 1.3816 0.8383 sec/batch
Epoch 18/20  Iteration 3105/3560 Training loss: 1.3812 0.8522 sec/batch
Epoch 18/20  Iteration 3106/3560 Training loss: 1.3811 0.8410 sec/batch
Epoch 18/20  Iteration 3107/3560 Training loss: 1.3807 0.8450 sec/batch
Epoch 18/20  Iteration 3108/3560 Training loss: 1.3808 0.8427 sec/batch
Epoch 18/20  Iteration 3109/3560 Training loss: 1.3803 0.8344 sec/batch
Epoch 18/20  Iteration 3110/3560 Training loss: 1.3804 0.8405 sec/batch
Epoch 18/20  Iteration 3111/3560 Training loss: 1.3801 0.8442 sec/batch
Epoch 18/20  Iteration 3112/3560 Training loss: 1.3800 0.8443 sec/batch
Epoch 18/20  Iteration 3113/3560 Training loss: 1.3796 0.8406 sec/batch
Epoch 18/20  Iteration 3114/3560 Training loss: 1.3793 0.8411 sec/batch
Epoch 18/20  Iteration 3115/3560 Training loss: 1.3789 0.8406 sec/batch
Epoch 18/20  Iteration 3116/3560 Training loss: 1.3790 0.8327 sec/batch
Epoch 18/20  Iteration 3117/3560 Training loss: 1.3787 0.8414 sec/batch
Epoch 18/20  Iteration 3118/3560 Training loss: 1.3786 0.8456 sec/batch
Epoch 18/20  Iteration 3119/3560 Training loss: 1.3782 0.8952 sec/batch
Epoch 18/20  Iteration 3120/3560 Training loss: 1.3779 0.8958 sec/batch
Epoch 18/20  Iteration 3121/3560 Training loss: 1.3777 0.8789 sec/batch
Epoch 18/20  Iteration 3122/3560 Training loss: 1.3778 0.8417 sec/batch
Epoch 18/20  Iteration 3123/3560 Training loss: 1.3778 0.8741 sec/batch
Epoch 18/20  Iteration 3124/3560 Training loss: 1.3775 0.8482 sec/batch
Epoch 18/20  Iteration 3125/3560 Training loss: 1.3772 1.0582 sec/batch
Epoch 18/20  Iteration 3126/3560 Training loss: 1.3769 1.0078 sec/batch
Epoch 18/20  Iteration 3127/3560 Training loss: 1.3769 0.9593 sec/batch
Epoch 18/20  Iteration 3128/3560 Training loss: 1.3768 0.8749 sec/batch
Epoch 18/20  Iteration 3129/3560 Training loss: 1.3767 0.8438 sec/batch
Epoch 18/20  Iteration 3130/3560 Training loss: 1.3767 0.9317 sec/batch
Epoch 18/20  Iteration 3131/3560 Training loss: 1.3765 0.8457 sec/batch
Epoch 18/20  Iteration 3132/3560 Training loss: 1.3764 0.9149 sec/batch
Epoch 18/20  Iteration 3133/3560 Training loss: 1.3763 0.8441 sec/batch
Epoch 18/20  Iteration 3134/3560 Training loss: 1.3763 0.8515 sec/batch
Epoch 18/20  Iteration 3135/3560 Training loss: 1.3761 0.8790 sec/batch
Epoch 18/20  Iteration 3136/3560 Training loss: 1.3762 0.8426 sec/batch
Epoch 18/20  Iteration 3137/3560 Training loss: 1.3761 0.8362 sec/batch
Epoch 18/20  Iteration 3138/3560 Training loss: 1.3760 0.9232 sec/batch
Epoch 18/20  Iteration 3139/3560 Training loss: 1.3759 0.8480 sec/batch
Epoch 18/20  Iteration 3140/3560 Training loss: 1.3757 0.8429 sec/batch
Epoch 18/20  Iteration 3141/3560 Training loss: 1.3755 0.8408 sec/batch
Epoch 18/20  Iteration 3142/3560 Training loss: 1.3751 0.8476 sec/batch
Epoch 18/20  Iteration 3143/3560 Training loss: 1.3749 0.8523 sec/batch
Epoch 18/20  Iteration 3144/3560 Training loss: 1.3748 0.8413 sec/batch
Epoch 18/20  Iteration 3145/3560 Training loss: 1.3747 0.8517 sec/batch
Epoch 18/20  Iteration 3146/3560 Training loss: 1.3746 0.8494 sec/batch
Epoch 18/20  Iteration 3147/3560 Training loss: 1.3746 0.8363 sec/batch
Epoch 18/20  Iteration 3148/3560 Training loss: 1.3743 0.8865 sec/batch
Epoch 18/20  Iteration 3149/3560 Training loss: 1.3739 0.8374 sec/batch
Epoch 18/20  Iteration 3150/3560 Training loss: 1.3740 0.8365 sec/batch
Epoch 18/20  Iteration 3151/3560 Training loss: 1.3739 0.8406 sec/batch
Epoch 18/20  Iteration 3152/3560 Training loss: 1.3735 0.8415 sec/batch
Epoch 18/20  Iteration 3153/3560 Training loss: 1.3737 0.8588 sec/batch
Epoch 18/20  Iteration 3154/3560 Training loss: 1.3736 0.8560 sec/batch
Epoch 18/20  Iteration 3155/3560 Training loss: 1.3734 0.8477 sec/batch
Epoch 18/20  Iteration 3156/3560 Training loss: 1.3731 0.8413 sec/batch
Epoch 18/20  Iteration 3157/3560 Training loss: 1.3728 0.8937 sec/batch
Epoch 18/20  Iteration 3158/3560 Training loss: 1.3726 0.8442 sec/batch
Epoch 18/20  Iteration 3159/3560 Training loss: 1.3727 0.8387 sec/batch
Epoch 18/20  Iteration 3160/3560 Training loss: 1.3728 0.8412 sec/batch
Epoch 18/20  Iteration 3161/3560 Training loss: 1.3728 0.8514 sec/batch
Epoch 18/20  Iteration 3162/3560 Training loss: 1.3728 0.8429 sec/batch
Epoch 18/20  Iteration 3163/3560 Training loss: 1.3729 0.8501 sec/batch
Epoch 18/20  Iteration 3164/3560 Training loss: 1.3731 0.8721 sec/batch
Epoch 18/20  Iteration 3165/3560 Training loss: 1.3731 0.8436 sec/batch
Epoch 18/20  Iteration 3166/3560 Training loss: 1.3731 0.8497 sec/batch
Epoch 18/20  Iteration 3167/3560 Training loss: 1.3734 0.8457 sec/batch
Epoch 18/20  Iteration 3168/3560 Training loss: 1.3735 0.8368 sec/batch
Epoch 18/20  Iteration 3169/3560 Training loss: 1.3735 0.8691 sec/batch
Epoch 18/20  Iteration 3170/3560 Training loss: 1.3737 0.9382 sec/batch
Epoch 18/20  Iteration 3171/3560 Training loss: 1.3736 0.8452 sec/batch
Epoch 18/20  Iteration 3172/3560 Training loss: 1.3738 0.9170 sec/batch
Epoch 18/20  Iteration 3173/3560 Training loss: 1.3739 0.8719 sec/batch
Epoch 18/20  Iteration 3174/3560 Training loss: 1.3741 0.8473 sec/batch
Epoch 18/20  Iteration 3175/3560 Training loss: 1.3742 0.8510 sec/batch
Epoch 18/20  Iteration 3176/3560 Training loss: 1.3740 0.9039 sec/batch
Epoch 18/20  Iteration 3177/3560 Training loss: 1.3737 0.8404 sec/batch
Epoch 18/20  Iteration 3178/3560 Training loss: 1.3736 0.8497 sec/batch
Epoch 18/20  Iteration 3179/3560 Training loss: 1.3736 0.8439 sec/batch
Epoch 18/20  Iteration 3180/3560 Training loss: 1.3736 0.8421 sec/batch
Epoch 18/20  Iteration 3181/3560 Training loss: 1.3735 0.8565 sec/batch
Epoch 18/20  Iteration 3182/3560 Training loss: 1.3735 0.8426 sec/batch
Epoch 18/20  Iteration 3183/3560 Training loss: 1.3734 0.8402 sec/batch
Epoch 18/20  Iteration 3184/3560 Training loss: 1.3734 0.8514 sec/batch
Epoch 18/20  Iteration 3185/3560 Training loss: 1.3733 0.8392 sec/batch
Epoch 18/20  Iteration 3186/3560 Training loss: 1.3734 0.8993 sec/batch
Epoch 18/20  Iteration 3187/3560 Training loss: 1.3736 0.8441 sec/batch
Epoch 18/20  Iteration 3188/3560 Training loss: 1.3736 0.9274 sec/batch
Epoch 18/20  Iteration 3189/3560 Training loss: 1.3736 0.8761 sec/batch
Epoch 18/20  Iteration 3190/3560 Training loss: 1.3736 0.8668 sec/batch
Epoch 18/20  Iteration 3191/3560 Training loss: 1.3735 0.8627 sec/batch
Epoch 18/20  Iteration 3192/3560 Training loss: 1.3735 0.8379 sec/batch
Epoch 18/20  Iteration 3193/3560 Training loss: 1.3736 0.8380 sec/batch
Epoch 18/20  Iteration 3194/3560 Training loss: 1.3741 0.8377 sec/batch
Epoch 18/20  Iteration 3195/3560 Training loss: 1.3741 0.8393 sec/batch
Epoch 18/20  Iteration 3196/3560 Training loss: 1.3741 0.9142 sec/batch
Epoch 18/20  Iteration 3197/3560 Training loss: 1.3741 0.8617 sec/batch
Epoch 18/20  Iteration 3198/3560 Training loss: 1.3739 0.9129 sec/batch
Epoch 18/20  Iteration 3199/3560 Training loss: 1.3740 0.8597 sec/batch
Epoch 18/20  Iteration 3200/3560 Training loss: 1.3741 0.8558 sec/batch
Epoch 18/20  Iteration 3201/3560 Training loss: 1.3741 0.8481 sec/batch
Epoch 18/20  Iteration 3202/3560 Training loss: 1.3740 0.8417 sec/batch
Epoch 18/20  Iteration 3203/3560 Training loss: 1.3739 0.8350 sec/batch
Epoch 18/20  Iteration 3204/3560 Training loss: 1.3741 0.8377 sec/batch
Epoch 19/20  Iteration 3205/3560 Training loss: 1.4546 0.8488 sec/batch
Epoch 19/20  Iteration 3206/3560 Training loss: 1.4209 0.8432 sec/batch
Epoch 19/20  Iteration 3207/3560 Training loss: 1.4044 0.8432 sec/batch
Epoch 19/20  Iteration 3208/3560 Training loss: 1.4014 0.9475 sec/batch
Epoch 19/20  Iteration 3209/3560 Training loss: 1.3918 0.8403 sec/batch
Epoch 19/20  Iteration 3210/3560 Training loss: 1.3813 0.8402 sec/batch
Epoch 19/20  Iteration 3211/3560 Training loss: 1.3816 0.8878 sec/batch
Epoch 19/20  Iteration 3212/3560 Training loss: 1.3782 0.8625 sec/batch
Epoch 19/20  Iteration 3213/3560 Training loss: 1.3774 0.8539 sec/batch
Epoch 19/20  Iteration 3214/3560 Training loss: 1.3763 0.8570 sec/batch
Epoch 19/20  Iteration 3215/3560 Training loss: 1.3725 0.8365 sec/batch
Epoch 19/20  Iteration 3216/3560 Training loss: 1.3724 0.8437 sec/batch
Epoch 19/20  Iteration 3217/3560 Training loss: 1.3725 0.8447 sec/batch
Epoch 19/20  Iteration 3218/3560 Training loss: 1.3726 0.8454 sec/batch
Epoch 19/20  Iteration 3219/3560 Training loss: 1.3717 0.8423 sec/batch
Epoch 19/20  Iteration 3220/3560 Training loss: 1.3697 0.8320 sec/batch
Epoch 19/20  Iteration 3221/3560 Training loss: 1.3701 0.8633 sec/batch
Epoch 19/20  Iteration 3222/3560 Training loss: 1.3714 0.8507 sec/batch
Epoch 19/20  Iteration 3223/3560 Training loss: 1.3719 0.9002 sec/batch
Epoch 19/20  Iteration 3224/3560 Training loss: 1.3735 0.8887 sec/batch
Epoch 19/20  Iteration 3225/3560 Training loss: 1.3731 0.8909 sec/batch
Epoch 19/20  Iteration 3226/3560 Training loss: 1.3735 0.8736 sec/batch
Epoch 19/20  Iteration 3227/3560 Training loss: 1.3725 0.9148 sec/batch
Epoch 19/20  Iteration 3228/3560 Training loss: 1.3726 0.8413 sec/batch
Epoch 19/20  Iteration 3229/3560 Training loss: 1.3728 0.8671 sec/batch
Epoch 19/20  Iteration 3230/3560 Training loss: 1.3709 0.8799 sec/batch
Epoch 19/20  Iteration 3231/3560 Training loss: 1.3701 0.9155 sec/batch
Epoch 19/20  Iteration 3232/3560 Training loss: 1.3708 0.8883 sec/batch
Epoch 19/20  Iteration 3233/3560 Training loss: 1.3712 0.8630 sec/batch
Epoch 19/20  Iteration 3234/3560 Training loss: 1.3719 0.8629 sec/batch
Epoch 19/20  Iteration 3235/3560 Training loss: 1.3715 0.8659 sec/batch
Epoch 19/20  Iteration 3236/3560 Training loss: 1.3708 0.8600 sec/batch
Epoch 19/20  Iteration 3237/3560 Training loss: 1.3707 0.8433 sec/batch
Epoch 19/20  Iteration 3238/3560 Training loss: 1.3710 0.8515 sec/batch
Epoch 19/20  Iteration 3239/3560 Training loss: 1.3707 0.8447 sec/batch
Epoch 19/20  Iteration 3240/3560 Training loss: 1.3704 0.8406 sec/batch
Epoch 19/20  Iteration 3241/3560 Training loss: 1.3695 0.8378 sec/batch
Epoch 19/20  Iteration 3242/3560 Training loss: 1.3685 0.8415 sec/batch
Epoch 19/20  Iteration 3243/3560 Training loss: 1.3674 0.8830 sec/batch
Epoch 19/20  Iteration 3244/3560 Training loss: 1.3670 0.8905 sec/batch
Epoch 19/20  Iteration 3245/3560 Training loss: 1.3667 0.8885 sec/batch
Epoch 19/20  Iteration 3246/3560 Training loss: 1.3675 0.8659 sec/batch
Epoch 19/20  Iteration 3247/3560 Training loss: 1.3675 0.8450 sec/batch
Epoch 19/20  Iteration 3248/3560 Training loss: 1.3671 0.8475 sec/batch
Epoch 19/20  Iteration 3249/3560 Training loss: 1.3673 0.8395 sec/batch
Epoch 19/20  Iteration 3250/3560 Training loss: 1.3667 0.8486 sec/batch
Epoch 19/20  Iteration 3251/3560 Training loss: 1.3664 0.8486 sec/batch
Epoch 19/20  Iteration 3252/3560 Training loss: 1.3657 0.8426 sec/batch
Epoch 19/20  Iteration 3253/3560 Training loss: 1.3659 0.8403 sec/batch
Epoch 19/20  Iteration 3254/3560 Training loss: 1.3664 0.8473 sec/batch
Epoch 19/20  Iteration 3255/3560 Training loss: 1.3659 0.8392 sec/batch
Epoch 19/20  Iteration 3256/3560 Training loss: 1.3669 0.8531 sec/batch
Epoch 19/20  Iteration 3257/3560 Training loss: 1.3669 0.8967 sec/batch
Epoch 19/20  Iteration 3258/3560 Training loss: 1.3670 0.8569 sec/batch
Epoch 19/20  Iteration 3259/3560 Training loss: 1.3668 0.8374 sec/batch
Epoch 19/20  Iteration 3260/3560 Training loss: 1.3669 0.8449 sec/batch
Epoch 19/20  Iteration 3261/3560 Training loss: 1.3673 0.8370 sec/batch
Epoch 19/20  Iteration 3262/3560 Training loss: 1.3670 0.8432 sec/batch
Epoch 19/20  Iteration 3263/3560 Training loss: 1.3663 0.8378 sec/batch
Epoch 19/20  Iteration 3264/3560 Training loss: 1.3667 0.8527 sec/batch
Epoch 19/20  Iteration 3265/3560 Training loss: 1.3668 0.8505 sec/batch
Epoch 19/20  Iteration 3266/3560 Training loss: 1.3675 0.9091 sec/batch
Epoch 19/20  Iteration 3267/3560 Training loss: 1.3679 0.8873 sec/batch
Epoch 19/20  Iteration 3268/3560 Training loss: 1.3680 0.8506 sec/batch
Epoch 19/20  Iteration 3269/3560 Training loss: 1.3681 0.8871 sec/batch
Epoch 19/20  Iteration 3270/3560 Training loss: 1.3683 0.8701 sec/batch
Epoch 19/20  Iteration 3271/3560 Training loss: 1.3685 0.8891 sec/batch
Epoch 19/20  Iteration 3272/3560 Training loss: 1.3682 0.9280 sec/batch
Epoch 19/20  Iteration 3273/3560 Training loss: 1.3683 0.8571 sec/batch
Epoch 19/20  Iteration 3274/3560 Training loss: 1.3681 0.8765 sec/batch
Epoch 19/20  Iteration 3275/3560 Training loss: 1.3688 0.8554 sec/batch
Epoch 19/20  Iteration 3276/3560 Training loss: 1.3692 0.8707 sec/batch
Epoch 19/20  Iteration 3277/3560 Training loss: 1.3697 0.8527 sec/batch
Epoch 19/20  Iteration 3278/3560 Training loss: 1.3695 0.8544 sec/batch
Epoch 19/20  Iteration 3279/3560 Training loss: 1.3695 0.8623 sec/batch
Epoch 19/20  Iteration 3280/3560 Training loss: 1.3698 0.8549 sec/batch
Epoch 19/20  Iteration 3281/3560 Training loss: 1.3698 0.8764 sec/batch
Epoch 19/20  Iteration 3282/3560 Training loss: 1.3698 0.8671 sec/batch
Epoch 19/20  Iteration 3283/3560 Training loss: 1.3693 0.8705 sec/batch
Epoch 19/20  Iteration 3284/3560 Training loss: 1.3693 0.8725 sec/batch
Epoch 19/20  Iteration 3285/3560 Training loss: 1.3689 0.9174 sec/batch
Epoch 19/20  Iteration 3286/3560 Training loss: 1.3689 0.8537 sec/batch
Epoch 19/20  Iteration 3287/3560 Training loss: 1.3685 0.8687 sec/batch
Epoch 19/20  Iteration 3288/3560 Training loss: 1.3684 0.8590 sec/batch
Epoch 19/20  Iteration 3289/3560 Training loss: 1.3681 0.8647 sec/batch
Epoch 19/20  Iteration 3290/3560 Training loss: 1.3680 0.8497 sec/batch
Epoch 19/20  Iteration 3291/3560 Training loss: 1.3678 0.8596 sec/batch
Epoch 19/20  Iteration 3292/3560 Training loss: 1.3675 0.8770 sec/batch
Epoch 19/20  Iteration 3293/3560 Training loss: 1.3670 0.8509 sec/batch
Epoch 19/20  Iteration 3294/3560 Training loss: 1.3671 0.8522 sec/batch
Epoch 19/20  Iteration 3295/3560 Training loss: 1.3669 0.8709 sec/batch
Epoch 19/20  Iteration 3296/3560 Training loss: 1.3668 0.8530 sec/batch
Epoch 19/20  Iteration 3297/3560 Training loss: 1.3663 0.8622 sec/batch
Epoch 19/20  Iteration 3298/3560 Training loss: 1.3661 0.8561 sec/batch
Epoch 19/20  Iteration 3299/3560 Training loss: 1.3660 0.8680 sec/batch
Epoch 19/20  Iteration 3300/3560 Training loss: 1.3662 0.8727 sec/batch
Epoch 19/20  Iteration 3301/3560 Training loss: 1.3663 0.8489 sec/batch
Epoch 19/20  Iteration 3302/3560 Training loss: 1.3659 0.8544 sec/batch
Epoch 19/20  Iteration 3303/3560 Training loss: 1.3656 0.8461 sec/batch
Epoch 19/20  Iteration 3304/3560 Training loss: 1.3653 0.8516 sec/batch
Epoch 19/20  Iteration 3305/3560 Training loss: 1.3653 0.8439 sec/batch
Epoch 19/20  Iteration 3306/3560 Training loss: 1.3652 0.8392 sec/batch
Epoch 19/20  Iteration 3307/3560 Training loss: 1.3650 0.8911 sec/batch
Epoch 19/20  Iteration 3308/3560 Training loss: 1.3649 1.0061 sec/batch
Epoch 19/20  Iteration 3309/3560 Training loss: 1.3647 0.8667 sec/batch
Epoch 19/20  Iteration 3310/3560 Training loss: 1.3646 0.9456 sec/batch
Epoch 19/20  Iteration 3311/3560 Training loss: 1.3647 0.8615 sec/batch
Epoch 19/20  Iteration 3312/3560 Training loss: 1.3647 0.8512 sec/batch
Epoch 19/20  Iteration 3313/3560 Training loss: 1.3646 0.8542 sec/batch
Epoch 19/20  Iteration 3314/3560 Training loss: 1.3647 0.8515 sec/batch
Epoch 19/20  Iteration 3315/3560 Training loss: 1.3646 0.8919 sec/batch
Epoch 19/20  Iteration 3316/3560 Training loss: 1.3645 0.8787 sec/batch
Epoch 19/20  Iteration 3317/3560 Training loss: 1.3644 0.8599 sec/batch
Epoch 19/20  Iteration 3318/3560 Training loss: 1.3644 0.8605 sec/batch
Epoch 19/20  Iteration 3319/3560 Training loss: 1.3641 0.8484 sec/batch
Epoch 19/20  Iteration 3320/3560 Training loss: 1.3638 0.8597 sec/batch
Epoch 19/20  Iteration 3321/3560 Training loss: 1.3638 0.8493 sec/batch
Epoch 19/20  Iteration 3322/3560 Training loss: 1.3638 0.8486 sec/batch
Epoch 19/20  Iteration 3323/3560 Training loss: 1.3636 0.8860 sec/batch
Epoch 19/20  Iteration 3324/3560 Training loss: 1.3635 0.8781 sec/batch
Epoch 19/20  Iteration 3325/3560 Training loss: 1.3635 0.8526 sec/batch
Epoch 19/20  Iteration 3326/3560 Training loss: 1.3632 0.8857 sec/batch
Epoch 19/20  Iteration 3327/3560 Training loss: 1.3628 0.8509 sec/batch
Epoch 19/20  Iteration 3328/3560 Training loss: 1.3629 0.8700 sec/batch
Epoch 19/20  Iteration 3329/3560 Training loss: 1.3628 0.8771 sec/batch
Epoch 19/20  Iteration 3330/3560 Training loss: 1.3624 0.8511 sec/batch
Epoch 19/20  Iteration 3331/3560 Training loss: 1.3626 0.8606 sec/batch
Epoch 19/20  Iteration 3332/3560 Training loss: 1.3625 0.8456 sec/batch
Epoch 19/20  Iteration 3333/3560 Training loss: 1.3624 0.8479 sec/batch
Epoch 19/20  Iteration 3334/3560 Training loss: 1.3620 0.8481 sec/batch
Epoch 19/20  Iteration 3335/3560 Training loss: 1.3617 0.8924 sec/batch
Epoch 19/20  Iteration 3336/3560 Training loss: 1.3615 0.8473 sec/batch
Epoch 19/20  Iteration 3337/3560 Training loss: 1.3616 0.9004 sec/batch
Epoch 19/20  Iteration 3338/3560 Training loss: 1.3616 0.8632 sec/batch
Epoch 19/20  Iteration 3339/3560 Training loss: 1.3616 0.8713 sec/batch
Epoch 19/20  Iteration 3340/3560 Training loss: 1.3616 0.8667 sec/batch
Epoch 19/20  Iteration 3341/3560 Training loss: 1.3617 0.9155 sec/batch
Epoch 19/20  Iteration 3342/3560 Training loss: 1.3618 0.8464 sec/batch
Epoch 19/20  Iteration 3343/3560 Training loss: 1.3618 0.8467 sec/batch
Epoch 19/20  Iteration 3344/3560 Training loss: 1.3618 0.8728 sec/batch
Epoch 19/20  Iteration 3345/3560 Training loss: 1.3621 0.8523 sec/batch
Epoch 19/20  Iteration 3346/3560 Training loss: 1.3622 0.8521 sec/batch
Epoch 19/20  Iteration 3347/3560 Training loss: 1.3621 0.8712 sec/batch
Epoch 19/20  Iteration 3348/3560 Training loss: 1.3623 0.8836 sec/batch
Epoch 19/20  Iteration 3349/3560 Training loss: 1.3622 0.8531 sec/batch
Epoch 19/20  Iteration 3350/3560 Training loss: 1.3624 0.8584 sec/batch
Epoch 19/20  Iteration 3351/3560 Training loss: 1.3624 0.8517 sec/batch
Epoch 19/20  Iteration 3352/3560 Training loss: 1.3626 0.8513 sec/batch
Epoch 19/20  Iteration 3353/3560 Training loss: 1.3628 0.8427 sec/batch
Epoch 19/20  Iteration 3354/3560 Training loss: 1.3626 0.8526 sec/batch
Epoch 19/20  Iteration 3355/3560 Training loss: 1.3623 0.8435 sec/batch
Epoch 19/20  Iteration 3356/3560 Training loss: 1.3622 0.8483 sec/batch
Epoch 19/20  Iteration 3357/3560 Training loss: 1.3622 0.8500 sec/batch
Epoch 19/20  Iteration 3358/3560 Training loss: 1.3622 0.8746 sec/batch
Epoch 19/20  Iteration 3359/3560 Training loss: 1.3623 0.8429 sec/batch
Epoch 19/20  Iteration 3360/3560 Training loss: 1.3623 0.8779 sec/batch
Epoch 19/20  Iteration 3361/3560 Training loss: 1.3623 0.8712 sec/batch
Epoch 19/20  Iteration 3362/3560 Training loss: 1.3622 0.8503 sec/batch
Epoch 19/20  Iteration 3363/3560 Training loss: 1.3621 0.8490 sec/batch
Epoch 19/20  Iteration 3364/3560 Training loss: 1.3622 0.8708 sec/batch
Epoch 19/20  Iteration 3365/3560 Training loss: 1.3624 0.8785 sec/batch
Epoch 19/20  Iteration 3366/3560 Training loss: 1.3624 0.8529 sec/batch
Epoch 19/20  Iteration 3367/3560 Training loss: 1.3624 0.8602 sec/batch
Epoch 19/20  Iteration 3368/3560 Training loss: 1.3624 0.8919 sec/batch
Epoch 19/20  Iteration 3369/3560 Training loss: 1.3624 0.9611 sec/batch
Epoch 19/20  Iteration 3370/3560 Training loss: 1.3624 1.1322 sec/batch
Epoch 19/20  Iteration 3371/3560 Training loss: 1.3626 1.1181 sec/batch
Epoch 19/20  Iteration 3372/3560 Training loss: 1.3630 1.2007 sec/batch
Epoch 19/20  Iteration 3373/3560 Training loss: 1.3630 1.2758 sec/batch
Epoch 19/20  Iteration 3374/3560 Training loss: 1.3631 1.1534 sec/batch
Epoch 19/20  Iteration 3375/3560 Training loss: 1.3630 1.2793 sec/batch
Epoch 19/20  Iteration 3376/3560 Training loss: 1.3629 1.2626 sec/batch
Epoch 19/20  Iteration 3377/3560 Training loss: 1.3631 1.1693 sec/batch
Epoch 19/20  Iteration 3378/3560 Training loss: 1.3630 1.2034 sec/batch
Epoch 19/20  Iteration 3379/3560 Training loss: 1.3631 1.2443 sec/batch
Epoch 19/20  Iteration 3380/3560 Training loss: 1.3630 1.2424 sec/batch
Epoch 19/20  Iteration 3381/3560 Training loss: 1.3629 1.3392 sec/batch
Epoch 19/20  Iteration 3382/3560 Training loss: 1.3631 1.1796 sec/batch
Epoch 20/20  Iteration 3383/3560 Training loss: 1.4533 1.2164 sec/batch
Epoch 20/20  Iteration 3384/3560 Training loss: 1.4141 1.1736 sec/batch
Epoch 20/20  Iteration 3385/3560 Training loss: 1.3987 1.2286 sec/batch
Epoch 20/20  Iteration 3386/3560 Training loss: 1.3932 1.3209 sec/batch
Epoch 20/20  Iteration 3387/3560 Training loss: 1.3838 1.2834 sec/batch
Epoch 20/20  Iteration 3388/3560 Training loss: 1.3730 1.3322 sec/batch
Epoch 20/20  Iteration 3389/3560 Training loss: 1.3732 1.3230 sec/batch
Epoch 20/20  Iteration 3390/3560 Training loss: 1.3694 1.2406 sec/batch
Epoch 20/20  Iteration 3391/3560 Training loss: 1.3685 1.3210 sec/batch
Epoch 20/20  Iteration 3392/3560 Training loss: 1.3676 1.4058 sec/batch
Epoch 20/20  Iteration 3393/3560 Training loss: 1.3648 1.4182 sec/batch
Epoch 20/20  Iteration 3394/3560 Training loss: 1.3649 1.4197 sec/batch
Epoch 20/20  Iteration 3395/3560 Training loss: 1.3641 1.5533 sec/batch
Epoch 20/20  Iteration 3396/3560 Training loss: 1.3648 1.3111 sec/batch
Epoch 20/20  Iteration 3397/3560 Training loss: 1.3639 1.2453 sec/batch
Epoch 20/20  Iteration 3398/3560 Training loss: 1.3617 1.2312 sec/batch
Epoch 20/20  Iteration 3399/3560 Training loss: 1.3619 1.3121 sec/batch
Epoch 20/20  Iteration 3400/3560 Training loss: 1.3630 1.3542 sec/batch
Epoch 20/20  Iteration 3401/3560 Training loss: 1.3632 1.3080 sec/batch
Epoch 20/20  Iteration 3402/3560 Training loss: 1.3648 1.1776 sec/batch
Epoch 20/20  Iteration 3403/3560 Training loss: 1.3637 1.2523 sec/batch
Epoch 20/20  Iteration 3404/3560 Training loss: 1.3643 1.2699 sec/batch
Epoch 20/20  Iteration 3405/3560 Training loss: 1.3633 1.2618 sec/batch
Epoch 20/20  Iteration 3406/3560 Training loss: 1.3634 1.3282 sec/batch
Epoch 20/20  Iteration 3407/3560 Training loss: 1.3635 0.9215 sec/batch
Epoch 20/20  Iteration 3408/3560 Training loss: 1.3618 0.8911 sec/batch
Epoch 20/20  Iteration 3409/3560 Training loss: 1.3611 0.9466 sec/batch
Epoch 20/20  Iteration 3410/3560 Training loss: 1.3617 1.0199 sec/batch
Epoch 20/20  Iteration 3411/3560 Training loss: 1.3622 0.9986 sec/batch
Epoch 20/20  Iteration 3412/3560 Training loss: 1.3624 0.9807 sec/batch
Epoch 20/20  Iteration 3413/3560 Training loss: 1.3617 1.1251 sec/batch
Epoch 20/20  Iteration 3414/3560 Training loss: 1.3612 1.2239 sec/batch
Epoch 20/20  Iteration 3415/3560 Training loss: 1.3615 1.2233 sec/batch
Epoch 20/20  Iteration 3416/3560 Training loss: 1.3617 1.1159 sec/batch
Epoch 20/20  Iteration 3417/3560 Training loss: 1.3616 1.2144 sec/batch
Epoch 20/20  Iteration 3418/3560 Training loss: 1.3615 1.2409 sec/batch
Epoch 20/20  Iteration 3419/3560 Training loss: 1.3607 1.2102 sec/batch
Epoch 20/20  Iteration 3420/3560 Training loss: 1.3597 1.2164 sec/batch
Epoch 20/20  Iteration 3421/3560 Training loss: 1.3585 1.1708 sec/batch
Epoch 20/20  Iteration 3422/3560 Training loss: 1.3583 0.8496 sec/batch
Epoch 20/20  Iteration 3423/3560 Training loss: 1.3578 0.8474 sec/batch
Epoch 20/20  Iteration 3424/3560 Training loss: 1.3587 0.9219 sec/batch
Epoch 20/20  Iteration 3425/3560 Training loss: 1.3586 1.0259 sec/batch
Epoch 20/20  Iteration 3426/3560 Training loss: 1.3581 1.0567 sec/batch
Epoch 20/20  Iteration 3427/3560 Training loss: 1.3584 1.0571 sec/batch
Epoch 20/20  Iteration 3428/3560 Training loss: 1.3577 0.8990 sec/batch
Epoch 20/20  Iteration 3429/3560 Training loss: 1.3573 0.9893 sec/batch
Epoch 20/20  Iteration 3430/3560 Training loss: 1.3569 0.8837 sec/batch
Epoch 20/20  Iteration 3431/3560 Training loss: 1.3570 0.9371 sec/batch
Epoch 20/20  Iteration 3432/3560 Training loss: 1.3575 0.8813 sec/batch
Epoch 20/20  Iteration 3433/3560 Training loss: 1.3571 0.8550 sec/batch
Epoch 20/20  Iteration 3434/3560 Training loss: 1.3580 0.9093 sec/batch
Epoch 20/20  Iteration 3435/3560 Training loss: 1.3579 0.8497 sec/batch
Epoch 20/20  Iteration 3436/3560 Training loss: 1.3581 0.8672 sec/batch
Epoch 20/20  Iteration 3437/3560 Training loss: 1.3579 0.8673 sec/batch
Epoch 20/20  Iteration 3438/3560 Training loss: 1.3579 0.9049 sec/batch
Epoch 20/20  Iteration 3439/3560 Training loss: 1.3582 1.0330 sec/batch
Epoch 20/20  Iteration 3440/3560 Training loss: 1.3578 1.1080 sec/batch
Epoch 20/20  Iteration 3441/3560 Training loss: 1.3573 1.2562 sec/batch
Epoch 20/20  Iteration 3442/3560 Training loss: 1.3579 1.3223 sec/batch
Epoch 20/20  Iteration 3443/3560 Training loss: 1.3580 1.0916 sec/batch
Epoch 20/20  Iteration 3444/3560 Training loss: 1.3589 1.0687 sec/batch
Epoch 20/20  Iteration 3445/3560 Training loss: 1.3591 1.1468 sec/batch
Epoch 20/20  Iteration 3446/3560 Training loss: 1.3593 1.3269 sec/batch
Epoch 20/20  Iteration 3447/3560 Training loss: 1.3595 1.4428 sec/batch
Epoch 20/20  Iteration 3448/3560 Training loss: 1.3595 1.2094 sec/batch
Epoch 20/20  Iteration 3449/3560 Training loss: 1.3600 1.1664 sec/batch
Epoch 20/20  Iteration 3450/3560 Training loss: 1.3597 1.0421 sec/batch
Epoch 20/20  Iteration 3451/3560 Training loss: 1.3598 0.9967 sec/batch
Epoch 20/20  Iteration 3452/3560 Training loss: 1.3596 1.0356 sec/batch
Epoch 20/20  Iteration 3453/3560 Training loss: 1.3602 1.0596 sec/batch
Epoch 20/20  Iteration 3454/3560 Training loss: 1.3604 0.8812 sec/batch
Epoch 20/20  Iteration 3455/3560 Training loss: 1.3609 0.8837 sec/batch
Epoch 20/20  Iteration 3456/3560 Training loss: 1.3605 0.9669 sec/batch
Epoch 20/20  Iteration 3457/3560 Training loss: 1.3604 0.8910 sec/batch
Epoch 20/20  Iteration 3458/3560 Training loss: 1.3606 0.9243 sec/batch
Epoch 20/20  Iteration 3459/3560 Training loss: 1.3605 0.8740 sec/batch
Epoch 20/20  Iteration 3460/3560 Training loss: 1.3605 0.8960 sec/batch
Epoch 20/20  Iteration 3461/3560 Training loss: 1.3600 0.9053 sec/batch
Epoch 20/20  Iteration 3462/3560 Training loss: 1.3601 0.8940 sec/batch
Epoch 20/20  Iteration 3463/3560 Training loss: 1.3595 0.8890 sec/batch
Epoch 20/20  Iteration 3464/3560 Training loss: 1.3596 0.9358 sec/batch
Epoch 20/20  Iteration 3465/3560 Training loss: 1.3591 0.8855 sec/batch
Epoch 20/20  Iteration 3466/3560 Training loss: 1.3591 0.8955 sec/batch
Epoch 20/20  Iteration 3467/3560 Training loss: 1.3588 0.8788 sec/batch
Epoch 20/20  Iteration 3468/3560 Training loss: 1.3588 0.9202 sec/batch
Epoch 20/20  Iteration 3469/3560 Training loss: 1.3587 0.8668 sec/batch
Epoch 20/20  Iteration 3470/3560 Training loss: 1.3584 0.8859 sec/batch
Epoch 20/20  Iteration 3471/3560 Training loss: 1.3580 0.8953 sec/batch
Epoch 20/20  Iteration 3472/3560 Training loss: 1.3581 0.8714 sec/batch
Epoch 20/20  Iteration 3473/3560 Training loss: 1.3579 0.9075 sec/batch
Epoch 20/20  Iteration 3474/3560 Training loss: 1.3579 0.8663 sec/batch
Epoch 20/20  Iteration 3475/3560 Training loss: 1.3576 0.8508 sec/batch
Epoch 20/20  Iteration 3476/3560 Training loss: 1.3573 0.8697 sec/batch
Epoch 20/20  Iteration 3477/3560 Training loss: 1.3572 0.9297 sec/batch
Epoch 20/20  Iteration 3478/3560 Training loss: 1.3573 0.8725 sec/batch
Epoch 20/20  Iteration 3479/3560 Training loss: 1.3573 0.8965 sec/batch
Epoch 20/20  Iteration 3480/3560 Training loss: 1.3570 0.9488 sec/batch
Epoch 20/20  Iteration 3481/3560 Training loss: 1.3567 1.1113 sec/batch
Epoch 20/20  Iteration 3482/3560 Training loss: 1.3564 1.1545 sec/batch
Epoch 20/20  Iteration 3483/3560 Training loss: 1.3564 0.9212 sec/batch
Epoch 20/20  Iteration 3484/3560 Training loss: 1.3562 0.8737 sec/batch
Epoch 20/20  Iteration 3485/3560 Training loss: 1.3561 0.8680 sec/batch
Epoch 20/20  Iteration 3486/3560 Training loss: 1.3561 0.8574 sec/batch
Epoch 20/20  Iteration 3487/3560 Training loss: 1.3559 0.8493 sec/batch
Epoch 20/20  Iteration 3488/3560 Training loss: 1.3558 0.8629 sec/batch
Epoch 20/20  Iteration 3489/3560 Training loss: 1.3558 0.8431 sec/batch
Epoch 20/20  Iteration 3490/3560 Training loss: 1.3558 0.8425 sec/batch
Epoch 20/20  Iteration 3491/3560 Training loss: 1.3556 0.8986 sec/batch
Epoch 20/20  Iteration 3492/3560 Training loss: 1.3557 0.8391 sec/batch
Epoch 20/20  Iteration 3493/3560 Training loss: 1.3555 0.8875 sec/batch
Epoch 20/20  Iteration 3494/3560 Training loss: 1.3555 0.8574 sec/batch
Epoch 20/20  Iteration 3495/3560 Training loss: 1.3554 0.8731 sec/batch
Epoch 20/20  Iteration 3496/3560 Training loss: 1.3553 0.8910 sec/batch
Epoch 20/20  Iteration 3497/3560 Training loss: 1.3550 0.8735 sec/batch
Epoch 20/20  Iteration 3498/3560 Training loss: 1.3547 0.9290 sec/batch
Epoch 20/20  Iteration 3499/3560 Training loss: 1.3546 0.8900 sec/batch
Epoch 20/20  Iteration 3500/3560 Training loss: 1.3546 0.8712 sec/batch
Epoch 20/20  Iteration 3501/3560 Training loss: 1.3546 0.8627 sec/batch
Epoch 20/20  Iteration 3502/3560 Training loss: 1.3546 0.8690 sec/batch
Epoch 20/20  Iteration 3503/3560 Training loss: 1.3545 0.8904 sec/batch
Epoch 20/20  Iteration 3504/3560 Training loss: 1.3542 0.9888 sec/batch
Epoch 20/20  Iteration 3505/3560 Training loss: 1.3538 0.9070 sec/batch
Epoch 20/20  Iteration 3506/3560 Training loss: 1.3539 0.8945 sec/batch
Epoch 20/20  Iteration 3507/3560 Training loss: 1.3538 0.8557 sec/batch
Epoch 20/20  Iteration 3508/3560 Training loss: 1.3534 0.8747 sec/batch
Epoch 20/20  Iteration 3509/3560 Training loss: 1.3535 0.8958 sec/batch
Epoch 20/20  Iteration 3510/3560 Training loss: 1.3535 0.8669 sec/batch
Epoch 20/20  Iteration 3511/3560 Training loss: 1.3533 0.8898 sec/batch
Epoch 20/20  Iteration 3512/3560 Training loss: 1.3529 0.8655 sec/batch
Epoch 20/20  Iteration 3513/3560 Training loss: 1.3526 0.8804 sec/batch
Epoch 20/20  Iteration 3514/3560 Training loss: 1.3525 0.9259 sec/batch
Epoch 20/20  Iteration 3515/3560 Training loss: 1.3525 0.9420 sec/batch
Epoch 20/20  Iteration 3516/3560 Training loss: 1.3526 0.8790 sec/batch
Epoch 20/20  Iteration 3517/3560 Training loss: 1.3526 0.9416 sec/batch
Epoch 20/20  Iteration 3518/3560 Training loss: 1.3526 0.9076 sec/batch
Epoch 20/20  Iteration 3519/3560 Training loss: 1.3528 0.8534 sec/batch
Epoch 20/20  Iteration 3520/3560 Training loss: 1.3529 1.1261 sec/batch
Epoch 20/20  Iteration 3521/3560 Training loss: 1.3528 1.0191 sec/batch
Epoch 20/20  Iteration 3522/3560 Training loss: 1.3528 1.0477 sec/batch
Epoch 20/20  Iteration 3523/3560 Training loss: 1.3531 0.8486 sec/batch
Epoch 20/20  Iteration 3524/3560 Training loss: 1.3532 0.8653 sec/batch
Epoch 20/20  Iteration 3525/3560 Training loss: 1.3531 0.8696 sec/batch
Epoch 20/20  Iteration 3526/3560 Training loss: 1.3533 0.8614 sec/batch
Epoch 20/20  Iteration 3527/3560 Training loss: 1.3531 0.8595 sec/batch
Epoch 20/20  Iteration 3528/3560 Training loss: 1.3533 0.8433 sec/batch
Epoch 20/20  Iteration 3529/3560 Training loss: 1.3533 0.8988 sec/batch
Epoch 20/20  Iteration 3530/3560 Training loss: 1.3535 0.9773 sec/batch
Epoch 20/20  Iteration 3531/3560 Training loss: 1.3537 1.0252 sec/batch
Epoch 20/20  Iteration 3532/3560 Training loss: 1.3536 0.9464 sec/batch
Epoch 20/20  Iteration 3533/3560 Training loss: 1.3534 0.9464 sec/batch
Epoch 20/20  Iteration 3534/3560 Training loss: 1.3532 0.8580 sec/batch
Epoch 20/20  Iteration 3535/3560 Training loss: 1.3533 0.8605 sec/batch
Epoch 20/20  Iteration 3536/3560 Training loss: 1.3533 0.8509 sec/batch
Epoch 20/20  Iteration 3537/3560 Training loss: 1.3533 0.8704 sec/batch
Epoch 20/20  Iteration 3538/3560 Training loss: 1.3533 0.8891 sec/batch
Epoch 20/20  Iteration 3539/3560 Training loss: 1.3533 0.8717 sec/batch
Epoch 20/20  Iteration 3540/3560 Training loss: 1.3533 0.8558 sec/batch
Epoch 20/20  Iteration 3541/3560 Training loss: 1.3531 0.8559 sec/batch
Epoch 20/20  Iteration 3542/3560 Training loss: 1.3533 0.9030 sec/batch
Epoch 20/20  Iteration 3543/3560 Training loss: 1.3534 0.8755 sec/batch
Epoch 20/20  Iteration 3544/3560 Training loss: 1.3535 0.8539 sec/batch
Epoch 20/20  Iteration 3545/3560 Training loss: 1.3535 0.8556 sec/batch
Epoch 20/20  Iteration 3546/3560 Training loss: 1.3535 0.8813 sec/batch
Epoch 20/20  Iteration 3547/3560 Training loss: 1.3535 0.8701 sec/batch
Epoch 20/20  Iteration 3548/3560 Training loss: 1.3535 0.8506 sec/batch
Epoch 20/20  Iteration 3549/3560 Training loss: 1.3537 0.8564 sec/batch
Epoch 20/20  Iteration 3550/3560 Training loss: 1.3541 0.8641 sec/batch
Epoch 20/20  Iteration 3551/3560 Training loss: 1.3541 0.8435 sec/batch
Epoch 20/20  Iteration 3552/3560 Training loss: 1.3542 0.8807 sec/batch
Epoch 20/20  Iteration 3553/3560 Training loss: 1.3541 0.9425 sec/batch
Epoch 20/20  Iteration 3554/3560 Training loss: 1.3540 0.9348 sec/batch
Epoch 20/20  Iteration 3555/3560 Training loss: 1.3541 0.9266 sec/batch
Epoch 20/20  Iteration 3556/3560 Training loss: 1.3542 0.8441 sec/batch
Epoch 20/20  Iteration 3557/3560 Training loss: 1.3542 0.8691 sec/batch
Epoch 20/20  Iteration 3558/3560 Training loss: 1.3540 0.8687 sec/batch
Epoch 20/20  Iteration 3559/3560 Training loss: 1.3540 0.9445 sec/batch
Epoch 20/20  Iteration 3560/3560 Training loss: 1.3541 0.8901 sec/batch
Epoch 1/20  Iteration 1/3560 Training loss: 4.4263 0.8674 sec/batch
Epoch 1/20  Iteration 2/3560 Training loss: 4.4144 0.8405 sec/batch
Epoch 1/20  Iteration 3/3560 Training loss: 4.4009 0.8423 sec/batch
Epoch 1/20  Iteration 4/3560 Training loss: 4.3827 0.8361 sec/batch
Epoch 1/20  Iteration 5/3560 Training loss: 4.3468 0.8749 sec/batch
Epoch 1/20  Iteration 6/3560 Training loss: 4.2547 0.8357 sec/batch
Epoch 1/20  Iteration 7/3560 Training loss: 4.1637 0.8685 sec/batch
Epoch 1/20  Iteration 8/3560 Training loss: 4.0877 0.8736 sec/batch
Epoch 1/20  Iteration 9/3560 Training loss: 4.0211 0.8669 sec/batch
Epoch 1/20  Iteration 10/3560 Training loss: 3.9650 0.8612 sec/batch
Epoch 1/20  Iteration 11/3560 Training loss: 3.9142 0.8560 sec/batch
Epoch 1/20  Iteration 12/3560 Training loss: 3.8702 0.8490 sec/batch
Epoch 1/20  Iteration 13/3560 Training loss: 3.8320 0.8435 sec/batch
Epoch 1/20  Iteration 14/3560 Training loss: 3.7979 0.8498 sec/batch
Epoch 1/20  Iteration 15/3560 Training loss: 3.7669 0.8469 sec/batch
Epoch 1/20  Iteration 16/3560 Training loss: 3.7396 0.8458 sec/batch
Epoch 1/20  Iteration 17/3560 Training loss: 3.7136 0.8406 sec/batch
Epoch 1/20  Iteration 18/3560 Training loss: 3.6916 0.8652 sec/batch
Epoch 1/20  Iteration 19/3560 Training loss: 3.6701 0.8756 sec/batch
Epoch 1/20  Iteration 20/3560 Training loss: 3.6484 0.8572 sec/batch
Epoch 1/20  Iteration 21/3560 Training loss: 3.6297 0.8737 sec/batch
Epoch 1/20  Iteration 22/3560 Training loss: 3.6123 0.8462 sec/batch
Epoch 1/20  Iteration 23/3560 Training loss: 3.5958 0.8967 sec/batch
Epoch 1/20  Iteration 24/3560 Training loss: 3.5807 0.8977 sec/batch
Epoch 1/20  Iteration 25/3560 Training loss: 3.5660 0.8473 sec/batch
Epoch 1/20  Iteration 26/3560 Training loss: 3.5532 0.8375 sec/batch
Epoch 1/20  Iteration 27/3560 Training loss: 3.5411 0.8469 sec/batch
Epoch 1/20  Iteration 28/3560 Training loss: 3.5288 0.8494 sec/batch
Epoch 1/20  Iteration 29/3560 Training loss: 3.5174 0.8473 sec/batch
Epoch 1/20  Iteration 30/3560 Training loss: 3.5068 0.8424 sec/batch
Epoch 1/20  Iteration 31/3560 Training loss: 3.4975 0.8391 sec/batch
Epoch 1/20  Iteration 32/3560 Training loss: 3.4877 0.8417 sec/batch
Epoch 1/20  Iteration 33/3560 Training loss: 3.4785 0.8371 sec/batch
Epoch 1/20  Iteration 34/3560 Training loss: 3.4701 0.8398 sec/batch
Epoch 1/20  Iteration 35/3560 Training loss: 3.4616 0.8478 sec/batch
Epoch 1/20  Iteration 36/3560 Training loss: 3.4541 0.8444 sec/batch
Epoch 1/20  Iteration 37/3560 Training loss: 3.4460 0.8401 sec/batch
Epoch 1/20  Iteration 38/3560 Training loss: 3.4387 0.8411 sec/batch
Epoch 1/20  Iteration 39/3560 Training loss: 3.4314 0.8453 sec/batch
Epoch 1/20  Iteration 40/3560 Training loss: 3.4248 0.8454 sec/batch
Epoch 1/20  Iteration 41/3560 Training loss: 3.4179 0.8416 sec/batch
Epoch 1/20  Iteration 42/3560 Training loss: 3.4116 0.8527 sec/batch
Epoch 1/20  Iteration 43/3560 Training loss: 3.4052 0.8516 sec/batch
Epoch 1/20  Iteration 44/3560 Training loss: 3.3992 0.8600 sec/batch
Epoch 1/20  Iteration 45/3560 Training loss: 3.3934 0.9106 sec/batch
Epoch 1/20  Iteration 46/3560 Training loss: 3.3881 0.8441 sec/batch
Epoch 1/20  Iteration 47/3560 Training loss: 3.3831 0.8388 sec/batch
Epoch 1/20  Iteration 48/3560 Training loss: 3.3783 0.8401 sec/batch
Epoch 1/20  Iteration 49/3560 Training loss: 3.3736 0.8490 sec/batch
Epoch 1/20  Iteration 50/3560 Training loss: 3.3691 0.8442 sec/batch
Epoch 1/20  Iteration 51/3560 Training loss: 3.3646 0.8405 sec/batch
Epoch 1/20  Iteration 52/3560 Training loss: 3.3600 0.8467 sec/batch
Epoch 1/20  Iteration 53/3560 Training loss: 3.3558 0.8382 sec/batch
Epoch 1/20  Iteration 54/3560 Training loss: 3.3514 0.8464 sec/batch
Epoch 1/20  Iteration 55/3560 Training loss: 3.3474 0.8397 sec/batch
Epoch 1/20  Iteration 56/3560 Training loss: 3.3430 0.8505 sec/batch
Epoch 1/20  Iteration 57/3560 Training loss: 3.3391 0.8483 sec/batch
Epoch 1/20  Iteration 58/3560 Training loss: 3.3352 0.8409 sec/batch
Epoch 1/20  Iteration 59/3560 Training loss: 3.3313 0.8378 sec/batch
Epoch 1/20  Iteration 60/3560 Training loss: 3.3278 0.8421 sec/batch
Epoch 1/20  Iteration 61/3560 Training loss: 3.3242 0.8472 sec/batch
Epoch 1/20  Iteration 62/3560 Training loss: 3.3210 0.8454 sec/batch
Epoch 1/20  Iteration 63/3560 Training loss: 3.3180 0.8426 sec/batch
Epoch 1/20  Iteration 64/3560 Training loss: 3.3144 0.8374 sec/batch
Epoch 1/20  Iteration 65/3560 Training loss: 3.3109 0.8445 sec/batch
Epoch 1/20  Iteration 66/3560 Training loss: 3.3079 0.8511 sec/batch
Epoch 1/20  Iteration 67/3560 Training loss: 3.3050 0.8392 sec/batch
Epoch 1/20  Iteration 68/3560 Training loss: 3.3014 0.8480 sec/batch
Epoch 1/20  Iteration 69/3560 Training loss: 3.2983 0.8392 sec/batch
Epoch 1/20  Iteration 70/3560 Training loss: 3.2955 0.8395 sec/batch
Epoch 1/20  Iteration 71/3560 Training loss: 3.2926 0.8902 sec/batch
Epoch 1/20  Iteration 72/3560 Training loss: 3.2901 0.8379 sec/batch
Epoch 1/20  Iteration 73/3560 Training loss: 3.2873 0.8388 sec/batch
Epoch 1/20  Iteration 74/3560 Training loss: 3.2846 0.8422 sec/batch
Epoch 1/20  Iteration 75/3560 Training loss: 3.2821 0.8520 sec/batch
Epoch 1/20  Iteration 76/3560 Training loss: 3.2796 0.8404 sec/batch
Epoch 1/20  Iteration 77/3560 Training loss: 3.2771 0.8436 sec/batch
Epoch 1/20  Iteration 78/3560 Training loss: 3.2745 0.8454 sec/batch
Epoch 1/20  Iteration 79/3560 Training loss: 3.2720 0.8383 sec/batch
Epoch 1/20  Iteration 80/3560 Training loss: 3.2693 0.8657 sec/batch
Epoch 1/20  Iteration 81/3560 Training loss: 3.2668 0.8482 sec/batch
Epoch 1/20  Iteration 82/3560 Training loss: 3.2644 0.8435 sec/batch
Epoch 1/20  Iteration 83/3560 Training loss: 3.2622 0.8443 sec/batch
Epoch 1/20  Iteration 84/3560 Training loss: 3.2598 0.8494 sec/batch
Epoch 1/20  Iteration 85/3560 Training loss: 3.2571 0.8449 sec/batch
Epoch 1/20  Iteration 86/3560 Training loss: 3.2547 0.8443 sec/batch
Epoch 1/20  Iteration 87/3560 Training loss: 3.2522 0.8898 sec/batch
Epoch 1/20  Iteration 88/3560 Training loss: 3.2498 0.8473 sec/batch
Epoch 1/20  Iteration 89/3560 Training loss: 3.2477 0.8433 sec/batch
Epoch 1/20  Iteration 90/3560 Training loss: 3.2455 0.8381 sec/batch
Epoch 1/20  Iteration 91/3560 Training loss: 3.2433 0.9058 sec/batch
Epoch 1/20  Iteration 92/3560 Training loss: 3.2410 0.8507 sec/batch
Epoch 1/20  Iteration 93/3560 Training loss: 3.2388 0.8654 sec/batch
Epoch 1/20  Iteration 94/3560 Training loss: 3.2366 0.8436 sec/batch
Epoch 1/20  Iteration 95/3560 Training loss: 3.2343 0.8427 sec/batch
Epoch 1/20  Iteration 96/3560 Training loss: 3.2320 0.8417 sec/batch
Epoch 1/20  Iteration 97/3560 Training loss: 3.2298 0.8410 sec/batch
Epoch 1/20  Iteration 98/3560 Training loss: 3.2276 0.8474 sec/batch
Epoch 1/20  Iteration 99/3560 Training loss: 3.2255 0.8400 sec/batch
Epoch 1/20  Iteration 100/3560 Training loss: 3.2233 0.8381 sec/batch
Epoch 1/20  Iteration 101/3560 Training loss: 3.2213 0.8386 sec/batch
Epoch 1/20  Iteration 102/3560 Training loss: 3.2191 0.8437 sec/batch
Epoch 1/20  Iteration 103/3560 Training loss: 3.2169 0.8746 sec/batch
Epoch 1/20  Iteration 104/3560 Training loss: 3.2147 0.9377 sec/batch
Epoch 1/20  Iteration 105/3560 Training loss: 3.2125 0.8408 sec/batch
Epoch 1/20  Iteration 106/3560 Training loss: 3.2104 0.8412 sec/batch
Epoch 1/20  Iteration 107/3560 Training loss: 3.2081 0.8373 sec/batch
Epoch 1/20  Iteration 108/3560 Training loss: 3.2057 0.8561 sec/batch
Epoch 1/20  Iteration 109/3560 Training loss: 3.2036 0.8503 sec/batch
Epoch 1/20  Iteration 110/3560 Training loss: 3.2011 0.8479 sec/batch
Epoch 1/20  Iteration 111/3560 Training loss: 3.1989 0.8379 sec/batch
Epoch 1/20  Iteration 112/3560 Training loss: 3.1967 0.8449 sec/batch
Epoch 1/20  Iteration 113/3560 Training loss: 3.1944 0.8476 sec/batch
Epoch 1/20  Iteration 114/3560 Training loss: 3.1920 0.8451 sec/batch
Epoch 1/20  Iteration 115/3560 Training loss: 3.1896 0.8593 sec/batch
Epoch 1/20  Iteration 116/3560 Training loss: 3.1873 0.8399 sec/batch
Epoch 1/20  Iteration 117/3560 Training loss: 3.1849 0.8442 sec/batch
Epoch 1/20  Iteration 118/3560 Training loss: 3.1827 0.8478 sec/batch
Epoch 1/20  Iteration 119/3560 Training loss: 3.1806 0.8402 sec/batch
Epoch 1/20  Iteration 120/3560 Training loss: 3.1782 0.8388 sec/batch
Epoch 1/20  Iteration 121/3560 Training loss: 3.1761 0.8948 sec/batch
Epoch 1/20  Iteration 122/3560 Training loss: 3.1739 0.8568 sec/batch
Epoch 1/20  Iteration 123/3560 Training loss: 3.1717 0.8459 sec/batch
Epoch 1/20  Iteration 124/3560 Training loss: 3.1695 0.8457 sec/batch
Epoch 1/20  Iteration 125/3560 Training loss: 3.1672 0.8399 sec/batch
Epoch 1/20  Iteration 126/3560 Training loss: 3.1647 0.8431 sec/batch
Epoch 1/20  Iteration 127/3560 Training loss: 3.1624 0.8463 sec/batch
Epoch 1/20  Iteration 128/3560 Training loss: 3.1602 0.8476 sec/batch
Epoch 1/20  Iteration 129/3560 Training loss: 3.1578 0.8493 sec/batch
Epoch 1/20  Iteration 130/3560 Training loss: 3.1554 0.8456 sec/batch
Epoch 1/20  Iteration 131/3560 Training loss: 3.1532 0.9646 sec/batch
Epoch 1/20  Iteration 132/3560 Training loss: 3.1507 0.8446 sec/batch
Epoch 1/20  Iteration 133/3560 Training loss: 3.1484 0.8854 sec/batch
Epoch 1/20  Iteration 134/3560 Training loss: 3.1459 0.8642 sec/batch
Epoch 1/20  Iteration 135/3560 Training loss: 3.1433 0.8382 sec/batch
Epoch 1/20  Iteration 136/3560 Training loss: 3.1407 0.8451 sec/batch
Epoch 1/20  Iteration 137/3560 Training loss: 3.1382 0.8422 sec/batch
Epoch 1/20  Iteration 138/3560 Training loss: 3.1357 0.8449 sec/batch
Epoch 1/20  Iteration 139/3560 Training loss: 3.1333 0.8415 sec/batch
Epoch 1/20  Iteration 140/3560 Training loss: 3.1308 0.8471 sec/batch
Epoch 1/20  Iteration 141/3560 Training loss: 3.1284 0.8468 sec/batch
Epoch 1/20  Iteration 142/3560 Training loss: 3.1258 0.8547 sec/batch
Epoch 1/20  Iteration 143/3560 Training loss: 3.1233 0.8401 sec/batch
Epoch 1/20  Iteration 144/3560 Training loss: 3.1208 0.8452 sec/batch
Epoch 1/20  Iteration 145/3560 Training loss: 3.1183 0.8415 sec/batch
Epoch 1/20  Iteration 146/3560 Training loss: 3.1158 0.8400 sec/batch
Epoch 1/20  Iteration 147/3560 Training loss: 3.1134 0.8955 sec/batch
Epoch 1/20  Iteration 148/3560 Training loss: 3.1111 0.8459 sec/batch
Epoch 1/20  Iteration 149/3560 Training loss: 3.1086 0.8440 sec/batch
Epoch 1/20  Iteration 150/3560 Training loss: 3.1060 0.8572 sec/batch
Epoch 1/20  Iteration 151/3560 Training loss: 3.1037 0.8435 sec/batch
Epoch 1/20  Iteration 152/3560 Training loss: 3.1015 0.8383 sec/batch
Epoch 1/20  Iteration 153/3560 Training loss: 3.0991 0.8494 sec/batch
Epoch 1/20  Iteration 154/3560 Training loss: 3.0967 0.8498 sec/batch
Epoch 1/20  Iteration 155/3560 Training loss: 3.0941 0.8433 sec/batch
Epoch 1/20  Iteration 156/3560 Training loss: 3.0916 0.8461 sec/batch
Epoch 1/20  Iteration 157/3560 Training loss: 3.0889 0.8463 sec/batch
Epoch 1/20  Iteration 158/3560 Training loss: 3.0864 0.8417 sec/batch
Epoch 1/20  Iteration 159/3560 Training loss: 3.0837 0.8497 sec/batch
Epoch 1/20  Iteration 160/3560 Training loss: 3.0813 0.8930 sec/batch
Epoch 1/20  Iteration 161/3560 Training loss: 3.0788 0.8429 sec/batch
Epoch 1/20  Iteration 162/3560 Training loss: 3.0760 0.8477 sec/batch
Epoch 1/20  Iteration 163/3560 Training loss: 3.0734 0.8947 sec/batch
Epoch 1/20  Iteration 164/3560 Training loss: 3.0708 0.8450 sec/batch
Epoch 1/20  Iteration 165/3560 Training loss: 3.0683 0.8524 sec/batch
Epoch 1/20  Iteration 166/3560 Training loss: 3.0658 0.8449 sec/batch
Epoch 1/20  Iteration 167/3560 Training loss: 3.0633 0.8414 sec/batch
Epoch 1/20  Iteration 168/3560 Training loss: 3.0609 0.8384 sec/batch
Epoch 1/20  Iteration 169/3560 Training loss: 3.0584 0.8476 sec/batch
Epoch 1/20  Iteration 170/3560 Training loss: 3.0559 0.8517 sec/batch
Epoch 1/20  Iteration 171/3560 Training loss: 3.0534 0.8405 sec/batch
Epoch 1/20  Iteration 172/3560 Training loss: 3.0511 0.8419 sec/batch
Epoch 1/20  Iteration 173/3560 Training loss: 3.0489 0.8514 sec/batch
Epoch 1/20  Iteration 174/3560 Training loss: 3.0467 0.8595 sec/batch
Epoch 1/20  Iteration 175/3560 Training loss: 3.0444 0.9130 sec/batch
Epoch 1/20  Iteration 176/3560 Training loss: 3.0419 0.9351 sec/batch
Epoch 1/20  Iteration 177/3560 Training loss: 3.0394 0.8392 sec/batch
Epoch 1/20  Iteration 178/3560 Training loss: 3.0369 0.8634 sec/batch
Epoch 2/20  Iteration 179/3560 Training loss: 2.6712 0.8926 sec/batch
Epoch 2/20  Iteration 180/3560 Training loss: 2.6169 0.8571 sec/batch
Epoch 2/20  Iteration 181/3560 Training loss: 2.6023 0.8608 sec/batch
Epoch 2/20  Iteration 182/3560 Training loss: 2.5950 0.8496 sec/batch
Epoch 2/20  Iteration 183/3560 Training loss: 2.5889 0.8450 sec/batch
Epoch 2/20  Iteration 184/3560 Training loss: 2.5857 0.8343 sec/batch
Epoch 2/20  Iteration 185/3560 Training loss: 2.5840 0.8686 sec/batch
Epoch 2/20  Iteration 186/3560 Training loss: 2.5825 0.8434 sec/batch
Epoch 2/20  Iteration 187/3560 Training loss: 2.5814 0.8448 sec/batch
Epoch 2/20  Iteration 188/3560 Training loss: 2.5787 0.8429 sec/batch
Epoch 2/20  Iteration 189/3560 Training loss: 2.5755 0.8531 sec/batch
Epoch 2/20  Iteration 190/3560 Training loss: 2.5737 0.8451 sec/batch
Epoch 2/20  Iteration 191/3560 Training loss: 2.5719 0.8497 sec/batch
Epoch 2/20  Iteration 192/3560 Training loss: 2.5726 0.8879 sec/batch
Epoch 2/20  Iteration 193/3560 Training loss: 2.5713 0.8538 sec/batch
Epoch 2/20  Iteration 194/3560 Training loss: 2.5701 0.8586 sec/batch
Epoch 2/20  Iteration 195/3560 Training loss: 2.5687 0.8364 sec/batch
Epoch 2/20  Iteration 196/3560 Training loss: 2.5691 0.8532 sec/batch
Epoch 2/20  Iteration 197/3560 Training loss: 2.5678 0.8847 sec/batch
Epoch 2/20  Iteration 198/3560 Training loss: 2.5650 0.8438 sec/batch
Epoch 2/20  Iteration 199/3560 Training loss: 2.5631 0.8553 sec/batch
Epoch 2/20  Iteration 200/3560 Training loss: 2.5622 0.8453 sec/batch
Epoch 2/20  Iteration 201/3560 Training loss: 2.5602 0.8408 sec/batch
Epoch 2/20  Iteration 202/3560 Training loss: 2.5582 0.8459 sec/batch
Epoch 2/20  Iteration 203/3560 Training loss: 2.5560 0.8490 sec/batch
Epoch 2/20  Iteration 204/3560 Training loss: 2.5544 0.8411 sec/batch
Epoch 2/20  Iteration 205/3560 Training loss: 2.5528 0.8434 sec/batch
Epoch 2/20  Iteration 206/3560 Training loss: 2.5511 0.8474 sec/batch
Epoch 2/20  Iteration 207/3560 Training loss: 2.5499 0.8478 sec/batch
Epoch 2/20  Iteration 208/3560 Training loss: 2.5486 0.8471 sec/batch
Epoch 2/20  Iteration 209/3560 Training loss: 2.5481 0.9279 sec/batch
Epoch 2/20  Iteration 210/3560 Training loss: 2.5464 0.8500 sec/batch
Epoch 2/20  Iteration 211/3560 Training loss: 2.5445 0.8492 sec/batch
Epoch 2/20  Iteration 212/3560 Training loss: 2.5432 0.8478 sec/batch
Epoch 2/20  Iteration 213/3560 Training loss: 2.5416 0.8473 sec/batch
Epoch 2/20  Iteration 214/3560 Training loss: 2.5402 0.8423 sec/batch
Epoch 2/20  Iteration 215/3560 Training loss: 2.5386 0.8441 sec/batch
Epoch 2/20  Iteration 216/3560 Training loss: 2.5364 0.8469 sec/batch
Epoch 2/20  Iteration 217/3560 Training loss: 2.5344 0.8501 sec/batch
Epoch 2/20  Iteration 218/3560 Training loss: 2.5327 0.8547 sec/batch
Epoch 2/20  Iteration 219/3560 Training loss: 2.5307 0.8445 sec/batch
Epoch 2/20  Iteration 220/3560 Training loss: 2.5288 0.8662 sec/batch
Epoch 2/20  Iteration 221/3560 Training loss: 2.5268 0.8809 sec/batch
Epoch 2/20  Iteration 222/3560 Training loss: 2.5250 0.9023 sec/batch
Epoch 2/20  Iteration 223/3560 Training loss: 2.5231 0.8660 sec/batch
Epoch 2/20  Iteration 224/3560 Training loss: 2.5210 0.8711 sec/batch
Epoch 2/20  Iteration 225/3560 Training loss: 2.5199 0.9985 sec/batch
Epoch 2/20  Iteration 226/3560 Training loss: 2.5183 0.9264 sec/batch
Epoch 2/20  Iteration 227/3560 Training loss: 2.5167 0.8899 sec/batch
Epoch 2/20  Iteration 228/3560 Training loss: 2.5158 0.8436 sec/batch
Epoch 2/20  Iteration 229/3560 Training loss: 2.5143 0.8545 sec/batch
Epoch 2/20  Iteration 230/3560 Training loss: 2.5131 0.8463 sec/batch
Epoch 2/20  Iteration 231/3560 Training loss: 2.5116 0.8393 sec/batch
Epoch 2/20  Iteration 232/3560 Training loss: 2.5100 0.8555 sec/batch
Epoch 2/20  Iteration 233/3560 Training loss: 2.5085 0.8410 sec/batch
Epoch 2/20  Iteration 234/3560 Training loss: 2.5072 0.8426 sec/batch
Epoch 2/20  Iteration 235/3560 Training loss: 2.5060 0.8496 sec/batch
Epoch 2/20  Iteration 236/3560 Training loss: 2.5044 0.8529 sec/batch
Epoch 2/20  Iteration 237/3560 Training loss: 2.5030 0.8451 sec/batch
Epoch 2/20  Iteration 238/3560 Training loss: 2.5019 0.8456 sec/batch
Epoch 2/20  Iteration 239/3560 Training loss: 2.5006 0.8369 sec/batch
Epoch 2/20  Iteration 240/3560 Training loss: 2.4996 0.8399 sec/batch
Epoch 2/20  Iteration 241/3560 Training loss: 2.4987 0.8510 sec/batch
Epoch 2/20  Iteration 242/3560 Training loss: 2.4974 0.8444 sec/batch
Epoch 2/20  Iteration 243/3560 Training loss: 2.4959 0.8470 sec/batch
Epoch 2/20  Iteration 244/3560 Training loss: 2.4949 0.8500 sec/batch
Epoch 2/20  Iteration 245/3560 Training loss: 2.4938 0.8462 sec/batch
Epoch 2/20  Iteration 246/3560 Training loss: 2.4921 0.8835 sec/batch
Epoch 2/20  Iteration 247/3560 Training loss: 2.4907 0.8550 sec/batch
Epoch 2/20  Iteration 248/3560 Training loss: 2.4895 0.8500 sec/batch
Epoch 2/20  Iteration 249/3560 Training loss: 2.4884 0.8389 sec/batch
Epoch 2/20  Iteration 250/3560 Training loss: 2.4874 0.8441 sec/batch
Epoch 2/20  Iteration 251/3560 Training loss: 2.4862 0.8477 sec/batch
Epoch 2/20  Iteration 252/3560 Training loss: 2.4849 0.8871 sec/batch
Epoch 2/20  Iteration 253/3560 Training loss: 2.4838 0.8451 sec/batch
Epoch 2/20  Iteration 254/3560 Training loss: 2.4831 0.8582 sec/batch
Epoch 2/20  Iteration 255/3560 Training loss: 2.4819 0.8634 sec/batch
Epoch 2/20  Iteration 256/3560 Training loss: 2.4809 0.8552 sec/batch
Epoch 2/20  Iteration 257/3560 Training loss: 2.4796 0.8532 sec/batch
Epoch 2/20  Iteration 258/3560 Training loss: 2.4784 0.8427 sec/batch
Epoch 2/20  Iteration 259/3560 Training loss: 2.4772 0.8851 sec/batch
Epoch 2/20  Iteration 260/3560 Training loss: 2.4762 0.8367 sec/batch
Epoch 2/20  Iteration 261/3560 Training loss: 2.4751 0.8445 sec/batch
Epoch 2/20  Iteration 262/3560 Training loss: 2.4738 0.8634 sec/batch
Epoch 2/20  Iteration 263/3560 Training loss: 2.4722 0.8481 sec/batch
Epoch 2/20  Iteration 264/3560 Training loss: 2.4710 0.8577 sec/batch
Epoch 2/20  Iteration 265/3560 Training loss: 2.4699 0.8625 sec/batch
Epoch 2/20  Iteration 266/3560 Training loss: 2.4688 0.9029 sec/batch
Epoch 2/20  Iteration 267/3560 Training loss: 2.4675 0.8629 sec/batch
Epoch 2/20  Iteration 268/3560 Training loss: 2.4666 0.8405 sec/batch
Epoch 2/20  Iteration 269/3560 Training loss: 2.4654 0.8548 sec/batch
Epoch 2/20  Iteration 270/3560 Training loss: 2.4644 0.9361 sec/batch
Epoch 2/20  Iteration 271/3560 Training loss: 2.4633 0.9072 sec/batch
Epoch 2/20  Iteration 272/3560 Training loss: 2.4621 0.8831 sec/batch
Epoch 2/20  Iteration 273/3560 Training loss: 2.4609 0.8631 sec/batch
Epoch 2/20  Iteration 274/3560 Training loss: 2.4597 0.8444 sec/batch
Epoch 2/20  Iteration 275/3560 Training loss: 2.4586 0.8416 sec/batch
Epoch 2/20  Iteration 276/3560 Training loss: 2.4575 0.9020 sec/batch
Epoch 2/20  Iteration 277/3560 Training loss: 2.4564 0.8490 sec/batch
Epoch 2/20  Iteration 278/3560 Training loss: 2.4553 0.8482 sec/batch
Epoch 2/20  Iteration 279/3560 Training loss: 2.4544 0.8431 sec/batch
Epoch 2/20  Iteration 280/3560 Training loss: 2.4534 0.8451 sec/batch
Epoch 2/20  Iteration 281/3560 Training loss: 2.4522 0.8553 sec/batch
Epoch 2/20  Iteration 282/3560 Training loss: 2.4512 0.8440 sec/batch
Epoch 2/20  Iteration 283/3560 Training loss: 2.4501 0.8481 sec/batch
Epoch 2/20  Iteration 284/3560 Training loss: 2.4491 0.8462 sec/batch
Epoch 2/20  Iteration 285/3560 Training loss: 2.4480 0.8466 sec/batch
Epoch 2/20  Iteration 286/3560 Training loss: 2.4473 0.8394 sec/batch
Epoch 2/20  Iteration 287/3560 Training loss: 2.4464 0.8459 sec/batch
Epoch 2/20  Iteration 288/3560 Training loss: 2.4453 0.8397 sec/batch
Epoch 2/20  Iteration 289/3560 Training loss: 2.4444 0.8681 sec/batch
Epoch 2/20  Iteration 290/3560 Training loss: 2.4435 0.8517 sec/batch
Epoch 2/20  Iteration 291/3560 Training loss: 2.4425 0.8379 sec/batch
Epoch 2/20  Iteration 292/3560 Training loss: 2.4415 0.8458 sec/batch
Epoch 2/20  Iteration 293/3560 Training loss: 2.4405 0.8478 sec/batch
Epoch 2/20  Iteration 294/3560 Training loss: 2.4392 0.8520 sec/batch
Epoch 2/20  Iteration 295/3560 Training loss: 2.4383 0.8414 sec/batch
Epoch 2/20  Iteration 296/3560 Training loss: 2.4374 0.8466 sec/batch
Epoch 2/20  Iteration 297/3560 Training loss: 2.4367 0.8472 sec/batch
Epoch 2/20  Iteration 298/3560 Training loss: 2.4358 0.8466 sec/batch
Epoch 2/20  Iteration 299/3560 Training loss: 2.4351 0.8413 sec/batch
Epoch 2/20  Iteration 300/3560 Training loss: 2.4342 0.8429 sec/batch
Epoch 2/20  Iteration 301/3560 Training loss: 2.4332 0.8470 sec/batch
Epoch 2/20  Iteration 302/3560 Training loss: 2.4324 0.8467 sec/batch
Epoch 2/20  Iteration 303/3560 Training loss: 2.4316 0.8434 sec/batch
Epoch 2/20  Iteration 304/3560 Training loss: 2.4306 0.8493 sec/batch
Epoch 2/20  Iteration 305/3560 Training loss: 2.4298 0.8474 sec/batch
Epoch 2/20  Iteration 306/3560 Training loss: 2.4291 0.8455 sec/batch
Epoch 2/20  Iteration 307/3560 Training loss: 2.4283 0.8498 sec/batch
Epoch 2/20  Iteration 308/3560 Training loss: 2.4275 0.8398 sec/batch
Epoch 2/20  Iteration 309/3560 Training loss: 2.4266 0.8426 sec/batch
Epoch 2/20  Iteration 310/3560 Training loss: 2.4256 0.8485 sec/batch
Epoch 2/20  Iteration 311/3560 Training loss: 2.4248 0.8421 sec/batch
Epoch 2/20  Iteration 312/3560 Training loss: 2.4242 0.8544 sec/batch
Epoch 2/20  Iteration 313/3560 Training loss: 2.4232 0.8452 sec/batch
Epoch 2/20  Iteration 314/3560 Training loss: 2.4225 0.8458 sec/batch
Epoch 2/20  Iteration 315/3560 Training loss: 2.4217 0.8454 sec/batch
Epoch 2/20  Iteration 316/3560 Training loss: 2.4209 0.8433 sec/batch
Epoch 2/20  Iteration 317/3560 Training loss: 2.4203 0.8459 sec/batch
Epoch 2/20  Iteration 318/3560 Training loss: 2.4197 0.8491 sec/batch
Epoch 2/20  Iteration 319/3560 Training loss: 2.4197 0.8469 sec/batch
Epoch 2/20  Iteration 320/3560 Training loss: 2.4193 0.8525 sec/batch
Epoch 2/20  Iteration 321/3560 Training loss: 2.4191 0.8385 sec/batch
Epoch 2/20  Iteration 322/3560 Training loss: 2.4188 0.8665 sec/batch
Epoch 2/20  Iteration 323/3560 Training loss: 2.4184 0.8493 sec/batch
Epoch 2/20  Iteration 324/3560 Training loss: 2.4183 0.8552 sec/batch
Epoch 2/20  Iteration 325/3560 Training loss: 2.4179 0.9207 sec/batch
Epoch 2/20  Iteration 326/3560 Training loss: 2.4176 0.8480 sec/batch
Epoch 2/20  Iteration 327/3560 Training loss: 2.4171 0.8925 sec/batch
Epoch 2/20  Iteration 328/3560 Training loss: 2.4166 0.8471 sec/batch
Epoch 2/20  Iteration 329/3560 Training loss: 2.4160 0.8366 sec/batch
Epoch 2/20  Iteration 330/3560 Training loss: 2.4156 0.8490 sec/batch
Epoch 2/20  Iteration 331/3560 Training loss: 2.4149 0.8482 sec/batch
Epoch 2/20  Iteration 332/3560 Training loss: 2.4143 0.8521 sec/batch
Epoch 2/20  Iteration 333/3560 Training loss: 2.4135 0.8499 sec/batch
Epoch 2/20  Iteration 334/3560 Training loss: 2.4127 0.8458 sec/batch
Epoch 2/20  Iteration 335/3560 Training loss: 2.4119 0.8383 sec/batch
Epoch 2/20  Iteration 336/3560 Training loss: 2.4111 0.8393 sec/batch
Epoch 2/20  Iteration 337/3560 Training loss: 2.4102 0.8541 sec/batch
Epoch 2/20  Iteration 338/3560 Training loss: 2.4097 0.8393 sec/batch
Epoch 2/20  Iteration 339/3560 Training loss: 2.4090 0.8413 sec/batch
Epoch 2/20  Iteration 340/3560 Training loss: 2.4081 0.8419 sec/batch
Epoch 2/20  Iteration 341/3560 Training loss: 2.4073 0.8419 sec/batch
Epoch 2/20  Iteration 342/3560 Training loss: 2.4066 0.8468 sec/batch
Epoch 2/20  Iteration 343/3560 Training loss: 2.4059 0.8851 sec/batch
Epoch 2/20  Iteration 344/3560 Training loss: 2.4051 0.8488 sec/batch
Epoch 2/20  Iteration 345/3560 Training loss: 2.4044 0.8654 sec/batch
Epoch 2/20  Iteration 346/3560 Training loss: 2.4038 0.8558 sec/batch
Epoch 2/20  Iteration 347/3560 Training loss: 2.4032 0.8451 sec/batch
Epoch 2/20  Iteration 348/3560 Training loss: 2.4024 0.8410 sec/batch
Epoch 2/20  Iteration 349/3560 Training loss: 2.4016 0.8403 sec/batch
Epoch 2/20  Iteration 350/3560 Training loss: 2.4008 0.8405 sec/batch
Epoch 2/20  Iteration 351/3560 Training loss: 2.4002 0.8908 sec/batch
Epoch 2/20  Iteration 352/3560 Training loss: 2.3996 0.8451 sec/batch
Epoch 2/20  Iteration 353/3560 Training loss: 2.3990 0.8403 sec/batch
Epoch 2/20  Iteration 354/3560 Training loss: 2.3984 0.8445 sec/batch
Epoch 2/20  Iteration 355/3560 Training loss: 2.3976 0.8501 sec/batch
Epoch 2/20  Iteration 356/3560 Training loss: 2.3968 0.8472 sec/batch
Epoch 3/20  Iteration 357/3560 Training loss: 2.3461 0.8409 sec/batch
Epoch 3/20  Iteration 358/3560 Training loss: 2.2930 0.8451 sec/batch
Epoch 3/20  Iteration 359/3560 Training loss: 2.2749 0.8633 sec/batch
Epoch 3/20  Iteration 360/3560 Training loss: 2.2698 0.8849 sec/batch
Epoch 3/20  Iteration 361/3560 Training loss: 2.2665 0.8451 sec/batch
Epoch 3/20  Iteration 362/3560 Training loss: 2.2635 0.8402 sec/batch
Epoch 3/20  Iteration 363/3560 Training loss: 2.2635 0.8975 sec/batch
Epoch 3/20  Iteration 364/3560 Training loss: 2.2643 0.8456 sec/batch
Epoch 3/20  Iteration 365/3560 Training loss: 2.2654 0.8406 sec/batch
Epoch 3/20  Iteration 366/3560 Training loss: 2.2655 0.8869 sec/batch
Epoch 3/20  Iteration 367/3560 Training loss: 2.2636 0.8404 sec/batch
Epoch 3/20  Iteration 368/3560 Training loss: 2.2624 0.8483 sec/batch
Epoch 3/20  Iteration 369/3560 Training loss: 2.2627 0.8533 sec/batch
Epoch 3/20  Iteration 370/3560 Training loss: 2.2653 0.8436 sec/batch
Epoch 3/20  Iteration 371/3560 Training loss: 2.2640 0.8958 sec/batch
Epoch 3/20  Iteration 372/3560 Training loss: 2.2631 0.8870 sec/batch
Epoch 3/20  Iteration 373/3560 Training loss: 2.2620 0.8341 sec/batch
Epoch 3/20  Iteration 374/3560 Training loss: 2.2637 0.8377 sec/batch
Epoch 3/20  Iteration 375/3560 Training loss: 2.2633 0.8402 sec/batch
Epoch 3/20  Iteration 376/3560 Training loss: 2.2623 0.8419 sec/batch
Epoch 3/20  Iteration 377/3560 Training loss: 2.2617 0.8859 sec/batch
Epoch 3/20  Iteration 378/3560 Training loss: 2.2621 0.8490 sec/batch
Epoch 3/20  Iteration 379/3560 Training loss: 2.2614 0.8419 sec/batch
Epoch 3/20  Iteration 380/3560 Training loss: 2.2604 0.8455 sec/batch
Epoch 3/20  Iteration 381/3560 Training loss: 2.2598 0.8464 sec/batch
Epoch 3/20  Iteration 382/3560 Training loss: 2.2588 0.8455 sec/batch
Epoch 3/20  Iteration 383/3560 Training loss: 2.2579 0.8445 sec/batch
Epoch 3/20  Iteration 384/3560 Training loss: 2.2576 0.8507 sec/batch
Epoch 3/20  Iteration 385/3560 Training loss: 2.2578 0.8429 sec/batch
Epoch 3/20  Iteration 386/3560 Training loss: 2.2575 0.8446 sec/batch
Epoch 3/20  Iteration 387/3560 Training loss: 2.2571 0.8421 sec/batch
Epoch 3/20  Iteration 388/3560 Training loss: 2.2562 0.8543 sec/batch
Epoch 3/20  Iteration 389/3560 Training loss: 2.2554 0.8431 sec/batch
Epoch 3/20  Iteration 390/3560 Training loss: 2.2554 0.8456 sec/batch
Epoch 3/20  Iteration 391/3560 Training loss: 2.2545 0.8460 sec/batch
Epoch 3/20  Iteration 392/3560 Training loss: 2.2541 0.8511 sec/batch
Epoch 3/20  Iteration 393/3560 Training loss: 2.2535 0.8495 sec/batch
Epoch 3/20  Iteration 394/3560 Training loss: 2.2519 0.8619 sec/batch
Epoch 3/20  Iteration 395/3560 Training loss: 2.2506 0.8421 sec/batch
Epoch 3/20  Iteration 396/3560 Training loss: 2.2495 0.8461 sec/batch
Epoch 3/20  Iteration 397/3560 Training loss: 2.2487 0.8457 sec/batch
Epoch 3/20  Iteration 398/3560 Training loss: 2.2479 0.8562 sec/batch
Epoch 3/20  Iteration 399/3560 Training loss: 2.2469 0.8459 sec/batch
Epoch 3/20  Iteration 400/3560 Training loss: 2.2459 0.8470 sec/batch
Epoch 3/20  Iteration 401/3560 Training loss: 2.2453 0.8412 sec/batch
Epoch 3/20  Iteration 402/3560 Training loss: 2.2437 0.8487 sec/batch
Epoch 3/20  Iteration 403/3560 Training loss: 2.2436 0.8362 sec/batch
Epoch 3/20  Iteration 404/3560 Training loss: 2.2426 0.8514 sec/batch
Epoch 3/20  Iteration 405/3560 Training loss: 2.2421 0.8475 sec/batch
Epoch 3/20  Iteration 406/3560 Training loss: 2.2422 0.8377 sec/batch
Epoch 3/20  Iteration 407/3560 Training loss: 2.2410 0.8869 sec/batch
Epoch 3/20  Iteration 408/3560 Training loss: 2.2409 0.8490 sec/batch
Epoch 3/20  Iteration 409/3560 Training loss: 2.2401 0.8453 sec/batch
Epoch 3/20  Iteration 410/3560 Training loss: 2.2396 0.8738 sec/batch
Epoch 3/20  Iteration 411/3560 Training loss: 2.2389 1.0214 sec/batch
Epoch 3/20  Iteration 412/3560 Training loss: 2.2386 0.8842 sec/batch
Epoch 3/20  Iteration 413/3560 Training loss: 2.2383 0.8647 sec/batch
Epoch 3/20  Iteration 414/3560 Training loss: 2.2377 0.8572 sec/batch
Epoch 3/20  Iteration 415/3560 Training loss: 2.2371 0.8461 sec/batch
Epoch 3/20  Iteration 416/3560 Training loss: 2.2371 0.8545 sec/batch
Epoch 3/20  Iteration 417/3560 Training loss: 2.2364 0.8445 sec/batch
Epoch 3/20  Iteration 418/3560 Training loss: 2.2364 0.8988 sec/batch
Epoch 3/20  Iteration 419/3560 Training loss: 2.2363 0.8473 sec/batch
Epoch 3/20  Iteration 420/3560 Training loss: 2.2359 0.8423 sec/batch
Epoch 3/20  Iteration 421/3560 Training loss: 2.2354 0.8512 sec/batch
Epoch 3/20  Iteration 422/3560 Training loss: 2.2352 0.8484 sec/batch
Epoch 3/20  Iteration 423/3560 Training loss: 2.2348 0.8460 sec/batch
Epoch 3/20  Iteration 424/3560 Training loss: 2.2340 0.8401 sec/batch
Epoch 3/20  Iteration 425/3560 Training loss: 2.2334 0.8422 sec/batch
Epoch 3/20  Iteration 426/3560 Training loss: 2.2330 0.8418 sec/batch
Epoch 3/20  Iteration 427/3560 Training loss: 2.2328 0.8411 sec/batch
Epoch 3/20  Iteration 428/3560 Training loss: 2.2324 0.8463 sec/batch
Epoch 3/20  Iteration 429/3560 Training loss: 2.2323 0.8591 sec/batch
Epoch 3/20  Iteration 430/3560 Training loss: 2.2315 0.8511 sec/batch
Epoch 3/20  Iteration 431/3560 Training loss: 2.2311 0.8478 sec/batch
Epoch 3/20  Iteration 432/3560 Training loss: 2.2313 0.8457 sec/batch
Epoch 3/20  Iteration 433/3560 Training loss: 2.2306 0.8548 sec/batch
Epoch 3/20  Iteration 434/3560 Training loss: 2.2305 0.9402 sec/batch
Epoch 3/20  Iteration 435/3560 Training loss: 2.2297 0.8593 sec/batch
Epoch 3/20  Iteration 436/3560 Training loss: 2.2292 0.8588 sec/batch
Epoch 3/20  Iteration 437/3560 Training loss: 2.2286 0.8559 sec/batch
Epoch 3/20  Iteration 438/3560 Training loss: 2.2284 0.8760 sec/batch
Epoch 3/20  Iteration 439/3560 Training loss: 2.2277 0.8536 sec/batch
Epoch 3/20  Iteration 440/3560 Training loss: 2.2271 0.8808 sec/batch
Epoch 3/20  Iteration 441/3560 Training loss: 2.2262 0.8667 sec/batch
Epoch 3/20  Iteration 442/3560 Training loss: 2.2257 0.8600 sec/batch
Epoch 3/20  Iteration 443/3560 Training loss: 2.2253 0.8791 sec/batch
Epoch 3/20  Iteration 444/3560 Training loss: 2.2248 0.8748 sec/batch
Epoch 3/20  Iteration 445/3560 Training loss: 2.2241 0.8542 sec/batch
Epoch 3/20  Iteration 446/3560 Training loss: 2.2238 0.9194 sec/batch
Epoch 3/20  Iteration 447/3560 Training loss: 2.2233 0.8694 sec/batch
Epoch 3/20  Iteration 448/3560 Training loss: 2.2229 0.8581 sec/batch
Epoch 3/20  Iteration 449/3560 Training loss: 2.2222 0.8783 sec/batch
Epoch 3/20  Iteration 450/3560 Training loss: 2.2216 0.8528 sec/batch
Epoch 3/20  Iteration 451/3560 Training loss: 2.2210 0.8475 sec/batch
Epoch 3/20  Iteration 452/3560 Training loss: 2.2205 0.8487 sec/batch
Epoch 3/20  Iteration 453/3560 Training loss: 2.2200 0.8520 sec/batch
Epoch 3/20  Iteration 454/3560 Training loss: 2.2195 0.8651 sec/batch
Epoch 3/20  Iteration 455/3560 Training loss: 2.2189 0.8787 sec/batch
Epoch 3/20  Iteration 456/3560 Training loss: 2.2183 0.8974 sec/batch
Epoch 3/20  Iteration 457/3560 Training loss: 2.2180 0.9022 sec/batch
Epoch 3/20  Iteration 458/3560 Training loss: 2.2176 0.9947 sec/batch
Epoch 3/20  Iteration 459/3560 Training loss: 2.2169 0.8635 sec/batch
Epoch 3/20  Iteration 460/3560 Training loss: 2.2164 0.8640 sec/batch
Epoch 3/20  Iteration 461/3560 Training loss: 2.2158 0.8497 sec/batch
Epoch 3/20  Iteration 462/3560 Training loss: 2.2154 0.9057 sec/batch
Epoch 3/20  Iteration 463/3560 Training loss: 2.2150 0.8870 sec/batch
Epoch 3/20  Iteration 464/3560 Training loss: 2.2147 0.8525 sec/batch
Epoch 3/20  Iteration 465/3560 Training loss: 2.2145 0.8516 sec/batch
Epoch 3/20  Iteration 466/3560 Training loss: 2.2140 0.8505 sec/batch
Epoch 3/20  Iteration 467/3560 Training loss: 2.2136 0.8474 sec/batch
Epoch 3/20  Iteration 468/3560 Training loss: 2.2132 0.8487 sec/batch
Epoch 3/20  Iteration 469/3560 Training loss: 2.2128 0.8541 sec/batch
Epoch 3/20  Iteration 470/3560 Training loss: 2.2123 0.8526 sec/batch
Epoch 3/20  Iteration 471/3560 Training loss: 2.2118 0.8621 sec/batch
Epoch 3/20  Iteration 472/3560 Training loss: 2.2110 0.8548 sec/batch
Epoch 3/20  Iteration 473/3560 Training loss: 2.2106 0.8533 sec/batch
Epoch 3/20  Iteration 474/3560 Training loss: 2.2103 0.8458 sec/batch
Epoch 3/20  Iteration 475/3560 Training loss: 2.2101 0.8541 sec/batch
Epoch 3/20  Iteration 476/3560 Training loss: 2.2097 0.8561 sec/batch
Epoch 3/20  Iteration 477/3560 Training loss: 2.2096 0.8522 sec/batch
Epoch 3/20  Iteration 478/3560 Training loss: 2.2090 0.8539 sec/batch
Epoch 3/20  Iteration 479/3560 Training loss: 2.2086 0.8572 sec/batch
Epoch 3/20  Iteration 480/3560 Training loss: 2.2083 0.8517 sec/batch
Epoch 3/20  Iteration 481/3560 Training loss: 2.2079 0.8515 sec/batch
Epoch 3/20  Iteration 482/3560 Training loss: 2.2073 0.8585 sec/batch
Epoch 3/20  Iteration 483/3560 Training loss: 2.2071 0.8459 sec/batch
Epoch 3/20  Iteration 484/3560 Training loss: 2.2068 0.8497 sec/batch
Epoch 3/20  Iteration 485/3560 Training loss: 2.2064 0.8511 sec/batch
Epoch 3/20  Iteration 486/3560 Training loss: 2.2061 0.9044 sec/batch
Epoch 3/20  Iteration 487/3560 Training loss: 2.2056 0.8450 sec/batch
Epoch 3/20  Iteration 488/3560 Training loss: 2.2050 0.8460 sec/batch
Epoch 3/20  Iteration 489/3560 Training loss: 2.2047 0.8553 sec/batch
Epoch 3/20  Iteration 490/3560 Training loss: 2.2044 0.8534 sec/batch
Epoch 3/20  Iteration 491/3560 Training loss: 2.2040 0.8505 sec/batch
Epoch 3/20  Iteration 492/3560 Training loss: 2.2038 0.8482 sec/batch
Epoch 3/20  Iteration 493/3560 Training loss: 2.2035 0.8518 sec/batch
Epoch 3/20  Iteration 494/3560 Training loss: 2.2032 0.8533 sec/batch
Epoch 3/20  Iteration 495/3560 Training loss: 2.2032 0.8520 sec/batch
Epoch 3/20  Iteration 496/3560 Training loss: 2.2027 0.8514 sec/batch
Epoch 3/20  Iteration 497/3560 Training loss: 2.2026 0.8500 sec/batch
Epoch 3/20  Iteration 498/3560 Training loss: 2.2022 0.8710 sec/batch
Epoch 3/20  Iteration 499/3560 Training loss: 2.2018 0.8506 sec/batch
Epoch 3/20  Iteration 500/3560 Training loss: 2.2015 0.8515 sec/batch
Epoch 3/20  Iteration 501/3560 Training loss: 2.2011 0.8506 sec/batch
Epoch 3/20  Iteration 502/3560 Training loss: 2.2010 0.8703 sec/batch
Epoch 3/20  Iteration 503/3560 Training loss: 2.2007 0.8849 sec/batch
Epoch 3/20  Iteration 504/3560 Training loss: 2.2006 0.8738 sec/batch
Epoch 3/20  Iteration 505/3560 Training loss: 2.2002 0.8539 sec/batch
Epoch 3/20  Iteration 506/3560 Training loss: 2.1997 0.8606 sec/batch
Epoch 3/20  Iteration 507/3560 Training loss: 2.1994 0.9093 sec/batch
Epoch 3/20  Iteration 508/3560 Training loss: 2.1994 0.8668 sec/batch
Epoch 3/20  Iteration 509/3560 Training loss: 2.1991 0.9820 sec/batch
Epoch 3/20  Iteration 510/3560 Training loss: 2.1988 0.8725 sec/batch
Epoch 3/20  Iteration 511/3560 Training loss: 2.1984 0.8733 sec/batch
Epoch 3/20  Iteration 512/3560 Training loss: 2.1981 0.8481 sec/batch
Epoch 3/20  Iteration 513/3560 Training loss: 2.1977 0.8495 sec/batch
Epoch 3/20  Iteration 514/3560 Training loss: 2.1973 0.8710 sec/batch
Epoch 3/20  Iteration 515/3560 Training loss: 2.1969 0.8754 sec/batch
Epoch 3/20  Iteration 516/3560 Training loss: 2.1967 0.8544 sec/batch
Epoch 3/20  Iteration 517/3560 Training loss: 2.1965 0.8627 sec/batch
Epoch 3/20  Iteration 518/3560 Training loss: 2.1961 0.8772 sec/batch
Epoch 3/20  Iteration 519/3560 Training loss: 2.1958 0.8754 sec/batch
Epoch 3/20  Iteration 520/3560 Training loss: 2.1954 0.8613 sec/batch
Epoch 3/20  Iteration 521/3560 Training loss: 2.1952 0.8894 sec/batch
Epoch 3/20  Iteration 522/3560 Training loss: 2.1949 0.9099 sec/batch
Epoch 3/20  Iteration 523/3560 Training loss: 2.1946 0.8582 sec/batch
Epoch 3/20  Iteration 524/3560 Training loss: 2.1945 0.8557 sec/batch
Epoch 3/20  Iteration 525/3560 Training loss: 2.1942 0.8495 sec/batch
Epoch 3/20  Iteration 526/3560 Training loss: 2.1938 0.8532 sec/batch
Epoch 3/20  Iteration 527/3560 Training loss: 2.1934 0.8487 sec/batch
Epoch 3/20  Iteration 528/3560 Training loss: 2.1931 0.8802 sec/batch
Epoch 3/20  Iteration 529/3560 Training loss: 2.1928 0.8971 sec/batch
Epoch 3/20  Iteration 530/3560 Training loss: 2.1926 0.9329 sec/batch
Epoch 3/20  Iteration 531/3560 Training loss: 2.1923 0.8661 sec/batch
Epoch 3/20  Iteration 532/3560 Training loss: 2.1921 0.8419 sec/batch
Epoch 3/20  Iteration 533/3560 Training loss: 2.1917 0.8660 sec/batch
Epoch 3/20  Iteration 534/3560 Training loss: 2.1914 0.8841 sec/batch
Epoch 4/20  Iteration 535/3560 Training loss: 2.2138 0.8479 sec/batch
Epoch 4/20  Iteration 536/3560 Training loss: 2.1608 0.9057 sec/batch
Epoch 4/20  Iteration 537/3560 Training loss: 2.1465 0.8932 sec/batch
Epoch 4/20  Iteration 538/3560 Training loss: 2.1376 0.8790 sec/batch
Epoch 4/20  Iteration 539/3560 Training loss: 2.1350 0.8490 sec/batch
Epoch 4/20  Iteration 540/3560 Training loss: 2.1274 0.8602 sec/batch
Epoch 4/20  Iteration 541/3560 Training loss: 2.1276 0.8483 sec/batch
Epoch 4/20  Iteration 542/3560 Training loss: 2.1276 0.8608 sec/batch
Epoch 4/20  Iteration 543/3560 Training loss: 2.1290 0.8618 sec/batch
Epoch 4/20  Iteration 544/3560 Training loss: 2.1285 0.8607 sec/batch
Epoch 4/20  Iteration 545/3560 Training loss: 2.1262 0.8622 sec/batch
Epoch 4/20  Iteration 546/3560 Training loss: 2.1244 0.9633 sec/batch
Epoch 4/20  Iteration 547/3560 Training loss: 2.1249 0.8465 sec/batch
Epoch 4/20  Iteration 548/3560 Training loss: 2.1270 0.9645 sec/batch
Epoch 4/20  Iteration 549/3560 Training loss: 2.1255 0.8743 sec/batch
Epoch 4/20  Iteration 550/3560 Training loss: 2.1245 0.9465 sec/batch
Epoch 4/20  Iteration 551/3560 Training loss: 2.1244 0.8724 sec/batch
Epoch 4/20  Iteration 552/3560 Training loss: 2.1270 0.8584 sec/batch
Epoch 4/20  Iteration 553/3560 Training loss: 2.1274 0.8545 sec/batch
Epoch 4/20  Iteration 554/3560 Training loss: 2.1270 0.8640 sec/batch
Epoch 4/20  Iteration 555/3560 Training loss: 2.1266 0.8805 sec/batch
Epoch 4/20  Iteration 556/3560 Training loss: 2.1275 0.8664 sec/batch
Epoch 4/20  Iteration 557/3560 Training loss: 2.1270 0.8588 sec/batch
Epoch 4/20  Iteration 558/3560 Training loss: 2.1263 0.8432 sec/batch
Epoch 4/20  Iteration 559/3560 Training loss: 2.1257 0.8613 sec/batch
Epoch 4/20  Iteration 560/3560 Training loss: 2.1245 0.8628 sec/batch
Epoch 4/20  Iteration 561/3560 Training loss: 2.1236 0.9082 sec/batch
Epoch 4/20  Iteration 562/3560 Training loss: 2.1235 0.8891 sec/batch
Epoch 4/20  Iteration 563/3560 Training loss: 2.1243 0.9005 sec/batch
Epoch 4/20  Iteration 564/3560 Training loss: 2.1244 0.8947 sec/batch
Epoch 4/20  Iteration 565/3560 Training loss: 2.1241 0.9634 sec/batch
Epoch 4/20  Iteration 566/3560 Training loss: 2.1232 1.2198 sec/batch
Epoch 4/20  Iteration 567/3560 Training loss: 2.1229 1.1658 sec/batch
Epoch 4/20  Iteration 568/3560 Training loss: 2.1234 1.1343 sec/batch
Epoch 4/20  Iteration 569/3560 Training loss: 2.1230 1.1340 sec/batch
Epoch 4/20  Iteration 570/3560 Training loss: 2.1226 1.1661 sec/batch
Epoch 4/20  Iteration 571/3560 Training loss: 2.1221 1.3432 sec/batch
Epoch 4/20  Iteration 572/3560 Training loss: 2.1205 1.1749 sec/batch
Epoch 4/20  Iteration 573/3560 Training loss: 2.1194 1.1450 sec/batch
Epoch 4/20  Iteration 574/3560 Training loss: 2.1183 0.8665 sec/batch
Epoch 4/20  Iteration 575/3560 Training loss: 2.1175 0.8787 sec/batch
Epoch 4/20  Iteration 576/3560 Training loss: 2.1173 0.8896 sec/batch
Epoch 4/20  Iteration 577/3560 Training loss: 2.1167 0.8797 sec/batch
Epoch 4/20  Iteration 578/3560 Training loss: 2.1157 0.8980 sec/batch
Epoch 4/20  Iteration 579/3560 Training loss: 2.1156 0.8619 sec/batch
Epoch 4/20  Iteration 580/3560 Training loss: 2.1139 0.8613 sec/batch
Epoch 4/20  Iteration 581/3560 Training loss: 2.1138 0.8560 sec/batch
Epoch 4/20  Iteration 582/3560 Training loss: 2.1130 0.8508 sec/batch
Epoch 4/20  Iteration 583/3560 Training loss: 2.1125 0.8414 sec/batch
Epoch 4/20  Iteration 584/3560 Training loss: 2.1129 0.8413 sec/batch
Epoch 4/20  Iteration 585/3560 Training loss: 2.1121 0.8512 sec/batch
Epoch 4/20  Iteration 586/3560 Training loss: 2.1125 0.8560 sec/batch
Epoch 4/20  Iteration 587/3560 Training loss: 2.1120 0.8421 sec/batch
Epoch 4/20  Iteration 588/3560 Training loss: 2.1115 0.8542 sec/batch
Epoch 4/20  Iteration 589/3560 Training loss: 2.1111 0.8409 sec/batch
Epoch 4/20  Iteration 590/3560 Training loss: 2.1111 0.8478 sec/batch
Epoch 4/20  Iteration 591/3560 Training loss: 2.1110 0.8500 sec/batch
Epoch 4/20  Iteration 592/3560 Training loss: 2.1105 0.8717 sec/batch
Epoch 4/20  Iteration 593/3560 Training loss: 2.1100 0.8946 sec/batch
Epoch 4/20  Iteration 594/3560 Training loss: 2.1103 0.9524 sec/batch
Epoch 4/20  Iteration 595/3560 Training loss: 2.1099 0.8495 sec/batch
Epoch 4/20  Iteration 596/3560 Training loss: 2.1101 0.9288 sec/batch
Epoch 4/20  Iteration 597/3560 Training loss: 2.1104 0.8560 sec/batch
Epoch 4/20  Iteration 598/3560 Training loss: 2.1105 0.8674 sec/batch
Epoch 4/20  Iteration 599/3560 Training loss: 2.1101 0.8470 sec/batch
Epoch 4/20  Iteration 600/3560 Training loss: 2.1102 0.8810 sec/batch
Epoch 4/20  Iteration 601/3560 Training loss: 2.1100 0.8605 sec/batch
Epoch 4/20  Iteration 602/3560 Training loss: 2.1093 0.8523 sec/batch
Epoch 4/20  Iteration 603/3560 Training loss: 2.1088 0.8623 sec/batch
Epoch 4/20  Iteration 604/3560 Training loss: 2.1086 0.8544 sec/batch
Epoch 4/20  Iteration 605/3560 Training loss: 2.1089 0.8484 sec/batch
Epoch 4/20  Iteration 606/3560 Training loss: 2.1088 0.8702 sec/batch
Epoch 4/20  Iteration 607/3560 Training loss: 2.1088 0.8837 sec/batch
Epoch 4/20  Iteration 608/3560 Training loss: 2.1083 0.8983 sec/batch
Epoch 4/20  Iteration 609/3560 Training loss: 2.1080 0.8603 sec/batch
Epoch 4/20  Iteration 610/3560 Training loss: 2.1082 0.8622 sec/batch
Epoch 4/20  Iteration 611/3560 Training loss: 2.1079 0.8562 sec/batch
Epoch 4/20  Iteration 612/3560 Training loss: 2.1078 0.8753 sec/batch
Epoch 4/20  Iteration 613/3560 Training loss: 2.1071 0.8823 sec/batch
Epoch 4/20  Iteration 614/3560 Training loss: 2.1067 0.8663 sec/batch
Epoch 4/20  Iteration 615/3560 Training loss: 2.1060 0.8736 sec/batch
Epoch 4/20  Iteration 616/3560 Training loss: 2.1058 0.8504 sec/batch
Epoch 4/20  Iteration 617/3560 Training loss: 2.1051 0.8716 sec/batch
Epoch 4/20  Iteration 618/3560 Training loss: 2.1047 0.8736 sec/batch
Epoch 4/20  Iteration 619/3560 Training loss: 2.1038 0.8929 sec/batch
Epoch 4/20  Iteration 620/3560 Training loss: 2.1034 0.8608 sec/batch
Epoch 4/20  Iteration 621/3560 Training loss: 2.1031 0.8671 sec/batch
Epoch 4/20  Iteration 622/3560 Training loss: 2.1027 0.8398 sec/batch
Epoch 4/20  Iteration 623/3560 Training loss: 2.1020 0.8657 sec/batch
Epoch 4/20  Iteration 624/3560 Training loss: 2.1018 0.8938 sec/batch
Epoch 4/20  Iteration 625/3560 Training loss: 2.1013 0.8563 sec/batch
Epoch 4/20  Iteration 626/3560 Training loss: 2.1009 0.8639 sec/batch
Epoch 4/20  Iteration 627/3560 Training loss: 2.1002 0.8725 sec/batch
Epoch 4/20  Iteration 628/3560 Training loss: 2.0996 0.8633 sec/batch
Epoch 4/20  Iteration 629/3560 Training loss: 2.0990 0.8733 sec/batch
Epoch 4/20  Iteration 630/3560 Training loss: 2.0987 0.8627 sec/batch
Epoch 4/20  Iteration 631/3560 Training loss: 2.0983 0.8585 sec/batch
Epoch 4/20  Iteration 632/3560 Training loss: 2.0978 0.9366 sec/batch
Epoch 4/20  Iteration 633/3560 Training loss: 2.0972 0.8767 sec/batch
Epoch 4/20  Iteration 634/3560 Training loss: 2.0966 0.8575 sec/batch
Epoch 4/20  Iteration 635/3560 Training loss: 2.0964 0.8540 sec/batch
Epoch 4/20  Iteration 636/3560 Training loss: 2.0961 0.8547 sec/batch
Epoch 4/20  Iteration 637/3560 Training loss: 2.0955 0.8466 sec/batch
Epoch 4/20  Iteration 638/3560 Training loss: 2.0952 0.8541 sec/batch
Epoch 4/20  Iteration 639/3560 Training loss: 2.0947 0.8810 sec/batch
Epoch 4/20  Iteration 640/3560 Training loss: 2.0944 0.8811 sec/batch
Epoch 4/20  Iteration 641/3560 Training loss: 2.0940 0.9033 sec/batch
Epoch 4/20  Iteration 642/3560 Training loss: 2.0938 0.8628 sec/batch
Epoch 4/20  Iteration 643/3560 Training loss: 2.0936 0.8965 sec/batch
Epoch 4/20  Iteration 644/3560 Training loss: 2.0933 0.8646 sec/batch
Epoch 4/20  Iteration 645/3560 Training loss: 2.0931 0.9170 sec/batch
Epoch 4/20  Iteration 646/3560 Training loss: 2.0928 0.9332 sec/batch
Epoch 4/20  Iteration 647/3560 Training loss: 2.0925 0.8504 sec/batch
Epoch 4/20  Iteration 648/3560 Training loss: 2.0921 0.8977 sec/batch
Epoch 4/20  Iteration 649/3560 Training loss: 2.0916 0.8588 sec/batch
Epoch 4/20  Iteration 650/3560 Training loss: 2.0910 0.8526 sec/batch
Epoch 4/20  Iteration 651/3560 Training loss: 2.0907 0.8513 sec/batch
Epoch 4/20  Iteration 652/3560 Training loss: 2.0904 0.8598 sec/batch
Epoch 4/20  Iteration 653/3560 Training loss: 2.0902 0.8519 sec/batch
Epoch 4/20  Iteration 654/3560 Training loss: 2.0900 0.8585 sec/batch
Epoch 4/20  Iteration 655/3560 Training loss: 2.0898 0.8603 sec/batch
Epoch 4/20  Iteration 656/3560 Training loss: 2.0893 0.8551 sec/batch
Epoch 4/20  Iteration 657/3560 Training loss: 2.0888 0.8957 sec/batch
Epoch 4/20  Iteration 658/3560 Training loss: 2.0887 0.8555 sec/batch
Epoch 4/20  Iteration 659/3560 Training loss: 2.0884 0.8565 sec/batch
Epoch 4/20  Iteration 660/3560 Training loss: 2.0879 0.8476 sec/batch
Epoch 4/20  Iteration 661/3560 Training loss: 2.0878 0.8617 sec/batch
Epoch 4/20  Iteration 662/3560 Training loss: 2.0875 0.8563 sec/batch
Epoch 4/20  Iteration 663/3560 Training loss: 2.0872 0.8542 sec/batch
Epoch 4/20  Iteration 664/3560 Training loss: 2.0871 0.8511 sec/batch
Epoch 4/20  Iteration 665/3560 Training loss: 2.0866 0.8492 sec/batch
Epoch 4/20  Iteration 666/3560 Training loss: 2.0861 0.8425 sec/batch
Epoch 4/20  Iteration 667/3560 Training loss: 2.0859 0.9152 sec/batch
Epoch 4/20  Iteration 668/3560 Training loss: 2.0858 0.8605 sec/batch
Epoch 4/20  Iteration 669/3560 Training loss: 2.0855 0.8436 sec/batch
Epoch 4/20  Iteration 670/3560 Training loss: 2.0854 0.8600 sec/batch
Epoch 4/20  Iteration 671/3560 Training loss: 2.0851 0.8510 sec/batch
Epoch 4/20  Iteration 672/3560 Training loss: 2.0850 0.8539 sec/batch
Epoch 4/20  Iteration 673/3560 Training loss: 2.0850 0.8575 sec/batch
Epoch 4/20  Iteration 674/3560 Training loss: 2.0846 0.8547 sec/batch
Epoch 4/20  Iteration 675/3560 Training loss: 2.0845 0.8540 sec/batch
Epoch 4/20  Iteration 676/3560 Training loss: 2.0843 0.8622 sec/batch
Epoch 4/20  Iteration 677/3560 Training loss: 2.0840 0.9022 sec/batch
Epoch 4/20  Iteration 678/3560 Training loss: 2.0838 0.8655 sec/batch
Epoch 4/20  Iteration 679/3560 Training loss: 2.0835 0.9073 sec/batch
Epoch 4/20  Iteration 680/3560 Training loss: 2.0834 0.8911 sec/batch
Epoch 4/20  Iteration 681/3560 Training loss: 2.0831 0.8537 sec/batch
Epoch 4/20  Iteration 682/3560 Training loss: 2.0831 0.9158 sec/batch
Epoch 4/20  Iteration 683/3560 Training loss: 2.0829 0.9608 sec/batch
Epoch 4/20  Iteration 684/3560 Training loss: 2.0825 0.9019 sec/batch
Epoch 4/20  Iteration 685/3560 Training loss: 2.0822 0.8711 sec/batch
Epoch 4/20  Iteration 686/3560 Training loss: 2.0822 0.8615 sec/batch
Epoch 4/20  Iteration 687/3560 Training loss: 2.0820 0.8437 sec/batch
Epoch 4/20  Iteration 688/3560 Training loss: 2.0818 0.8506 sec/batch
Epoch 4/20  Iteration 689/3560 Training loss: 2.0816 0.8499 sec/batch
Epoch 4/20  Iteration 690/3560 Training loss: 2.0813 0.8478 sec/batch
Epoch 4/20  Iteration 691/3560 Training loss: 2.0811 0.8523 sec/batch
Epoch 4/20  Iteration 692/3560 Training loss: 2.0808 0.8452 sec/batch
Epoch 4/20  Iteration 693/3560 Training loss: 2.0804 0.8770 sec/batch
Epoch 4/20  Iteration 694/3560 Training loss: 2.0804 0.9177 sec/batch
Epoch 4/20  Iteration 695/3560 Training loss: 2.0803 0.8829 sec/batch
Epoch 4/20  Iteration 696/3560 Training loss: 2.0800 0.9874 sec/batch
Epoch 4/20  Iteration 697/3560 Training loss: 2.0798 0.9492 sec/batch
Epoch 4/20  Iteration 698/3560 Training loss: 2.0796 0.9157 sec/batch
Epoch 4/20  Iteration 699/3560 Training loss: 2.0794 1.1297 sec/batch
Epoch 4/20  Iteration 700/3560 Training loss: 2.0792 0.9620 sec/batch
Epoch 4/20  Iteration 701/3560 Training loss: 2.0790 1.0012 sec/batch
Epoch 4/20  Iteration 702/3560 Training loss: 2.0790 0.9071 sec/batch
Epoch 4/20  Iteration 703/3560 Training loss: 2.0788 0.9103 sec/batch
Epoch 4/20  Iteration 704/3560 Training loss: 2.0786 0.9074 sec/batch
Epoch 4/20  Iteration 705/3560 Training loss: 2.0782 0.8471 sec/batch
Epoch 4/20  Iteration 706/3560 Training loss: 2.0780 0.8724 sec/batch
Epoch 4/20  Iteration 707/3560 Training loss: 2.0779 0.8573 sec/batch
Epoch 4/20  Iteration 708/3560 Training loss: 2.0776 0.9193 sec/batch
Epoch 4/20  Iteration 709/3560 Training loss: 2.0775 0.8401 sec/batch
Epoch 4/20  Iteration 710/3560 Training loss: 2.0773 0.8327 sec/batch
Epoch 4/20  Iteration 711/3560 Training loss: 2.0770 0.8429 sec/batch
Epoch 4/20  Iteration 712/3560 Training loss: 2.0768 0.8928 sec/batch
Epoch 5/20  Iteration 713/3560 Training loss: 2.1075 0.8448 sec/batch
Epoch 5/20  Iteration 714/3560 Training loss: 2.0592 0.8413 sec/batch
Epoch 5/20  Iteration 715/3560 Training loss: 2.0459 0.8620 sec/batch
Epoch 5/20  Iteration 716/3560 Training loss: 2.0386 0.8420 sec/batch
Epoch 5/20  Iteration 717/3560 Training loss: 2.0352 0.8330 sec/batch
Epoch 5/20  Iteration 718/3560 Training loss: 2.0276 0.8389 sec/batch
Epoch 5/20  Iteration 719/3560 Training loss: 2.0279 0.8398 sec/batch
Epoch 5/20  Iteration 720/3560 Training loss: 2.0281 0.8358 sec/batch
Epoch 5/20  Iteration 721/3560 Training loss: 2.0307 0.8327 sec/batch
Epoch 5/20  Iteration 722/3560 Training loss: 2.0290 0.8758 sec/batch
Epoch 5/20  Iteration 723/3560 Training loss: 2.0264 0.8465 sec/batch
Epoch 5/20  Iteration 724/3560 Training loss: 2.0247 0.9071 sec/batch
Epoch 5/20  Iteration 725/3560 Training loss: 2.0249 0.8664 sec/batch
Epoch 5/20  Iteration 726/3560 Training loss: 2.0273 0.8699 sec/batch
Epoch 5/20  Iteration 727/3560 Training loss: 2.0268 0.9068 sec/batch
Epoch 5/20  Iteration 728/3560 Training loss: 2.0251 0.8542 sec/batch
Epoch 5/20  Iteration 729/3560 Training loss: 2.0244 0.9451 sec/batch
Epoch 5/20  Iteration 730/3560 Training loss: 2.0272 0.8437 sec/batch
Epoch 5/20  Iteration 731/3560 Training loss: 2.0274 0.8625 sec/batch
Epoch 5/20  Iteration 732/3560 Training loss: 2.0273 0.8452 sec/batch
Epoch 5/20  Iteration 733/3560 Training loss: 2.0264 0.8661 sec/batch
Epoch 5/20  Iteration 734/3560 Training loss: 2.0271 0.8477 sec/batch
Epoch 5/20  Iteration 735/3560 Training loss: 2.0262 0.8834 sec/batch
Epoch 5/20  Iteration 736/3560 Training loss: 2.0255 0.8422 sec/batch
Epoch 5/20  Iteration 737/3560 Training loss: 2.0253 0.8551 sec/batch
Epoch 5/20  Iteration 738/3560 Training loss: 2.0241 0.8550 sec/batch
Epoch 5/20  Iteration 739/3560 Training loss: 2.0231 0.8487 sec/batch
Epoch 5/20  Iteration 740/3560 Training loss: 2.0230 0.8420 sec/batch
Epoch 5/20  Iteration 741/3560 Training loss: 2.0237 0.8580 sec/batch
Epoch 5/20  Iteration 742/3560 Training loss: 2.0237 0.8401 sec/batch
Epoch 5/20  Iteration 743/3560 Training loss: 2.0233 0.8465 sec/batch
Epoch 5/20  Iteration 744/3560 Training loss: 2.0226 0.8385 sec/batch
Epoch 5/20  Iteration 745/3560 Training loss: 2.0221 0.8476 sec/batch
Epoch 5/20  Iteration 746/3560 Training loss: 2.0227 0.8401 sec/batch
Epoch 5/20  Iteration 747/3560 Training loss: 2.0224 0.8412 sec/batch
Epoch 5/20  Iteration 748/3560 Training loss: 2.0222 0.8840 sec/batch
Epoch 5/20  Iteration 749/3560 Training loss: 2.0218 0.8449 sec/batch
Epoch 5/20  Iteration 750/3560 Training loss: 2.0205 0.8428 sec/batch
Epoch 5/20  Iteration 751/3560 Training loss: 2.0193 0.8413 sec/batch
Epoch 5/20  Iteration 752/3560 Training loss: 2.0183 0.8366 sec/batch
Epoch 5/20  Iteration 753/3560 Training loss: 2.0179 0.8376 sec/batch
Epoch 5/20  Iteration 754/3560 Training loss: 2.0178 0.8424 sec/batch
Epoch 5/20  Iteration 755/3560 Training loss: 2.0172 0.8384 sec/batch
Epoch 5/20  Iteration 756/3560 Training loss: 2.0162 0.8442 sec/batch
Epoch 5/20  Iteration 757/3560 Training loss: 2.0162 0.8351 sec/batch
Epoch 5/20  Iteration 758/3560 Training loss: 2.0147 0.8442 sec/batch
Epoch 5/20  Iteration 759/3560 Training loss: 2.0146 0.8369 sec/batch
Epoch 5/20  Iteration 760/3560 Training loss: 2.0138 0.8865 sec/batch
Epoch 5/20  Iteration 761/3560 Training loss: 2.0136 0.8392 sec/batch
Epoch 5/20  Iteration 762/3560 Training loss: 2.0142 0.8425 sec/batch
Epoch 5/20  Iteration 763/3560 Training loss: 2.0136 0.8469 sec/batch
Epoch 5/20  Iteration 764/3560 Training loss: 2.0140 0.8447 sec/batch
Epoch 5/20  Iteration 765/3560 Training loss: 2.0136 0.8465 sec/batch
Epoch 5/20  Iteration 766/3560 Training loss: 2.0132 0.8398 sec/batch
Epoch 5/20  Iteration 767/3560 Training loss: 2.0128 0.8365 sec/batch
Epoch 5/20  Iteration 768/3560 Training loss: 2.0130 0.8435 sec/batch
Epoch 5/20  Iteration 769/3560 Training loss: 2.0131 0.8934 sec/batch
Epoch 5/20  Iteration 770/3560 Training loss: 2.0127 0.8524 sec/batch
Epoch 5/20  Iteration 771/3560 Training loss: 2.0122 0.8636 sec/batch
Epoch 5/20  Iteration 772/3560 Training loss: 2.0125 0.8905 sec/batch
Epoch 5/20  Iteration 773/3560 Training loss: 2.0123 0.8572 sec/batch
Epoch 5/20  Iteration 774/3560 Training loss: 2.0128 0.8937 sec/batch
Epoch 5/20  Iteration 775/3560 Training loss: 2.0130 0.8721 sec/batch
Epoch 5/20  Iteration 776/3560 Training loss: 2.0131 0.8378 sec/batch
Epoch 5/20  Iteration 777/3560 Training loss: 2.0127 0.8444 sec/batch
Epoch 5/20  Iteration 778/3560 Training loss: 2.0127 0.8589 sec/batch
Epoch 5/20  Iteration 779/3560 Training loss: 2.0126 0.8583 sec/batch
Epoch 5/20  Iteration 780/3560 Training loss: 2.0121 0.9050 sec/batch
Epoch 5/20  Iteration 781/3560 Training loss: 2.0117 0.9284 sec/batch
Epoch 5/20  Iteration 782/3560 Training loss: 2.0115 0.8502 sec/batch
Epoch 5/20  Iteration 783/3560 Training loss: 2.0118 0.8591 sec/batch
Epoch 5/20  Iteration 784/3560 Training loss: 2.0117 0.8450 sec/batch
Epoch 5/20  Iteration 785/3560 Training loss: 2.0117 0.8430 sec/batch
Epoch 5/20  Iteration 786/3560 Training loss: 2.0112 0.8408 sec/batch
Epoch 5/20  Iteration 787/3560 Training loss: 2.0109 0.8334 sec/batch
Epoch 5/20  Iteration 788/3560 Training loss: 2.0111 0.8431 sec/batch
Epoch 5/20  Iteration 789/3560 Training loss: 2.0108 0.8552 sec/batch
Epoch 5/20  Iteration 790/3560 Training loss: 2.0108 0.8896 sec/batch
Epoch 5/20  Iteration 791/3560 Training loss: 2.0101 0.8829 sec/batch
Epoch 5/20  Iteration 792/3560 Training loss: 2.0096 0.8435 sec/batch
Epoch 5/20  Iteration 793/3560 Training loss: 2.0089 0.8506 sec/batch
Epoch 5/20  Iteration 794/3560 Training loss: 2.0088 0.8430 sec/batch
Epoch 5/20  Iteration 795/3560 Training loss: 2.0080 0.8837 sec/batch
Epoch 5/20  Iteration 796/3560 Training loss: 2.0077 0.8570 sec/batch
Epoch 5/20  Iteration 797/3560 Training loss: 2.0070 0.8379 sec/batch
Epoch 5/20  Iteration 798/3560 Training loss: 2.0065 0.8811 sec/batch
Epoch 5/20  Iteration 799/3560 Training loss: 2.0063 0.9006 sec/batch
Epoch 5/20  Iteration 800/3560 Training loss: 2.0059 0.8504 sec/batch
Epoch 5/20  Iteration 801/3560 Training loss: 2.0052 0.8622 sec/batch
Epoch 5/20  Iteration 802/3560 Training loss: 2.0051 0.8545 sec/batch
Epoch 5/20  Iteration 803/3560 Training loss: 2.0047 0.9069 sec/batch
Epoch 5/20  Iteration 804/3560 Training loss: 2.0044 0.9233 sec/batch
Epoch 5/20  Iteration 805/3560 Training loss: 2.0037 0.8385 sec/batch
Epoch 5/20  Iteration 806/3560 Training loss: 2.0033 0.8793 sec/batch
Epoch 5/20  Iteration 807/3560 Training loss: 2.0026 0.8453 sec/batch
Epoch 5/20  Iteration 808/3560 Training loss: 2.0023 0.8413 sec/batch
Epoch 5/20  Iteration 809/3560 Training loss: 2.0021 0.8440 sec/batch
Epoch 5/20  Iteration 810/3560 Training loss: 2.0016 0.8537 sec/batch
Epoch 5/20  Iteration 811/3560 Training loss: 2.0010 0.9074 sec/batch
Epoch 5/20  Iteration 812/3560 Training loss: 2.0003 0.8817 sec/batch
Epoch 5/20  Iteration 813/3560 Training loss: 2.0001 0.8858 sec/batch
Epoch 5/20  Iteration 814/3560 Training loss: 1.9998 0.8402 sec/batch
Epoch 5/20  Iteration 815/3560 Training loss: 1.9993 0.8392 sec/batch
Epoch 5/20  Iteration 816/3560 Training loss: 1.9989 0.8364 sec/batch
Epoch 5/20  Iteration 817/3560 Training loss: 1.9986 0.8455 sec/batch
Epoch 5/20  Iteration 818/3560 Training loss: 1.9982 0.8384 sec/batch
Epoch 5/20  Iteration 819/3560 Training loss: 1.9980 0.8385 sec/batch
Epoch 5/20  Iteration 820/3560 Training loss: 1.9978 0.8465 sec/batch
Epoch 5/20  Iteration 821/3560 Training loss: 1.9977 0.8771 sec/batch
Epoch 5/20  Iteration 822/3560 Training loss: 1.9975 0.9325 sec/batch
Epoch 5/20  Iteration 823/3560 Training loss: 1.9973 0.8766 sec/batch
Epoch 5/20  Iteration 824/3560 Training loss: 1.9970 0.8393 sec/batch
Epoch 5/20  Iteration 825/3560 Training loss: 1.9967 0.8415 sec/batch
Epoch 5/20  Iteration 826/3560 Training loss: 1.9964 0.8488 sec/batch
Epoch 5/20  Iteration 827/3560 Training loss: 1.9960 0.8475 sec/batch
Epoch 5/20  Iteration 828/3560 Training loss: 1.9954 0.9043 sec/batch
Epoch 5/20  Iteration 829/3560 Training loss: 1.9952 1.2531 sec/batch
Epoch 5/20  Iteration 830/3560 Training loss: 1.9949 1.2187 sec/batch
Epoch 5/20  Iteration 831/3560 Training loss: 1.9948 1.2165 sec/batch
Epoch 5/20  Iteration 832/3560 Training loss: 1.9946 1.1635 sec/batch
Epoch 5/20  Iteration 833/3560 Training loss: 1.9945 1.1582 sec/batch
Epoch 5/20  Iteration 834/3560 Training loss: 1.9941 1.2049 sec/batch
Epoch 5/20  Iteration 835/3560 Training loss: 1.9937 1.1415 sec/batch
Epoch 5/20  Iteration 836/3560 Training loss: 1.9936 0.8424 sec/batch
Epoch 5/20  Iteration 837/3560 Training loss: 1.9934 0.8572 sec/batch
Epoch 5/20  Iteration 838/3560 Training loss: 1.9929 0.8414 sec/batch
Epoch 5/20  Iteration 839/3560 Training loss: 1.9929 0.8747 sec/batch
Epoch 5/20  Iteration 840/3560 Training loss: 1.9927 0.8300 sec/batch
Epoch 5/20  Iteration 841/3560 Training loss: 1.9925 0.8242 sec/batch
Epoch 5/20  Iteration 842/3560 Training loss: 1.9923 0.8343 sec/batch
Epoch 5/20  Iteration 843/3560 Training loss: 1.9919 0.8344 sec/batch
Epoch 5/20  Iteration 844/3560 Training loss: 1.9915 0.8374 sec/batch
Epoch 5/20  Iteration 845/3560 Training loss: 1.9913 0.8335 sec/batch
Epoch 5/20  Iteration 846/3560 Training loss: 1.9912 0.8784 sec/batch
Epoch 5/20  Iteration 847/3560 Training loss: 1.9910 0.8485 sec/batch
Epoch 5/20  Iteration 848/3560 Training loss: 1.9909 0.8848 sec/batch
Epoch 5/20  Iteration 849/3560 Training loss: 1.9909 0.8602 sec/batch
Epoch 5/20  Iteration 850/3560 Training loss: 1.9908 0.8514 sec/batch
Epoch 5/20  Iteration 851/3560 Training loss: 1.9908 0.8426 sec/batch
Epoch 5/20  Iteration 852/3560 Training loss: 1.9905 0.8929 sec/batch
Epoch 5/20  Iteration 853/3560 Training loss: 1.9905 0.8783 sec/batch
Epoch 5/20  Iteration 854/3560 Training loss: 1.9903 0.9495 sec/batch
Epoch 5/20  Iteration 855/3560 Training loss: 1.9901 0.8929 sec/batch
Epoch 5/20  Iteration 856/3560 Training loss: 1.9900 0.8828 sec/batch
Epoch 5/20  Iteration 857/3560 Training loss: 1.9897 0.8702 sec/batch
Epoch 5/20  Iteration 858/3560 Training loss: 1.9896 0.9213 sec/batch
Epoch 5/20  Iteration 859/3560 Training loss: 1.9894 0.8762 sec/batch
Epoch 5/20  Iteration 860/3560 Training loss: 1.9894 0.8408 sec/batch
Epoch 5/20  Iteration 861/3560 Training loss: 1.9892 0.8472 sec/batch
Epoch 5/20  Iteration 862/3560 Training loss: 1.9890 0.8450 sec/batch
Epoch 5/20  Iteration 863/3560 Training loss: 1.9886 0.8430 sec/batch
Epoch 5/20  Iteration 864/3560 Training loss: 1.9886 0.8423 sec/batch
Epoch 5/20  Iteration 865/3560 Training loss: 1.9885 0.8492 sec/batch
Epoch 5/20  Iteration 866/3560 Training loss: 1.9884 0.8407 sec/batch
Epoch 5/20  Iteration 867/3560 Training loss: 1.9882 0.8648 sec/batch
Epoch 5/20  Iteration 868/3560 Training loss: 1.9880 0.8484 sec/batch
Epoch 5/20  Iteration 869/3560 Training loss: 1.9878 0.8700 sec/batch
Epoch 5/20  Iteration 870/3560 Training loss: 1.9876 0.8544 sec/batch
Epoch 5/20  Iteration 871/3560 Training loss: 1.9872 0.8553 sec/batch
Epoch 5/20  Iteration 872/3560 Training loss: 1.9873 0.8962 sec/batch
Epoch 5/20  Iteration 873/3560 Training loss: 1.9873 0.8459 sec/batch
Epoch 5/20  Iteration 874/3560 Training loss: 1.9871 0.8601 sec/batch
Epoch 5/20  Iteration 875/3560 Training loss: 1.9870 0.8598 sec/batch
Epoch 5/20  Iteration 876/3560 Training loss: 1.9869 0.8450 sec/batch
Epoch 5/20  Iteration 877/3560 Training loss: 1.9868 0.8721 sec/batch
Epoch 5/20  Iteration 878/3560 Training loss: 1.9865 0.8734 sec/batch
Epoch 5/20  Iteration 879/3560 Training loss: 1.9864 0.8548 sec/batch
Epoch 5/20  Iteration 880/3560 Training loss: 1.9866 0.8675 sec/batch
Epoch 5/20  Iteration 881/3560 Training loss: 1.9864 0.8834 sec/batch
Epoch 5/20  Iteration 882/3560 Training loss: 1.9862 0.9625 sec/batch
Epoch 5/20  Iteration 883/3560 Training loss: 1.9859 0.9079 sec/batch
Epoch 5/20  Iteration 884/3560 Training loss: 1.9856 0.8916 sec/batch
Epoch 5/20  Iteration 885/3560 Training loss: 1.9855 0.9178 sec/batch
Epoch 5/20  Iteration 886/3560 Training loss: 1.9853 1.0931 sec/batch
Epoch 5/20  Iteration 887/3560 Training loss: 1.9852 1.1914 sec/batch
Epoch 5/20  Iteration 888/3560 Training loss: 1.9850 1.3324 sec/batch
Epoch 5/20  Iteration 889/3560 Training loss: 1.9848 1.2400 sec/batch
Epoch 5/20  Iteration 890/3560 Training loss: 1.9846 1.1588 sec/batch
Epoch 6/20  Iteration 891/3560 Training loss: 2.0231 1.1139 sec/batch
Epoch 6/20  Iteration 892/3560 Training loss: 1.9776 1.1337 sec/batch
Epoch 6/20  Iteration 893/3560 Training loss: 1.9671 1.1145 sec/batch
Epoch 6/20  Iteration 894/3560 Training loss: 1.9601 1.0866 sec/batch
Epoch 6/20  Iteration 895/3560 Training loss: 1.9566 1.0795 sec/batch
Epoch 6/20  Iteration 896/3560 Training loss: 1.9463 1.0687 sec/batch
Epoch 6/20  Iteration 897/3560 Training loss: 1.9467 1.1671 sec/batch
Epoch 6/20  Iteration 898/3560 Training loss: 1.9466 1.2411 sec/batch
Epoch 6/20  Iteration 899/3560 Training loss: 1.9488 1.1934 sec/batch
Epoch 6/20  Iteration 900/3560 Training loss: 1.9481 0.9773 sec/batch
Epoch 6/20  Iteration 901/3560 Training loss: 1.9448 0.8530 sec/batch
Epoch 6/20  Iteration 902/3560 Training loss: 1.9427 0.8953 sec/batch
Epoch 6/20  Iteration 903/3560 Training loss: 1.9435 0.8717 sec/batch
Epoch 6/20  Iteration 904/3560 Training loss: 1.9454 0.8589 sec/batch
Epoch 6/20  Iteration 905/3560 Training loss: 1.9447 0.8712 sec/batch
Epoch 6/20  Iteration 906/3560 Training loss: 1.9430 1.0581 sec/batch
Epoch 6/20  Iteration 907/3560 Training loss: 1.9430 0.9355 sec/batch
Epoch 6/20  Iteration 908/3560 Training loss: 1.9451 0.8498 sec/batch
Epoch 6/20  Iteration 909/3560 Training loss: 1.9451 0.8377 sec/batch
Epoch 6/20  Iteration 910/3560 Training loss: 1.9452 0.8755 sec/batch
Epoch 6/20  Iteration 911/3560 Training loss: 1.9441 0.8929 sec/batch
Epoch 6/20  Iteration 912/3560 Training loss: 1.9450 0.9034 sec/batch
Epoch 6/20  Iteration 913/3560 Training loss: 1.9443 0.8790 sec/batch
Epoch 6/20  Iteration 914/3560 Training loss: 1.9440 0.8595 sec/batch
Epoch 6/20  Iteration 915/3560 Training loss: 1.9439 0.8559 sec/batch
Epoch 6/20  Iteration 916/3560 Training loss: 1.9425 0.8396 sec/batch
Epoch 6/20  Iteration 917/3560 Training loss: 1.9412 0.8363 sec/batch
Epoch 6/20  Iteration 918/3560 Training loss: 1.9413 0.8396 sec/batch
Epoch 6/20  Iteration 919/3560 Training loss: 1.9419 0.8409 sec/batch
Epoch 6/20  Iteration 920/3560 Training loss: 1.9420 0.8369 sec/batch
Epoch 6/20  Iteration 921/3560 Training loss: 1.9415 0.8393 sec/batch
Epoch 6/20  Iteration 922/3560 Training loss: 1.9404 0.8793 sec/batch
Epoch 6/20  Iteration 923/3560 Training loss: 1.9404 0.8336 sec/batch
Epoch 6/20  Iteration 924/3560 Training loss: 1.9412 0.8625 sec/batch
Epoch 6/20  Iteration 925/3560 Training loss: 1.9407 0.8327 sec/batch
Epoch 6/20  Iteration 926/3560 Training loss: 1.9399 0.8362 sec/batch
Epoch 6/20  Iteration 927/3560 Training loss: 1.9393 0.8254 sec/batch
Epoch 6/20  Iteration 928/3560 Training loss: 1.9380 0.8337 sec/batch
Epoch 6/20  Iteration 929/3560 Training loss: 1.9369 0.8378 sec/batch
Epoch 6/20  Iteration 930/3560 Training loss: 1.9360 0.8336 sec/batch
Epoch 6/20  Iteration 931/3560 Training loss: 1.9355 0.8335 sec/batch
Epoch 6/20  Iteration 932/3560 Training loss: 1.9358 0.8343 sec/batch
Epoch 6/20  Iteration 933/3560 Training loss: 1.9352 0.8319 sec/batch
Epoch 6/20  Iteration 934/3560 Training loss: 1.9343 0.8339 sec/batch
Epoch 6/20  Iteration 935/3560 Training loss: 1.9345 0.8580 sec/batch
Epoch 6/20  Iteration 936/3560 Training loss: 1.9332 0.8413 sec/batch
Epoch 6/20  Iteration 937/3560 Training loss: 1.9332 0.8291 sec/batch
Epoch 6/20  Iteration 938/3560 Training loss: 1.9326 0.8809 sec/batch
Epoch 6/20  Iteration 939/3560 Training loss: 1.9324 0.8452 sec/batch
Epoch 6/20  Iteration 940/3560 Training loss: 1.9330 0.8339 sec/batch
Epoch 6/20  Iteration 941/3560 Training loss: 1.9323 0.8259 sec/batch
Epoch 6/20  Iteration 942/3560 Training loss: 1.9331 0.8344 sec/batch
Epoch 6/20  Iteration 943/3560 Training loss: 1.9329 0.8337 sec/batch
Epoch 6/20  Iteration 944/3560 Training loss: 1.9327 0.8295 sec/batch
Epoch 6/20  Iteration 945/3560 Training loss: 1.9322 0.8650 sec/batch
Epoch 6/20  Iteration 946/3560 Training loss: 1.9324 0.8351 sec/batch
Epoch 6/20  Iteration 947/3560 Training loss: 1.9324 0.8335 sec/batch
Epoch 6/20  Iteration 948/3560 Training loss: 1.9319 0.8385 sec/batch
Epoch 6/20  Iteration 949/3560 Training loss: 1.9313 0.8363 sec/batch
Epoch 6/20  Iteration 950/3560 Training loss: 1.9320 0.8350 sec/batch
Epoch 6/20  Iteration 951/3560 Training loss: 1.9317 0.8325 sec/batch
Epoch 6/20  Iteration 952/3560 Training loss: 1.9322 0.9000 sec/batch
Epoch 6/20  Iteration 953/3560 Training loss: 1.9324 0.9108 sec/batch
Epoch 6/20  Iteration 954/3560 Training loss: 1.9325 0.8750 sec/batch
Epoch 6/20  Iteration 955/3560 Training loss: 1.9320 0.8383 sec/batch
Epoch 6/20  Iteration 956/3560 Training loss: 1.9321 0.8271 sec/batch
Epoch 6/20  Iteration 957/3560 Training loss: 1.9322 0.8382 sec/batch
Epoch 6/20  Iteration 958/3560 Training loss: 1.9316 0.8324 sec/batch
Epoch 6/20  Iteration 959/3560 Training loss: 1.9313 0.8374 sec/batch
Epoch 6/20  Iteration 960/3560 Training loss: 1.9311 0.8540 sec/batch
Epoch 6/20  Iteration 961/3560 Training loss: 1.9315 0.8982 sec/batch
Epoch 6/20  Iteration 962/3560 Training loss: 1.9313 0.8478 sec/batch
Epoch 6/20  Iteration 963/3560 Training loss: 1.9316 0.9154 sec/batch
Epoch 6/20  Iteration 964/3560 Training loss: 1.9311 0.8602 sec/batch
Epoch 6/20  Iteration 965/3560 Training loss: 1.9308 0.8420 sec/batch
Epoch 6/20  Iteration 966/3560 Training loss: 1.9310 0.8588 sec/batch
Epoch 6/20  Iteration 967/3560 Training loss: 1.9308 0.8479 sec/batch
Epoch 6/20  Iteration 968/3560 Training loss: 1.9308 0.8429 sec/batch
Epoch 6/20  Iteration 969/3560 Training loss: 1.9303 0.8728 sec/batch
Epoch 6/20  Iteration 970/3560 Training loss: 1.9300 0.8544 sec/batch
Epoch 6/20  Iteration 971/3560 Training loss: 1.9294 0.8490 sec/batch
Epoch 6/20  Iteration 972/3560 Training loss: 1.9294 0.8425 sec/batch
Epoch 6/20  Iteration 973/3560 Training loss: 1.9287 0.8415 sec/batch
Epoch 6/20  Iteration 974/3560 Training loss: 1.9285 0.8444 sec/batch
Epoch 6/20  Iteration 975/3560 Training loss: 1.9277 0.8749 sec/batch
Epoch 6/20  Iteration 976/3560 Training loss: 1.9273 0.8672 sec/batch
Epoch 6/20  Iteration 977/3560 Training loss: 1.9270 0.8518 sec/batch
Epoch 6/20  Iteration 978/3560 Training loss: 1.9266 0.8701 sec/batch
Epoch 6/20  Iteration 979/3560 Training loss: 1.9260 1.0128 sec/batch
Epoch 6/20  Iteration 980/3560 Training loss: 1.9260 1.1983 sec/batch
Epoch 6/20  Iteration 981/3560 Training loss: 1.9256 1.1716 sec/batch
Epoch 6/20  Iteration 982/3560 Training loss: 1.9254 1.2417 sec/batch
Epoch 6/20  Iteration 983/3560 Training loss: 1.9248 1.2182 sec/batch
Epoch 6/20  Iteration 984/3560 Training loss: 1.9243 0.8484 sec/batch
Epoch 6/20  Iteration 985/3560 Training loss: 1.9238 0.8540 sec/batch
Epoch 6/20  Iteration 986/3560 Training loss: 1.9236 0.8641 sec/batch
Epoch 6/20  Iteration 987/3560 Training loss: 1.9234 0.8374 sec/batch
Epoch 6/20  Iteration 988/3560 Training loss: 1.9229 0.8852 sec/batch
Epoch 6/20  Iteration 989/3560 Training loss: 1.9224 0.8507 sec/batch
Epoch 6/20  Iteration 990/3560 Training loss: 1.9217 0.8606 sec/batch
Epoch 6/20  Iteration 991/3560 Training loss: 1.9216 0.8750 sec/batch
Epoch 6/20  Iteration 992/3560 Training loss: 1.9215 0.8463 sec/batch
Epoch 6/20  Iteration 993/3560 Training loss: 1.9210 0.8539 sec/batch
Epoch 6/20  Iteration 994/3560 Training loss: 1.9208 0.8720 sec/batch
Epoch 6/20  Iteration 995/3560 Training loss: 1.9205 0.8931 sec/batch
Epoch 6/20  Iteration 996/3560 Training loss: 1.9202 0.8507 sec/batch
Epoch 6/20  Iteration 997/3560 Training loss: 1.9199 0.9122 sec/batch
Epoch 6/20  Iteration 998/3560 Training loss: 1.9198 0.8509 sec/batch
Epoch 6/20  Iteration 999/3560 Training loss: 1.9197 0.8620 sec/batch
Epoch 6/20  Iteration 1000/3560 Training loss: 1.9196 0.8360 sec/batch
Epoch 6/20  Iteration 1001/3560 Training loss: 1.9194 0.8329 sec/batch
Epoch 6/20  Iteration 1002/3560 Training loss: 1.9190 0.8369 sec/batch
Epoch 6/20  Iteration 1003/3560 Training loss: 1.9188 0.8501 sec/batch
Epoch 6/20  Iteration 1004/3560 Training loss: 1.9185 0.8395 sec/batch
Epoch 6/20  Iteration 1005/3560 Training loss: 1.9182 0.8775 sec/batch
Epoch 6/20  Iteration 1006/3560 Training loss: 1.9176 0.8311 sec/batch
Epoch 6/20  Iteration 1007/3560 Training loss: 1.9173 0.8559 sec/batch
Epoch 6/20  Iteration 1008/3560 Training loss: 1.9171 0.8519 sec/batch
Epoch 6/20  Iteration 1009/3560 Training loss: 1.9169 0.8388 sec/batch
Epoch 6/20  Iteration 1010/3560 Training loss: 1.9167 0.8344 sec/batch
Epoch 6/20  Iteration 1011/3560 Training loss: 1.9167 0.8357 sec/batch
Epoch 6/20  Iteration 1012/3560 Training loss: 1.9163 0.8363 sec/batch
Epoch 6/20  Iteration 1013/3560 Training loss: 1.9159 0.8401 sec/batch
Epoch 6/20  Iteration 1014/3560 Training loss: 1.9158 0.8369 sec/batch
Epoch 6/20  Iteration 1015/3560 Training loss: 1.9156 0.8351 sec/batch
Epoch 6/20  Iteration 1016/3560 Training loss: 1.9152 0.8362 sec/batch
Epoch 6/20  Iteration 1017/3560 Training loss: 1.9151 0.8315 sec/batch
Epoch 6/20  Iteration 1018/3560 Training loss: 1.9150 0.8426 sec/batch
Epoch 6/20  Iteration 1019/3560 Training loss: 1.9149 0.8304 sec/batch
Epoch 6/20  Iteration 1020/3560 Training loss: 1.9147 0.8423 sec/batch
Epoch 6/20  Iteration 1021/3560 Training loss: 1.9143 0.8299 sec/batch
Epoch 6/20  Iteration 1022/3560 Training loss: 1.9139 0.8393 sec/batch
Epoch 6/20  Iteration 1023/3560 Training loss: 1.9138 0.8285 sec/batch
Epoch 6/20  Iteration 1024/3560 Training loss: 1.9136 0.8363 sec/batch
Epoch 6/20  Iteration 1025/3560 Training loss: 1.9135 0.8383 sec/batch
Epoch 6/20  Iteration 1026/3560 Training loss: 1.9134 0.8423 sec/batch
Epoch 6/20  Iteration 1027/3560 Training loss: 1.9134 0.9416 sec/batch
Epoch 6/20  Iteration 1028/3560 Training loss: 1.9133 0.8544 sec/batch
Epoch 6/20  Iteration 1029/3560 Training loss: 1.9134 0.8559 sec/batch
Epoch 6/20  Iteration 1030/3560 Training loss: 1.9132 0.8306 sec/batch
Epoch 6/20  Iteration 1031/3560 Training loss: 1.9132 0.8342 sec/batch
Epoch 6/20  Iteration 1032/3560 Training loss: 1.9131 0.8323 sec/batch
Epoch 6/20  Iteration 1033/3560 Training loss: 1.9129 0.8295 sec/batch
Epoch 6/20  Iteration 1034/3560 Training loss: 1.9128 0.8291 sec/batch
Epoch 6/20  Iteration 1035/3560 Training loss: 1.9125 0.8739 sec/batch
Epoch 6/20  Iteration 1036/3560 Training loss: 1.9125 0.8549 sec/batch
Epoch 6/20  Iteration 1037/3560 Training loss: 1.9124 0.8371 sec/batch
Epoch 6/20  Iteration 1038/3560 Training loss: 1.9125 0.8498 sec/batch
Epoch 6/20  Iteration 1039/3560 Training loss: 1.9123 0.8385 sec/batch
Epoch 6/20  Iteration 1040/3560 Training loss: 1.9120 0.8401 sec/batch
Epoch 6/20  Iteration 1041/3560 Training loss: 1.9117 0.8548 sec/batch
Epoch 6/20  Iteration 1042/3560 Training loss: 1.9118 0.9516 sec/batch
Epoch 6/20  Iteration 1043/3560 Training loss: 1.9117 0.9351 sec/batch
Epoch 6/20  Iteration 1044/3560 Training loss: 1.9116 0.9870 sec/batch
Epoch 6/20  Iteration 1045/3560 Training loss: 1.9114 0.8767 sec/batch
Epoch 6/20  Iteration 1046/3560 Training loss: 1.9113 0.9018 sec/batch
Epoch 6/20  Iteration 1047/3560 Training loss: 1.9111 0.8427 sec/batch
Epoch 6/20  Iteration 1048/3560 Training loss: 1.9110 0.8415 sec/batch
Epoch 6/20  Iteration 1049/3560 Training loss: 1.9106 0.8968 sec/batch
Epoch 6/20  Iteration 1050/3560 Training loss: 1.9106 0.8571 sec/batch
Epoch 6/20  Iteration 1051/3560 Training loss: 1.9107 0.8667 sec/batch
Epoch 6/20  Iteration 1052/3560 Training loss: 1.9105 0.8470 sec/batch
Epoch 6/20  Iteration 1053/3560 Training loss: 1.9105 0.8777 sec/batch
Epoch 6/20  Iteration 1054/3560 Training loss: 1.9104 0.8947 sec/batch
Epoch 6/20  Iteration 1055/3560 Training loss: 1.9103 0.9260 sec/batch
Epoch 6/20  Iteration 1056/3560 Training loss: 1.9101 0.9276 sec/batch
Epoch 6/20  Iteration 1057/3560 Training loss: 1.9100 1.0190 sec/batch
Epoch 6/20  Iteration 1058/3560 Training loss: 1.9101 1.1140 sec/batch
Epoch 6/20  Iteration 1059/3560 Training loss: 1.9099 0.9492 sec/batch
Epoch 6/20  Iteration 1060/3560 Training loss: 1.9098 0.9138 sec/batch
Epoch 6/20  Iteration 1061/3560 Training loss: 1.9095 0.9376 sec/batch
Epoch 6/20  Iteration 1062/3560 Training loss: 1.9093 0.8386 sec/batch
Epoch 6/20  Iteration 1063/3560 Training loss: 1.9093 0.9105 sec/batch
Epoch 6/20  Iteration 1064/3560 Training loss: 1.9092 0.9449 sec/batch
Epoch 6/20  Iteration 1065/3560 Training loss: 1.9091 0.8746 sec/batch
Epoch 6/20  Iteration 1066/3560 Training loss: 1.9090 0.8695 sec/batch
Epoch 6/20  Iteration 1067/3560 Training loss: 1.9087 0.8913 sec/batch
Epoch 6/20  Iteration 1068/3560 Training loss: 1.9086 0.8949 sec/batch
Epoch 7/20  Iteration 1069/3560 Training loss: 1.9656 0.8631 sec/batch
Epoch 7/20  Iteration 1070/3560 Training loss: 1.9215 0.8458 sec/batch
Epoch 7/20  Iteration 1071/3560 Training loss: 1.9042 0.8811 sec/batch
Epoch 7/20  Iteration 1072/3560 Training loss: 1.8946 0.9135 sec/batch
Epoch 7/20  Iteration 1073/3560 Training loss: 1.8901 0.8896 sec/batch
Epoch 7/20  Iteration 1074/3560 Training loss: 1.8793 0.8789 sec/batch
Epoch 7/20  Iteration 1075/3560 Training loss: 1.8796 0.8846 sec/batch
Epoch 7/20  Iteration 1076/3560 Training loss: 1.8782 0.8651 sec/batch
Epoch 7/20  Iteration 1077/3560 Training loss: 1.8801 0.8872 sec/batch
Epoch 7/20  Iteration 1078/3560 Training loss: 1.8785 0.8722 sec/batch
Epoch 7/20  Iteration 1079/3560 Training loss: 1.8757 0.8608 sec/batch
Epoch 7/20  Iteration 1080/3560 Training loss: 1.8733 0.8482 sec/batch
Epoch 7/20  Iteration 1081/3560 Training loss: 1.8739 0.8837 sec/batch
Epoch 7/20  Iteration 1082/3560 Training loss: 1.8763 0.8409 sec/batch
Epoch 7/20  Iteration 1083/3560 Training loss: 1.8758 0.8797 sec/batch
Epoch 7/20  Iteration 1084/3560 Training loss: 1.8736 0.9113 sec/batch
Epoch 7/20  Iteration 1085/3560 Training loss: 1.8729 0.8365 sec/batch
Epoch 7/20  Iteration 1086/3560 Training loss: 1.8752 0.8508 sec/batch
Epoch 7/20  Iteration 1087/3560 Training loss: 1.8752 0.9564 sec/batch
Epoch 7/20  Iteration 1088/3560 Training loss: 1.8754 0.9353 sec/batch
Epoch 7/20  Iteration 1089/3560 Training loss: 1.8742 0.8893 sec/batch
Epoch 7/20  Iteration 1090/3560 Training loss: 1.8747 0.8685 sec/batch
Epoch 7/20  Iteration 1091/3560 Training loss: 1.8744 0.9074 sec/batch
Epoch 7/20  Iteration 1092/3560 Training loss: 1.8737 0.9831 sec/batch
Epoch 7/20  Iteration 1093/3560 Training loss: 1.8735 0.9067 sec/batch
Epoch 7/20  Iteration 1094/3560 Training loss: 1.8721 0.8569 sec/batch
Epoch 7/20  Iteration 1095/3560 Training loss: 1.8709 0.8692 sec/batch
Epoch 7/20  Iteration 1096/3560 Training loss: 1.8709 0.8686 sec/batch
Epoch 7/20  Iteration 1097/3560 Training loss: 1.8716 0.8918 sec/batch
Epoch 7/20  Iteration 1098/3560 Training loss: 1.8717 0.8756 sec/batch
Epoch 7/20  Iteration 1099/3560 Training loss: 1.8717 0.9112 sec/batch
Epoch 7/20  Iteration 1100/3560 Training loss: 1.8706 0.8407 sec/batch
Epoch 7/20  Iteration 1101/3560 Training loss: 1.8708 0.8452 sec/batch
Epoch 7/20  Iteration 1102/3560 Training loss: 1.8714 0.8804 sec/batch
Epoch 7/20  Iteration 1103/3560 Training loss: 1.8713 0.8507 sec/batch
Epoch 7/20  Iteration 1104/3560 Training loss: 1.8710 0.8465 sec/batch
Epoch 7/20  Iteration 1105/3560 Training loss: 1.8705 0.8536 sec/batch
Epoch 7/20  Iteration 1106/3560 Training loss: 1.8692 0.8477 sec/batch
Epoch 7/20  Iteration 1107/3560 Training loss: 1.8683 0.8486 sec/batch
Epoch 7/20  Iteration 1108/3560 Training loss: 1.8674 0.8495 sec/batch
Epoch 7/20  Iteration 1109/3560 Training loss: 1.8671 0.8430 sec/batch
Epoch 7/20  Iteration 1110/3560 Training loss: 1.8674 0.8449 sec/batch
Epoch 7/20  Iteration 1111/3560 Training loss: 1.8670 0.8412 sec/batch
Epoch 7/20  Iteration 1112/3560 Training loss: 1.8661 0.8668 sec/batch
Epoch 7/20  Iteration 1113/3560 Training loss: 1.8664 0.8464 sec/batch
Epoch 7/20  Iteration 1114/3560 Training loss: 1.8651 0.8453 sec/batch
Epoch 7/20  Iteration 1115/3560 Training loss: 1.8649 0.8452 sec/batch
Epoch 7/20  Iteration 1116/3560 Training loss: 1.8643 0.8437 sec/batch
Epoch 7/20  Iteration 1117/3560 Training loss: 1.8639 0.8489 sec/batch
Epoch 7/20  Iteration 1118/3560 Training loss: 1.8647 0.8513 sec/batch
Epoch 7/20  Iteration 1119/3560 Training loss: 1.8641 0.8421 sec/batch
Epoch 7/20  Iteration 1120/3560 Training loss: 1.8649 0.8867 sec/batch
Epoch 7/20  Iteration 1121/3560 Training loss: 1.8648 0.8466 sec/batch
Epoch 7/20  Iteration 1122/3560 Training loss: 1.8648 0.8443 sec/batch
Epoch 7/20  Iteration 1123/3560 Training loss: 1.8645 0.8500 sec/batch
Epoch 7/20  Iteration 1124/3560 Training loss: 1.8648 0.8488 sec/batch
Epoch 7/20  Iteration 1125/3560 Training loss: 1.8649 0.8449 sec/batch
Epoch 7/20  Iteration 1126/3560 Training loss: 1.8647 0.8370 sec/batch
Epoch 7/20  Iteration 1127/3560 Training loss: 1.8643 0.8479 sec/batch
Epoch 7/20  Iteration 1128/3560 Training loss: 1.8649 0.8611 sec/batch
Epoch 7/20  Iteration 1129/3560 Training loss: 1.8648 0.8410 sec/batch
Epoch 7/20  Iteration 1130/3560 Training loss: 1.8653 0.8467 sec/batch
Epoch 7/20  Iteration 1131/3560 Training loss: 1.8656 0.8431 sec/batch
Epoch 7/20  Iteration 1132/3560 Training loss: 1.8658 0.8437 sec/batch
Epoch 7/20  Iteration 1133/3560 Training loss: 1.8656 0.8421 sec/batch
Epoch 7/20  Iteration 1134/3560 Training loss: 1.8657 0.8487 sec/batch
Epoch 7/20  Iteration 1135/3560 Training loss: 1.8658 0.8453 sec/batch
Epoch 7/20  Iteration 1136/3560 Training loss: 1.8652 0.8411 sec/batch
Epoch 7/20  Iteration 1137/3560 Training loss: 1.8650 0.8387 sec/batch
Epoch 7/20  Iteration 1138/3560 Training loss: 1.8649 0.8485 sec/batch
Epoch 7/20  Iteration 1139/3560 Training loss: 1.8654 0.8463 sec/batch
Epoch 7/20  Iteration 1140/3560 Training loss: 1.8654 0.8519 sec/batch
Epoch 7/20  Iteration 1141/3560 Training loss: 1.8658 1.1088 sec/batch
Epoch 7/20  Iteration 1142/3560 Training loss: 1.8654 0.8419 sec/batch
Epoch 7/20  Iteration 1143/3560 Training loss: 1.8652 0.8437 sec/batch
Epoch 7/20  Iteration 1144/3560 Training loss: 1.8656 0.8463 sec/batch
Epoch 7/20  Iteration 1145/3560 Training loss: 1.8655 0.8462 sec/batch
Epoch 7/20  Iteration 1146/3560 Training loss: 1.8655 0.8404 sec/batch
Epoch 7/20  Iteration 1147/3560 Training loss: 1.8649 0.8961 sec/batch
Epoch 7/20  Iteration 1148/3560 Training loss: 1.8647 0.8438 sec/batch
Epoch 7/20  Iteration 1149/3560 Training loss: 1.8641 0.8858 sec/batch
Epoch 7/20  Iteration 1150/3560 Training loss: 1.8642 0.8468 sec/batch
Epoch 7/20  Iteration 1151/3560 Training loss: 1.8637 0.8448 sec/batch
Epoch 7/20  Iteration 1152/3560 Training loss: 1.8635 0.8422 sec/batch
Epoch 7/20  Iteration 1153/3560 Training loss: 1.8629 0.8479 sec/batch
Epoch 7/20  Iteration 1154/3560 Training loss: 1.8624 0.8337 sec/batch
Epoch 7/20  Iteration 1155/3560 Training loss: 1.8622 0.8475 sec/batch
Epoch 7/20  Iteration 1156/3560 Training loss: 1.8619 0.8466 sec/batch
Epoch 7/20  Iteration 1157/3560 Training loss: 1.8613 0.8365 sec/batch
Epoch 7/20  Iteration 1158/3560 Training loss: 1.8612 0.8407 sec/batch
Epoch 7/20  Iteration 1159/3560 Training loss: 1.8607 0.8451 sec/batch
Epoch 7/20  Iteration 1160/3560 Training loss: 1.8605 0.8391 sec/batch
Epoch 7/20  Iteration 1161/3560 Training loss: 1.8599 0.8989 sec/batch
Epoch 7/20  Iteration 1162/3560 Training loss: 1.8596 0.8447 sec/batch
Epoch 7/20  Iteration 1163/3560 Training loss: 1.8592 0.8445 sec/batch
Epoch 7/20  Iteration 1164/3560 Training loss: 1.8590 0.8482 sec/batch
Epoch 7/20  Iteration 1165/3560 Training loss: 1.8588 0.8478 sec/batch
Epoch 7/20  Iteration 1166/3560 Training loss: 1.8585 0.8427 sec/batch
Epoch 7/20  Iteration 1167/3560 Training loss: 1.8580 0.8461 sec/batch
Epoch 7/20  Iteration 1168/3560 Training loss: 1.8573 0.8441 sec/batch
Epoch 7/20  Iteration 1169/3560 Training loss: 1.8573 0.8397 sec/batch
Epoch 7/20  Iteration 1170/3560 Training loss: 1.8570 0.8443 sec/batch
Epoch 7/20  Iteration 1171/3560 Training loss: 1.8567 0.8421 sec/batch
Epoch 7/20  Iteration 1172/3560 Training loss: 1.8564 0.8426 sec/batch
Epoch 7/20  Iteration 1173/3560 Training loss: 1.8562 0.8497 sec/batch
Epoch 7/20  Iteration 1174/3560 Training loss: 1.8559 0.8380 sec/batch
Epoch 7/20  Iteration 1175/3560 Training loss: 1.8556 0.8605 sec/batch
Epoch 7/20  Iteration 1176/3560 Training loss: 1.8555 0.8523 sec/batch
Epoch 7/20  Iteration 1177/3560 Training loss: 1.8554 0.8425 sec/batch
Epoch 7/20  Iteration 1178/3560 Training loss: 1.8553 0.8419 sec/batch
Epoch 7/20  Iteration 1179/3560 Training loss: 1.8551 0.8383 sec/batch
Epoch 7/20  Iteration 1180/3560 Training loss: 1.8548 0.8525 sec/batch
Epoch 7/20  Iteration 1181/3560 Training loss: 1.8545 0.8543 sec/batch
Epoch 7/20  Iteration 1182/3560 Training loss: 1.8543 0.8470 sec/batch
Epoch 7/20  Iteration 1183/3560 Training loss: 1.8539 0.8434 sec/batch
Epoch 7/20  Iteration 1184/3560 Training loss: 1.8534 0.8372 sec/batch
Epoch 7/20  Iteration 1185/3560 Training loss: 1.8532 0.8390 sec/batch
Epoch 7/20  Iteration 1186/3560 Training loss: 1.8530 0.8465 sec/batch
Epoch 7/20  Iteration 1187/3560 Training loss: 1.8530 0.8375 sec/batch
Epoch 7/20  Iteration 1188/3560 Training loss: 1.8528 0.8414 sec/batch
Epoch 7/20  Iteration 1189/3560 Training loss: 1.8526 0.8445 sec/batch
Epoch 7/20  Iteration 1190/3560 Training loss: 1.8522 0.8403 sec/batch
Epoch 7/20  Iteration 1191/3560 Training loss: 1.8519 0.8431 sec/batch
Epoch 7/20  Iteration 1192/3560 Training loss: 1.8520 0.8445 sec/batch
Epoch 7/20  Iteration 1193/3560 Training loss: 1.8518 0.8479 sec/batch
Epoch 7/20  Iteration 1194/3560 Training loss: 1.8513 0.8485 sec/batch
Epoch 7/20  Iteration 1195/3560 Training loss: 1.8513 0.8430 sec/batch
Epoch 7/20  Iteration 1196/3560 Training loss: 1.8512 0.8358 sec/batch
Epoch 7/20  Iteration 1197/3560 Training loss: 1.8511 0.8456 sec/batch
Epoch 7/20  Iteration 1198/3560 Training loss: 1.8509 0.8651 sec/batch
Epoch 7/20  Iteration 1199/3560 Training loss: 1.8506 0.8856 sec/batch
Epoch 7/20  Iteration 1200/3560 Training loss: 1.8502 0.8359 sec/batch
Epoch 7/20  Iteration 1201/3560 Training loss: 1.8501 0.8407 sec/batch
Epoch 7/20  Iteration 1202/3560 Training loss: 1.8500 0.8627 sec/batch
Epoch 7/20  Iteration 1203/3560 Training loss: 1.8500 1.0992 sec/batch
Epoch 7/20  Iteration 1204/3560 Training loss: 1.8499 1.3441 sec/batch
Epoch 7/20  Iteration 1205/3560 Training loss: 1.8498 0.9566 sec/batch
Epoch 7/20  Iteration 1206/3560 Training loss: 1.8498 0.9556 sec/batch
Epoch 7/20  Iteration 1207/3560 Training loss: 1.8498 0.8866 sec/batch
Epoch 7/20  Iteration 1208/3560 Training loss: 1.8496 0.8639 sec/batch
Epoch 7/20  Iteration 1209/3560 Training loss: 1.8497 0.8602 sec/batch
Epoch 7/20  Iteration 1210/3560 Training loss: 1.8495 1.0183 sec/batch
Epoch 7/20  Iteration 1211/3560 Training loss: 1.8493 0.9410 sec/batch
Epoch 7/20  Iteration 1212/3560 Training loss: 1.8492 0.9433 sec/batch
Epoch 7/20  Iteration 1213/3560 Training loss: 1.8490 0.9253 sec/batch
Epoch 7/20  Iteration 1214/3560 Training loss: 1.8490 0.8749 sec/batch
Epoch 7/20  Iteration 1215/3560 Training loss: 1.8489 0.8782 sec/batch
Epoch 7/20  Iteration 1216/3560 Training loss: 1.8490 0.8761 sec/batch
Epoch 7/20  Iteration 1217/3560 Training loss: 1.8489 1.0800 sec/batch
Epoch 7/20  Iteration 1218/3560 Training loss: 1.8486 0.9800 sec/batch
Epoch 7/20  Iteration 1219/3560 Training loss: 1.8483 1.2193 sec/batch
Epoch 7/20  Iteration 1220/3560 Training loss: 1.8483 0.9204 sec/batch
Epoch 7/20  Iteration 1221/3560 Training loss: 1.8482 0.9483 sec/batch
Epoch 7/20  Iteration 1222/3560 Training loss: 1.8482 0.9128 sec/batch
Epoch 7/20  Iteration 1223/3560 Training loss: 1.8481 0.8970 sec/batch
Epoch 7/20  Iteration 1224/3560 Training loss: 1.8480 1.0589 sec/batch
Epoch 7/20  Iteration 1225/3560 Training loss: 1.8480 1.0202 sec/batch
Epoch 7/20  Iteration 1226/3560 Training loss: 1.8478 0.9166 sec/batch
Epoch 7/20  Iteration 1227/3560 Training loss: 1.8476 0.8431 sec/batch
Epoch 7/20  Iteration 1228/3560 Training loss: 1.8476 0.8450 sec/batch
Epoch 7/20  Iteration 1229/3560 Training loss: 1.8477 0.8665 sec/batch
Epoch 7/20  Iteration 1230/3560 Training loss: 1.8475 0.8460 sec/batch
Epoch 7/20  Iteration 1231/3560 Training loss: 1.8476 0.8664 sec/batch
Epoch 7/20  Iteration 1232/3560 Training loss: 1.8475 0.8523 sec/batch
Epoch 7/20  Iteration 1233/3560 Training loss: 1.8474 0.8426 sec/batch
Epoch 7/20  Iteration 1234/3560 Training loss: 1.8472 0.8623 sec/batch
Epoch 7/20  Iteration 1235/3560 Training loss: 1.8472 0.8591 sec/batch
Epoch 7/20  Iteration 1236/3560 Training loss: 1.8474 0.8443 sec/batch
Epoch 7/20  Iteration 1237/3560 Training loss: 1.8473 0.8498 sec/batch
Epoch 7/20  Iteration 1238/3560 Training loss: 1.8472 0.8507 sec/batch
Epoch 7/20  Iteration 1239/3560 Training loss: 1.8469 0.8450 sec/batch
Epoch 7/20  Iteration 1240/3560 Training loss: 1.8467 0.8453 sec/batch
Epoch 7/20  Iteration 1241/3560 Training loss: 1.8467 0.8461 sec/batch
Epoch 7/20  Iteration 1242/3560 Training loss: 1.8466 0.9013 sec/batch
Epoch 7/20  Iteration 1243/3560 Training loss: 1.8466 0.8626 sec/batch
Epoch 7/20  Iteration 1244/3560 Training loss: 1.8465 0.8439 sec/batch
Epoch 7/20  Iteration 1245/3560 Training loss: 1.8463 0.8458 sec/batch
Epoch 7/20  Iteration 1246/3560 Training loss: 1.8462 0.8499 sec/batch
Epoch 8/20  Iteration 1247/3560 Training loss: 1.8989 0.8411 sec/batch
Epoch 8/20  Iteration 1248/3560 Training loss: 1.8584 1.0398 sec/batch
Epoch 8/20  Iteration 1249/3560 Training loss: 1.8460 0.9253 sec/batch
Epoch 8/20  Iteration 1250/3560 Training loss: 1.8384 0.8519 sec/batch
Epoch 8/20  Iteration 1251/3560 Training loss: 1.8369 0.8511 sec/batch
Epoch 8/20  Iteration 1252/3560 Training loss: 1.8274 0.8374 sec/batch
Epoch 8/20  Iteration 1253/3560 Training loss: 1.8271 0.8440 sec/batch
Epoch 8/20  Iteration 1254/3560 Training loss: 1.8254 0.8487 sec/batch
Epoch 8/20  Iteration 1255/3560 Training loss: 1.8269 0.8920 sec/batch
Epoch 8/20  Iteration 1256/3560 Training loss: 1.8249 0.8491 sec/batch
Epoch 8/20  Iteration 1257/3560 Training loss: 1.8211 0.8455 sec/batch
Epoch 8/20  Iteration 1258/3560 Training loss: 1.8190 0.8439 sec/batch
Epoch 8/20  Iteration 1259/3560 Training loss: 1.8190 0.8447 sec/batch
Epoch 8/20  Iteration 1260/3560 Training loss: 1.8214 0.8440 sec/batch
Epoch 8/20  Iteration 1261/3560 Training loss: 1.8202 0.8404 sec/batch
Epoch 8/20  Iteration 1262/3560 Training loss: 1.8189 0.8538 sec/batch
Epoch 8/20  Iteration 1263/3560 Training loss: 1.8185 0.8619 sec/batch
Epoch 8/20  Iteration 1264/3560 Training loss: 1.8202 0.8403 sec/batch
Epoch 8/20  Iteration 1265/3560 Training loss: 1.8207 0.8608 sec/batch
Epoch 8/20  Iteration 1266/3560 Training loss: 1.8211 0.8554 sec/batch
Epoch 8/20  Iteration 1267/3560 Training loss: 1.8197 0.8485 sec/batch
Epoch 8/20  Iteration 1268/3560 Training loss: 1.8206 0.8485 sec/batch
Epoch 8/20  Iteration 1269/3560 Training loss: 1.8200 0.8938 sec/batch
Epoch 8/20  Iteration 1270/3560 Training loss: 1.8194 0.8873 sec/batch
Epoch 8/20  Iteration 1271/3560 Training loss: 1.8192 0.8694 sec/batch
Epoch 8/20  Iteration 1272/3560 Training loss: 1.8180 0.9981 sec/batch
Epoch 8/20  Iteration 1273/3560 Training loss: 1.8165 1.0433 sec/batch
Epoch 8/20  Iteration 1274/3560 Training loss: 1.8169 1.0091 sec/batch
Epoch 8/20  Iteration 1275/3560 Training loss: 1.8178 0.9975 sec/batch
Epoch 8/20  Iteration 1276/3560 Training loss: 1.8180 0.8873 sec/batch
Epoch 8/20  Iteration 1277/3560 Training loss: 1.8179 0.8492 sec/batch
Epoch 8/20  Iteration 1278/3560 Training loss: 1.8172 0.8447 sec/batch
Epoch 8/20  Iteration 1279/3560 Training loss: 1.8174 0.8440 sec/batch
Epoch 8/20  Iteration 1280/3560 Training loss: 1.8181 0.8635 sec/batch
Epoch 8/20  Iteration 1281/3560 Training loss: 1.8181 0.8474 sec/batch
Epoch 8/20  Iteration 1282/3560 Training loss: 1.8175 0.8457 sec/batch
Epoch 8/20  Iteration 1283/3560 Training loss: 1.8169 0.8440 sec/batch
Epoch 8/20  Iteration 1284/3560 Training loss: 1.8155 0.8384 sec/batch
Epoch 8/20  Iteration 1285/3560 Training loss: 1.8143 0.8469 sec/batch
Epoch 8/20  Iteration 1286/3560 Training loss: 1.8135 0.8887 sec/batch
Epoch 8/20  Iteration 1287/3560 Training loss: 1.8130 0.8480 sec/batch
Epoch 8/20  Iteration 1288/3560 Training loss: 1.8134 0.8474 sec/batch
Epoch 8/20  Iteration 1289/3560 Training loss: 1.8127 0.8554 sec/batch
Epoch 8/20  Iteration 1290/3560 Training loss: 1.8119 0.8505 sec/batch
Epoch 8/20  Iteration 1291/3560 Training loss: 1.8120 0.8594 sec/batch
Epoch 8/20  Iteration 1292/3560 Training loss: 1.8109 0.8427 sec/batch
Epoch 8/20  Iteration 1293/3560 Training loss: 1.8107 0.8606 sec/batch
Epoch 8/20  Iteration 1294/3560 Training loss: 1.8102 0.8471 sec/batch
Epoch 8/20  Iteration 1295/3560 Training loss: 1.8100 0.8373 sec/batch
Epoch 8/20  Iteration 1296/3560 Training loss: 1.8107 0.8675 sec/batch
Epoch 8/20  Iteration 1297/3560 Training loss: 1.8101 0.8532 sec/batch
Epoch 8/20  Iteration 1298/3560 Training loss: 1.8108 0.8528 sec/batch
Epoch 8/20  Iteration 1299/3560 Training loss: 1.8106 0.8410 sec/batch
Epoch 8/20  Iteration 1300/3560 Training loss: 1.8105 0.8714 sec/batch
Epoch 8/20  Iteration 1301/3560 Training loss: 1.8103 0.8673 sec/batch
Epoch 8/20  Iteration 1302/3560 Training loss: 1.8105 0.8440 sec/batch
Epoch 8/20  Iteration 1303/3560 Training loss: 1.8105 0.8494 sec/batch
Epoch 8/20  Iteration 1304/3560 Training loss: 1.8103 0.9033 sec/batch
Epoch 8/20  Iteration 1305/3560 Training loss: 1.8097 0.8941 sec/batch
Epoch 8/20  Iteration 1306/3560 Training loss: 1.8103 0.8916 sec/batch
Epoch 8/20  Iteration 1307/3560 Training loss: 1.8101 0.8803 sec/batch
Epoch 8/20  Iteration 1308/3560 Training loss: 1.8108 0.8538 sec/batch
Epoch 8/20  Iteration 1309/3560 Training loss: 1.8112 0.9327 sec/batch
Epoch 8/20  Iteration 1310/3560 Training loss: 1.8114 0.8972 sec/batch
Epoch 8/20  Iteration 1311/3560 Training loss: 1.8111 0.8817 sec/batch
Epoch 8/20  Iteration 1312/3560 Training loss: 1.8113 0.8401 sec/batch
Epoch 8/20  Iteration 1313/3560 Training loss: 1.8113 0.8461 sec/batch
Epoch 8/20  Iteration 1314/3560 Training loss: 1.8108 0.8777 sec/batch
Epoch 8/20  Iteration 1315/3560 Training loss: 1.8106 0.8674 sec/batch
Epoch 8/20  Iteration 1316/3560 Training loss: 1.8103 0.9336 sec/batch
Epoch 8/20  Iteration 1317/3560 Training loss: 1.8109 1.2010 sec/batch
Epoch 8/20  Iteration 1318/3560 Training loss: 1.8108 0.9106 sec/batch
Epoch 8/20  Iteration 1319/3560 Training loss: 1.8111 0.9286 sec/batch
Epoch 8/20  Iteration 1320/3560 Training loss: 1.8108 0.8780 sec/batch
Epoch 8/20  Iteration 1321/3560 Training loss: 1.8106 0.8663 sec/batch
Epoch 8/20  Iteration 1322/3560 Training loss: 1.8108 0.8699 sec/batch
Epoch 8/20  Iteration 1323/3560 Training loss: 1.8106 0.9662 sec/batch
Epoch 8/20  Iteration 1324/3560 Training loss: 1.8106 0.9489 sec/batch
Epoch 8/20  Iteration 1325/3560 Training loss: 1.8099 0.9206 sec/batch
Epoch 8/20  Iteration 1326/3560 Training loss: 1.8096 0.8882 sec/batch
Epoch 8/20  Iteration 1327/3560 Training loss: 1.8090 0.8682 sec/batch
Epoch 8/20  Iteration 1328/3560 Training loss: 1.8091 0.8712 sec/batch
Epoch 8/20  Iteration 1329/3560 Training loss: 1.8085 0.8648 sec/batch
Epoch 8/20  Iteration 1330/3560 Training loss: 1.8083 0.9087 sec/batch
Epoch 8/20  Iteration 1331/3560 Training loss: 1.8077 0.9083 sec/batch
Epoch 8/20  Iteration 1332/3560 Training loss: 1.8073 0.9259 sec/batch
Epoch 8/20  Iteration 1333/3560 Training loss: 1.8071 1.0209 sec/batch
Epoch 8/20  Iteration 1334/3560 Training loss: 1.8068 0.8739 sec/batch
Epoch 8/20  Iteration 1335/3560 Training loss: 1.8061 0.9044 sec/batch
Epoch 8/20  Iteration 1336/3560 Training loss: 1.8061 0.9075 sec/batch
Epoch 8/20  Iteration 1337/3560 Training loss: 1.8057 0.8537 sec/batch
Epoch 8/20  Iteration 1338/3560 Training loss: 1.8054 0.8394 sec/batch
Epoch 8/20  Iteration 1339/3560 Training loss: 1.8049 0.8648 sec/batch
Epoch 8/20  Iteration 1340/3560 Training loss: 1.8046 0.8453 sec/batch
Epoch 8/20  Iteration 1341/3560 Training loss: 1.8042 0.8432 sec/batch
Epoch 8/20  Iteration 1342/3560 Training loss: 1.8041 0.8449 sec/batch
Epoch 8/20  Iteration 1343/3560 Training loss: 1.8039 0.8526 sec/batch
Epoch 8/20  Iteration 1344/3560 Training loss: 1.8035 0.8640 sec/batch
Epoch 8/20  Iteration 1345/3560 Training loss: 1.8031 0.8992 sec/batch
Epoch 8/20  Iteration 1346/3560 Training loss: 1.8026 0.8446 sec/batch
Epoch 8/20  Iteration 1347/3560 Training loss: 1.8026 0.8549 sec/batch
Epoch 8/20  Iteration 1348/3560 Training loss: 1.8024 0.8902 sec/batch
Epoch 8/20  Iteration 1349/3560 Training loss: 1.8021 0.8464 sec/batch
Epoch 8/20  Iteration 1350/3560 Training loss: 1.8020 0.8463 sec/batch
Epoch 8/20  Iteration 1351/3560 Training loss: 1.8017 0.8570 sec/batch
Epoch 8/20  Iteration 1352/3560 Training loss: 1.8015 0.8368 sec/batch
Epoch 8/20  Iteration 1353/3560 Training loss: 1.8013 0.8595 sec/batch
Epoch 8/20  Iteration 1354/3560 Training loss: 1.8011 0.8457 sec/batch
Epoch 8/20  Iteration 1355/3560 Training loss: 1.8011 0.8651 sec/batch
Epoch 8/20  Iteration 1356/3560 Training loss: 1.8010 0.8695 sec/batch
Epoch 8/20  Iteration 1357/3560 Training loss: 1.8008 0.8520 sec/batch
Epoch 8/20  Iteration 1358/3560 Training loss: 1.8007 0.8490 sec/batch
Epoch 8/20  Iteration 1359/3560 Training loss: 1.8005 0.8958 sec/batch
Epoch 8/20  Iteration 1360/3560 Training loss: 1.8002 0.9684 sec/batch
Epoch 8/20  Iteration 1361/3560 Training loss: 1.7999 0.8753 sec/batch
Epoch 8/20  Iteration 1362/3560 Training loss: 1.7994 0.8874 sec/batch
Epoch 8/20  Iteration 1363/3560 Training loss: 1.7993 0.8798 sec/batch
Epoch 8/20  Iteration 1364/3560 Training loss: 1.7991 0.8402 sec/batch
Epoch 8/20  Iteration 1365/3560 Training loss: 1.7990 0.8456 sec/batch
Epoch 8/20  Iteration 1366/3560 Training loss: 1.7989 0.8496 sec/batch
Epoch 8/20  Iteration 1367/3560 Training loss: 1.7988 0.8474 sec/batch
Epoch 8/20  Iteration 1368/3560 Training loss: 1.7984 0.8391 sec/batch
Epoch 8/20  Iteration 1369/3560 Training loss: 1.7980 0.8438 sec/batch
Epoch 8/20  Iteration 1370/3560 Training loss: 1.7980 0.8428 sec/batch
Epoch 8/20  Iteration 1371/3560 Training loss: 1.7978 0.8468 sec/batch
Epoch 8/20  Iteration 1372/3560 Training loss: 1.7973 0.8424 sec/batch
Epoch 8/20  Iteration 1373/3560 Training loss: 1.7974 0.8513 sec/batch
Epoch 8/20  Iteration 1374/3560 Training loss: 1.7974 0.8411 sec/batch
Epoch 8/20  Iteration 1375/3560 Training loss: 1.7973 0.8405 sec/batch
Epoch 8/20  Iteration 1376/3560 Training loss: 1.7971 0.8470 sec/batch
Epoch 8/20  Iteration 1377/3560 Training loss: 1.7968 0.8426 sec/batch
Epoch 8/20  Iteration 1378/3560 Training loss: 1.7964 0.9170 sec/batch
Epoch 8/20  Iteration 1379/3560 Training loss: 1.7964 0.8914 sec/batch
Epoch 8/20  Iteration 1380/3560 Training loss: 1.7964 0.8445 sec/batch
Epoch 8/20  Iteration 1381/3560 Training loss: 1.7963 0.8454 sec/batch
Epoch 8/20  Iteration 1382/3560 Training loss: 1.7963 0.8404 sec/batch
Epoch 8/20  Iteration 1383/3560 Training loss: 1.7963 0.8431 sec/batch
Epoch 8/20  Iteration 1384/3560 Training loss: 1.7963 0.9248 sec/batch
Epoch 8/20  Iteration 1385/3560 Training loss: 1.7964 0.8412 sec/batch
Epoch 8/20  Iteration 1386/3560 Training loss: 1.7962 0.8423 sec/batch
Epoch 8/20  Iteration 1387/3560 Training loss: 1.7963 0.8448 sec/batch
Epoch 8/20  Iteration 1388/3560 Training loss: 1.7962 0.8723 sec/batch
Epoch 8/20  Iteration 1389/3560 Training loss: 1.7961 0.8842 sec/batch
Epoch 8/20  Iteration 1390/3560 Training loss: 1.7961 0.8896 sec/batch
Epoch 8/20  Iteration 1391/3560 Training loss: 1.7958 0.8686 sec/batch
Epoch 8/20  Iteration 1392/3560 Training loss: 1.7959 0.8629 sec/batch
Epoch 8/20  Iteration 1393/3560 Training loss: 1.7958 0.9100 sec/batch
Epoch 8/20  Iteration 1394/3560 Training loss: 1.7959 0.9064 sec/batch
Epoch 8/20  Iteration 1395/3560 Training loss: 1.7959 0.8910 sec/batch
Epoch 8/20  Iteration 1396/3560 Training loss: 1.7957 0.8933 sec/batch
Epoch 8/20  Iteration 1397/3560 Training loss: 1.7954 0.8990 sec/batch
Epoch 8/20  Iteration 1398/3560 Training loss: 1.7955 1.0068 sec/batch
Epoch 8/20  Iteration 1399/3560 Training loss: 1.7955 0.8775 sec/batch
Epoch 8/20  Iteration 1400/3560 Training loss: 1.7955 0.9335 sec/batch
Epoch 8/20  Iteration 1401/3560 Training loss: 1.7954 0.8767 sec/batch
Epoch 8/20  Iteration 1402/3560 Training loss: 1.7954 0.8441 sec/batch
Epoch 8/20  Iteration 1403/3560 Training loss: 1.7953 0.8460 sec/batch
Epoch 8/20  Iteration 1404/3560 Training loss: 1.7952 0.8418 sec/batch
Epoch 8/20  Iteration 1405/3560 Training loss: 1.7950 0.8434 sec/batch
Epoch 8/20  Iteration 1406/3560 Training loss: 1.7951 0.8472 sec/batch
Epoch 8/20  Iteration 1407/3560 Training loss: 1.7952 0.8837 sec/batch
Epoch 8/20  Iteration 1408/3560 Training loss: 1.7951 0.8395 sec/batch
Epoch 8/20  Iteration 1409/3560 Training loss: 1.7951 0.8402 sec/batch
Epoch 8/20  Iteration 1410/3560 Training loss: 1.7951 0.8551 sec/batch
Epoch 8/20  Iteration 1411/3560 Training loss: 1.7950 0.8754 sec/batch
Epoch 8/20  Iteration 1412/3560 Training loss: 1.7949 0.8407 sec/batch
Epoch 8/20  Iteration 1413/3560 Training loss: 1.7949 0.8554 sec/batch
Epoch 8/20  Iteration 1414/3560 Training loss: 1.7952 0.8430 sec/batch
Epoch 8/20  Iteration 1415/3560 Training loss: 1.7952 0.8476 sec/batch
Epoch 8/20  Iteration 1416/3560 Training loss: 1.7950 0.8535 sec/batch
Epoch 8/20  Iteration 1417/3560 Training loss: 1.7948 0.8374 sec/batch
Epoch 8/20  Iteration 1418/3560 Training loss: 1.7946 0.8530 sec/batch
Epoch 8/20  Iteration 1419/3560 Training loss: 1.7946 0.8632 sec/batch
Epoch 8/20  Iteration 1420/3560 Training loss: 1.7946 0.8705 sec/batch
Epoch 8/20  Iteration 1421/3560 Training loss: 1.7946 0.8512 sec/batch
Epoch 8/20  Iteration 1422/3560 Training loss: 1.7944 0.8684 sec/batch
Epoch 8/20  Iteration 1423/3560 Training loss: 1.7942 0.8681 sec/batch
Epoch 8/20  Iteration 1424/3560 Training loss: 1.7942 0.8397 sec/batch
Epoch 9/20  Iteration 1425/3560 Training loss: 1.8590 0.8436 sec/batch
Epoch 9/20  Iteration 1426/3560 Training loss: 1.8154 0.8414 sec/batch
Epoch 9/20  Iteration 1427/3560 Training loss: 1.8029 0.8454 sec/batch
Epoch 9/20  Iteration 1428/3560 Training loss: 1.7946 0.8472 sec/batch
Epoch 9/20  Iteration 1429/3560 Training loss: 1.7911 0.8559 sec/batch
Epoch 9/20  Iteration 1430/3560 Training loss: 1.7802 0.9096 sec/batch
Epoch 9/20  Iteration 1431/3560 Training loss: 1.7795 0.8500 sec/batch
Epoch 9/20  Iteration 1432/3560 Training loss: 1.7773 0.8459 sec/batch
Epoch 9/20  Iteration 1433/3560 Training loss: 1.7785 0.8512 sec/batch
Epoch 9/20  Iteration 1434/3560 Training loss: 1.7777 0.8416 sec/batch
Epoch 9/20  Iteration 1435/3560 Training loss: 1.7737 0.8425 sec/batch
Epoch 9/20  Iteration 1436/3560 Training loss: 1.7718 0.8573 sec/batch
Epoch 9/20  Iteration 1437/3560 Training loss: 1.7715 0.8554 sec/batch
Epoch 9/20  Iteration 1438/3560 Training loss: 1.7739 0.8489 sec/batch
Epoch 9/20  Iteration 1439/3560 Training loss: 1.7725 0.8443 sec/batch
Epoch 9/20  Iteration 1440/3560 Training loss: 1.7703 0.8851 sec/batch
Epoch 9/20  Iteration 1441/3560 Training loss: 1.7705 0.8471 sec/batch
Epoch 9/20  Iteration 1442/3560 Training loss: 1.7723 0.8426 sec/batch
Epoch 9/20  Iteration 1443/3560 Training loss: 1.7727 0.8420 sec/batch
Epoch 9/20  Iteration 1444/3560 Training loss: 1.7733 0.8432 sec/batch
Epoch 9/20  Iteration 1445/3560 Training loss: 1.7723 0.8439 sec/batch
Epoch 9/20  Iteration 1446/3560 Training loss: 1.7736 0.8467 sec/batch
Epoch 9/20  Iteration 1447/3560 Training loss: 1.7731 0.8423 sec/batch
Epoch 9/20  Iteration 1448/3560 Training loss: 1.7726 0.8600 sec/batch
Epoch 9/20  Iteration 1449/3560 Training loss: 1.7721 0.8447 sec/batch
Epoch 9/20  Iteration 1450/3560 Training loss: 1.7706 0.8414 sec/batch
Epoch 9/20  Iteration 1451/3560 Training loss: 1.7692 0.8448 sec/batch
Epoch 9/20  Iteration 1452/3560 Training loss: 1.7694 0.8439 sec/batch
Epoch 9/20  Iteration 1453/3560 Training loss: 1.7703 0.8424 sec/batch
Epoch 9/20  Iteration 1454/3560 Training loss: 1.7702 0.8383 sec/batch
Epoch 9/20  Iteration 1455/3560 Training loss: 1.7700 0.8457 sec/batch
Epoch 9/20  Iteration 1456/3560 Training loss: 1.7692 0.8503 sec/batch
Epoch 9/20  Iteration 1457/3560 Training loss: 1.7691 0.8445 sec/batch
Epoch 9/20  Iteration 1458/3560 Training loss: 1.7699 0.8391 sec/batch
Epoch 9/20  Iteration 1459/3560 Training loss: 1.7697 0.8499 sec/batch
Epoch 9/20  Iteration 1460/3560 Training loss: 1.7693 0.8395 sec/batch
Epoch 9/20  Iteration 1461/3560 Training loss: 1.7689 0.8405 sec/batch
Epoch 9/20  Iteration 1462/3560 Training loss: 1.7676 0.8448 sec/batch
Epoch 9/20  Iteration 1463/3560 Training loss: 1.7663 0.8488 sec/batch
Epoch 9/20  Iteration 1464/3560 Training loss: 1.7656 0.8514 sec/batch
Epoch 9/20  Iteration 1465/3560 Training loss: 1.7649 0.8450 sec/batch
Epoch 9/20  Iteration 1466/3560 Training loss: 1.7651 0.8461 sec/batch
Epoch 9/20  Iteration 1467/3560 Training loss: 1.7645 0.8435 sec/batch
Epoch 9/20  Iteration 1468/3560 Training loss: 1.7637 0.8443 sec/batch
Epoch 9/20  Iteration 1469/3560 Training loss: 1.7639 0.8430 sec/batch
Epoch 9/20  Iteration 1470/3560 Training loss: 1.7626 0.8420 sec/batch
Epoch 9/20  Iteration 1471/3560 Training loss: 1.7622 0.8421 sec/batch
Epoch 9/20  Iteration 1472/3560 Training loss: 1.7616 0.8486 sec/batch
Epoch 9/20  Iteration 1473/3560 Training loss: 1.7613 0.8487 sec/batch
Epoch 9/20  Iteration 1474/3560 Training loss: 1.7620 0.8426 sec/batch
Epoch 9/20  Iteration 1475/3560 Training loss: 1.7615 0.8436 sec/batch
Epoch 9/20  Iteration 1476/3560 Training loss: 1.7623 0.8500 sec/batch
Epoch 9/20  Iteration 1477/3560 Training loss: 1.7622 0.8398 sec/batch
Epoch 9/20  Iteration 1478/3560 Training loss: 1.7621 0.8514 sec/batch
Epoch 9/20  Iteration 1479/3560 Training loss: 1.7619 0.8470 sec/batch
Epoch 9/20  Iteration 1480/3560 Training loss: 1.7621 0.8449 sec/batch
Epoch 9/20  Iteration 1481/3560 Training loss: 1.7624 0.8487 sec/batch
Epoch 9/20  Iteration 1482/3560 Training loss: 1.7621 0.8487 sec/batch
Epoch 9/20  Iteration 1483/3560 Training loss: 1.7616 0.8765 sec/batch
Epoch 9/20  Iteration 1484/3560 Training loss: 1.7623 0.8439 sec/batch
Epoch 9/20  Iteration 1485/3560 Training loss: 1.7623 0.8541 sec/batch
Epoch 9/20  Iteration 1486/3560 Training loss: 1.7629 0.8889 sec/batch
Epoch 9/20  Iteration 1487/3560 Training loss: 1.7634 0.8410 sec/batch
Epoch 9/20  Iteration 1488/3560 Training loss: 1.7637 0.8738 sec/batch
Epoch 9/20  Iteration 1489/3560 Training loss: 1.7633 0.9029 sec/batch
Epoch 9/20  Iteration 1490/3560 Training loss: 1.7635 1.0092 sec/batch
Epoch 9/20  Iteration 1491/3560 Training loss: 1.7636 0.9873 sec/batch
Epoch 9/20  Iteration 1492/3560 Training loss: 1.7631 1.0219 sec/batch
Epoch 9/20  Iteration 1493/3560 Training loss: 1.7631 0.8560 sec/batch
Epoch 9/20  Iteration 1494/3560 Training loss: 1.7630 0.8575 sec/batch
Epoch 9/20  Iteration 1495/3560 Training loss: 1.7636 0.8445 sec/batch
Epoch 9/20  Iteration 1496/3560 Training loss: 1.7637 0.8416 sec/batch
Epoch 9/20  Iteration 1497/3560 Training loss: 1.7642 0.8438 sec/batch
Epoch 9/20  Iteration 1498/3560 Training loss: 1.7640 0.9535 sec/batch
Epoch 9/20  Iteration 1499/3560 Training loss: 1.7638 0.8677 sec/batch
Epoch 9/20  Iteration 1500/3560 Training loss: 1.7641 0.8858 sec/batch
Epoch 9/20  Iteration 1501/3560 Training loss: 1.7640 0.8630 sec/batch
Epoch 9/20  Iteration 1502/3560 Training loss: 1.7640 0.8462 sec/batch
Epoch 9/20  Iteration 1503/3560 Training loss: 1.7634 0.8506 sec/batch
Epoch 9/20  Iteration 1504/3560 Training loss: 1.7632 0.8491 sec/batch
Epoch 9/20  Iteration 1505/3560 Training loss: 1.7627 0.8444 sec/batch
Epoch 9/20  Iteration 1506/3560 Training loss: 1.7628 0.8504 sec/batch
Epoch 9/20  Iteration 1507/3560 Training loss: 1.7621 0.8535 sec/batch
Epoch 9/20  Iteration 1508/3560 Training loss: 1.7620 0.8413 sec/batch
Epoch 9/20  Iteration 1509/3560 Training loss: 1.7615 0.8834 sec/batch
Epoch 9/20  Iteration 1510/3560 Training loss: 1.7611 0.8477 sec/batch
Epoch 9/20  Iteration 1511/3560 Training loss: 1.7609 0.8409 sec/batch
Epoch 9/20  Iteration 1512/3560 Training loss: 1.7605 0.8518 sec/batch
Epoch 9/20  Iteration 1513/3560 Training loss: 1.7600 0.8932 sec/batch
Epoch 9/20  Iteration 1514/3560 Training loss: 1.7601 1.0201 sec/batch
Epoch 9/20  Iteration 1515/3560 Training loss: 1.7597 0.8472 sec/batch
Epoch 9/20  Iteration 1516/3560 Training loss: 1.7595 0.8370 sec/batch
Epoch 9/20  Iteration 1517/3560 Training loss: 1.7591 0.8654 sec/batch
Epoch 9/20  Iteration 1518/3560 Training loss: 1.7586 0.8439 sec/batch
Epoch 9/20  Iteration 1519/3560 Training loss: 1.7583 0.8420 sec/batch
Epoch 9/20  Iteration 1520/3560 Training loss: 1.7582 0.8459 sec/batch
Epoch 9/20  Iteration 1521/3560 Training loss: 1.7581 0.8363 sec/batch
Epoch 9/20  Iteration 1522/3560 Training loss: 1.7576 0.8411 sec/batch
Epoch 9/20  Iteration 1523/3560 Training loss: 1.7571 0.8397 sec/batch
Epoch 9/20  Iteration 1524/3560 Training loss: 1.7565 0.8404 sec/batch
Epoch 9/20  Iteration 1525/3560 Training loss: 1.7565 0.8447 sec/batch
Epoch 9/20  Iteration 1526/3560 Training loss: 1.7563 0.8431 sec/batch
Epoch 9/20  Iteration 1527/3560 Training loss: 1.7560 0.8439 sec/batch
Epoch 9/20  Iteration 1528/3560 Training loss: 1.7558 0.8518 sec/batch
Epoch 9/20  Iteration 1529/3560 Training loss: 1.7554 0.8407 sec/batch
Epoch 9/20  Iteration 1530/3560 Training loss: 1.7552 0.8431 sec/batch
Epoch 9/20  Iteration 1531/3560 Training loss: 1.7551 0.8442 sec/batch
Epoch 9/20  Iteration 1532/3560 Training loss: 1.7550 0.8471 sec/batch
Epoch 9/20  Iteration 1533/3560 Training loss: 1.7549 0.8402 sec/batch
Epoch 9/20  Iteration 1534/3560 Training loss: 1.7549 0.8503 sec/batch
Epoch 9/20  Iteration 1535/3560 Training loss: 1.7547 0.8445 sec/batch
Epoch 9/20  Iteration 1536/3560 Training loss: 1.7546 0.8450 sec/batch
Epoch 9/20  Iteration 1537/3560 Training loss: 1.7543 0.8914 sec/batch
Epoch 9/20  Iteration 1538/3560 Training loss: 1.7542 0.8467 sec/batch
Epoch 9/20  Iteration 1539/3560 Training loss: 1.7539 0.8432 sec/batch
Epoch 9/20  Iteration 1540/3560 Training loss: 1.7534 0.8413 sec/batch
Epoch 9/20  Iteration 1541/3560 Training loss: 1.7534 0.8406 sec/batch
Epoch 9/20  Iteration 1542/3560 Training loss: 1.7533 0.8422 sec/batch
Epoch 9/20  Iteration 1543/3560 Training loss: 1.7532 0.8470 sec/batch
Epoch 9/20  Iteration 1544/3560 Training loss: 1.7531 0.8484 sec/batch
Epoch 9/20  Iteration 1545/3560 Training loss: 1.7530 0.8478 sec/batch
Epoch 9/20  Iteration 1546/3560 Training loss: 1.7525 0.8436 sec/batch
Epoch 9/20  Iteration 1547/3560 Training loss: 1.7522 0.8431 sec/batch
Epoch 9/20  Iteration 1548/3560 Training loss: 1.7522 0.8418 sec/batch
Epoch 9/20  Iteration 1549/3560 Training loss: 1.7521 0.8387 sec/batch
Epoch 9/20  Iteration 1550/3560 Training loss: 1.7517 0.8356 sec/batch
Epoch 9/20  Iteration 1551/3560 Training loss: 1.7518 0.8474 sec/batch
Epoch 9/20  Iteration 1552/3560 Training loss: 1.7517 0.8576 sec/batch
Epoch 9/20  Iteration 1553/3560 Training loss: 1.7516 0.8469 sec/batch
Epoch 9/20  Iteration 1554/3560 Training loss: 1.7515 0.8432 sec/batch
Epoch 9/20  Iteration 1555/3560 Training loss: 1.7511 0.8455 sec/batch
Epoch 9/20  Iteration 1556/3560 Training loss: 1.7508 0.8422 sec/batch
Epoch 9/20  Iteration 1557/3560 Training loss: 1.7508 0.8491 sec/batch
Epoch 9/20  Iteration 1558/3560 Training loss: 1.7507 0.8445 sec/batch
Epoch 9/20  Iteration 1559/3560 Training loss: 1.7507 0.8473 sec/batch
Epoch 9/20  Iteration 1560/3560 Training loss: 1.7506 0.8468 sec/batch
Epoch 9/20  Iteration 1561/3560 Training loss: 1.7507 0.8456 sec/batch
Epoch 9/20  Iteration 1562/3560 Training loss: 1.7507 0.8643 sec/batch
Epoch 9/20  Iteration 1563/3560 Training loss: 1.7508 0.8409 sec/batch
Epoch 9/20  Iteration 1564/3560 Training loss: 1.7506 0.8501 sec/batch
Epoch 9/20  Iteration 1565/3560 Training loss: 1.7508 0.8492 sec/batch
Epoch 9/20  Iteration 1566/3560 Training loss: 1.7507 0.8383 sec/batch
Epoch 9/20  Iteration 1567/3560 Training loss: 1.7506 0.8463 sec/batch
Epoch 9/20  Iteration 1568/3560 Training loss: 1.7506 0.8412 sec/batch
Epoch 9/20  Iteration 1569/3560 Training loss: 1.7504 0.8919 sec/batch
Epoch 9/20  Iteration 1570/3560 Training loss: 1.7504 0.8524 sec/batch
Epoch 9/20  Iteration 1571/3560 Training loss: 1.7504 0.8453 sec/batch
Epoch 9/20  Iteration 1572/3560 Training loss: 1.7506 0.8467 sec/batch
Epoch 9/20  Iteration 1573/3560 Training loss: 1.7506 0.8418 sec/batch
Epoch 9/20  Iteration 1574/3560 Training loss: 1.7504 0.8444 sec/batch
Epoch 9/20  Iteration 1575/3560 Training loss: 1.7501 0.8439 sec/batch
Epoch 9/20  Iteration 1576/3560 Training loss: 1.7501 0.8500 sec/batch
Epoch 9/20  Iteration 1577/3560 Training loss: 1.7501 0.8387 sec/batch
Epoch 9/20  Iteration 1578/3560 Training loss: 1.7500 0.8391 sec/batch
Epoch 9/20  Iteration 1579/3560 Training loss: 1.7499 0.8384 sec/batch
Epoch 9/20  Iteration 1580/3560 Training loss: 1.7499 0.8467 sec/batch
Epoch 9/20  Iteration 1581/3560 Training loss: 1.7499 0.8428 sec/batch
Epoch 9/20  Iteration 1582/3560 Training loss: 1.7498 0.8482 sec/batch
Epoch 9/20  Iteration 1583/3560 Training loss: 1.7496 0.9099 sec/batch
Epoch 9/20  Iteration 1584/3560 Training loss: 1.7497 0.8826 sec/batch
Epoch 9/20  Iteration 1585/3560 Training loss: 1.7498 0.9123 sec/batch
Epoch 9/20  Iteration 1586/3560 Training loss: 1.7497 0.8686 sec/batch
Epoch 9/20  Iteration 1587/3560 Training loss: 1.7498 0.8654 sec/batch
Epoch 9/20  Iteration 1588/3560 Training loss: 1.7498 0.8481 sec/batch
Epoch 9/20  Iteration 1589/3560 Training loss: 1.7497 0.8384 sec/batch
Epoch 9/20  Iteration 1590/3560 Training loss: 1.7497 0.8480 sec/batch
Epoch 9/20  Iteration 1591/3560 Training loss: 1.7497 0.8378 sec/batch
Epoch 9/20  Iteration 1592/3560 Training loss: 1.7500 0.8517 sec/batch
Epoch 9/20  Iteration 1593/3560 Training loss: 1.7500 1.0801 sec/batch
Epoch 9/20  Iteration 1594/3560 Training loss: 1.7499 0.8583 sec/batch
Epoch 9/20  Iteration 1595/3560 Training loss: 1.7497 0.8742 sec/batch
Epoch 9/20  Iteration 1596/3560 Training loss: 1.7495 0.8539 sec/batch
Epoch 9/20  Iteration 1597/3560 Training loss: 1.7495 0.8461 sec/batch
Epoch 9/20  Iteration 1598/3560 Training loss: 1.7495 0.8455 sec/batch
Epoch 9/20  Iteration 1599/3560 Training loss: 1.7495 0.8497 sec/batch
Epoch 9/20  Iteration 1600/3560 Training loss: 1.7493 0.8455 sec/batch
Epoch 9/20  Iteration 1601/3560 Training loss: 1.7492 0.8407 sec/batch
Epoch 9/20  Iteration 1602/3560 Training loss: 1.7491 0.8389 sec/batch
Epoch 10/20  Iteration 1603/3560 Training loss: 1.8305 0.8395 sec/batch
Epoch 10/20  Iteration 1604/3560 Training loss: 1.7789 0.8857 sec/batch
Epoch 10/20  Iteration 1605/3560 Training loss: 1.7626 0.8472 sec/batch
Epoch 10/20  Iteration 1606/3560 Training loss: 1.7533 0.8417 sec/batch
Epoch 10/20  Iteration 1607/3560 Training loss: 1.7480 0.8400 sec/batch
Epoch 10/20  Iteration 1608/3560 Training loss: 1.7369 0.8836 sec/batch
Epoch 10/20  Iteration 1609/3560 Training loss: 1.7367 0.8435 sec/batch
Epoch 10/20  Iteration 1610/3560 Training loss: 1.7353 0.8470 sec/batch
Epoch 10/20  Iteration 1611/3560 Training loss: 1.7363 0.8538 sec/batch
Epoch 10/20  Iteration 1612/3560 Training loss: 1.7355 0.8450 sec/batch
Epoch 10/20  Iteration 1613/3560 Training loss: 1.7315 0.8518 sec/batch
Epoch 10/20  Iteration 1614/3560 Training loss: 1.7298 0.8653 sec/batch
Epoch 10/20  Iteration 1615/3560 Training loss: 1.7299 0.8709 sec/batch
Epoch 10/20  Iteration 1616/3560 Training loss: 1.7321 0.8446 sec/batch
Epoch 10/20  Iteration 1617/3560 Training loss: 1.7317 0.8487 sec/batch
Epoch 10/20  Iteration 1618/3560 Training loss: 1.7301 0.8521 sec/batch
Epoch 10/20  Iteration 1619/3560 Training loss: 1.7301 0.8438 sec/batch
Epoch 10/20  Iteration 1620/3560 Training loss: 1.7315 0.8521 sec/batch
Epoch 10/20  Iteration 1621/3560 Training loss: 1.7317 0.8553 sec/batch
Epoch 10/20  Iteration 1622/3560 Training loss: 1.7327 0.8722 sec/batch
Epoch 10/20  Iteration 1623/3560 Training loss: 1.7324 0.8492 sec/batch
Epoch 10/20  Iteration 1624/3560 Training loss: 1.7325 0.8544 sec/batch
Epoch 10/20  Iteration 1625/3560 Training loss: 1.7319 0.8477 sec/batch
Epoch 10/20  Iteration 1626/3560 Training loss: 1.7314 0.8500 sec/batch
Epoch 10/20  Iteration 1627/3560 Training loss: 1.7313 0.8545 sec/batch
Epoch 10/20  Iteration 1628/3560 Training loss: 1.7298 0.8564 sec/batch
Epoch 10/20  Iteration 1629/3560 Training loss: 1.7283 0.8938 sec/batch
Epoch 10/20  Iteration 1630/3560 Training loss: 1.7288 0.8494 sec/batch
Epoch 10/20  Iteration 1631/3560 Training loss: 1.7292 0.8630 sec/batch
Epoch 10/20  Iteration 1632/3560 Training loss: 1.7294 0.8699 sec/batch
Epoch 10/20  Iteration 1633/3560 Training loss: 1.7297 0.8495 sec/batch
Epoch 10/20  Iteration 1634/3560 Training loss: 1.7289 0.8459 sec/batch
Epoch 10/20  Iteration 1635/3560 Training loss: 1.7291 0.8489 sec/batch
Epoch 10/20  Iteration 1636/3560 Training loss: 1.7298 0.8555 sec/batch
Epoch 10/20  Iteration 1637/3560 Training loss: 1.7295 0.8406 sec/batch
Epoch 10/20  Iteration 1638/3560 Training loss: 1.7292 0.9762 sec/batch
Epoch 10/20  Iteration 1639/3560 Training loss: 1.7285 0.8944 sec/batch
Epoch 10/20  Iteration 1640/3560 Training loss: 1.7273 0.8867 sec/batch
Epoch 10/20  Iteration 1641/3560 Training loss: 1.7261 0.8967 sec/batch
Epoch 10/20  Iteration 1642/3560 Training loss: 1.7255 0.8477 sec/batch
Epoch 10/20  Iteration 1643/3560 Training loss: 1.7249 0.8466 sec/batch
Epoch 10/20  Iteration 1644/3560 Training loss: 1.7252 0.9407 sec/batch
Epoch 10/20  Iteration 1645/3560 Training loss: 1.7246 0.9311 sec/batch
Epoch 10/20  Iteration 1646/3560 Training loss: 1.7237 0.8550 sec/batch
Epoch 10/20  Iteration 1647/3560 Training loss: 1.7239 0.8450 sec/batch
Epoch 10/20  Iteration 1648/3560 Training loss: 1.7227 0.8596 sec/batch
Epoch 10/20  Iteration 1649/3560 Training loss: 1.7225 0.8562 sec/batch
Epoch 10/20  Iteration 1650/3560 Training loss: 1.7221 0.8512 sec/batch
Epoch 10/20  Iteration 1651/3560 Training loss: 1.7220 0.8519 sec/batch
Epoch 10/20  Iteration 1652/3560 Training loss: 1.7226 0.8465 sec/batch
Epoch 10/20  Iteration 1653/3560 Training loss: 1.7221 0.8556 sec/batch
Epoch 10/20  Iteration 1654/3560 Training loss: 1.7230 0.8890 sec/batch
Epoch 10/20  Iteration 1655/3560 Training loss: 1.7228 0.9095 sec/batch
Epoch 10/20  Iteration 1656/3560 Training loss: 1.7228 0.9530 sec/batch
Epoch 10/20  Iteration 1657/3560 Training loss: 1.7226 0.9537 sec/batch
Epoch 10/20  Iteration 1658/3560 Training loss: 1.7228 0.9347 sec/batch
Epoch 10/20  Iteration 1659/3560 Training loss: 1.7231 0.8698 sec/batch
Epoch 10/20  Iteration 1660/3560 Training loss: 1.7228 0.8558 sec/batch
Epoch 10/20  Iteration 1661/3560 Training loss: 1.7222 0.9317 sec/batch
Epoch 10/20  Iteration 1662/3560 Training loss: 1.7228 0.8530 sec/batch
Epoch 10/20  Iteration 1663/3560 Training loss: 1.7227 0.8996 sec/batch
Epoch 10/20  Iteration 1664/3560 Training loss: 1.7235 0.8444 sec/batch
Epoch 10/20  Iteration 1665/3560 Training loss: 1.7240 0.8428 sec/batch
Epoch 10/20  Iteration 1666/3560 Training loss: 1.7243 0.8473 sec/batch
Epoch 10/20  Iteration 1667/3560 Training loss: 1.7241 0.8459 sec/batch
Epoch 10/20  Iteration 1668/3560 Training loss: 1.7244 0.8430 sec/batch
Epoch 10/20  Iteration 1669/3560 Training loss: 1.7246 0.8984 sec/batch
Epoch 10/20  Iteration 1670/3560 Training loss: 1.7242 0.9508 sec/batch
Epoch 10/20  Iteration 1671/3560 Training loss: 1.7242 0.9766 sec/batch
Epoch 10/20  Iteration 1672/3560 Training loss: 1.7241 0.9122 sec/batch
Epoch 10/20  Iteration 1673/3560 Training loss: 1.7247 0.9160 sec/batch
Epoch 10/20  Iteration 1674/3560 Training loss: 1.7248 0.8670 sec/batch
Epoch 10/20  Iteration 1675/3560 Training loss: 1.7252 0.8445 sec/batch
Epoch 10/20  Iteration 1676/3560 Training loss: 1.7250 0.8485 sec/batch
Epoch 10/20  Iteration 1677/3560 Training loss: 1.7248 0.8459 sec/batch
Epoch 10/20  Iteration 1678/3560 Training loss: 1.7251 0.8461 sec/batch
Epoch 10/20  Iteration 1679/3560 Training loss: 1.7250 0.8408 sec/batch
Epoch 10/20  Iteration 1680/3560 Training loss: 1.7249 0.8505 sec/batch
Epoch 10/20  Iteration 1681/3560 Training loss: 1.7243 0.8480 sec/batch
Epoch 10/20  Iteration 1682/3560 Training loss: 1.7240 0.8487 sec/batch
Epoch 10/20  Iteration 1683/3560 Training loss: 1.7234 0.8973 sec/batch
Epoch 10/20  Iteration 1684/3560 Training loss: 1.7237 0.8462 sec/batch
Epoch 10/20  Iteration 1685/3560 Training loss: 1.7231 0.8414 sec/batch
Epoch 10/20  Iteration 1686/3560 Training loss: 1.7231 0.8409 sec/batch
Epoch 10/20  Iteration 1687/3560 Training loss: 1.7226 0.8500 sec/batch
Epoch 10/20  Iteration 1688/3560 Training loss: 1.7223 0.8454 sec/batch
Epoch 10/20  Iteration 1689/3560 Training loss: 1.7222 0.8413 sec/batch
Epoch 10/20  Iteration 1690/3560 Training loss: 1.7218 0.8549 sec/batch
Epoch 10/20  Iteration 1691/3560 Training loss: 1.7213 0.8439 sec/batch
Epoch 10/20  Iteration 1692/3560 Training loss: 1.7213 0.8887 sec/batch
Epoch 10/20  Iteration 1693/3560 Training loss: 1.7209 1.1087 sec/batch
Epoch 10/20  Iteration 1694/3560 Training loss: 1.7206 1.3824 sec/batch
Epoch 10/20  Iteration 1695/3560 Training loss: 1.7201 1.1389 sec/batch
Epoch 10/20  Iteration 1696/3560 Training loss: 1.7197 1.2982 sec/batch
Epoch 10/20  Iteration 1697/3560 Training loss: 1.7194 0.9614 sec/batch
Epoch 10/20  Iteration 1698/3560 Training loss: 1.7192 0.9204 sec/batch
Epoch 10/20  Iteration 1699/3560 Training loss: 1.7191 0.8608 sec/batch
Epoch 10/20  Iteration 1700/3560 Training loss: 1.7187 0.8556 sec/batch
Epoch 10/20  Iteration 1701/3560 Training loss: 1.7183 0.9018 sec/batch
Epoch 10/20  Iteration 1702/3560 Training loss: 1.7177 0.8565 sec/batch
Epoch 10/20  Iteration 1703/3560 Training loss: 1.7177 0.8579 sec/batch
Epoch 10/20  Iteration 1704/3560 Training loss: 1.7175 0.8576 sec/batch
Epoch 10/20  Iteration 1705/3560 Training loss: 1.7173 0.8644 sec/batch
Epoch 10/20  Iteration 1706/3560 Training loss: 1.7170 0.8752 sec/batch
Epoch 10/20  Iteration 1707/3560 Training loss: 1.7168 0.8599 sec/batch
Epoch 10/20  Iteration 1708/3560 Training loss: 1.7166 0.8696 sec/batch
Epoch 10/20  Iteration 1709/3560 Training loss: 1.7165 0.8629 sec/batch
Epoch 10/20  Iteration 1710/3560 Training loss: 1.7164 0.9079 sec/batch
Epoch 10/20  Iteration 1711/3560 Training loss: 1.7164 0.8599 sec/batch
Epoch 10/20  Iteration 1712/3560 Training loss: 1.7165 0.8831 sec/batch
Epoch 10/20  Iteration 1713/3560 Training loss: 1.7163 0.8847 sec/batch
Epoch 10/20  Iteration 1714/3560 Training loss: 1.7162 0.8575 sec/batch
Epoch 10/20  Iteration 1715/3560 Training loss: 1.7160 0.8987 sec/batch
Epoch 10/20  Iteration 1716/3560 Training loss: 1.7158 0.8616 sec/batch
Epoch 10/20  Iteration 1717/3560 Training loss: 1.7155 0.8708 sec/batch
Epoch 10/20  Iteration 1718/3560 Training loss: 1.7151 0.8584 sec/batch
Epoch 10/20  Iteration 1719/3560 Training loss: 1.7151 0.8589 sec/batch
Epoch 10/20  Iteration 1720/3560 Training loss: 1.7150 0.8607 sec/batch
Epoch 10/20  Iteration 1721/3560 Training loss: 1.7148 0.8603 sec/batch
Epoch 10/20  Iteration 1722/3560 Training loss: 1.7148 0.8632 sec/batch
Epoch 10/20  Iteration 1723/3560 Training loss: 1.7147 0.8849 sec/batch
Epoch 10/20  Iteration 1724/3560 Training loss: 1.7142 0.8655 sec/batch
Epoch 10/20  Iteration 1725/3560 Training loss: 1.7139 0.8677 sec/batch
Epoch 10/20  Iteration 1726/3560 Training loss: 1.7140 0.8634 sec/batch
Epoch 10/20  Iteration 1727/3560 Training loss: 1.7138 0.8570 sec/batch
Epoch 10/20  Iteration 1728/3560 Training loss: 1.7134 0.8587 sec/batch
Epoch 10/20  Iteration 1729/3560 Training loss: 1.7134 0.8637 sec/batch
Epoch 10/20  Iteration 1730/3560 Training loss: 1.7134 0.8642 sec/batch
Epoch 10/20  Iteration 1731/3560 Training loss: 1.7132 0.8598 sec/batch
Epoch 10/20  Iteration 1732/3560 Training loss: 1.7130 0.8622 sec/batch
Epoch 10/20  Iteration 1733/3560 Training loss: 1.7127 0.8661 sec/batch
Epoch 10/20  Iteration 1734/3560 Training loss: 1.7124 0.8645 sec/batch
Epoch 10/20  Iteration 1735/3560 Training loss: 1.7124 0.9019 sec/batch
Epoch 10/20  Iteration 1736/3560 Training loss: 1.7124 0.8592 sec/batch
Epoch 10/20  Iteration 1737/3560 Training loss: 1.7124 0.8579 sec/batch
Epoch 10/20  Iteration 1738/3560 Training loss: 1.7124 0.8557 sec/batch
Epoch 10/20  Iteration 1739/3560 Training loss: 1.7124 0.8605 sec/batch
Epoch 10/20  Iteration 1740/3560 Training loss: 1.7124 0.8662 sec/batch
Epoch 10/20  Iteration 1741/3560 Training loss: 1.7124 0.8579 sec/batch
Epoch 10/20  Iteration 1742/3560 Training loss: 1.7123 0.8494 sec/batch
Epoch 10/20  Iteration 1743/3560 Training loss: 1.7126 0.8610 sec/batch
Epoch 10/20  Iteration 1744/3560 Training loss: 1.7124 0.8506 sec/batch
Epoch 10/20  Iteration 1745/3560 Training loss: 1.7123 0.8617 sec/batch
Epoch 10/20  Iteration 1746/3560 Training loss: 1.7124 0.8603 sec/batch
Epoch 10/20  Iteration 1747/3560 Training loss: 1.7122 0.8615 sec/batch
Epoch 10/20  Iteration 1748/3560 Training loss: 1.7122 0.8572 sec/batch
Epoch 10/20  Iteration 1749/3560 Training loss: 1.7121 0.8569 sec/batch
Epoch 10/20  Iteration 1750/3560 Training loss: 1.7123 0.9071 sec/batch
Epoch 10/20  Iteration 1751/3560 Training loss: 1.7123 0.8563 sec/batch
Epoch 10/20  Iteration 1752/3560 Training loss: 1.7121 0.8514 sec/batch
Epoch 10/20  Iteration 1753/3560 Training loss: 1.7118 0.8571 sec/batch
Epoch 10/20  Iteration 1754/3560 Training loss: 1.7118 0.8610 sec/batch
Epoch 10/20  Iteration 1755/3560 Training loss: 1.7118 0.8701 sec/batch
Epoch 10/20  Iteration 1756/3560 Training loss: 1.7119 0.8591 sec/batch
Epoch 10/20  Iteration 1757/3560 Training loss: 1.7118 0.8790 sec/batch
Epoch 10/20  Iteration 1758/3560 Training loss: 1.7117 0.8670 sec/batch
Epoch 10/20  Iteration 1759/3560 Training loss: 1.7117 0.8608 sec/batch
Epoch 10/20  Iteration 1760/3560 Training loss: 1.7117 0.8651 sec/batch
Epoch 10/20  Iteration 1761/3560 Training loss: 1.7115 0.8589 sec/batch
Epoch 10/20  Iteration 1762/3560 Training loss: 1.7116 0.8631 sec/batch
Epoch 10/20  Iteration 1763/3560 Training loss: 1.7118 0.8606 sec/batch
Epoch 10/20  Iteration 1764/3560 Training loss: 1.7117 0.8584 sec/batch
Epoch 10/20  Iteration 1765/3560 Training loss: 1.7118 0.8609 sec/batch
Epoch 10/20  Iteration 1766/3560 Training loss: 1.7118 0.8565 sec/batch
Epoch 10/20  Iteration 1767/3560 Training loss: 1.7118 0.8665 sec/batch
Epoch 10/20  Iteration 1768/3560 Training loss: 1.7117 0.8619 sec/batch
Epoch 10/20  Iteration 1769/3560 Training loss: 1.7117 0.8690 sec/batch
Epoch 10/20  Iteration 1770/3560 Training loss: 1.7121 0.8589 sec/batch
Epoch 10/20  Iteration 1771/3560 Training loss: 1.7120 0.8606 sec/batch
Epoch 10/20  Iteration 1772/3560 Training loss: 1.7120 0.8567 sec/batch
Epoch 10/20  Iteration 1773/3560 Training loss: 1.7118 0.9875 sec/batch
Epoch 10/20  Iteration 1774/3560 Training loss: 1.7116 0.8672 sec/batch
Epoch 10/20  Iteration 1775/3560 Training loss: 1.7116 0.9038 sec/batch
Epoch 10/20  Iteration 1776/3560 Training loss: 1.7115 0.9220 sec/batch
Epoch 10/20  Iteration 1777/3560 Training loss: 1.7116 0.8606 sec/batch
Epoch 10/20  Iteration 1778/3560 Training loss: 1.7114 0.8617 sec/batch
Epoch 10/20  Iteration 1779/3560 Training loss: 1.7113 0.8567 sec/batch
Epoch 10/20  Iteration 1780/3560 Training loss: 1.7113 0.8622 sec/batch
Epoch 11/20  Iteration 1781/3560 Training loss: 1.7874 0.8610 sec/batch
Epoch 11/20  Iteration 1782/3560 Training loss: 1.7442 0.8519 sec/batch
Epoch 11/20  Iteration 1783/3560 Training loss: 1.7284 0.8600 sec/batch
Epoch 11/20  Iteration 1784/3560 Training loss: 1.7193 0.8641 sec/batch
Epoch 11/20  Iteration 1785/3560 Training loss: 1.7131 0.8633 sec/batch
Epoch 11/20  Iteration 1786/3560 Training loss: 1.7032 0.8596 sec/batch
Epoch 11/20  Iteration 1787/3560 Training loss: 1.7034 0.8959 sec/batch
Epoch 11/20  Iteration 1788/3560 Training loss: 1.7016 0.8595 sec/batch
Epoch 11/20  Iteration 1789/3560 Training loss: 1.7031 0.8614 sec/batch
Epoch 11/20  Iteration 1790/3560 Training loss: 1.7012 0.8542 sec/batch
Epoch 11/20  Iteration 1791/3560 Training loss: 1.6977 0.8634 sec/batch
Epoch 11/20  Iteration 1792/3560 Training loss: 1.6955 0.8741 sec/batch
Epoch 11/20  Iteration 1793/3560 Training loss: 1.6960 0.8670 sec/batch
Epoch 11/20  Iteration 1794/3560 Training loss: 1.6983 0.8601 sec/batch
Epoch 11/20  Iteration 1795/3560 Training loss: 1.6973 0.8605 sec/batch
Epoch 11/20  Iteration 1796/3560 Training loss: 1.6956 0.8586 sec/batch
Epoch 11/20  Iteration 1797/3560 Training loss: 1.6956 0.8572 sec/batch
Epoch 11/20  Iteration 1798/3560 Training loss: 1.6972 0.8606 sec/batch
Epoch 11/20  Iteration 1799/3560 Training loss: 1.6974 0.8593 sec/batch
Epoch 11/20  Iteration 1800/3560 Training loss: 1.6986 0.8654 sec/batch
Epoch 11/20  Iteration 1801/3560 Training loss: 1.6980 0.8663 sec/batch
Epoch 11/20  Iteration 1802/3560 Training loss: 1.6989 0.8598 sec/batch
Epoch 11/20  Iteration 1803/3560 Training loss: 1.6983 0.8619 sec/batch
Epoch 11/20  Iteration 1804/3560 Training loss: 1.6981 0.8529 sec/batch
Epoch 11/20  Iteration 1805/3560 Training loss: 1.6977 0.8484 sec/batch
Epoch 11/20  Iteration 1806/3560 Training loss: 1.6965 0.8647 sec/batch
Epoch 11/20  Iteration 1807/3560 Training loss: 1.6950 0.8600 sec/batch
Epoch 11/20  Iteration 1808/3560 Training loss: 1.6952 0.8568 sec/batch
Epoch 11/20  Iteration 1809/3560 Training loss: 1.6960 0.8549 sec/batch
Epoch 11/20  Iteration 1810/3560 Training loss: 1.6959 0.8642 sec/batch
Epoch 11/20  Iteration 1811/3560 Training loss: 1.6961 0.8681 sec/batch
Epoch 11/20  Iteration 1812/3560 Training loss: 1.6952 0.8831 sec/batch
Epoch 11/20  Iteration 1813/3560 Training loss: 1.6951 0.9143 sec/batch
Epoch 11/20  Iteration 1814/3560 Training loss: 1.6955 0.8633 sec/batch
Epoch 11/20  Iteration 1815/3560 Training loss: 1.6954 0.8604 sec/batch
Epoch 11/20  Iteration 1816/3560 Training loss: 1.6949 0.8599 sec/batch
Epoch 11/20  Iteration 1817/3560 Training loss: 1.6944 0.8650 sec/batch
Epoch 11/20  Iteration 1818/3560 Training loss: 1.6930 0.8549 sec/batch
Epoch 11/20  Iteration 1819/3560 Training loss: 1.6918 0.9087 sec/batch
Epoch 11/20  Iteration 1820/3560 Training loss: 1.6910 0.9044 sec/batch
Epoch 11/20  Iteration 1821/3560 Training loss: 1.6907 0.8725 sec/batch
Epoch 11/20  Iteration 1822/3560 Training loss: 1.6912 0.9112 sec/batch
Epoch 11/20  Iteration 1823/3560 Training loss: 1.6907 0.9399 sec/batch
Epoch 11/20  Iteration 1824/3560 Training loss: 1.6900 0.9160 sec/batch
Epoch 11/20  Iteration 1825/3560 Training loss: 1.6905 0.8954 sec/batch
Epoch 11/20  Iteration 1826/3560 Training loss: 1.6895 0.8820 sec/batch
Epoch 11/20  Iteration 1827/3560 Training loss: 1.6890 0.8577 sec/batch
Epoch 11/20  Iteration 1828/3560 Training loss: 1.6886 0.8573 sec/batch
Epoch 11/20  Iteration 1829/3560 Training loss: 1.6884 0.8580 sec/batch
Epoch 11/20  Iteration 1830/3560 Training loss: 1.6889 0.8597 sec/batch
Epoch 11/20  Iteration 1831/3560 Training loss: 1.6883 0.8666 sec/batch
Epoch 11/20  Iteration 1832/3560 Training loss: 1.6891 0.8638 sec/batch
Epoch 11/20  Iteration 1833/3560 Training loss: 1.6890 0.8639 sec/batch
Epoch 11/20  Iteration 1834/3560 Training loss: 1.6891 0.8549 sec/batch
Epoch 11/20  Iteration 1835/3560 Training loss: 1.6888 0.8583 sec/batch
Epoch 11/20  Iteration 1836/3560 Training loss: 1.6890 0.8650 sec/batch
Epoch 11/20  Iteration 1837/3560 Training loss: 1.6893 0.8587 sec/batch
Epoch 11/20  Iteration 1838/3560 Training loss: 1.6888 0.8568 sec/batch
Epoch 11/20  Iteration 1839/3560 Training loss: 1.6883 0.8670 sec/batch
Epoch 11/20  Iteration 1840/3560 Training loss: 1.6888 0.8567 sec/batch
Epoch 11/20  Iteration 1841/3560 Training loss: 1.6888 0.8658 sec/batch
Epoch 11/20  Iteration 1842/3560 Training loss: 1.6896 0.8846 sec/batch
Epoch 11/20  Iteration 1843/3560 Training loss: 1.6900 0.8609 sec/batch
Epoch 11/20  Iteration 1844/3560 Training loss: 1.6902 0.8586 sec/batch
Epoch 11/20  Iteration 1845/3560 Training loss: 1.6900 0.8655 sec/batch
Epoch 11/20  Iteration 1846/3560 Training loss: 1.6902 0.8539 sec/batch
Epoch 11/20  Iteration 1847/3560 Training loss: 1.6904 0.8577 sec/batch
Epoch 11/20  Iteration 1848/3560 Training loss: 1.6901 0.8605 sec/batch
Epoch 11/20  Iteration 1849/3560 Training loss: 1.6899 0.8570 sec/batch
Epoch 11/20  Iteration 1850/3560 Training loss: 1.6899 0.8680 sec/batch
Epoch 11/20  Iteration 1851/3560 Training loss: 1.6905 0.8646 sec/batch
Epoch 11/20  Iteration 1852/3560 Training loss: 1.6906 0.8602 sec/batch
Epoch 11/20  Iteration 1853/3560 Training loss: 1.6911 0.8629 sec/batch
Epoch 11/20  Iteration 1854/3560 Training loss: 1.6909 0.8555 sec/batch
Epoch 11/20  Iteration 1855/3560 Training loss: 1.6907 0.8506 sec/batch
Epoch 11/20  Iteration 1856/3560 Training loss: 1.6910 0.8576 sec/batch
Epoch 11/20  Iteration 1857/3560 Training loss: 1.6910 0.8545 sec/batch
Epoch 11/20  Iteration 1858/3560 Training loss: 1.6912 0.8612 sec/batch
Epoch 11/20  Iteration 1859/3560 Training loss: 1.6906 0.8631 sec/batch
Epoch 11/20  Iteration 1860/3560 Training loss: 1.6904 0.9210 sec/batch
Epoch 11/20  Iteration 1861/3560 Training loss: 1.6897 0.8582 sec/batch
Epoch 11/20  Iteration 1862/3560 Training loss: 1.6899 0.8629 sec/batch
Epoch 11/20  Iteration 1863/3560 Training loss: 1.6893 0.8584 sec/batch
Epoch 11/20  Iteration 1864/3560 Training loss: 1.6892 0.8607 sec/batch
Epoch 11/20  Iteration 1865/3560 Training loss: 1.6887 0.8560 sec/batch
Epoch 11/20  Iteration 1866/3560 Training loss: 1.6884 0.8586 sec/batch
Epoch 11/20  Iteration 1867/3560 Training loss: 1.6883 0.8527 sec/batch
Epoch 11/20  Iteration 1868/3560 Training loss: 1.6880 0.9085 sec/batch
Epoch 11/20  Iteration 1869/3560 Training loss: 1.6875 0.8550 sec/batch
Epoch 11/20  Iteration 1870/3560 Training loss: 1.6876 0.8552 sec/batch
Epoch 11/20  Iteration 1871/3560 Training loss: 1.6871 0.8552 sec/batch
Epoch 11/20  Iteration 1872/3560 Training loss: 1.6869 0.8539 sec/batch
Epoch 11/20  Iteration 1873/3560 Training loss: 1.6864 0.8670 sec/batch
Epoch 11/20  Iteration 1874/3560 Training loss: 1.6860 0.8594 sec/batch
Epoch 11/20  Iteration 1875/3560 Training loss: 1.6856 0.9048 sec/batch
Epoch 11/20  Iteration 1876/3560 Training loss: 1.6857 0.8630 sec/batch
Epoch 11/20  Iteration 1877/3560 Training loss: 1.6855 0.8682 sec/batch
Epoch 11/20  Iteration 1878/3560 Training loss: 1.6851 0.8622 sec/batch
Epoch 11/20  Iteration 1879/3560 Training loss: 1.6847 0.8598 sec/batch
Epoch 11/20  Iteration 1880/3560 Training loss: 1.6841 0.8543 sec/batch
Epoch 11/20  Iteration 1881/3560 Training loss: 1.6840 0.8578 sec/batch
Epoch 11/20  Iteration 1882/3560 Training loss: 1.6838 0.8578 sec/batch
Epoch 11/20  Iteration 1883/3560 Training loss: 1.6836 0.8575 sec/batch
Epoch 11/20  Iteration 1884/3560 Training loss: 1.6833 0.8634 sec/batch
Epoch 11/20  Iteration 1885/3560 Training loss: 1.6831 0.8572 sec/batch
Epoch 11/20  Iteration 1886/3560 Training loss: 1.6830 0.8644 sec/batch
Epoch 11/20  Iteration 1887/3560 Training loss: 1.6828 0.8593 sec/batch
Epoch 11/20  Iteration 1888/3560 Training loss: 1.6827 0.8634 sec/batch
Epoch 11/20  Iteration 1889/3560 Training loss: 1.6827 0.8592 sec/batch
Epoch 11/20  Iteration 1890/3560 Training loss: 1.6828 0.8810 sec/batch
Epoch 11/20  Iteration 1891/3560 Training loss: 1.6825 0.8534 sec/batch
Epoch 11/20  Iteration 1892/3560 Training loss: 1.6824 0.8559 sec/batch
Epoch 11/20  Iteration 1893/3560 Training loss: 1.6822 0.8534 sec/batch
Epoch 11/20  Iteration 1894/3560 Training loss: 1.6820 0.8682 sec/batch
Epoch 11/20  Iteration 1895/3560 Training loss: 1.6817 0.8731 sec/batch
Epoch 11/20  Iteration 1896/3560 Training loss: 1.6813 0.8594 sec/batch
Epoch 11/20  Iteration 1897/3560 Training loss: 1.6813 0.8630 sec/batch
Epoch 11/20  Iteration 1898/3560 Training loss: 1.6813 0.8587 sec/batch
Epoch 11/20  Iteration 1899/3560 Training loss: 1.6811 0.9009 sec/batch
Epoch 11/20  Iteration 1900/3560 Training loss: 1.6810 0.8556 sec/batch
Epoch 11/20  Iteration 1901/3560 Training loss: 1.6809 0.8662 sec/batch
Epoch 11/20  Iteration 1902/3560 Training loss: 1.6805 0.8651 sec/batch
Epoch 11/20  Iteration 1903/3560 Training loss: 1.6801 0.8603 sec/batch
Epoch 11/20  Iteration 1904/3560 Training loss: 1.6802 0.8625 sec/batch
Epoch 11/20  Iteration 1905/3560 Training loss: 1.6800 0.8580 sec/batch
Epoch 11/20  Iteration 1906/3560 Training loss: 1.6796 0.8549 sec/batch
Epoch 11/20  Iteration 1907/3560 Training loss: 1.6797 0.8572 sec/batch
Epoch 11/20  Iteration 1908/3560 Training loss: 1.6797 0.9182 sec/batch
Epoch 11/20  Iteration 1909/3560 Training loss: 1.6796 0.8591 sec/batch
Epoch 11/20  Iteration 1910/3560 Training loss: 1.6794 0.9047 sec/batch
Epoch 11/20  Iteration 1911/3560 Training loss: 1.6790 0.9566 sec/batch
Epoch 11/20  Iteration 1912/3560 Training loss: 1.6787 0.8691 sec/batch
Epoch 11/20  Iteration 1913/3560 Training loss: 1.6788 0.9049 sec/batch
Epoch 11/20  Iteration 1914/3560 Training loss: 1.6788 0.8523 sec/batch
Epoch 11/20  Iteration 1915/3560 Training loss: 1.6788 0.8541 sec/batch
Epoch 11/20  Iteration 1916/3560 Training loss: 1.6788 0.8615 sec/batch
Epoch 11/20  Iteration 1917/3560 Training loss: 1.6789 0.9070 sec/batch
Epoch 11/20  Iteration 1918/3560 Training loss: 1.6788 0.8581 sec/batch
Epoch 11/20  Iteration 1919/3560 Training loss: 1.6790 0.8639 sec/batch
Epoch 11/20  Iteration 1920/3560 Training loss: 1.6788 0.8834 sec/batch
Epoch 11/20  Iteration 1921/3560 Training loss: 1.6791 0.8606 sec/batch
Epoch 11/20  Iteration 1922/3560 Training loss: 1.6790 0.8584 sec/batch
Epoch 11/20  Iteration 1923/3560 Training loss: 1.6790 0.8600 sec/batch
Epoch 11/20  Iteration 1924/3560 Training loss: 1.6791 0.8546 sec/batch
Epoch 11/20  Iteration 1925/3560 Training loss: 1.6789 0.8544 sec/batch
Epoch 11/20  Iteration 1926/3560 Training loss: 1.6789 0.8609 sec/batch
Epoch 11/20  Iteration 1927/3560 Training loss: 1.6789 0.8591 sec/batch
Epoch 11/20  Iteration 1928/3560 Training loss: 1.6791 0.8604 sec/batch
Epoch 11/20  Iteration 1929/3560 Training loss: 1.6791 0.8747 sec/batch
Epoch 11/20  Iteration 1930/3560 Training loss: 1.6790 0.8658 sec/batch
Epoch 11/20  Iteration 1931/3560 Training loss: 1.6787 0.8607 sec/batch
Epoch 11/20  Iteration 1932/3560 Training loss: 1.6787 0.8777 sec/batch
Epoch 11/20  Iteration 1933/3560 Training loss: 1.6787 0.8619 sec/batch
Epoch 11/20  Iteration 1934/3560 Training loss: 1.6788 0.8842 sec/batch
Epoch 11/20  Iteration 1935/3560 Training loss: 1.6787 1.1887 sec/batch
Epoch 11/20  Iteration 1936/3560 Training loss: 1.6787 0.9746 sec/batch
Epoch 11/20  Iteration 1937/3560 Training loss: 1.6787 0.8539 sec/batch
Epoch 11/20  Iteration 1938/3560 Training loss: 1.6787 0.8588 sec/batch
Epoch 11/20  Iteration 1939/3560 Training loss: 1.6785 0.8622 sec/batch
Epoch 11/20  Iteration 1940/3560 Training loss: 1.6786 0.8566 sec/batch
Epoch 11/20  Iteration 1941/3560 Training loss: 1.6788 0.8588 sec/batch
Epoch 11/20  Iteration 1942/3560 Training loss: 1.6787 0.8551 sec/batch
Epoch 11/20  Iteration 1943/3560 Training loss: 1.6788 0.8584 sec/batch
Epoch 11/20  Iteration 1944/3560 Training loss: 1.6788 0.8675 sec/batch
Epoch 11/20  Iteration 1945/3560 Training loss: 1.6788 0.8988 sec/batch
Epoch 11/20  Iteration 1946/3560 Training loss: 1.6787 1.0035 sec/batch
Epoch 11/20  Iteration 1947/3560 Training loss: 1.6788 0.9684 sec/batch
Epoch 11/20  Iteration 1948/3560 Training loss: 1.6792 0.8928 sec/batch
Epoch 11/20  Iteration 1949/3560 Training loss: 1.6792 0.8693 sec/batch
Epoch 11/20  Iteration 1950/3560 Training loss: 1.6791 0.8677 sec/batch
Epoch 11/20  Iteration 1951/3560 Training loss: 1.6790 0.8704 sec/batch
Epoch 11/20  Iteration 1952/3560 Training loss: 1.6788 0.8796 sec/batch
Epoch 11/20  Iteration 1953/3560 Training loss: 1.6789 0.8865 sec/batch
Epoch 11/20  Iteration 1954/3560 Training loss: 1.6789 0.8610 sec/batch
Epoch 11/20  Iteration 1955/3560 Training loss: 1.6789 0.9191 sec/batch
Epoch 11/20  Iteration 1956/3560 Training loss: 1.6788 0.8916 sec/batch
Epoch 11/20  Iteration 1957/3560 Training loss: 1.6787 0.9148 sec/batch
Epoch 11/20  Iteration 1958/3560 Training loss: 1.6787 0.8646 sec/batch
Epoch 12/20  Iteration 1959/3560 Training loss: 1.7578 0.8692 sec/batch
Epoch 12/20  Iteration 1960/3560 Training loss: 1.7150 0.8729 sec/batch
Epoch 12/20  Iteration 1961/3560 Training loss: 1.6996 0.8629 sec/batch
Epoch 12/20  Iteration 1962/3560 Training loss: 1.6912 0.8826 sec/batch
Epoch 12/20  Iteration 1963/3560 Training loss: 1.6846 0.9259 sec/batch
Epoch 12/20  Iteration 1964/3560 Training loss: 1.6737 0.8589 sec/batch
Epoch 12/20  Iteration 1965/3560 Training loss: 1.6737 0.8830 sec/batch
Epoch 12/20  Iteration 1966/3560 Training loss: 1.6716 0.8637 sec/batch
Epoch 12/20  Iteration 1967/3560 Training loss: 1.6722 0.8622 sec/batch
Epoch 12/20  Iteration 1968/3560 Training loss: 1.6712 0.8633 sec/batch
Epoch 12/20  Iteration 1969/3560 Training loss: 1.6677 0.8684 sec/batch
Epoch 12/20  Iteration 1970/3560 Training loss: 1.6658 0.8637 sec/batch
Epoch 12/20  Iteration 1971/3560 Training loss: 1.6656 0.8679 sec/batch
Epoch 12/20  Iteration 1972/3560 Training loss: 1.6672 0.8773 sec/batch
Epoch 12/20  Iteration 1973/3560 Training loss: 1.6662 0.8687 sec/batch
Epoch 12/20  Iteration 1974/3560 Training loss: 1.6644 0.8705 sec/batch
Epoch 12/20  Iteration 1975/3560 Training loss: 1.6640 0.8591 sec/batch
Epoch 12/20  Iteration 1976/3560 Training loss: 1.6658 0.8664 sec/batch
Epoch 12/20  Iteration 1977/3560 Training loss: 1.6666 0.8671 sec/batch
Epoch 12/20  Iteration 1978/3560 Training loss: 1.6675 0.8656 sec/batch
Epoch 12/20  Iteration 1979/3560 Training loss: 1.6665 0.8730 sec/batch
Epoch 12/20  Iteration 1980/3560 Training loss: 1.6672 0.8750 sec/batch
Epoch 12/20  Iteration 1981/3560 Training loss: 1.6669 0.8667 sec/batch
Epoch 12/20  Iteration 1982/3560 Training loss: 1.6668 0.8676 sec/batch
Epoch 12/20  Iteration 1983/3560 Training loss: 1.6666 0.8681 sec/batch
Epoch 12/20  Iteration 1984/3560 Training loss: 1.6651 0.8691 sec/batch
Epoch 12/20  Iteration 1985/3560 Training loss: 1.6639 0.8669 sec/batch
Epoch 12/20  Iteration 1986/3560 Training loss: 1.6643 0.8724 sec/batch
Epoch 12/20  Iteration 1987/3560 Training loss: 1.6650 0.9113 sec/batch
Epoch 12/20  Iteration 1988/3560 Training loss: 1.6653 0.8719 sec/batch
Epoch 12/20  Iteration 1989/3560 Training loss: 1.6653 0.8653 sec/batch
Epoch 12/20  Iteration 1990/3560 Training loss: 1.6644 0.8684 sec/batch
Epoch 12/20  Iteration 1991/3560 Training loss: 1.6645 0.9080 sec/batch
Epoch 12/20  Iteration 1992/3560 Training loss: 1.6651 0.8744 sec/batch
Epoch 12/20  Iteration 1993/3560 Training loss: 1.6650 0.8574 sec/batch
Epoch 12/20  Iteration 1994/3560 Training loss: 1.6649 0.8696 sec/batch
Epoch 12/20  Iteration 1995/3560 Training loss: 1.6643 0.8638 sec/batch
Epoch 12/20  Iteration 1996/3560 Training loss: 1.6633 0.9071 sec/batch
Epoch 12/20  Iteration 1997/3560 Training loss: 1.6621 0.8816 sec/batch
Epoch 12/20  Iteration 1998/3560 Training loss: 1.6613 0.8662 sec/batch
Epoch 12/20  Iteration 1999/3560 Training loss: 1.6608 0.8683 sec/batch
Epoch 12/20  Iteration 2000/3560 Training loss: 1.6613 0.8761 sec/batch
Epoch 12/20  Iteration 2001/3560 Training loss: 1.6608 0.8697 sec/batch
Epoch 12/20  Iteration 2002/3560 Training loss: 1.6602 0.8691 sec/batch
Epoch 12/20  Iteration 2003/3560 Training loss: 1.6605 0.8676 sec/batch
Epoch 12/20  Iteration 2004/3560 Training loss: 1.6595 0.8630 sec/batch
Epoch 12/20  Iteration 2005/3560 Training loss: 1.6592 0.8707 sec/batch
Epoch 12/20  Iteration 2006/3560 Training loss: 1.6588 0.9196 sec/batch
Epoch 12/20  Iteration 2007/3560 Training loss: 1.6587 0.8676 sec/batch
Epoch 12/20  Iteration 2008/3560 Training loss: 1.6595 0.8620 sec/batch
Epoch 12/20  Iteration 2009/3560 Training loss: 1.6588 0.8689 sec/batch
Epoch 12/20  Iteration 2010/3560 Training loss: 1.6598 0.8703 sec/batch
Epoch 12/20  Iteration 2011/3560 Training loss: 1.6598 0.8702 sec/batch
Epoch 12/20  Iteration 2012/3560 Training loss: 1.6599 0.8671 sec/batch
Epoch 12/20  Iteration 2013/3560 Training loss: 1.6599 0.9106 sec/batch
Epoch 12/20  Iteration 2014/3560 Training loss: 1.6600 0.8622 sec/batch
Epoch 12/20  Iteration 2015/3560 Training loss: 1.6604 0.8671 sec/batch
Epoch 12/20  Iteration 2016/3560 Training loss: 1.6600 0.8627 sec/batch
Epoch 12/20  Iteration 2017/3560 Training loss: 1.6593 0.8683 sec/batch
Epoch 12/20  Iteration 2018/3560 Training loss: 1.6600 0.9356 sec/batch
Epoch 12/20  Iteration 2019/3560 Training loss: 1.6597 0.8680 sec/batch
Epoch 12/20  Iteration 2020/3560 Training loss: 1.6606 0.8614 sec/batch
Epoch 12/20  Iteration 2021/3560 Training loss: 1.6609 0.8673 sec/batch
Epoch 12/20  Iteration 2022/3560 Training loss: 1.6612 0.8603 sec/batch
Epoch 12/20  Iteration 2023/3560 Training loss: 1.6609 0.8653 sec/batch
Epoch 12/20  Iteration 2024/3560 Training loss: 1.6613 0.8668 sec/batch
Epoch 12/20  Iteration 2025/3560 Training loss: 1.6616 0.8782 sec/batch
Epoch 12/20  Iteration 2026/3560 Training loss: 1.6611 0.9111 sec/batch
Epoch 12/20  Iteration 2027/3560 Training loss: 1.6612 0.8805 sec/batch
Epoch 12/20  Iteration 2028/3560 Training loss: 1.6611 0.8931 sec/batch
Epoch 12/20  Iteration 2029/3560 Training loss: 1.6617 0.8696 sec/batch
Epoch 12/20  Iteration 2030/3560 Training loss: 1.6620 0.8910 sec/batch
Epoch 12/20  Iteration 2031/3560 Training loss: 1.6625 0.8974 sec/batch
Epoch 12/20  Iteration 2032/3560 Training loss: 1.6624 0.8795 sec/batch
Epoch 12/20  Iteration 2033/3560 Training loss: 1.6623 0.8788 sec/batch
Epoch 12/20  Iteration 2034/3560 Training loss: 1.6625 0.8747 sec/batch
Epoch 12/20  Iteration 2035/3560 Training loss: 1.6625 0.9558 sec/batch
Epoch 12/20  Iteration 2036/3560 Training loss: 1.6624 0.8760 sec/batch
Epoch 12/20  Iteration 2037/3560 Training loss: 1.6619 0.9217 sec/batch
Epoch 12/20  Iteration 2038/3560 Training loss: 1.6617 0.8776 sec/batch
Epoch 12/20  Iteration 2039/3560 Training loss: 1.6610 0.8862 sec/batch
Epoch 12/20  Iteration 2040/3560 Training loss: 1.6611 0.8856 sec/batch
Epoch 12/20  Iteration 2041/3560 Training loss: 1.6606 0.8894 sec/batch
Epoch 12/20  Iteration 2042/3560 Training loss: 1.6606 0.9228 sec/batch
Epoch 12/20  Iteration 2043/3560 Training loss: 1.6601 0.8695 sec/batch
Epoch 12/20  Iteration 2044/3560 Training loss: 1.6599 0.8721 sec/batch
Epoch 12/20  Iteration 2045/3560 Training loss: 1.6596 0.8628 sec/batch
Epoch 12/20  Iteration 2046/3560 Training loss: 1.6592 0.9919 sec/batch
Epoch 12/20  Iteration 2047/3560 Training loss: 1.6588 0.8770 sec/batch
Epoch 12/20  Iteration 2048/3560 Training loss: 1.6588 0.9155 sec/batch
Epoch 12/20  Iteration 2049/3560 Training loss: 1.6585 0.9048 sec/batch
Epoch 12/20  Iteration 2050/3560 Training loss: 1.6583 0.8651 sec/batch
Epoch 12/20  Iteration 2051/3560 Training loss: 1.6579 0.8680 sec/batch
Epoch 12/20  Iteration 2052/3560 Training loss: 1.6575 0.8707 sec/batch
Epoch 12/20  Iteration 2053/3560 Training loss: 1.6572 0.9012 sec/batch
Epoch 12/20  Iteration 2054/3560 Training loss: 1.6572 0.9137 sec/batch
Epoch 12/20  Iteration 2055/3560 Training loss: 1.6571 0.8915 sec/batch
Epoch 12/20  Iteration 2056/3560 Training loss: 1.6567 0.8632 sec/batch
Epoch 12/20  Iteration 2057/3560 Training loss: 1.6564 0.8665 sec/batch
Epoch 12/20  Iteration 2058/3560 Training loss: 1.6560 0.9043 sec/batch
Epoch 12/20  Iteration 2059/3560 Training loss: 1.6560 0.8703 sec/batch
Epoch 12/20  Iteration 2060/3560 Training loss: 1.6558 0.8719 sec/batch
Epoch 12/20  Iteration 2061/3560 Training loss: 1.6555 0.8770 sec/batch
Epoch 12/20  Iteration 2062/3560 Training loss: 1.6553 0.9103 sec/batch
Epoch 12/20  Iteration 2063/3560 Training loss: 1.6551 0.8728 sec/batch
Epoch 12/20  Iteration 2064/3560 Training loss: 1.6550 0.9422 sec/batch
Epoch 12/20  Iteration 2065/3560 Training loss: 1.6549 0.8915 sec/batch
Epoch 12/20  Iteration 2066/3560 Training loss: 1.6549 0.9013 sec/batch
Epoch 12/20  Iteration 2067/3560 Training loss: 1.6548 0.8835 sec/batch
Epoch 12/20  Iteration 2068/3560 Training loss: 1.6549 0.8666 sec/batch
Epoch 12/20  Iteration 2069/3560 Training loss: 1.6547 0.8569 sec/batch
Epoch 12/20  Iteration 2070/3560 Training loss: 1.6545 0.8638 sec/batch
Epoch 12/20  Iteration 2071/3560 Training loss: 1.6544 0.8631 sec/batch
Epoch 12/20  Iteration 2072/3560 Training loss: 1.6542 0.9230 sec/batch
Epoch 12/20  Iteration 2073/3560 Training loss: 1.6538 0.8659 sec/batch
Epoch 12/20  Iteration 2074/3560 Training loss: 1.6535 0.8661 sec/batch
Epoch 12/20  Iteration 2075/3560 Training loss: 1.6534 0.8633 sec/batch
Epoch 12/20  Iteration 2076/3560 Training loss: 1.6533 0.8757 sec/batch
Epoch 12/20  Iteration 2077/3560 Training loss: 1.6531 0.8652 sec/batch
Epoch 12/20  Iteration 2078/3560 Training loss: 1.6531 0.8660 sec/batch
Epoch 12/20  Iteration 2079/3560 Training loss: 1.6529 0.9169 sec/batch
Epoch 12/20  Iteration 2080/3560 Training loss: 1.6524 0.8624 sec/batch
Epoch 12/20  Iteration 2081/3560 Training loss: 1.6521 0.8684 sec/batch
Epoch 12/20  Iteration 2082/3560 Training loss: 1.6520 0.8705 sec/batch
Epoch 12/20  Iteration 2083/3560 Training loss: 1.6519 0.8649 sec/batch
Epoch 12/20  Iteration 2084/3560 Training loss: 1.6515 0.8703 sec/batch
Epoch 12/20  Iteration 2085/3560 Training loss: 1.6515 0.8744 sec/batch
Epoch 12/20  Iteration 2086/3560 Training loss: 1.6515 0.8598 sec/batch
Epoch 12/20  Iteration 2087/3560 Training loss: 1.6514 0.8713 sec/batch
Epoch 12/20  Iteration 2088/3560 Training loss: 1.6512 0.8764 sec/batch
Epoch 12/20  Iteration 2089/3560 Training loss: 1.6508 0.8624 sec/batch
Epoch 12/20  Iteration 2090/3560 Training loss: 1.6505 0.8744 sec/batch
Epoch 12/20  Iteration 2091/3560 Training loss: 1.6506 0.8710 sec/batch
Epoch 12/20  Iteration 2092/3560 Training loss: 1.6505 0.8728 sec/batch
Epoch 12/20  Iteration 2093/3560 Training loss: 1.6505 0.8771 sec/batch
Epoch 12/20  Iteration 2094/3560 Training loss: 1.6505 0.8683 sec/batch
Epoch 12/20  Iteration 2095/3560 Training loss: 1.6507 0.8628 sec/batch
Epoch 12/20  Iteration 2096/3560 Training loss: 1.6507 0.8669 sec/batch
Epoch 12/20  Iteration 2097/3560 Training loss: 1.6507 0.8745 sec/batch
Epoch 12/20  Iteration 2098/3560 Training loss: 1.6506 0.8858 sec/batch
Epoch 12/20  Iteration 2099/3560 Training loss: 1.6509 0.8727 sec/batch
Epoch 12/20  Iteration 2100/3560 Training loss: 1.6508 0.8596 sec/batch
Epoch 12/20  Iteration 2101/3560 Training loss: 1.6507 0.8656 sec/batch
Epoch 12/20  Iteration 2102/3560 Training loss: 1.6508 0.8648 sec/batch
Epoch 12/20  Iteration 2103/3560 Training loss: 1.6506 0.8581 sec/batch
Epoch 12/20  Iteration 2104/3560 Training loss: 1.6507 0.9142 sec/batch
Epoch 12/20  Iteration 2105/3560 Training loss: 1.6506 0.8792 sec/batch
Epoch 12/20  Iteration 2106/3560 Training loss: 1.6509 0.8662 sec/batch
Epoch 12/20  Iteration 2107/3560 Training loss: 1.6509 0.8711 sec/batch
Epoch 12/20  Iteration 2108/3560 Training loss: 1.6508 0.9173 sec/batch
Epoch 12/20  Iteration 2109/3560 Training loss: 1.6505 0.8621 sec/batch
Epoch 12/20  Iteration 2110/3560 Training loss: 1.6505 0.9125 sec/batch
Epoch 12/20  Iteration 2111/3560 Training loss: 1.6505 0.8709 sec/batch
Epoch 12/20  Iteration 2112/3560 Training loss: 1.6506 0.8651 sec/batch
Epoch 12/20  Iteration 2113/3560 Training loss: 1.6506 0.8627 sec/batch
Epoch 12/20  Iteration 2114/3560 Training loss: 1.6506 0.8710 sec/batch
Epoch 12/20  Iteration 2115/3560 Training loss: 1.6507 0.8648 sec/batch
Epoch 12/20  Iteration 2116/3560 Training loss: 1.6507 0.8738 sec/batch
Epoch 12/20  Iteration 2117/3560 Training loss: 1.6504 0.8745 sec/batch
Epoch 12/20  Iteration 2118/3560 Training loss: 1.6505 0.8690 sec/batch
Epoch 12/20  Iteration 2119/3560 Training loss: 1.6507 0.8684 sec/batch
Epoch 12/20  Iteration 2120/3560 Training loss: 1.6507 0.8640 sec/batch
Epoch 12/20  Iteration 2121/3560 Training loss: 1.6507 0.8638 sec/batch
Epoch 12/20  Iteration 2122/3560 Training loss: 1.6508 0.8830 sec/batch
Epoch 12/20  Iteration 2123/3560 Training loss: 1.6507 0.8673 sec/batch
Epoch 12/20  Iteration 2124/3560 Training loss: 1.6507 0.8633 sec/batch
Epoch 12/20  Iteration 2125/3560 Training loss: 1.6508 0.8651 sec/batch
Epoch 12/20  Iteration 2126/3560 Training loss: 1.6512 0.8642 sec/batch
Epoch 12/20  Iteration 2127/3560 Training loss: 1.6512 0.8605 sec/batch
Epoch 12/20  Iteration 2128/3560 Training loss: 1.6511 0.8653 sec/batch
Epoch 12/20  Iteration 2129/3560 Training loss: 1.6509 0.8671 sec/batch
Epoch 12/20  Iteration 2130/3560 Training loss: 1.6507 0.9039 sec/batch
Epoch 12/20  Iteration 2131/3560 Training loss: 1.6508 0.8691 sec/batch
Epoch 12/20  Iteration 2132/3560 Training loss: 1.6508 0.8814 sec/batch
Epoch 12/20  Iteration 2133/3560 Training loss: 1.6508 0.9167 sec/batch
Epoch 12/20  Iteration 2134/3560 Training loss: 1.6507 0.8789 sec/batch
Epoch 12/20  Iteration 2135/3560 Training loss: 1.6505 0.8699 sec/batch
Epoch 12/20  Iteration 2136/3560 Training loss: 1.6505 0.8610 sec/batch
Epoch 13/20  Iteration 2137/3560 Training loss: 1.7270 0.8650 sec/batch
Epoch 13/20  Iteration 2138/3560 Training loss: 1.6857 0.9179 sec/batch
Epoch 13/20  Iteration 2139/3560 Training loss: 1.6695 0.8637 sec/batch
Epoch 13/20  Iteration 2140/3560 Training loss: 1.6635 0.8713 sec/batch
Epoch 13/20  Iteration 2141/3560 Training loss: 1.6571 0.8701 sec/batch
Epoch 13/20  Iteration 2142/3560 Training loss: 1.6439 0.8608 sec/batch
Epoch 13/20  Iteration 2143/3560 Training loss: 1.6442 0.9076 sec/batch
Epoch 13/20  Iteration 2144/3560 Training loss: 1.6428 0.8716 sec/batch
Epoch 13/20  Iteration 2145/3560 Training loss: 1.6443 0.8937 sec/batch
Epoch 13/20  Iteration 2146/3560 Training loss: 1.6433 0.8686 sec/batch
Epoch 13/20  Iteration 2147/3560 Training loss: 1.6400 0.8726 sec/batch
Epoch 13/20  Iteration 2148/3560 Training loss: 1.6377 0.8635 sec/batch
Epoch 13/20  Iteration 2149/3560 Training loss: 1.6374 0.8720 sec/batch
Epoch 13/20  Iteration 2150/3560 Training loss: 1.6396 0.8764 sec/batch
Epoch 13/20  Iteration 2151/3560 Training loss: 1.6387 0.9106 sec/batch
Epoch 13/20  Iteration 2152/3560 Training loss: 1.6370 0.8722 sec/batch
Epoch 13/20  Iteration 2153/3560 Training loss: 1.6369 0.8700 sec/batch
Epoch 13/20  Iteration 2154/3560 Training loss: 1.6384 0.8609 sec/batch
Epoch 13/20  Iteration 2155/3560 Training loss: 1.6392 0.8668 sec/batch
Epoch 13/20  Iteration 2156/3560 Training loss: 1.6403 0.8657 sec/batch
Epoch 13/20  Iteration 2157/3560 Training loss: 1.6396 0.9108 sec/batch
Epoch 13/20  Iteration 2158/3560 Training loss: 1.6398 0.8649 sec/batch
Epoch 13/20  Iteration 2159/3560 Training loss: 1.6391 0.8674 sec/batch
Epoch 13/20  Iteration 2160/3560 Training loss: 1.6389 0.9066 sec/batch
Epoch 13/20  Iteration 2161/3560 Training loss: 1.6389 0.8730 sec/batch
Epoch 13/20  Iteration 2162/3560 Training loss: 1.6372 0.8758 sec/batch
Epoch 13/20  Iteration 2163/3560 Training loss: 1.6357 0.8560 sec/batch
Epoch 13/20  Iteration 2164/3560 Training loss: 1.6362 0.8660 sec/batch
Epoch 13/20  Iteration 2165/3560 Training loss: 1.6367 0.8717 sec/batch
Epoch 13/20  Iteration 2166/3560 Training loss: 1.6372 0.8925 sec/batch
Epoch 13/20  Iteration 2167/3560 Training loss: 1.6368 0.8735 sec/batch
Epoch 13/20  Iteration 2168/3560 Training loss: 1.6360 0.8706 sec/batch
Epoch 13/20  Iteration 2169/3560 Training loss: 1.6362 0.9072 sec/batch
Epoch 13/20  Iteration 2170/3560 Training loss: 1.6366 0.8722 sec/batch
Epoch 13/20  Iteration 2171/3560 Training loss: 1.6365 0.8652 sec/batch
Epoch 13/20  Iteration 2172/3560 Training loss: 1.6363 0.8611 sec/batch
Epoch 13/20  Iteration 2173/3560 Training loss: 1.6357 0.8706 sec/batch
Epoch 13/20  Iteration 2174/3560 Training loss: 1.6344 0.8812 sec/batch
Epoch 13/20  Iteration 2175/3560 Training loss: 1.6333 0.8607 sec/batch
Epoch 13/20  Iteration 2176/3560 Training loss: 1.6325 0.8778 sec/batch
Epoch 13/20  Iteration 2177/3560 Training loss: 1.6319 0.9132 sec/batch
Epoch 13/20  Iteration 2178/3560 Training loss: 1.6326 0.8638 sec/batch
Epoch 13/20  Iteration 2179/3560 Training loss: 1.6320 0.9162 sec/batch
Epoch 13/20  Iteration 2180/3560 Training loss: 1.6314 0.9160 sec/batch
Epoch 13/20  Iteration 2181/3560 Training loss: 1.6318 0.8743 sec/batch
Epoch 13/20  Iteration 2182/3560 Training loss: 1.6309 0.9956 sec/batch
Epoch 13/20  Iteration 2183/3560 Training loss: 1.6306 0.8650 sec/batch
Epoch 13/20  Iteration 2184/3560 Training loss: 1.6303 0.9383 sec/batch
Epoch 13/20  Iteration 2185/3560 Training loss: 1.6303 0.8688 sec/batch
Epoch 13/20  Iteration 2186/3560 Training loss: 1.6311 0.8718 sec/batch
Epoch 13/20  Iteration 2187/3560 Training loss: 1.6306 0.8667 sec/batch
Epoch 13/20  Iteration 2188/3560 Training loss: 1.6314 0.8635 sec/batch
Epoch 13/20  Iteration 2189/3560 Training loss: 1.6312 0.8629 sec/batch
Epoch 13/20  Iteration 2190/3560 Training loss: 1.6313 0.8722 sec/batch
Epoch 13/20  Iteration 2191/3560 Training loss: 1.6312 0.8638 sec/batch
Epoch 13/20  Iteration 2192/3560 Training loss: 1.6312 0.8726 sec/batch
Epoch 13/20  Iteration 2193/3560 Training loss: 1.6315 0.9078 sec/batch
Epoch 13/20  Iteration 2194/3560 Training loss: 1.6311 0.8670 sec/batch
Epoch 13/20  Iteration 2195/3560 Training loss: 1.6305 0.8683 sec/batch
Epoch 13/20  Iteration 2196/3560 Training loss: 1.6310 0.8646 sec/batch
Epoch 13/20  Iteration 2197/3560 Training loss: 1.6309 0.8631 sec/batch
Epoch 13/20  Iteration 2198/3560 Training loss: 1.6318 0.8669 sec/batch
Epoch 13/20  Iteration 2199/3560 Training loss: 1.6321 0.8660 sec/batch
Epoch 13/20  Iteration 2200/3560 Training loss: 1.6324 0.8762 sec/batch
Epoch 13/20  Iteration 2201/3560 Training loss: 1.6322 0.8804 sec/batch
Epoch 13/20  Iteration 2202/3560 Training loss: 1.6325 0.9279 sec/batch
Epoch 13/20  Iteration 2203/3560 Training loss: 1.6328 0.8811 sec/batch
Epoch 13/20  Iteration 2204/3560 Training loss: 1.6324 0.9179 sec/batch
Epoch 13/20  Iteration 2205/3560 Training loss: 1.6324 0.8715 sec/batch
Epoch 13/20  Iteration 2206/3560 Training loss: 1.6323 0.8702 sec/batch
Epoch 13/20  Iteration 2207/3560 Training loss: 1.6331 0.8660 sec/batch
Epoch 13/20  Iteration 2208/3560 Training loss: 1.6332 0.8646 sec/batch
Epoch 13/20  Iteration 2209/3560 Training loss: 1.6337 0.8580 sec/batch
Epoch 13/20  Iteration 2210/3560 Training loss: 1.6336 0.8674 sec/batch
Epoch 13/20  Iteration 2211/3560 Training loss: 1.6334 0.8570 sec/batch
Epoch 13/20  Iteration 2212/3560 Training loss: 1.6337 0.8604 sec/batch
Epoch 13/20  Iteration 2213/3560 Training loss: 1.6336 0.8657 sec/batch
Epoch 13/20  Iteration 2214/3560 Training loss: 1.6336 0.9175 sec/batch
Epoch 13/20  Iteration 2215/3560 Training loss: 1.6329 0.8884 sec/batch
Epoch 13/20  Iteration 2216/3560 Training loss: 1.6328 0.9588 sec/batch
Epoch 13/20  Iteration 2217/3560 Training loss: 1.6323 0.9528 sec/batch
Epoch 13/20  Iteration 2218/3560 Training loss: 1.6323 0.9850 sec/batch
Epoch 13/20  Iteration 2219/3560 Training loss: 1.6318 1.0766 sec/batch
Epoch 13/20  Iteration 2220/3560 Training loss: 1.6317 0.9493 sec/batch
Epoch 13/20  Iteration 2221/3560 Training loss: 1.6313 0.8628 sec/batch
Epoch 13/20  Iteration 2222/3560 Training loss: 1.6311 0.8649 sec/batch
Epoch 13/20  Iteration 2223/3560 Training loss: 1.6309 0.8661 sec/batch
Epoch 13/20  Iteration 2224/3560 Training loss: 1.6307 0.8662 sec/batch
Epoch 13/20  Iteration 2225/3560 Training loss: 1.6303 0.8734 sec/batch
Epoch 13/20  Iteration 2226/3560 Training loss: 1.6304 0.8677 sec/batch
Epoch 13/20  Iteration 2227/3560 Training loss: 1.6300 0.8879 sec/batch
Epoch 13/20  Iteration 2228/3560 Training loss: 1.6298 0.9068 sec/batch
Epoch 13/20  Iteration 2229/3560 Training loss: 1.6294 0.8673 sec/batch
Epoch 13/20  Iteration 2230/3560 Training loss: 1.6289 0.9121 sec/batch
Epoch 13/20  Iteration 2231/3560 Training loss: 1.6286 0.8697 sec/batch
Epoch 13/20  Iteration 2232/3560 Training loss: 1.6287 0.8689 sec/batch
Epoch 13/20  Iteration 2233/3560 Training loss: 1.6286 0.8703 sec/batch
Epoch 13/20  Iteration 2234/3560 Training loss: 1.6282 0.9388 sec/batch
Epoch 13/20  Iteration 2235/3560 Training loss: 1.6279 0.8849 sec/batch
Epoch 13/20  Iteration 2236/3560 Training loss: 1.6275 0.8770 sec/batch
Epoch 13/20  Iteration 2237/3560 Training loss: 1.6275 0.8675 sec/batch
Epoch 13/20  Iteration 2238/3560 Training loss: 1.6273 0.8916 sec/batch
Epoch 13/20  Iteration 2239/3560 Training loss: 1.6270 0.8746 sec/batch
Epoch 13/20  Iteration 2240/3560 Training loss: 1.6269 0.8721 sec/batch
Epoch 13/20  Iteration 2241/3560 Training loss: 1.6268 0.8830 sec/batch
Epoch 13/20  Iteration 2242/3560 Training loss: 1.6266 0.8849 sec/batch
Epoch 13/20  Iteration 2243/3560 Training loss: 1.6265 0.9571 sec/batch
Epoch 13/20  Iteration 2244/3560 Training loss: 1.6264 0.8734 sec/batch
Epoch 13/20  Iteration 2245/3560 Training loss: 1.6264 0.8667 sec/batch
Epoch 13/20  Iteration 2246/3560 Training loss: 1.6264 0.8886 sec/batch
Epoch 13/20  Iteration 2247/3560 Training loss: 1.6262 0.8716 sec/batch
Epoch 13/20  Iteration 2248/3560 Training loss: 1.6261 0.9015 sec/batch
Epoch 13/20  Iteration 2249/3560 Training loss: 1.6259 0.8704 sec/batch
Epoch 13/20  Iteration 2250/3560 Training loss: 1.6257 0.8846 sec/batch
Epoch 13/20  Iteration 2251/3560 Training loss: 1.6254 0.8611 sec/batch
Epoch 13/20  Iteration 2252/3560 Training loss: 1.6250 0.8704 sec/batch
Epoch 13/20  Iteration 2253/3560 Training loss: 1.6250 0.8791 sec/batch
Epoch 13/20  Iteration 2254/3560 Training loss: 1.6250 0.8614 sec/batch
Epoch 13/20  Iteration 2255/3560 Training loss: 1.6249 0.8618 sec/batch
Epoch 13/20  Iteration 2256/3560 Training loss: 1.6248 0.8749 sec/batch
Epoch 13/20  Iteration 2257/3560 Training loss: 1.6248 0.8684 sec/batch
Epoch 13/20  Iteration 2258/3560 Training loss: 1.6244 0.8685 sec/batch
Epoch 13/20  Iteration 2259/3560 Training loss: 1.6240 0.8722 sec/batch
Epoch 13/20  Iteration 2260/3560 Training loss: 1.6241 0.8829 sec/batch
Epoch 13/20  Iteration 2261/3560 Training loss: 1.6239 0.8764 sec/batch
Epoch 13/20  Iteration 2262/3560 Training loss: 1.6235 0.8827 sec/batch
Epoch 13/20  Iteration 2263/3560 Training loss: 1.6237 0.8647 sec/batch
Epoch 13/20  Iteration 2264/3560 Training loss: 1.6237 0.8726 sec/batch
Epoch 13/20  Iteration 2265/3560 Training loss: 1.6236 0.8696 sec/batch
Epoch 13/20  Iteration 2266/3560 Training loss: 1.6234 0.8580 sec/batch
Epoch 13/20  Iteration 2267/3560 Training loss: 1.6231 0.8839 sec/batch
Epoch 13/20  Iteration 2268/3560 Training loss: 1.6227 0.9088 sec/batch
Epoch 13/20  Iteration 2269/3560 Training loss: 1.6228 0.8698 sec/batch
Epoch 13/20  Iteration 2270/3560 Training loss: 1.6228 0.9189 sec/batch
Epoch 13/20  Iteration 2271/3560 Training loss: 1.6228 0.8739 sec/batch
Epoch 13/20  Iteration 2272/3560 Training loss: 1.6228 0.8593 sec/batch
Epoch 13/20  Iteration 2273/3560 Training loss: 1.6229 1.0859 sec/batch
Epoch 13/20  Iteration 2274/3560 Training loss: 1.6230 0.8889 sec/batch
Epoch 13/20  Iteration 2275/3560 Training loss: 1.6230 0.8770 sec/batch
Epoch 13/20  Iteration 2276/3560 Training loss: 1.6229 0.9131 sec/batch
Epoch 13/20  Iteration 2277/3560 Training loss: 1.6232 0.9156 sec/batch
Epoch 13/20  Iteration 2278/3560 Training loss: 1.6232 0.9125 sec/batch
Epoch 13/20  Iteration 2279/3560 Training loss: 1.6231 0.8691 sec/batch
Epoch 13/20  Iteration 2280/3560 Training loss: 1.6231 0.8884 sec/batch
Epoch 13/20  Iteration 2281/3560 Training loss: 1.6229 0.8691 sec/batch
Epoch 13/20  Iteration 2282/3560 Training loss: 1.6230 0.8648 sec/batch
Epoch 13/20  Iteration 2283/3560 Training loss: 1.6230 0.8653 sec/batch
Epoch 13/20  Iteration 2284/3560 Training loss: 1.6232 0.8728 sec/batch
Epoch 13/20  Iteration 2285/3560 Training loss: 1.6233 0.8702 sec/batch
Epoch 13/20  Iteration 2286/3560 Training loss: 1.6233 0.8628 sec/batch
Epoch 13/20  Iteration 2287/3560 Training loss: 1.6229 0.8650 sec/batch
Epoch 13/20  Iteration 2288/3560 Training loss: 1.6230 0.8693 sec/batch
Epoch 13/20  Iteration 2289/3560 Training loss: 1.6231 0.8610 sec/batch
Epoch 13/20  Iteration 2290/3560 Training loss: 1.6231 0.8716 sec/batch
Epoch 13/20  Iteration 2291/3560 Training loss: 1.6231 0.8716 sec/batch
Epoch 13/20  Iteration 2292/3560 Training loss: 1.6231 0.8622 sec/batch
Epoch 13/20  Iteration 2293/3560 Training loss: 1.6232 0.8730 sec/batch
Epoch 13/20  Iteration 2294/3560 Training loss: 1.6232 0.8637 sec/batch
Epoch 13/20  Iteration 2295/3560 Training loss: 1.6230 0.8676 sec/batch
Epoch 13/20  Iteration 2296/3560 Training loss: 1.6231 0.8682 sec/batch
Epoch 13/20  Iteration 2297/3560 Training loss: 1.6233 0.8595 sec/batch
Epoch 13/20  Iteration 2298/3560 Training loss: 1.6232 0.8712 sec/batch
Epoch 13/20  Iteration 2299/3560 Training loss: 1.6233 0.8624 sec/batch
Epoch 13/20  Iteration 2300/3560 Training loss: 1.6232 0.8644 sec/batch
Epoch 13/20  Iteration 2301/3560 Training loss: 1.6232 0.8760 sec/batch
Epoch 13/20  Iteration 2302/3560 Training loss: 1.6231 0.8687 sec/batch
Epoch 13/20  Iteration 2303/3560 Training loss: 1.6232 0.8892 sec/batch
Epoch 13/20  Iteration 2304/3560 Training loss: 1.6236 0.8636 sec/batch
Epoch 13/20  Iteration 2305/3560 Training loss: 1.6236 0.8863 sec/batch
Epoch 13/20  Iteration 2306/3560 Training loss: 1.6236 0.8742 sec/batch
Epoch 13/20  Iteration 2307/3560 Training loss: 1.6236 0.9280 sec/batch
Epoch 13/20  Iteration 2308/3560 Training loss: 1.6233 0.9254 sec/batch
Epoch 13/20  Iteration 2309/3560 Training loss: 1.6234 0.8875 sec/batch
Epoch 13/20  Iteration 2310/3560 Training loss: 1.6234 0.9102 sec/batch
Epoch 13/20  Iteration 2311/3560 Training loss: 1.6235 0.8910 sec/batch
Epoch 13/20  Iteration 2312/3560 Training loss: 1.6234 0.8830 sec/batch
Epoch 13/20  Iteration 2313/3560 Training loss: 1.6231 0.9071 sec/batch
Epoch 13/20  Iteration 2314/3560 Training loss: 1.6232 0.8579 sec/batch
Epoch 14/20  Iteration 2315/3560 Training loss: 1.7008 0.8712 sec/batch
Epoch 14/20  Iteration 2316/3560 Training loss: 1.6611 0.9777 sec/batch
Epoch 14/20  Iteration 2317/3560 Training loss: 1.6484 1.0216 sec/batch
Epoch 14/20  Iteration 2318/3560 Training loss: 1.6430 0.8786 sec/batch
Epoch 14/20  Iteration 2319/3560 Training loss: 1.6345 0.9543 sec/batch
Epoch 14/20  Iteration 2320/3560 Training loss: 1.6235 0.8847 sec/batch
Epoch 14/20  Iteration 2321/3560 Training loss: 1.6242 0.8645 sec/batch
Epoch 14/20  Iteration 2322/3560 Training loss: 1.6207 0.9481 sec/batch
Epoch 14/20  Iteration 2323/3560 Training loss: 1.6219 0.8654 sec/batch
Epoch 14/20  Iteration 2324/3560 Training loss: 1.6202 0.8692 sec/batch
Epoch 14/20  Iteration 2325/3560 Training loss: 1.6148 0.8763 sec/batch
Epoch 14/20  Iteration 2326/3560 Training loss: 1.6132 0.8562 sec/batch
Epoch 14/20  Iteration 2327/3560 Training loss: 1.6133 0.8720 sec/batch
Epoch 14/20  Iteration 2328/3560 Training loss: 1.6154 0.8996 sec/batch
Epoch 14/20  Iteration 2329/3560 Training loss: 1.6152 0.8819 sec/batch
Epoch 14/20  Iteration 2330/3560 Training loss: 1.6136 0.9002 sec/batch
Epoch 14/20  Iteration 2331/3560 Training loss: 1.6136 0.9127 sec/batch
Epoch 14/20  Iteration 2332/3560 Training loss: 1.6148 0.8704 sec/batch
Epoch 14/20  Iteration 2333/3560 Training loss: 1.6150 0.9233 sec/batch
Epoch 14/20  Iteration 2334/3560 Training loss: 1.6163 0.8816 sec/batch
Epoch 14/20  Iteration 2335/3560 Training loss: 1.6157 0.8935 sec/batch
Epoch 14/20  Iteration 2336/3560 Training loss: 1.6163 0.9210 sec/batch
Epoch 14/20  Iteration 2337/3560 Training loss: 1.6155 0.9244 sec/batch
Epoch 14/20  Iteration 2338/3560 Training loss: 1.6153 0.8982 sec/batch
Epoch 14/20  Iteration 2339/3560 Training loss: 1.6152 0.8641 sec/batch
Epoch 14/20  Iteration 2340/3560 Training loss: 1.6135 0.8587 sec/batch
Epoch 14/20  Iteration 2341/3560 Training loss: 1.6121 0.8683 sec/batch
Epoch 14/20  Iteration 2342/3560 Training loss: 1.6126 0.9460 sec/batch
Epoch 14/20  Iteration 2343/3560 Training loss: 1.6134 0.8831 sec/batch
Epoch 14/20  Iteration 2344/3560 Training loss: 1.6137 0.8734 sec/batch
Epoch 14/20  Iteration 2345/3560 Training loss: 1.6135 0.8792 sec/batch
Epoch 14/20  Iteration 2346/3560 Training loss: 1.6125 0.9256 sec/batch
Epoch 14/20  Iteration 2347/3560 Training loss: 1.6129 0.9534 sec/batch
Epoch 14/20  Iteration 2348/3560 Training loss: 1.6135 1.0357 sec/batch
Epoch 14/20  Iteration 2349/3560 Training loss: 1.6133 0.8482 sec/batch
Epoch 14/20  Iteration 2350/3560 Training loss: 1.6131 0.8645 sec/batch
Epoch 14/20  Iteration 2351/3560 Training loss: 1.6125 0.9183 sec/batch
Epoch 14/20  Iteration 2352/3560 Training loss: 1.6114 0.8610 sec/batch
Epoch 14/20  Iteration 2353/3560 Training loss: 1.6102 0.8535 sec/batch
Epoch 14/20  Iteration 2354/3560 Training loss: 1.6095 0.8736 sec/batch
Epoch 14/20  Iteration 2355/3560 Training loss: 1.6091 0.8493 sec/batch
Epoch 14/20  Iteration 2356/3560 Training loss: 1.6096 0.8516 sec/batch
Epoch 14/20  Iteration 2357/3560 Training loss: 1.6090 0.8511 sec/batch
Epoch 14/20  Iteration 2358/3560 Training loss: 1.6082 0.8608 sec/batch
Epoch 14/20  Iteration 2359/3560 Training loss: 1.6085 0.8587 sec/batch
Epoch 14/20  Iteration 2360/3560 Training loss: 1.6074 0.8504 sec/batch
Epoch 14/20  Iteration 2361/3560 Training loss: 1.6070 0.8645 sec/batch
Epoch 14/20  Iteration 2362/3560 Training loss: 1.6067 0.8544 sec/batch
Epoch 14/20  Iteration 2363/3560 Training loss: 1.6065 0.8607 sec/batch
Epoch 14/20  Iteration 2364/3560 Training loss: 1.6070 0.8532 sec/batch
Epoch 14/20  Iteration 2365/3560 Training loss: 1.6066 0.8691 sec/batch
Epoch 14/20  Iteration 2366/3560 Training loss: 1.6073 0.8466 sec/batch
Epoch 14/20  Iteration 2367/3560 Training loss: 1.6071 0.8453 sec/batch
Epoch 14/20  Iteration 2368/3560 Training loss: 1.6070 0.8848 sec/batch
Epoch 14/20  Iteration 2369/3560 Training loss: 1.6070 0.9085 sec/batch
Epoch 14/20  Iteration 2370/3560 Training loss: 1.6072 0.8565 sec/batch
Epoch 14/20  Iteration 2371/3560 Training loss: 1.6076 0.8479 sec/batch
Epoch 14/20  Iteration 2372/3560 Training loss: 1.6073 0.8587 sec/batch
Epoch 14/20  Iteration 2373/3560 Training loss: 1.6069 0.8437 sec/batch
Epoch 14/20  Iteration 2374/3560 Training loss: 1.6075 0.8446 sec/batch
Epoch 14/20  Iteration 2375/3560 Training loss: 1.6074 0.8458 sec/batch
Epoch 14/20  Iteration 2376/3560 Training loss: 1.6084 0.8579 sec/batch
Epoch 14/20  Iteration 2377/3560 Training loss: 1.6087 0.8527 sec/batch
Epoch 14/20  Iteration 2378/3560 Training loss: 1.6089 0.8629 sec/batch
Epoch 14/20  Iteration 2379/3560 Training loss: 1.6087 0.8559 sec/batch
Epoch 14/20  Iteration 2380/3560 Training loss: 1.6090 0.8500 sec/batch
Epoch 14/20  Iteration 2381/3560 Training loss: 1.6092 0.8567 sec/batch
Epoch 14/20  Iteration 2382/3560 Training loss: 1.6089 0.8428 sec/batch
Epoch 14/20  Iteration 2383/3560 Training loss: 1.6089 0.8499 sec/batch
Epoch 14/20  Iteration 2384/3560 Training loss: 1.6088 0.8547 sec/batch
Epoch 14/20  Iteration 2385/3560 Training loss: 1.6094 0.8588 sec/batch
Epoch 14/20  Iteration 2386/3560 Training loss: 1.6096 0.9001 sec/batch
Epoch 14/20  Iteration 2387/3560 Training loss: 1.6100 0.8513 sec/batch
Epoch 14/20  Iteration 2388/3560 Training loss: 1.6098 0.8508 sec/batch
Epoch 14/20  Iteration 2389/3560 Training loss: 1.6097 0.8492 sec/batch
Epoch 14/20  Iteration 2390/3560 Training loss: 1.6099 0.8995 sec/batch
Epoch 14/20  Iteration 2391/3560 Training loss: 1.6098 0.8591 sec/batch
Epoch 14/20  Iteration 2392/3560 Training loss: 1.6098 0.9263 sec/batch
Epoch 14/20  Iteration 2393/3560 Training loss: 1.6093 0.8690 sec/batch
Epoch 14/20  Iteration 2394/3560 Training loss: 1.6092 0.9349 sec/batch
Epoch 14/20  Iteration 2395/3560 Training loss: 1.6087 0.8826 sec/batch
Epoch 14/20  Iteration 2396/3560 Training loss: 1.6088 0.9413 sec/batch
Epoch 14/20  Iteration 2397/3560 Training loss: 1.6083 0.8890 sec/batch
Epoch 14/20  Iteration 2398/3560 Training loss: 1.6082 0.8581 sec/batch
Epoch 14/20  Iteration 2399/3560 Training loss: 1.6078 0.8818 sec/batch
Epoch 14/20  Iteration 2400/3560 Training loss: 1.6076 0.8693 sec/batch
Epoch 14/20  Iteration 2401/3560 Training loss: 1.6074 0.8938 sec/batch
Epoch 14/20  Iteration 2402/3560 Training loss: 1.6071 0.8705 sec/batch
Epoch 14/20  Iteration 2403/3560 Training loss: 1.6067 0.9583 sec/batch
Epoch 14/20  Iteration 2404/3560 Training loss: 1.6068 0.8544 sec/batch
Epoch 14/20  Iteration 2405/3560 Training loss: 1.6064 0.8594 sec/batch
Epoch 14/20  Iteration 2406/3560 Training loss: 1.6062 0.8587 sec/batch
Epoch 14/20  Iteration 2407/3560 Training loss: 1.6058 0.9305 sec/batch
Epoch 14/20  Iteration 2408/3560 Training loss: 1.6053 0.9014 sec/batch
Epoch 14/20  Iteration 2409/3560 Training loss: 1.6051 0.8635 sec/batch
Epoch 14/20  Iteration 2410/3560 Training loss: 1.6051 0.9586 sec/batch
Epoch 14/20  Iteration 2411/3560 Training loss: 1.6051 0.8674 sec/batch
Epoch 14/20  Iteration 2412/3560 Training loss: 1.6047 0.8582 sec/batch
Epoch 14/20  Iteration 2413/3560 Training loss: 1.6044 0.9044 sec/batch
Epoch 14/20  Iteration 2414/3560 Training loss: 1.6039 0.8603 sec/batch
Epoch 14/20  Iteration 2415/3560 Training loss: 1.6038 0.8625 sec/batch
Epoch 14/20  Iteration 2416/3560 Training loss: 1.6037 0.9227 sec/batch
Epoch 14/20  Iteration 2417/3560 Training loss: 1.6035 0.8721 sec/batch
Epoch 14/20  Iteration 2418/3560 Training loss: 1.6034 0.8555 sec/batch
Epoch 14/20  Iteration 2419/3560 Training loss: 1.6033 0.8565 sec/batch
Epoch 14/20  Iteration 2420/3560 Training loss: 1.6032 0.8637 sec/batch
Epoch 14/20  Iteration 2421/3560 Training loss: 1.6031 0.8613 sec/batch
Epoch 14/20  Iteration 2422/3560 Training loss: 1.6030 0.8468 sec/batch
Epoch 14/20  Iteration 2423/3560 Training loss: 1.6028 0.8564 sec/batch
Epoch 14/20  Iteration 2424/3560 Training loss: 1.6029 0.8563 sec/batch
Epoch 14/20  Iteration 2425/3560 Training loss: 1.6026 0.8534 sec/batch
Epoch 14/20  Iteration 2426/3560 Training loss: 1.6026 0.8614 sec/batch
Epoch 14/20  Iteration 2427/3560 Training loss: 1.6024 0.8763 sec/batch
Epoch 14/20  Iteration 2428/3560 Training loss: 1.6023 0.9175 sec/batch
Epoch 14/20  Iteration 2429/3560 Training loss: 1.6020 0.8575 sec/batch
Epoch 14/20  Iteration 2430/3560 Training loss: 1.6016 0.8899 sec/batch
Epoch 14/20  Iteration 2431/3560 Training loss: 1.6015 0.8556 sec/batch
Epoch 14/20  Iteration 2432/3560 Training loss: 1.6015 0.8737 sec/batch
Epoch 14/20  Iteration 2433/3560 Training loss: 1.6014 0.9142 sec/batch
Epoch 14/20  Iteration 2434/3560 Training loss: 1.6013 0.8532 sec/batch
Epoch 14/20  Iteration 2435/3560 Training loss: 1.6012 0.8843 sec/batch
Epoch 14/20  Iteration 2436/3560 Training loss: 1.6008 0.8504 sec/batch
Epoch 14/20  Iteration 2437/3560 Training loss: 1.6004 0.8787 sec/batch
Epoch 14/20  Iteration 2438/3560 Training loss: 1.6004 0.8929 sec/batch
Epoch 14/20  Iteration 2439/3560 Training loss: 1.6004 0.8953 sec/batch
Epoch 14/20  Iteration 2440/3560 Training loss: 1.6000 0.9055 sec/batch
Epoch 14/20  Iteration 2441/3560 Training loss: 1.6001 0.9298 sec/batch
Epoch 14/20  Iteration 2442/3560 Training loss: 1.6001 0.8897 sec/batch
Epoch 14/20  Iteration 2443/3560 Training loss: 1.6000 0.8577 sec/batch
Epoch 14/20  Iteration 2444/3560 Training loss: 1.5998 0.9106 sec/batch
Epoch 14/20  Iteration 2445/3560 Training loss: 1.5995 0.8711 sec/batch
Epoch 14/20  Iteration 2446/3560 Training loss: 1.5992 0.8600 sec/batch
Epoch 14/20  Iteration 2447/3560 Training loss: 1.5993 0.8616 sec/batch
Epoch 14/20  Iteration 2448/3560 Training loss: 1.5993 0.8849 sec/batch
Epoch 14/20  Iteration 2449/3560 Training loss: 1.5993 0.8502 sec/batch
Epoch 14/20  Iteration 2450/3560 Training loss: 1.5994 0.8858 sec/batch
Epoch 14/20  Iteration 2451/3560 Training loss: 1.5995 0.8677 sec/batch
Epoch 14/20  Iteration 2452/3560 Training loss: 1.5995 0.8654 sec/batch
Epoch 14/20  Iteration 2453/3560 Training loss: 1.5996 0.9589 sec/batch
Epoch 14/20  Iteration 2454/3560 Training loss: 1.5995 0.8864 sec/batch
Epoch 14/20  Iteration 2455/3560 Training loss: 1.5998 0.9282 sec/batch
Epoch 14/20  Iteration 2456/3560 Training loss: 1.5997 0.8690 sec/batch
Epoch 14/20  Iteration 2457/3560 Training loss: 1.5997 0.8804 sec/batch
Epoch 14/20  Iteration 2458/3560 Training loss: 1.5998 0.8842 sec/batch
Epoch 14/20  Iteration 2459/3560 Training loss: 1.5996 0.8859 sec/batch
Epoch 14/20  Iteration 2460/3560 Training loss: 1.5997 0.9031 sec/batch
Epoch 14/20  Iteration 2461/3560 Training loss: 1.5996 0.8777 sec/batch
Epoch 14/20  Iteration 2462/3560 Training loss: 1.5998 0.9077 sec/batch
Epoch 14/20  Iteration 2463/3560 Training loss: 1.5999 0.8889 sec/batch
Epoch 14/20  Iteration 2464/3560 Training loss: 1.5998 0.8680 sec/batch
Epoch 14/20  Iteration 2465/3560 Training loss: 1.5994 0.8518 sec/batch
Epoch 14/20  Iteration 2466/3560 Training loss: 1.5994 0.8944 sec/batch
Epoch 14/20  Iteration 2467/3560 Training loss: 1.5994 0.8726 sec/batch
Epoch 14/20  Iteration 2468/3560 Training loss: 1.5995 0.8924 sec/batch
Epoch 14/20  Iteration 2469/3560 Training loss: 1.5995 0.9203 sec/batch
Epoch 14/20  Iteration 2470/3560 Training loss: 1.5995 0.8848 sec/batch
Epoch 14/20  Iteration 2471/3560 Training loss: 1.5995 0.9025 sec/batch
Epoch 14/20  Iteration 2472/3560 Training loss: 1.5996 1.0082 sec/batch
Epoch 14/20  Iteration 2473/3560 Training loss: 1.5993 1.0514 sec/batch
Epoch 14/20  Iteration 2474/3560 Training loss: 1.5995 1.0358 sec/batch
Epoch 14/20  Iteration 2475/3560 Training loss: 1.5997 1.0685 sec/batch
Epoch 14/20  Iteration 2476/3560 Training loss: 1.5996 1.0647 sec/batch
Epoch 14/20  Iteration 2477/3560 Training loss: 1.5997 1.0748 sec/batch
Epoch 14/20  Iteration 2478/3560 Training loss: 1.5997 1.1049 sec/batch
Epoch 14/20  Iteration 2479/3560 Training loss: 1.5997 1.0907 sec/batch
Epoch 14/20  Iteration 2480/3560 Training loss: 1.5996 1.1422 sec/batch
Epoch 14/20  Iteration 2481/3560 Training loss: 1.5998 1.0885 sec/batch
Epoch 14/20  Iteration 2482/3560 Training loss: 1.6001 1.0258 sec/batch
Epoch 14/20  Iteration 2483/3560 Training loss: 1.6001 1.0410 sec/batch
Epoch 14/20  Iteration 2484/3560 Training loss: 1.6000 1.0475 sec/batch
Epoch 14/20  Iteration 2485/3560 Training loss: 1.5999 1.0919 sec/batch
Epoch 14/20  Iteration 2486/3560 Training loss: 1.5997 1.0612 sec/batch
Epoch 14/20  Iteration 2487/3560 Training loss: 1.5998 1.0878 sec/batch
Epoch 14/20  Iteration 2488/3560 Training loss: 1.5998 0.8604 sec/batch
Epoch 14/20  Iteration 2489/3560 Training loss: 1.5998 0.8630 sec/batch
Epoch 14/20  Iteration 2490/3560 Training loss: 1.5997 0.8505 sec/batch
Epoch 14/20  Iteration 2491/3560 Training loss: 1.5996 0.8496 sec/batch
Epoch 14/20  Iteration 2492/3560 Training loss: 1.5997 0.8624 sec/batch
Epoch 15/20  Iteration 2493/3560 Training loss: 1.6814 0.8725 sec/batch
Epoch 15/20  Iteration 2494/3560 Training loss: 1.6393 0.8656 sec/batch
Epoch 15/20  Iteration 2495/3560 Training loss: 1.6215 0.8838 sec/batch
Epoch 15/20  Iteration 2496/3560 Training loss: 1.6172 0.8870 sec/batch
Epoch 15/20  Iteration 2497/3560 Training loss: 1.6104 0.8930 sec/batch
Epoch 15/20  Iteration 2498/3560 Training loss: 1.6004 0.8507 sec/batch
Epoch 15/20  Iteration 2499/3560 Training loss: 1.6014 0.8552 sec/batch
Epoch 15/20  Iteration 2500/3560 Training loss: 1.5981 0.8831 sec/batch
Epoch 15/20  Iteration 2501/3560 Training loss: 1.5992 0.9302 sec/batch
Epoch 15/20  Iteration 2502/3560 Training loss: 1.5978 0.8840 sec/batch
Epoch 15/20  Iteration 2503/3560 Training loss: 1.5935 0.8713 sec/batch
Epoch 15/20  Iteration 2504/3560 Training loss: 1.5926 0.8428 sec/batch
Epoch 15/20  Iteration 2505/3560 Training loss: 1.5924 0.8553 sec/batch
Epoch 15/20  Iteration 2506/3560 Training loss: 1.5947 0.8514 sec/batch
Epoch 15/20  Iteration 2507/3560 Training loss: 1.5938 0.8557 sec/batch
Epoch 15/20  Iteration 2508/3560 Training loss: 1.5922 0.8572 sec/batch
Epoch 15/20  Iteration 2509/3560 Training loss: 1.5922 0.8498 sec/batch
Epoch 15/20  Iteration 2510/3560 Training loss: 1.5933 0.8457 sec/batch
Epoch 15/20  Iteration 2511/3560 Training loss: 1.5931 0.8468 sec/batch
Epoch 15/20  Iteration 2512/3560 Training loss: 1.5943 0.8508 sec/batch
Epoch 15/20  Iteration 2513/3560 Training loss: 1.5935 0.8491 sec/batch
Epoch 15/20  Iteration 2514/3560 Training loss: 1.5935 0.8802 sec/batch
Epoch 15/20  Iteration 2515/3560 Training loss: 1.5929 0.8648 sec/batch
Epoch 15/20  Iteration 2516/3560 Training loss: 1.5924 0.8526 sec/batch
Epoch 15/20  Iteration 2517/3560 Training loss: 1.5926 0.8603 sec/batch
Epoch 15/20  Iteration 2518/3560 Training loss: 1.5912 0.8562 sec/batch
Epoch 15/20  Iteration 2519/3560 Training loss: 1.5898 0.8488 sec/batch
Epoch 15/20  Iteration 2520/3560 Training loss: 1.5902 0.8512 sec/batch
Epoch 15/20  Iteration 2521/3560 Training loss: 1.5908 0.8499 sec/batch
Epoch 15/20  Iteration 2522/3560 Training loss: 1.5911 0.8517 sec/batch
Epoch 15/20  Iteration 2523/3560 Training loss: 1.5911 0.8469 sec/batch
Epoch 15/20  Iteration 2524/3560 Training loss: 1.5899 0.8546 sec/batch
Epoch 15/20  Iteration 2525/3560 Training loss: 1.5901 0.9197 sec/batch
Epoch 15/20  Iteration 2526/3560 Training loss: 1.5906 0.8631 sec/batch
Epoch 15/20  Iteration 2527/3560 Training loss: 1.5905 0.9020 sec/batch
Epoch 15/20  Iteration 2528/3560 Training loss: 1.5904 0.8956 sec/batch
Epoch 15/20  Iteration 2529/3560 Training loss: 1.5898 0.9828 sec/batch
Epoch 15/20  Iteration 2530/3560 Training loss: 1.5886 1.0065 sec/batch
Epoch 15/20  Iteration 2531/3560 Training loss: 1.5873 1.1296 sec/batch
Epoch 15/20  Iteration 2532/3560 Training loss: 1.5865 0.9421 sec/batch
Epoch 15/20  Iteration 2533/3560 Training loss: 1.5862 0.9647 sec/batch
Epoch 15/20  Iteration 2534/3560 Training loss: 1.5868 0.8973 sec/batch
Epoch 15/20  Iteration 2535/3560 Training loss: 1.5864 0.9404 sec/batch
Epoch 15/20  Iteration 2536/3560 Training loss: 1.5856 0.9186 sec/batch
Epoch 15/20  Iteration 2537/3560 Training loss: 1.5859 0.8980 sec/batch
Epoch 15/20  Iteration 2538/3560 Training loss: 1.5850 0.9331 sec/batch
Epoch 15/20  Iteration 2539/3560 Training loss: 1.5845 0.8799 sec/batch
Epoch 15/20  Iteration 2540/3560 Training loss: 1.5841 0.8651 sec/batch
Epoch 15/20  Iteration 2541/3560 Training loss: 1.5842 0.8636 sec/batch
Epoch 15/20  Iteration 2542/3560 Training loss: 1.5850 0.8626 sec/batch
Epoch 15/20  Iteration 2543/3560 Training loss: 1.5846 0.8673 sec/batch
Epoch 15/20  Iteration 2544/3560 Training loss: 1.5854 0.8604 sec/batch
Epoch 15/20  Iteration 2545/3560 Training loss: 1.5853 0.9069 sec/batch
Epoch 15/20  Iteration 2546/3560 Training loss: 1.5852 0.9130 sec/batch
Epoch 15/20  Iteration 2547/3560 Training loss: 1.5851 0.8826 sec/batch
Epoch 15/20  Iteration 2548/3560 Training loss: 1.5855 0.8618 sec/batch
Epoch 15/20  Iteration 2549/3560 Training loss: 1.5859 0.8598 sec/batch
Epoch 15/20  Iteration 2550/3560 Training loss: 1.5856 0.8635 sec/batch
Epoch 15/20  Iteration 2551/3560 Training loss: 1.5850 0.8563 sec/batch
Epoch 15/20  Iteration 2552/3560 Training loss: 1.5858 0.8538 sec/batch
Epoch 15/20  Iteration 2553/3560 Training loss: 1.5859 0.8545 sec/batch
Epoch 15/20  Iteration 2554/3560 Training loss: 1.5868 0.8593 sec/batch
Epoch 15/20  Iteration 2555/3560 Training loss: 1.5871 0.8534 sec/batch
Epoch 15/20  Iteration 2556/3560 Training loss: 1.5873 0.8594 sec/batch
Epoch 15/20  Iteration 2557/3560 Training loss: 1.5871 0.8537 sec/batch
Epoch 15/20  Iteration 2558/3560 Training loss: 1.5875 0.9021 sec/batch
Epoch 15/20  Iteration 2559/3560 Training loss: 1.5877 0.8602 sec/batch
Epoch 15/20  Iteration 2560/3560 Training loss: 1.5873 0.8604 sec/batch
Epoch 15/20  Iteration 2561/3560 Training loss: 1.5874 0.8617 sec/batch
Epoch 15/20  Iteration 2562/3560 Training loss: 1.5872 0.8719 sec/batch
Epoch 15/20  Iteration 2563/3560 Training loss: 1.5879 0.8579 sec/batch
Epoch 15/20  Iteration 2564/3560 Training loss: 1.5882 0.8596 sec/batch
Epoch 15/20  Iteration 2565/3560 Training loss: 1.5886 0.8636 sec/batch
Epoch 15/20  Iteration 2566/3560 Training loss: 1.5884 0.8613 sec/batch
Epoch 15/20  Iteration 2567/3560 Training loss: 1.5883 0.8599 sec/batch
Epoch 15/20  Iteration 2568/3560 Training loss: 1.5885 0.9030 sec/batch
Epoch 15/20  Iteration 2569/3560 Training loss: 1.5884 0.8743 sec/batch
Epoch 15/20  Iteration 2570/3560 Training loss: 1.5883 0.8558 sec/batch
Epoch 15/20  Iteration 2571/3560 Training loss: 1.5878 0.8608 sec/batch
Epoch 15/20  Iteration 2572/3560 Training loss: 1.5876 0.8616 sec/batch
Epoch 15/20  Iteration 2573/3560 Training loss: 1.5872 0.8587 sec/batch
Epoch 15/20  Iteration 2574/3560 Training loss: 1.5872 0.8706 sec/batch
Epoch 15/20  Iteration 2575/3560 Training loss: 1.5867 0.8665 sec/batch
Epoch 15/20  Iteration 2576/3560 Training loss: 1.5868 0.8777 sec/batch
Epoch 15/20  Iteration 2577/3560 Training loss: 1.5864 0.8593 sec/batch
Epoch 15/20  Iteration 2578/3560 Training loss: 1.5862 0.8581 sec/batch
Epoch 15/20  Iteration 2579/3560 Training loss: 1.5859 0.8690 sec/batch
Epoch 15/20  Iteration 2580/3560 Training loss: 1.5856 0.8588 sec/batch
Epoch 15/20  Iteration 2581/3560 Training loss: 1.5852 0.8589 sec/batch
Epoch 15/20  Iteration 2582/3560 Training loss: 1.5853 0.8550 sec/batch
Epoch 15/20  Iteration 2583/3560 Training loss: 1.5849 0.8598 sec/batch
Epoch 15/20  Iteration 2584/3560 Training loss: 1.5848 0.9041 sec/batch
Epoch 15/20  Iteration 2585/3560 Training loss: 1.5844 0.9702 sec/batch
Epoch 15/20  Iteration 2586/3560 Training loss: 1.5840 0.9217 sec/batch
Epoch 15/20  Iteration 2587/3560 Training loss: 1.5837 0.8953 sec/batch
Epoch 15/20  Iteration 2588/3560 Training loss: 1.5837 0.8573 sec/batch
Epoch 15/20  Iteration 2589/3560 Training loss: 1.5837 0.8549 sec/batch
Epoch 15/20  Iteration 2590/3560 Training loss: 1.5833 0.8540 sec/batch
Epoch 15/20  Iteration 2591/3560 Training loss: 1.5830 0.8551 sec/batch
Epoch 15/20  Iteration 2592/3560 Training loss: 1.5827 0.8560 sec/batch
Epoch 15/20  Iteration 2593/3560 Training loss: 1.5826 0.8731 sec/batch
Epoch 15/20  Iteration 2594/3560 Training loss: 1.5825 0.8565 sec/batch
Epoch 15/20  Iteration 2595/3560 Training loss: 1.5823 0.8607 sec/batch
Epoch 15/20  Iteration 2596/3560 Training loss: 1.5822 0.8562 sec/batch
Epoch 15/20  Iteration 2597/3560 Training loss: 1.5821 0.8546 sec/batch
Epoch 15/20  Iteration 2598/3560 Training loss: 1.5819 0.8667 sec/batch
Epoch 15/20  Iteration 2599/3560 Training loss: 1.5818 0.8586 sec/batch
Epoch 15/20  Iteration 2600/3560 Training loss: 1.5818 0.8584 sec/batch
Epoch 15/20  Iteration 2601/3560 Training loss: 1.5817 0.8682 sec/batch
Epoch 15/20  Iteration 2602/3560 Training loss: 1.5817 0.8672 sec/batch
Epoch 15/20  Iteration 2603/3560 Training loss: 1.5815 0.8702 sec/batch
Epoch 15/20  Iteration 2604/3560 Training loss: 1.5814 0.8605 sec/batch
Epoch 15/20  Iteration 2605/3560 Training loss: 1.5813 0.8599 sec/batch
Epoch 15/20  Iteration 2606/3560 Training loss: 1.5812 0.9124 sec/batch
Epoch 15/20  Iteration 2607/3560 Training loss: 1.5809 0.8644 sec/batch
Epoch 15/20  Iteration 2608/3560 Training loss: 1.5806 0.8748 sec/batch
Epoch 15/20  Iteration 2609/3560 Training loss: 1.5805 1.1512 sec/batch
Epoch 15/20  Iteration 2610/3560 Training loss: 1.5805 1.0621 sec/batch
Epoch 15/20  Iteration 2611/3560 Training loss: 1.5804 0.9725 sec/batch
Epoch 15/20  Iteration 2612/3560 Training loss: 1.5804 0.9541 sec/batch
Epoch 15/20  Iteration 2613/3560 Training loss: 1.5802 0.8643 sec/batch
Epoch 15/20  Iteration 2614/3560 Training loss: 1.5798 0.8600 sec/batch
Epoch 15/20  Iteration 2615/3560 Training loss: 1.5794 0.8592 sec/batch
Epoch 15/20  Iteration 2616/3560 Training loss: 1.5795 0.8569 sec/batch
Epoch 15/20  Iteration 2617/3560 Training loss: 1.5794 0.8588 sec/batch
Epoch 15/20  Iteration 2618/3560 Training loss: 1.5790 0.8507 sec/batch
Epoch 15/20  Iteration 2619/3560 Training loss: 1.5791 0.8642 sec/batch
Epoch 15/20  Iteration 2620/3560 Training loss: 1.5791 0.8719 sec/batch
Epoch 15/20  Iteration 2621/3560 Training loss: 1.5790 0.8705 sec/batch
Epoch 15/20  Iteration 2622/3560 Training loss: 1.5788 0.8598 sec/batch
Epoch 15/20  Iteration 2623/3560 Training loss: 1.5785 0.8711 sec/batch
Epoch 15/20  Iteration 2624/3560 Training loss: 1.5782 0.8518 sec/batch
Epoch 15/20  Iteration 2625/3560 Training loss: 1.5784 0.8605 sec/batch
Epoch 15/20  Iteration 2626/3560 Training loss: 1.5783 0.8692 sec/batch
Epoch 15/20  Iteration 2627/3560 Training loss: 1.5784 0.8674 sec/batch
Epoch 15/20  Iteration 2628/3560 Training loss: 1.5784 0.8736 sec/batch
Epoch 15/20  Iteration 2629/3560 Training loss: 1.5785 0.8635 sec/batch
Epoch 15/20  Iteration 2630/3560 Training loss: 1.5786 0.8597 sec/batch
Epoch 15/20  Iteration 2631/3560 Training loss: 1.5787 0.8603 sec/batch
Epoch 15/20  Iteration 2632/3560 Training loss: 1.5787 0.8683 sec/batch
Epoch 15/20  Iteration 2633/3560 Training loss: 1.5790 0.8752 sec/batch
Epoch 15/20  Iteration 2634/3560 Training loss: 1.5790 0.8723 sec/batch
Epoch 15/20  Iteration 2635/3560 Training loss: 1.5789 0.8789 sec/batch
Epoch 15/20  Iteration 2636/3560 Training loss: 1.5791 0.8636 sec/batch
Epoch 15/20  Iteration 2637/3560 Training loss: 1.5789 0.8854 sec/batch
Epoch 15/20  Iteration 2638/3560 Training loss: 1.5790 0.8688 sec/batch
Epoch 15/20  Iteration 2639/3560 Training loss: 1.5790 0.8624 sec/batch
Epoch 15/20  Iteration 2640/3560 Training loss: 1.5792 0.8573 sec/batch
Epoch 15/20  Iteration 2641/3560 Training loss: 1.5793 0.8628 sec/batch
Epoch 15/20  Iteration 2642/3560 Training loss: 1.5792 0.8608 sec/batch
Epoch 15/20  Iteration 2643/3560 Training loss: 1.5789 0.8611 sec/batch
Epoch 15/20  Iteration 2644/3560 Training loss: 1.5788 0.8685 sec/batch
Epoch 15/20  Iteration 2645/3560 Training loss: 1.5789 0.8632 sec/batch
Epoch 15/20  Iteration 2646/3560 Training loss: 1.5790 0.8724 sec/batch
Epoch 15/20  Iteration 2647/3560 Training loss: 1.5790 0.8612 sec/batch
Epoch 15/20  Iteration 2648/3560 Training loss: 1.5790 0.8627 sec/batch
Epoch 15/20  Iteration 2649/3560 Training loss: 1.5791 0.8607 sec/batch
Epoch 15/20  Iteration 2650/3560 Training loss: 1.5791 0.8590 sec/batch
Epoch 15/20  Iteration 2651/3560 Training loss: 1.5789 0.8594 sec/batch
Epoch 15/20  Iteration 2652/3560 Training loss: 1.5790 0.8646 sec/batch
Epoch 15/20  Iteration 2653/3560 Training loss: 1.5793 0.8560 sec/batch
Epoch 15/20  Iteration 2654/3560 Training loss: 1.5792 0.8657 sec/batch
Epoch 15/20  Iteration 2655/3560 Training loss: 1.5792 0.8629 sec/batch
Epoch 15/20  Iteration 2656/3560 Training loss: 1.5793 0.8529 sec/batch
Epoch 15/20  Iteration 2657/3560 Training loss: 1.5792 0.8574 sec/batch
Epoch 15/20  Iteration 2658/3560 Training loss: 1.5792 0.8574 sec/batch
Epoch 15/20  Iteration 2659/3560 Training loss: 1.5793 0.8576 sec/batch
Epoch 15/20  Iteration 2660/3560 Training loss: 1.5798 0.8573 sec/batch
Epoch 15/20  Iteration 2661/3560 Training loss: 1.5798 0.8682 sec/batch
Epoch 15/20  Iteration 2662/3560 Training loss: 1.5798 0.8644 sec/batch
Epoch 15/20  Iteration 2663/3560 Training loss: 1.5797 0.8567 sec/batch
Epoch 15/20  Iteration 2664/3560 Training loss: 1.5795 0.8590 sec/batch
Epoch 15/20  Iteration 2665/3560 Training loss: 1.5796 0.8574 sec/batch
Epoch 15/20  Iteration 2666/3560 Training loss: 1.5796 0.8558 sec/batch
Epoch 15/20  Iteration 2667/3560 Training loss: 1.5796 0.8611 sec/batch
Epoch 15/20  Iteration 2668/3560 Training loss: 1.5795 0.8588 sec/batch
Epoch 15/20  Iteration 2669/3560 Training loss: 1.5793 0.8619 sec/batch
Epoch 15/20  Iteration 2670/3560 Training loss: 1.5795 0.9026 sec/batch
Epoch 16/20  Iteration 2671/3560 Training loss: 1.6697 0.8728 sec/batch
Epoch 16/20  Iteration 2672/3560 Training loss: 1.6239 0.8542 sec/batch
Epoch 16/20  Iteration 2673/3560 Training loss: 1.6065 0.8650 sec/batch
Epoch 16/20  Iteration 2674/3560 Training loss: 1.5999 0.8589 sec/batch
Epoch 16/20  Iteration 2675/3560 Training loss: 1.5913 0.9081 sec/batch
Epoch 16/20  Iteration 2676/3560 Training loss: 1.5788 0.8647 sec/batch
Epoch 16/20  Iteration 2677/3560 Training loss: 1.5792 0.8556 sec/batch
Epoch 16/20  Iteration 2678/3560 Training loss: 1.5764 0.8559 sec/batch
Epoch 16/20  Iteration 2679/3560 Training loss: 1.5773 0.8656 sec/batch
Epoch 16/20  Iteration 2680/3560 Training loss: 1.5754 0.8605 sec/batch
Epoch 16/20  Iteration 2681/3560 Training loss: 1.5716 0.8665 sec/batch
Epoch 16/20  Iteration 2682/3560 Training loss: 1.5699 0.8572 sec/batch
Epoch 16/20  Iteration 2683/3560 Training loss: 1.5696 0.8623 sec/batch
Epoch 16/20  Iteration 2684/3560 Training loss: 1.5721 0.8593 sec/batch
Epoch 16/20  Iteration 2685/3560 Training loss: 1.5713 0.9024 sec/batch
Epoch 16/20  Iteration 2686/3560 Training loss: 1.5703 0.8604 sec/batch
Epoch 16/20  Iteration 2687/3560 Training loss: 1.5705 0.8637 sec/batch
Epoch 16/20  Iteration 2688/3560 Training loss: 1.5720 0.8583 sec/batch
Epoch 16/20  Iteration 2689/3560 Training loss: 1.5726 0.8603 sec/batch
Epoch 16/20  Iteration 2690/3560 Training loss: 1.5740 0.9096 sec/batch
Epoch 16/20  Iteration 2691/3560 Training loss: 1.5730 0.8537 sec/batch
Epoch 16/20  Iteration 2692/3560 Training loss: 1.5732 0.8590 sec/batch
Epoch 16/20  Iteration 2693/3560 Training loss: 1.5725 0.8573 sec/batch
Epoch 16/20  Iteration 2694/3560 Training loss: 1.5723 0.8555 sec/batch
Epoch 16/20  Iteration 2695/3560 Training loss: 1.5724 0.8623 sec/batch
Epoch 16/20  Iteration 2696/3560 Training loss: 1.5706 0.9110 sec/batch
Epoch 16/20  Iteration 2697/3560 Training loss: 1.5691 0.8564 sec/batch
Epoch 16/20  Iteration 2698/3560 Training loss: 1.5698 0.8540 sec/batch
Epoch 16/20  Iteration 2699/3560 Training loss: 1.5704 0.9018 sec/batch
Epoch 16/20  Iteration 2700/3560 Training loss: 1.5707 0.8574 sec/batch
Epoch 16/20  Iteration 2701/3560 Training loss: 1.5706 0.8507 sec/batch
Epoch 16/20  Iteration 2702/3560 Training loss: 1.5697 0.8650 sec/batch
Epoch 16/20  Iteration 2703/3560 Training loss: 1.5701 0.8580 sec/batch
Epoch 16/20  Iteration 2704/3560 Training loss: 1.5706 0.8540 sec/batch
Epoch 16/20  Iteration 2705/3560 Training loss: 1.5704 0.8586 sec/batch
Epoch 16/20  Iteration 2706/3560 Training loss: 1.5704 0.8698 sec/batch
Epoch 16/20  Iteration 2707/3560 Training loss: 1.5699 0.8558 sec/batch
Epoch 16/20  Iteration 2708/3560 Training loss: 1.5686 0.8607 sec/batch
Epoch 16/20  Iteration 2709/3560 Training loss: 1.5672 0.8642 sec/batch
Epoch 16/20  Iteration 2710/3560 Training loss: 1.5665 0.8593 sec/batch
Epoch 16/20  Iteration 2711/3560 Training loss: 1.5661 0.8633 sec/batch
Epoch 16/20  Iteration 2712/3560 Training loss: 1.5670 0.8575 sec/batch
Epoch 16/20  Iteration 2713/3560 Training loss: 1.5666 0.8640 sec/batch
Epoch 16/20  Iteration 2714/3560 Training loss: 1.5658 0.8579 sec/batch
Epoch 16/20  Iteration 2715/3560 Training loss: 1.5659 0.8648 sec/batch
Epoch 16/20  Iteration 2716/3560 Training loss: 1.5649 0.8599 sec/batch
Epoch 16/20  Iteration 2717/3560 Training loss: 1.5647 0.8596 sec/batch
Epoch 16/20  Iteration 2718/3560 Training loss: 1.5644 0.8582 sec/batch
Epoch 16/20  Iteration 2719/3560 Training loss: 1.5644 0.8625 sec/batch
Epoch 16/20  Iteration 2720/3560 Training loss: 1.5650 0.8675 sec/batch
Epoch 16/20  Iteration 2721/3560 Training loss: 1.5646 0.9319 sec/batch
Epoch 16/20  Iteration 2722/3560 Training loss: 1.5654 0.9594 sec/batch
Epoch 16/20  Iteration 2723/3560 Training loss: 1.5654 0.8686 sec/batch
Epoch 16/20  Iteration 2724/3560 Training loss: 1.5655 0.9251 sec/batch
Epoch 16/20  Iteration 2725/3560 Training loss: 1.5656 0.8543 sec/batch
Epoch 16/20  Iteration 2726/3560 Training loss: 1.5658 0.8598 sec/batch
Epoch 16/20  Iteration 2727/3560 Training loss: 1.5662 0.8572 sec/batch
Epoch 16/20  Iteration 2728/3560 Training loss: 1.5659 0.8635 sec/batch
Epoch 16/20  Iteration 2729/3560 Training loss: 1.5654 0.8606 sec/batch
Epoch 16/20  Iteration 2730/3560 Training loss: 1.5660 0.8697 sec/batch
Epoch 16/20  Iteration 2731/3560 Training loss: 1.5660 0.8649 sec/batch
Epoch 16/20  Iteration 2732/3560 Training loss: 1.5668 0.8564 sec/batch
Epoch 16/20  Iteration 2733/3560 Training loss: 1.5671 0.8662 sec/batch
Epoch 16/20  Iteration 2734/3560 Training loss: 1.5673 0.8601 sec/batch
Epoch 16/20  Iteration 2735/3560 Training loss: 1.5672 0.8588 sec/batch
Epoch 16/20  Iteration 2736/3560 Training loss: 1.5674 0.8605 sec/batch
Epoch 16/20  Iteration 2737/3560 Training loss: 1.5676 0.8646 sec/batch
Epoch 16/20  Iteration 2738/3560 Training loss: 1.5673 0.8659 sec/batch
Epoch 16/20  Iteration 2739/3560 Training loss: 1.5672 0.8544 sec/batch
Epoch 16/20  Iteration 2740/3560 Training loss: 1.5672 0.8757 sec/batch
Epoch 16/20  Iteration 2741/3560 Training loss: 1.5678 0.8718 sec/batch
Epoch 16/20  Iteration 2742/3560 Training loss: 1.5679 0.9014 sec/batch
Epoch 16/20  Iteration 2743/3560 Training loss: 1.5684 0.8546 sec/batch
Epoch 16/20  Iteration 2744/3560 Training loss: 1.5682 0.8656 sec/batch
Epoch 16/20  Iteration 2745/3560 Training loss: 1.5682 0.8605 sec/batch
Epoch 16/20  Iteration 2746/3560 Training loss: 1.5684 0.8570 sec/batch
Epoch 16/20  Iteration 2747/3560 Training loss: 1.5682 0.8586 sec/batch
Epoch 16/20  Iteration 2748/3560 Training loss: 1.5682 0.8530 sec/batch
Epoch 16/20  Iteration 2749/3560 Training loss: 1.5675 0.8656 sec/batch
Epoch 16/20  Iteration 2750/3560 Training loss: 1.5674 0.8574 sec/batch
Epoch 16/20  Iteration 2751/3560 Training loss: 1.5669 0.8577 sec/batch
Epoch 16/20  Iteration 2752/3560 Training loss: 1.5668 0.8609 sec/batch
Epoch 16/20  Iteration 2753/3560 Training loss: 1.5664 0.9027 sec/batch
Epoch 16/20  Iteration 2754/3560 Training loss: 1.5664 0.8568 sec/batch
Epoch 16/20  Iteration 2755/3560 Training loss: 1.5661 0.9064 sec/batch
Epoch 16/20  Iteration 2756/3560 Training loss: 1.5658 0.9005 sec/batch
Epoch 16/20  Iteration 2757/3560 Training loss: 1.5657 0.8595 sec/batch
Epoch 16/20  Iteration 2758/3560 Training loss: 1.5654 0.8582 sec/batch
Epoch 16/20  Iteration 2759/3560 Training loss: 1.5649 0.8586 sec/batch
Epoch 16/20  Iteration 2760/3560 Training loss: 1.5651 0.8644 sec/batch
Epoch 16/20  Iteration 2761/3560 Training loss: 1.5648 0.8576 sec/batch
Epoch 16/20  Iteration 2762/3560 Training loss: 1.5647 0.8598 sec/batch
Epoch 16/20  Iteration 2763/3560 Training loss: 1.5643 0.8649 sec/batch
Epoch 16/20  Iteration 2764/3560 Training loss: 1.5640 0.8553 sec/batch
Epoch 16/20  Iteration 2765/3560 Training loss: 1.5636 0.8517 sec/batch
Epoch 16/20  Iteration 2766/3560 Training loss: 1.5637 0.8526 sec/batch
Epoch 16/20  Iteration 2767/3560 Training loss: 1.5636 0.8609 sec/batch
Epoch 16/20  Iteration 2768/3560 Training loss: 1.5632 0.8533 sec/batch
Epoch 16/20  Iteration 2769/3560 Training loss: 1.5629 0.8654 sec/batch
Epoch 16/20  Iteration 2770/3560 Training loss: 1.5624 0.8816 sec/batch
Epoch 16/20  Iteration 2771/3560 Training loss: 1.5623 0.8620 sec/batch
Epoch 16/20  Iteration 2772/3560 Training loss: 1.5623 0.8645 sec/batch
Epoch 16/20  Iteration 2773/3560 Training loss: 1.5621 0.8650 sec/batch
Epoch 16/20  Iteration 2774/3560 Training loss: 1.5619 0.8797 sec/batch
Epoch 16/20  Iteration 2775/3560 Training loss: 1.5618 0.9135 sec/batch
Epoch 16/20  Iteration 2776/3560 Training loss: 1.5617 0.8729 sec/batch
Epoch 16/20  Iteration 2777/3560 Training loss: 1.5617 0.8588 sec/batch
Epoch 16/20  Iteration 2778/3560 Training loss: 1.5616 0.9007 sec/batch
Epoch 16/20  Iteration 2779/3560 Training loss: 1.5615 0.8633 sec/batch
Epoch 16/20  Iteration 2780/3560 Training loss: 1.5616 0.8742 sec/batch
Epoch 16/20  Iteration 2781/3560 Training loss: 1.5614 0.8698 sec/batch
Epoch 16/20  Iteration 2782/3560 Training loss: 1.5614 0.8623 sec/batch
Epoch 16/20  Iteration 2783/3560 Training loss: 1.5612 0.8675 sec/batch
Epoch 16/20  Iteration 2784/3560 Training loss: 1.5610 0.9076 sec/batch
Epoch 16/20  Iteration 2785/3560 Training loss: 1.5607 0.8616 sec/batch
Epoch 16/20  Iteration 2786/3560 Training loss: 1.5604 0.8639 sec/batch
Epoch 16/20  Iteration 2787/3560 Training loss: 1.5604 0.8615 sec/batch
Epoch 16/20  Iteration 2788/3560 Training loss: 1.5604 0.8663 sec/batch
Epoch 16/20  Iteration 2789/3560 Training loss: 1.5603 0.8609 sec/batch
Epoch 16/20  Iteration 2790/3560 Training loss: 1.5604 0.8642 sec/batch
Epoch 16/20  Iteration 2791/3560 Training loss: 1.5603 0.8683 sec/batch
Epoch 16/20  Iteration 2792/3560 Training loss: 1.5598 0.8606 sec/batch
Epoch 16/20  Iteration 2793/3560 Training loss: 1.5595 0.8528 sec/batch
Epoch 16/20  Iteration 2794/3560 Training loss: 1.5596 0.8563 sec/batch
Epoch 16/20  Iteration 2795/3560 Training loss: 1.5595 0.9090 sec/batch
Epoch 16/20  Iteration 2796/3560 Training loss: 1.5592 0.8604 sec/batch
Epoch 16/20  Iteration 2797/3560 Training loss: 1.5592 0.8677 sec/batch
Epoch 16/20  Iteration 2798/3560 Training loss: 1.5593 0.8551 sec/batch
Epoch 16/20  Iteration 2799/3560 Training loss: 1.5591 0.8607 sec/batch
Epoch 16/20  Iteration 2800/3560 Training loss: 1.5589 0.8655 sec/batch
Epoch 16/20  Iteration 2801/3560 Training loss: 1.5585 0.8606 sec/batch
Epoch 16/20  Iteration 2802/3560 Training loss: 1.5583 0.8631 sec/batch
Epoch 16/20  Iteration 2803/3560 Training loss: 1.5585 0.8658 sec/batch
Epoch 16/20  Iteration 2804/3560 Training loss: 1.5584 0.8550 sec/batch
Epoch 16/20  Iteration 2805/3560 Training loss: 1.5584 0.8578 sec/batch
Epoch 16/20  Iteration 2806/3560 Training loss: 1.5585 0.8578 sec/batch
Epoch 16/20  Iteration 2807/3560 Training loss: 1.5586 0.9094 sec/batch
Epoch 16/20  Iteration 2808/3560 Training loss: 1.5587 0.8593 sec/batch
Epoch 16/20  Iteration 2809/3560 Training loss: 1.5588 0.8738 sec/batch
Epoch 16/20  Iteration 2810/3560 Training loss: 1.5587 0.8663 sec/batch
Epoch 16/20  Iteration 2811/3560 Training loss: 1.5590 0.8611 sec/batch
Epoch 16/20  Iteration 2812/3560 Training loss: 1.5591 0.9151 sec/batch
Epoch 16/20  Iteration 2813/3560 Training loss: 1.5590 0.8650 sec/batch
Epoch 16/20  Iteration 2814/3560 Training loss: 1.5591 0.8641 sec/batch
Epoch 16/20  Iteration 2815/3560 Training loss: 1.5589 0.8594 sec/batch
Epoch 16/20  Iteration 2816/3560 Training loss: 1.5591 0.8684 sec/batch
Epoch 16/20  Iteration 2817/3560 Training loss: 1.5591 0.8624 sec/batch
Epoch 16/20  Iteration 2818/3560 Training loss: 1.5593 0.8601 sec/batch
Epoch 16/20  Iteration 2819/3560 Training loss: 1.5594 0.8592 sec/batch
Epoch 16/20  Iteration 2820/3560 Training loss: 1.5593 0.8677 sec/batch
Epoch 16/20  Iteration 2821/3560 Training loss: 1.5590 0.8618 sec/batch
Epoch 16/20  Iteration 2822/3560 Training loss: 1.5590 0.8619 sec/batch
Epoch 16/20  Iteration 2823/3560 Training loss: 1.5591 0.8550 sec/batch
Epoch 16/20  Iteration 2824/3560 Training loss: 1.5592 0.8633 sec/batch
Epoch 16/20  Iteration 2825/3560 Training loss: 1.5592 0.8560 sec/batch
Epoch 16/20  Iteration 2826/3560 Training loss: 1.5592 0.8573 sec/batch
Epoch 16/20  Iteration 2827/3560 Training loss: 1.5593 0.8596 sec/batch
Epoch 16/20  Iteration 2828/3560 Training loss: 1.5593 0.8593 sec/batch
Epoch 16/20  Iteration 2829/3560 Training loss: 1.5590 0.8646 sec/batch
Epoch 16/20  Iteration 2830/3560 Training loss: 1.5591 0.8602 sec/batch
Epoch 16/20  Iteration 2831/3560 Training loss: 1.5593 0.8638 sec/batch
Epoch 16/20  Iteration 2832/3560 Training loss: 1.5592 0.8605 sec/batch
Epoch 16/20  Iteration 2833/3560 Training loss: 1.5593 0.8597 sec/batch
Epoch 16/20  Iteration 2834/3560 Training loss: 1.5593 0.8592 sec/batch
Epoch 16/20  Iteration 2835/3560 Training loss: 1.5593 0.9048 sec/batch
Epoch 16/20  Iteration 2836/3560 Training loss: 1.5593 0.8745 sec/batch
Epoch 16/20  Iteration 2837/3560 Training loss: 1.5595 0.8745 sec/batch
Epoch 16/20  Iteration 2838/3560 Training loss: 1.5599 0.8608 sec/batch
Epoch 16/20  Iteration 2839/3560 Training loss: 1.5599 0.8617 sec/batch
Epoch 16/20  Iteration 2840/3560 Training loss: 1.5599 0.8586 sec/batch
Epoch 16/20  Iteration 2841/3560 Training loss: 1.5597 0.8573 sec/batch
Epoch 16/20  Iteration 2842/3560 Training loss: 1.5596 0.9115 sec/batch
Epoch 16/20  Iteration 2843/3560 Training loss: 1.5597 0.8811 sec/batch
Epoch 16/20  Iteration 2844/3560 Training loss: 1.5597 0.8607 sec/batch
Epoch 16/20  Iteration 2845/3560 Training loss: 1.5598 0.8660 sec/batch
Epoch 16/20  Iteration 2846/3560 Training loss: 1.5596 0.8626 sec/batch
Epoch 16/20  Iteration 2847/3560 Training loss: 1.5595 0.8735 sec/batch
Epoch 16/20  Iteration 2848/3560 Training loss: 1.5596 0.8620 sec/batch
Epoch 17/20  Iteration 2849/3560 Training loss: 1.6480 0.8614 sec/batch
Epoch 17/20  Iteration 2850/3560 Training loss: 1.6017 0.8602 sec/batch
Epoch 17/20  Iteration 2851/3560 Training loss: 1.5876 0.9048 sec/batch
Epoch 17/20  Iteration 2852/3560 Training loss: 1.5815 0.8710 sec/batch
Epoch 17/20  Iteration 2853/3560 Training loss: 1.5731 0.8638 sec/batch
Epoch 17/20  Iteration 2854/3560 Training loss: 1.5611 0.8610 sec/batch
Epoch 17/20  Iteration 2855/3560 Training loss: 1.5610 0.9066 sec/batch
Epoch 17/20  Iteration 2856/3560 Training loss: 1.5582 0.8581 sec/batch
Epoch 17/20  Iteration 2857/3560 Training loss: 1.5594 0.8674 sec/batch
Epoch 17/20  Iteration 2858/3560 Training loss: 1.5584 0.8800 sec/batch
Epoch 17/20  Iteration 2859/3560 Training loss: 1.5547 1.0466 sec/batch
Epoch 17/20  Iteration 2860/3560 Training loss: 1.5534 0.9462 sec/batch
Epoch 17/20  Iteration 2861/3560 Training loss: 1.5530 0.9519 sec/batch
Epoch 17/20  Iteration 2862/3560 Training loss: 1.5543 0.8693 sec/batch
Epoch 17/20  Iteration 2863/3560 Training loss: 1.5533 0.9047 sec/batch
Epoch 17/20  Iteration 2864/3560 Training loss: 1.5515 0.8653 sec/batch
Epoch 17/20  Iteration 2865/3560 Training loss: 1.5517 0.8691 sec/batch
Epoch 17/20  Iteration 2866/3560 Training loss: 1.5531 0.8631 sec/batch
Epoch 17/20  Iteration 2867/3560 Training loss: 1.5538 0.8686 sec/batch
Epoch 17/20  Iteration 2868/3560 Training loss: 1.5556 0.8712 sec/batch
Epoch 17/20  Iteration 2869/3560 Training loss: 1.5542 0.9075 sec/batch
Epoch 17/20  Iteration 2870/3560 Training loss: 1.5545 0.8588 sec/batch
Epoch 17/20  Iteration 2871/3560 Training loss: 1.5542 0.8716 sec/batch
Epoch 17/20  Iteration 2872/3560 Training loss: 1.5540 0.8591 sec/batch
Epoch 17/20  Iteration 2873/3560 Training loss: 1.5540 0.8597 sec/batch
Epoch 17/20  Iteration 2874/3560 Training loss: 1.5524 0.8706 sec/batch
Epoch 17/20  Iteration 2875/3560 Training loss: 1.5511 0.8699 sec/batch
Epoch 17/20  Iteration 2876/3560 Training loss: 1.5519 0.9095 sec/batch
Epoch 17/20  Iteration 2877/3560 Training loss: 1.5521 0.8848 sec/batch
Epoch 17/20  Iteration 2878/3560 Training loss: 1.5524 0.8605 sec/batch
Epoch 17/20  Iteration 2879/3560 Training loss: 1.5523 0.8580 sec/batch
Epoch 17/20  Iteration 2880/3560 Training loss: 1.5513 0.8662 sec/batch
Epoch 17/20  Iteration 2881/3560 Training loss: 1.5515 0.8632 sec/batch
Epoch 17/20  Iteration 2882/3560 Training loss: 1.5520 0.8754 sec/batch
Epoch 17/20  Iteration 2883/3560 Training loss: 1.5517 0.8651 sec/batch
Epoch 17/20  Iteration 2884/3560 Training loss: 1.5515 0.8668 sec/batch
Epoch 17/20  Iteration 2885/3560 Training loss: 1.5511 0.8554 sec/batch
Epoch 17/20  Iteration 2886/3560 Training loss: 1.5499 0.8562 sec/batch
Epoch 17/20  Iteration 2887/3560 Training loss: 1.5485 0.8601 sec/batch
Epoch 17/20  Iteration 2888/3560 Training loss: 1.5478 0.8563 sec/batch
Epoch 17/20  Iteration 2889/3560 Training loss: 1.5475 0.8609 sec/batch
Epoch 17/20  Iteration 2890/3560 Training loss: 1.5483 0.8657 sec/batch
Epoch 17/20  Iteration 2891/3560 Training loss: 1.5479 0.8604 sec/batch
Epoch 17/20  Iteration 2892/3560 Training loss: 1.5474 0.8687 sec/batch
Epoch 17/20  Iteration 2893/3560 Training loss: 1.5476 0.8638 sec/batch
Epoch 17/20  Iteration 2894/3560 Training loss: 1.5466 0.8691 sec/batch
Epoch 17/20  Iteration 2895/3560 Training loss: 1.5463 0.8740 sec/batch
Epoch 17/20  Iteration 2896/3560 Training loss: 1.5459 0.8623 sec/batch
Epoch 17/20  Iteration 2897/3560 Training loss: 1.5458 0.8643 sec/batch
Epoch 17/20  Iteration 2898/3560 Training loss: 1.5465 0.8590 sec/batch
Epoch 17/20  Iteration 2899/3560 Training loss: 1.5461 0.8698 sec/batch
Epoch 17/20  Iteration 2900/3560 Training loss: 1.5470 0.8609 sec/batch
Epoch 17/20  Iteration 2901/3560 Training loss: 1.5470 0.8573 sec/batch
Epoch 17/20  Iteration 2902/3560 Training loss: 1.5472 0.8649 sec/batch
Epoch 17/20  Iteration 2903/3560 Training loss: 1.5471 0.9069 sec/batch
Epoch 17/20  Iteration 2904/3560 Training loss: 1.5474 0.8608 sec/batch
Epoch 17/20  Iteration 2905/3560 Training loss: 1.5478 0.8610 sec/batch
Epoch 17/20  Iteration 2906/3560 Training loss: 1.5475 0.8590 sec/batch
Epoch 17/20  Iteration 2907/3560 Training loss: 1.5469 0.8644 sec/batch
Epoch 17/20  Iteration 2908/3560 Training loss: 1.5476 0.8609 sec/batch
Epoch 17/20  Iteration 2909/3560 Training loss: 1.5476 0.8717 sec/batch
Epoch 17/20  Iteration 2910/3560 Training loss: 1.5486 0.8721 sec/batch
Epoch 17/20  Iteration 2911/3560 Training loss: 1.5488 0.8946 sec/batch
Epoch 17/20  Iteration 2912/3560 Training loss: 1.5492 0.8687 sec/batch
Epoch 17/20  Iteration 2913/3560 Training loss: 1.5490 0.8585 sec/batch
Epoch 17/20  Iteration 2914/3560 Training loss: 1.5493 0.8614 sec/batch
Epoch 17/20  Iteration 2915/3560 Training loss: 1.5494 0.8695 sec/batch
Epoch 17/20  Iteration 2916/3560 Training loss: 1.5491 0.8623 sec/batch
Epoch 17/20  Iteration 2917/3560 Training loss: 1.5491 0.8636 sec/batch
Epoch 17/20  Iteration 2918/3560 Training loss: 1.5490 0.8683 sec/batch
Epoch 17/20  Iteration 2919/3560 Training loss: 1.5497 0.8597 sec/batch
Epoch 17/20  Iteration 2920/3560 Training loss: 1.5499 0.8646 sec/batch
Epoch 17/20  Iteration 2921/3560 Training loss: 1.5504 0.8734 sec/batch
Epoch 17/20  Iteration 2922/3560 Training loss: 1.5502 0.8705 sec/batch
Epoch 17/20  Iteration 2923/3560 Training loss: 1.5501 0.8624 sec/batch
Epoch 17/20  Iteration 2924/3560 Training loss: 1.5503 0.9119 sec/batch
Epoch 17/20  Iteration 2925/3560 Training loss: 1.5502 0.9126 sec/batch
Epoch 17/20  Iteration 2926/3560 Training loss: 1.5503 0.8667 sec/batch
Epoch 17/20  Iteration 2927/3560 Training loss: 1.5497 0.8604 sec/batch
Epoch 17/20  Iteration 2928/3560 Training loss: 1.5496 0.8623 sec/batch
Epoch 17/20  Iteration 2929/3560 Training loss: 1.5492 0.8610 sec/batch
Epoch 17/20  Iteration 2930/3560 Training loss: 1.5492 0.8579 sec/batch
Epoch 17/20  Iteration 2931/3560 Training loss: 1.5488 0.9139 sec/batch
Epoch 17/20  Iteration 2932/3560 Training loss: 1.5487 0.8592 sec/batch
Epoch 17/20  Iteration 2933/3560 Training loss: 1.5484 0.8689 sec/batch
Epoch 17/20  Iteration 2934/3560 Training loss: 1.5482 0.8586 sec/batch
Epoch 17/20  Iteration 2935/3560 Training loss: 1.5479 0.9083 sec/batch
Epoch 17/20  Iteration 2936/3560 Training loss: 1.5476 0.8674 sec/batch
Epoch 17/20  Iteration 2937/3560 Training loss: 1.5472 0.8738 sec/batch
Epoch 17/20  Iteration 2938/3560 Training loss: 1.5474 0.8641 sec/batch
Epoch 17/20  Iteration 2939/3560 Training loss: 1.5470 0.8636 sec/batch
Epoch 17/20  Iteration 2940/3560 Training loss: 1.5469 0.8620 sec/batch
Epoch 17/20  Iteration 2941/3560 Training loss: 1.5465 0.8538 sec/batch
Epoch 17/20  Iteration 2942/3560 Training loss: 1.5461 0.8596 sec/batch
Epoch 17/20  Iteration 2943/3560 Training loss: 1.5458 0.8632 sec/batch
Epoch 17/20  Iteration 2944/3560 Training loss: 1.5458 0.8672 sec/batch
Epoch 17/20  Iteration 2945/3560 Training loss: 1.5458 0.9165 sec/batch
Epoch 17/20  Iteration 2946/3560 Training loss: 1.5453 0.8599 sec/batch
Epoch 17/20  Iteration 2947/3560 Training loss: 1.5450 0.9088 sec/batch
Epoch 17/20  Iteration 2948/3560 Training loss: 1.5446 0.8596 sec/batch
Epoch 17/20  Iteration 2949/3560 Training loss: 1.5446 0.8663 sec/batch
Epoch 17/20  Iteration 2950/3560 Training loss: 1.5445 0.8638 sec/batch
Epoch 17/20  Iteration 2951/3560 Training loss: 1.5443 1.0456 sec/batch
Epoch 17/20  Iteration 2952/3560 Training loss: 1.5441 0.9536 sec/batch
Epoch 17/20  Iteration 2953/3560 Training loss: 1.5440 0.8622 sec/batch
Epoch 17/20  Iteration 2954/3560 Training loss: 1.5439 0.8611 sec/batch
Epoch 17/20  Iteration 2955/3560 Training loss: 1.5438 0.8684 sec/batch
Epoch 17/20  Iteration 2956/3560 Training loss: 1.5438 0.8789 sec/batch
Epoch 17/20  Iteration 2957/3560 Training loss: 1.5438 0.9426 sec/batch
Epoch 17/20  Iteration 2958/3560 Training loss: 1.5439 0.9496 sec/batch
Epoch 17/20  Iteration 2959/3560 Training loss: 1.5437 0.9175 sec/batch
Epoch 17/20  Iteration 2960/3560 Training loss: 1.5436 0.8597 sec/batch
Epoch 17/20  Iteration 2961/3560 Training loss: 1.5434 0.9184 sec/batch
Epoch 17/20  Iteration 2962/3560 Training loss: 1.5433 0.9644 sec/batch
Epoch 17/20  Iteration 2963/3560 Training loss: 1.5430 0.9389 sec/batch
Epoch 17/20  Iteration 2964/3560 Training loss: 1.5426 0.9001 sec/batch
Epoch 17/20  Iteration 2965/3560 Training loss: 1.5426 0.9759 sec/batch
Epoch 17/20  Iteration 2966/3560 Training loss: 1.5426 0.9142 sec/batch
Epoch 17/20  Iteration 2967/3560 Training loss: 1.5424 0.8644 sec/batch
Epoch 17/20  Iteration 2968/3560 Training loss: 1.5424 0.9741 sec/batch
Epoch 17/20  Iteration 2969/3560 Training loss: 1.5422 0.8662 sec/batch
Epoch 17/20  Iteration 2970/3560 Training loss: 1.5418 0.8596 sec/batch
Epoch 17/20  Iteration 2971/3560 Training loss: 1.5414 0.8633 sec/batch
Epoch 17/20  Iteration 2972/3560 Training loss: 1.5415 0.9316 sec/batch
Epoch 17/20  Iteration 2973/3560 Training loss: 1.5414 0.9089 sec/batch
Epoch 17/20  Iteration 2974/3560 Training loss: 1.5410 0.8640 sec/batch
Epoch 17/20  Iteration 2975/3560 Training loss: 1.5411 0.8650 sec/batch
Epoch 17/20  Iteration 2976/3560 Training loss: 1.5412 0.8561 sec/batch
Epoch 17/20  Iteration 2977/3560 Training loss: 1.5410 0.8611 sec/batch
Epoch 17/20  Iteration 2978/3560 Training loss: 1.5408 0.8675 sec/batch
Epoch 17/20  Iteration 2979/3560 Training loss: 1.5404 0.9311 sec/batch
Epoch 17/20  Iteration 2980/3560 Training loss: 1.5402 0.8569 sec/batch
Epoch 17/20  Iteration 2981/3560 Training loss: 1.5404 0.8551 sec/batch
Epoch 17/20  Iteration 2982/3560 Training loss: 1.5404 0.8611 sec/batch
Epoch 17/20  Iteration 2983/3560 Training loss: 1.5404 0.8655 sec/batch
Epoch 17/20  Iteration 2984/3560 Training loss: 1.5404 0.9206 sec/batch
Epoch 17/20  Iteration 2985/3560 Training loss: 1.5405 0.8643 sec/batch
Epoch 17/20  Iteration 2986/3560 Training loss: 1.5406 0.8623 sec/batch
Epoch 17/20  Iteration 2987/3560 Training loss: 1.5407 0.8631 sec/batch
Epoch 17/20  Iteration 2988/3560 Training loss: 1.5407 0.8871 sec/batch
Epoch 17/20  Iteration 2989/3560 Training loss: 1.5411 0.9178 sec/batch
Epoch 17/20  Iteration 2990/3560 Training loss: 1.5410 0.8637 sec/batch
Epoch 17/20  Iteration 2991/3560 Training loss: 1.5410 0.8651 sec/batch
Epoch 17/20  Iteration 2992/3560 Training loss: 1.5411 0.8696 sec/batch
Epoch 17/20  Iteration 2993/3560 Training loss: 1.5409 0.8586 sec/batch
Epoch 17/20  Iteration 2994/3560 Training loss: 1.5410 0.9798 sec/batch
Epoch 17/20  Iteration 2995/3560 Training loss: 1.5410 0.8886 sec/batch
Epoch 17/20  Iteration 2996/3560 Training loss: 1.5412 0.8842 sec/batch
Epoch 17/20  Iteration 2997/3560 Training loss: 1.5412 0.8871 sec/batch
Epoch 17/20  Iteration 2998/3560 Training loss: 1.5412 0.8678 sec/batch
Epoch 17/20  Iteration 2999/3560 Training loss: 1.5409 0.8571 sec/batch
Epoch 17/20  Iteration 3000/3560 Training loss: 1.5408 0.8567 sec/batch
Epoch 17/20  Iteration 3001/3560 Training loss: 1.5409 0.8589 sec/batch
Epoch 17/20  Iteration 3002/3560 Training loss: 1.5411 0.8613 sec/batch
Epoch 17/20  Iteration 3003/3560 Training loss: 1.5410 0.8656 sec/batch
Epoch 17/20  Iteration 3004/3560 Training loss: 1.5411 0.8669 sec/batch
Epoch 17/20  Iteration 3005/3560 Training loss: 1.5412 0.8664 sec/batch
Epoch 17/20  Iteration 3006/3560 Training loss: 1.5412 0.8614 sec/batch
Epoch 17/20  Iteration 3007/3560 Training loss: 1.5410 0.8575 sec/batch
Epoch 17/20  Iteration 3008/3560 Training loss: 1.5411 0.8589 sec/batch
Epoch 17/20  Iteration 3009/3560 Training loss: 1.5413 0.8591 sec/batch
Epoch 17/20  Iteration 3010/3560 Training loss: 1.5412 0.8572 sec/batch
Epoch 17/20  Iteration 3011/3560 Training loss: 1.5413 0.8573 sec/batch
Epoch 17/20  Iteration 3012/3560 Training loss: 1.5413 0.8661 sec/batch
Epoch 17/20  Iteration 3013/3560 Training loss: 1.5413 0.8840 sec/batch
Epoch 17/20  Iteration 3014/3560 Training loss: 1.5412 0.8604 sec/batch
Epoch 17/20  Iteration 3015/3560 Training loss: 1.5413 0.8637 sec/batch
Epoch 17/20  Iteration 3016/3560 Training loss: 1.5417 0.8636 sec/batch
Epoch 17/20  Iteration 3017/3560 Training loss: 1.5417 0.8643 sec/batch
Epoch 17/20  Iteration 3018/3560 Training loss: 1.5418 0.8674 sec/batch
Epoch 17/20  Iteration 3019/3560 Training loss: 1.5417 0.8648 sec/batch
Epoch 17/20  Iteration 3020/3560 Training loss: 1.5415 0.8614 sec/batch
Epoch 17/20  Iteration 3021/3560 Training loss: 1.5416 0.9112 sec/batch
Epoch 17/20  Iteration 3022/3560 Training loss: 1.5416 0.8607 sec/batch
Epoch 17/20  Iteration 3023/3560 Training loss: 1.5417 0.8634 sec/batch
Epoch 17/20  Iteration 3024/3560 Training loss: 1.5415 0.8591 sec/batch
Epoch 17/20  Iteration 3025/3560 Training loss: 1.5414 0.8584 sec/batch
Epoch 17/20  Iteration 3026/3560 Training loss: 1.5415 0.8552 sec/batch
Epoch 18/20  Iteration 3027/3560 Training loss: 1.6350 0.8570 sec/batch
Epoch 18/20  Iteration 3028/3560 Training loss: 1.5889 0.8652 sec/batch
Epoch 18/20  Iteration 3029/3560 Training loss: 1.5725 0.8603 sec/batch
Epoch 18/20  Iteration 3030/3560 Training loss: 1.5680 0.8635 sec/batch
Epoch 18/20  Iteration 3031/3560 Training loss: 1.5567 0.8698 sec/batch
Epoch 18/20  Iteration 3032/3560 Training loss: 1.5454 0.8595 sec/batch
Epoch 18/20  Iteration 3033/3560 Training loss: 1.5447 0.8696 sec/batch
Epoch 18/20  Iteration 3034/3560 Training loss: 1.5426 0.8631 sec/batch
Epoch 18/20  Iteration 3035/3560 Training loss: 1.5437 0.8603 sec/batch
Epoch 18/20  Iteration 3036/3560 Training loss: 1.5428 0.8606 sec/batch
Epoch 18/20  Iteration 3037/3560 Training loss: 1.5387 0.8597 sec/batch
Epoch 18/20  Iteration 3038/3560 Training loss: 1.5372 0.8540 sec/batch
Epoch 18/20  Iteration 3039/3560 Training loss: 1.5373 0.8597 sec/batch
Epoch 18/20  Iteration 3040/3560 Training loss: 1.5385 0.8709 sec/batch
Epoch 18/20  Iteration 3041/3560 Training loss: 1.5368 0.8627 sec/batch
Epoch 18/20  Iteration 3042/3560 Training loss: 1.5350 0.8606 sec/batch
Epoch 18/20  Iteration 3043/3560 Training loss: 1.5352 0.8622 sec/batch
Epoch 18/20  Iteration 3044/3560 Training loss: 1.5366 0.8628 sec/batch
Epoch 18/20  Iteration 3045/3560 Training loss: 1.5368 0.8721 sec/batch
Epoch 18/20  Iteration 3046/3560 Training loss: 1.5381 0.8603 sec/batch
Epoch 18/20  Iteration 3047/3560 Training loss: 1.5375 0.8798 sec/batch
Epoch 18/20  Iteration 3048/3560 Training loss: 1.5375 0.8645 sec/batch
Epoch 18/20  Iteration 3049/3560 Training loss: 1.5372 0.8600 sec/batch
Epoch 18/20  Iteration 3050/3560 Training loss: 1.5373 0.8576 sec/batch
Epoch 18/20  Iteration 3051/3560 Training loss: 1.5373 0.8683 sec/batch
Epoch 18/20  Iteration 3052/3560 Training loss: 1.5359 0.8664 sec/batch
Epoch 18/20  Iteration 3053/3560 Training loss: 1.5345 0.8618 sec/batch
Epoch 18/20  Iteration 3054/3560 Training loss: 1.5351 0.8675 sec/batch
Epoch 18/20  Iteration 3055/3560 Training loss: 1.5357 0.8699 sec/batch
Epoch 18/20  Iteration 3056/3560 Training loss: 1.5361 0.8697 sec/batch
Epoch 18/20  Iteration 3057/3560 Training loss: 1.5360 0.8729 sec/batch
Epoch 18/20  Iteration 3058/3560 Training loss: 1.5351 0.8668 sec/batch
Epoch 18/20  Iteration 3059/3560 Training loss: 1.5352 0.8738 sec/batch
Epoch 18/20  Iteration 3060/3560 Training loss: 1.5355 0.8632 sec/batch
Epoch 18/20  Iteration 3061/3560 Training loss: 1.5352 0.8664 sec/batch
Epoch 18/20  Iteration 3062/3560 Training loss: 1.5350 0.8630 sec/batch
Epoch 18/20  Iteration 3063/3560 Training loss: 1.5343 0.8608 sec/batch
Epoch 18/20  Iteration 3064/3560 Training loss: 1.5331 0.8578 sec/batch
Epoch 18/20  Iteration 3065/3560 Training loss: 1.5316 0.8591 sec/batch
Epoch 18/20  Iteration 3066/3560 Training loss: 1.5313 0.8593 sec/batch
Epoch 18/20  Iteration 3067/3560 Training loss: 1.5306 0.8569 sec/batch
Epoch 18/20  Iteration 3068/3560 Training loss: 1.5312 0.8613 sec/batch
Epoch 18/20  Iteration 3069/3560 Training loss: 1.5307 0.8648 sec/batch
Epoch 18/20  Iteration 3070/3560 Training loss: 1.5301 0.8644 sec/batch
Epoch 18/20  Iteration 3071/3560 Training loss: 1.5305 0.8684 sec/batch
Epoch 18/20  Iteration 3072/3560 Training loss: 1.5295 0.8613 sec/batch
Epoch 18/20  Iteration 3073/3560 Training loss: 1.5291 0.8674 sec/batch
Epoch 18/20  Iteration 3074/3560 Training loss: 1.5287 0.8713 sec/batch
Epoch 18/20  Iteration 3075/3560 Training loss: 1.5290 0.8688 sec/batch
Epoch 18/20  Iteration 3076/3560 Training loss: 1.5295 0.8560 sec/batch
Epoch 18/20  Iteration 3077/3560 Training loss: 1.5291 0.8609 sec/batch
Epoch 18/20  Iteration 3078/3560 Training loss: 1.5299 0.9091 sec/batch
Epoch 18/20  Iteration 3079/3560 Training loss: 1.5298 0.8697 sec/batch
Epoch 18/20  Iteration 3080/3560 Training loss: 1.5300 0.8610 sec/batch
Epoch 18/20  Iteration 3081/3560 Training loss: 1.5301 0.8575 sec/batch
Epoch 18/20  Iteration 3082/3560 Training loss: 1.5302 0.8743 sec/batch
Epoch 18/20  Iteration 3083/3560 Training loss: 1.5306 0.8615 sec/batch
Epoch 18/20  Iteration 3084/3560 Training loss: 1.5302 0.8578 sec/batch
Epoch 18/20  Iteration 3085/3560 Training loss: 1.5299 0.8673 sec/batch
Epoch 18/20  Iteration 3086/3560 Training loss: 1.5304 0.8663 sec/batch
Epoch 18/20  Iteration 3087/3560 Training loss: 1.5303 0.8706 sec/batch
Epoch 18/20  Iteration 3088/3560 Training loss: 1.5313 0.8614 sec/batch
Epoch 18/20  Iteration 3089/3560 Training loss: 1.5315 0.8639 sec/batch
Epoch 18/20  Iteration 3090/3560 Training loss: 1.5317 0.8637 sec/batch
Epoch 18/20  Iteration 3091/3560 Training loss: 1.5316 0.8664 sec/batch
Epoch 18/20  Iteration 3092/3560 Training loss: 1.5318 0.8601 sec/batch
Epoch 18/20  Iteration 3093/3560 Training loss: 1.5321 0.8723 sec/batch
Epoch 18/20  Iteration 3094/3560 Training loss: 1.5319 0.8608 sec/batch
Epoch 18/20  Iteration 3095/3560 Training loss: 1.5321 0.8558 sec/batch
Epoch 18/20  Iteration 3096/3560 Training loss: 1.5320 0.8647 sec/batch
Epoch 18/20  Iteration 3097/3560 Training loss: 1.5327 0.8562 sec/batch
Epoch 18/20  Iteration 3098/3560 Training loss: 1.5330 0.8601 sec/batch
Epoch 18/20  Iteration 3099/3560 Training loss: 1.5336 0.8664 sec/batch
Epoch 18/20  Iteration 3100/3560 Training loss: 1.5334 0.8679 sec/batch
Epoch 18/20  Iteration 3101/3560 Training loss: 1.5332 0.8592 sec/batch
Epoch 18/20  Iteration 3102/3560 Training loss: 1.5335 0.8632 sec/batch
Epoch 18/20  Iteration 3103/3560 Training loss: 1.5332 0.8578 sec/batch
Epoch 18/20  Iteration 3104/3560 Training loss: 1.5333 0.8681 sec/batch
Epoch 18/20  Iteration 3105/3560 Training loss: 1.5327 0.8614 sec/batch
Epoch 18/20  Iteration 3106/3560 Training loss: 1.5326 0.9133 sec/batch
Epoch 18/20  Iteration 3107/3560 Training loss: 1.5321 0.8634 sec/batch
Epoch 18/20  Iteration 3108/3560 Training loss: 1.5322 0.8636 sec/batch
Epoch 18/20  Iteration 3109/3560 Training loss: 1.5316 0.8587 sec/batch
Epoch 18/20  Iteration 3110/3560 Training loss: 1.5315 0.8528 sec/batch
Epoch 18/20  Iteration 3111/3560 Training loss: 1.5313 0.8615 sec/batch
Epoch 18/20  Iteration 3112/3560 Training loss: 1.5311 0.8591 sec/batch
Epoch 18/20  Iteration 3113/3560 Training loss: 1.5308 0.8621 sec/batch
Epoch 18/20  Iteration 3114/3560 Training loss: 1.5306 0.8563 sec/batch
Epoch 18/20  Iteration 3115/3560 Training loss: 1.5301 0.8656 sec/batch
Epoch 18/20  Iteration 3116/3560 Training loss: 1.5303 0.8765 sec/batch
Epoch 18/20  Iteration 3117/3560 Training loss: 1.5301 0.8552 sec/batch
Epoch 18/20  Iteration 3118/3560 Training loss: 1.5299 0.8535 sec/batch
Epoch 18/20  Iteration 3119/3560 Training loss: 1.5295 0.8593 sec/batch
Epoch 18/20  Iteration 3120/3560 Training loss: 1.5292 0.9209 sec/batch
Epoch 18/20  Iteration 3121/3560 Training loss: 1.5290 0.8627 sec/batch
Epoch 18/20  Iteration 3122/3560 Training loss: 1.5291 0.8664 sec/batch
Epoch 18/20  Iteration 3123/3560 Training loss: 1.5291 0.8611 sec/batch
Epoch 18/20  Iteration 3124/3560 Training loss: 1.5288 0.8704 sec/batch
Epoch 18/20  Iteration 3125/3560 Training loss: 1.5285 0.9064 sec/batch
Epoch 18/20  Iteration 3126/3560 Training loss: 1.5281 0.8657 sec/batch
Epoch 18/20  Iteration 3127/3560 Training loss: 1.5280 0.8626 sec/batch
Epoch 18/20  Iteration 3128/3560 Training loss: 1.5279 0.8714 sec/batch
Epoch 18/20  Iteration 3129/3560 Training loss: 1.5278 0.9721 sec/batch
Epoch 18/20  Iteration 3130/3560 Training loss: 1.5277 0.9813 sec/batch
Epoch 18/20  Iteration 3131/3560 Training loss: 1.5276 0.8919 sec/batch
Epoch 18/20  Iteration 3132/3560 Training loss: 1.5273 0.9840 sec/batch
Epoch 18/20  Iteration 3133/3560 Training loss: 1.5273 0.9005 sec/batch
Epoch 18/20  Iteration 3134/3560 Training loss: 1.5272 0.9585 sec/batch
Epoch 18/20  Iteration 3135/3560 Training loss: 1.5271 0.8607 sec/batch
Epoch 18/20  Iteration 3136/3560 Training loss: 1.5271 0.8608 sec/batch
Epoch 18/20  Iteration 3137/3560 Training loss: 1.5270 0.9016 sec/batch
Epoch 18/20  Iteration 3138/3560 Training loss: 1.5269 0.8840 sec/batch
Epoch 18/20  Iteration 3139/3560 Training loss: 1.5267 0.9656 sec/batch
Epoch 18/20  Iteration 3140/3560 Training loss: 1.5265 0.9031 sec/batch
Epoch 18/20  Iteration 3141/3560 Training loss: 1.5262 0.9526 sec/batch
Epoch 18/20  Iteration 3142/3560 Training loss: 1.5258 0.9974 sec/batch
Epoch 18/20  Iteration 3143/3560 Training loss: 1.5258 0.9945 sec/batch
Epoch 18/20  Iteration 3144/3560 Training loss: 1.5258 0.9688 sec/batch
Epoch 18/20  Iteration 3145/3560 Training loss: 1.5256 0.9802 sec/batch
Epoch 18/20  Iteration 3146/3560 Training loss: 1.5255 0.8987 sec/batch
Epoch 18/20  Iteration 3147/3560 Training loss: 1.5254 0.9085 sec/batch
Epoch 18/20  Iteration 3148/3560 Training loss: 1.5250 0.9796 sec/batch
Epoch 18/20  Iteration 3149/3560 Training loss: 1.5246 0.9692 sec/batch
Epoch 18/20  Iteration 3150/3560 Training loss: 1.5247 0.9400 sec/batch
Epoch 18/20  Iteration 3151/3560 Training loss: 1.5246 0.8565 sec/batch
Epoch 18/20  Iteration 3152/3560 Training loss: 1.5242 0.9056 sec/batch
Epoch 18/20  Iteration 3153/3560 Training loss: 1.5244 0.9577 sec/batch
Epoch 18/20  Iteration 3154/3560 Training loss: 1.5244 0.8673 sec/batch
Epoch 18/20  Iteration 3155/3560 Training loss: 1.5243 0.9127 sec/batch
Epoch 18/20  Iteration 3156/3560 Training loss: 1.5240 0.8954 sec/batch
Epoch 18/20  Iteration 3157/3560 Training loss: 1.5237 0.9086 sec/batch
Epoch 18/20  Iteration 3158/3560 Training loss: 1.5234 0.9081 sec/batch
Epoch 18/20  Iteration 3159/3560 Training loss: 1.5236 0.8665 sec/batch
Epoch 18/20  Iteration 3160/3560 Training loss: 1.5236 0.8746 sec/batch
Epoch 18/20  Iteration 3161/3560 Training loss: 1.5236 0.8631 sec/batch
Epoch 18/20  Iteration 3162/3560 Training loss: 1.5236 0.9370 sec/batch
Epoch 18/20  Iteration 3163/3560 Training loss: 1.5238 0.8657 sec/batch
Epoch 18/20  Iteration 3164/3560 Training loss: 1.5239 0.8659 sec/batch
Epoch 18/20  Iteration 3165/3560 Training loss: 1.5239 0.8683 sec/batch
Epoch 18/20  Iteration 3166/3560 Training loss: 1.5239 0.8752 sec/batch
Epoch 18/20  Iteration 3167/3560 Training loss: 1.5243 0.9477 sec/batch
Epoch 18/20  Iteration 3168/3560 Training loss: 1.5243 0.8562 sec/batch
Epoch 18/20  Iteration 3169/3560 Training loss: 1.5242 0.8667 sec/batch
Epoch 18/20  Iteration 3170/3560 Training loss: 1.5244 0.8564 sec/batch
Epoch 18/20  Iteration 3171/3560 Training loss: 1.5242 0.8570 sec/batch
Epoch 18/20  Iteration 3172/3560 Training loss: 1.5243 0.8824 sec/batch
Epoch 18/20  Iteration 3173/3560 Training loss: 1.5243 0.8861 sec/batch
Epoch 18/20  Iteration 3174/3560 Training loss: 1.5245 0.8580 sec/batch
Epoch 18/20  Iteration 3175/3560 Training loss: 1.5247 0.8608 sec/batch
Epoch 18/20  Iteration 3176/3560 Training loss: 1.5246 0.8578 sec/batch
Epoch 18/20  Iteration 3177/3560 Training loss: 1.5243 0.8668 sec/batch
Epoch 18/20  Iteration 3178/3560 Training loss: 1.5243 0.8817 sec/batch
Epoch 18/20  Iteration 3179/3560 Training loss: 1.5244 0.8829 sec/batch
Epoch 18/20  Iteration 3180/3560 Training loss: 1.5244 0.9003 sec/batch
Epoch 18/20  Iteration 3181/3560 Training loss: 1.5244 0.8529 sec/batch
Epoch 18/20  Iteration 3182/3560 Training loss: 1.5244 0.8757 sec/batch
Epoch 18/20  Iteration 3183/3560 Training loss: 1.5245 0.8597 sec/batch
Epoch 18/20  Iteration 3184/3560 Training loss: 1.5246 0.8614 sec/batch
Epoch 18/20  Iteration 3185/3560 Training loss: 1.5243 0.8691 sec/batch
Epoch 18/20  Iteration 3186/3560 Training loss: 1.5244 0.8600 sec/batch
Epoch 18/20  Iteration 3187/3560 Training loss: 1.5247 0.8663 sec/batch
Epoch 18/20  Iteration 3188/3560 Training loss: 1.5247 0.8621 sec/batch
Epoch 18/20  Iteration 3189/3560 Training loss: 1.5247 0.8639 sec/batch
Epoch 18/20  Iteration 3190/3560 Training loss: 1.5248 0.8601 sec/batch
Epoch 18/20  Iteration 3191/3560 Training loss: 1.5248 0.8654 sec/batch
Epoch 18/20  Iteration 3192/3560 Training loss: 1.5247 0.8819 sec/batch
Epoch 18/20  Iteration 3193/3560 Training loss: 1.5248 0.8627 sec/batch
Epoch 18/20  Iteration 3194/3560 Training loss: 1.5252 0.8549 sec/batch
Epoch 18/20  Iteration 3195/3560 Training loss: 1.5252 0.9032 sec/batch
Epoch 18/20  Iteration 3196/3560 Training loss: 1.5253 0.8669 sec/batch
Epoch 18/20  Iteration 3197/3560 Training loss: 1.5252 0.8622 sec/batch
Epoch 18/20  Iteration 3198/3560 Training loss: 1.5251 0.8839 sec/batch
Epoch 18/20  Iteration 3199/3560 Training loss: 1.5252 0.9209 sec/batch
Epoch 18/20  Iteration 3200/3560 Training loss: 1.5252 0.8935 sec/batch
Epoch 18/20  Iteration 3201/3560 Training loss: 1.5252 0.9023 sec/batch
Epoch 18/20  Iteration 3202/3560 Training loss: 1.5251 0.8580 sec/batch
Epoch 18/20  Iteration 3203/3560 Training loss: 1.5250 0.8615 sec/batch
Epoch 18/20  Iteration 3204/3560 Training loss: 1.5251 0.8610 sec/batch
Epoch 19/20  Iteration 3205/3560 Training loss: 1.6092 0.8630 sec/batch
Epoch 19/20  Iteration 3206/3560 Training loss: 1.5660 0.8630 sec/batch
Epoch 19/20  Iteration 3207/3560 Training loss: 1.5517 0.8644 sec/batch
Epoch 19/20  Iteration 3208/3560 Training loss: 1.5489 0.8630 sec/batch
Epoch 19/20  Iteration 3209/3560 Training loss: 1.5400 0.8681 sec/batch
Epoch 19/20  Iteration 3210/3560 Training loss: 1.5289 0.8681 sec/batch
Epoch 19/20  Iteration 3211/3560 Training loss: 1.5292 0.8615 sec/batch
Epoch 19/20  Iteration 3212/3560 Training loss: 1.5273 0.8575 sec/batch
Epoch 19/20  Iteration 3213/3560 Training loss: 1.5280 0.9140 sec/batch
Epoch 19/20  Iteration 3214/3560 Training loss: 1.5259 0.8578 sec/batch
Epoch 19/20  Iteration 3215/3560 Training loss: 1.5226 0.9124 sec/batch
Epoch 19/20  Iteration 3216/3560 Training loss: 1.5224 0.8624 sec/batch
Epoch 19/20  Iteration 3217/3560 Training loss: 1.5222 0.8824 sec/batch
Epoch 19/20  Iteration 3218/3560 Training loss: 1.5233 0.8763 sec/batch
Epoch 19/20  Iteration 3219/3560 Training loss: 1.5225 0.8663 sec/batch
Epoch 19/20  Iteration 3220/3560 Training loss: 1.5206 0.9111 sec/batch
Epoch 19/20  Iteration 3221/3560 Training loss: 1.5215 0.9015 sec/batch
Epoch 19/20  Iteration 3222/3560 Training loss: 1.5224 0.8815 sec/batch
Epoch 19/20  Iteration 3223/3560 Training loss: 1.5228 0.9051 sec/batch
Epoch 19/20  Iteration 3224/3560 Training loss: 1.5240 0.8627 sec/batch
Epoch 19/20  Iteration 3225/3560 Training loss: 1.5231 0.8552 sec/batch
Epoch 19/20  Iteration 3226/3560 Training loss: 1.5235 0.8576 sec/batch
Epoch 19/20  Iteration 3227/3560 Training loss: 1.5230 0.9026 sec/batch
Epoch 19/20  Iteration 3228/3560 Training loss: 1.5228 0.9204 sec/batch
Epoch 19/20  Iteration 3229/3560 Training loss: 1.5226 0.8681 sec/batch
Epoch 19/20  Iteration 3230/3560 Training loss: 1.5208 0.9026 sec/batch
Epoch 19/20  Iteration 3231/3560 Training loss: 1.5197 0.9011 sec/batch
Epoch 19/20  Iteration 3232/3560 Training loss: 1.5205 0.8618 sec/batch
Epoch 19/20  Iteration 3233/3560 Training loss: 1.5210 1.0136 sec/batch
Epoch 19/20  Iteration 3234/3560 Training loss: 1.5216 0.9634 sec/batch
Epoch 19/20  Iteration 3235/3560 Training loss: 1.5213 0.8736 sec/batch
Epoch 19/20  Iteration 3236/3560 Training loss: 1.5204 0.9503 sec/batch
Epoch 19/20  Iteration 3237/3560 Training loss: 1.5207 0.8940 sec/batch
Epoch 19/20  Iteration 3238/3560 Training loss: 1.5210 0.8994 sec/batch
Epoch 19/20  Iteration 3239/3560 Training loss: 1.5210 0.8839 sec/batch
Epoch 19/20  Iteration 3240/3560 Training loss: 1.5207 0.9013 sec/batch
Epoch 19/20  Iteration 3241/3560 Training loss: 1.5202 0.8677 sec/batch
Epoch 19/20  Iteration 3242/3560 Training loss: 1.5191 1.0254 sec/batch
Epoch 19/20  Iteration 3243/3560 Training loss: 1.5176 1.0396 sec/batch
Epoch 19/20  Iteration 3244/3560 Training loss: 1.5169 0.9558 sec/batch
Epoch 19/20  Iteration 3245/3560 Training loss: 1.5164 1.0936 sec/batch
Epoch 19/20  Iteration 3246/3560 Training loss: 1.5173 0.9570 sec/batch
Epoch 19/20  Iteration 3247/3560 Training loss: 1.5171 0.9281 sec/batch
Epoch 19/20  Iteration 3248/3560 Training loss: 1.5164 0.8914 sec/batch
Epoch 19/20  Iteration 3249/3560 Training loss: 1.5169 0.9419 sec/batch
Epoch 19/20  Iteration 3250/3560 Training loss: 1.5159 1.0238 sec/batch
Epoch 19/20  Iteration 3251/3560 Training loss: 1.5155 0.9679 sec/batch
Epoch 19/20  Iteration 3252/3560 Training loss: 1.5151 0.9290 sec/batch
Epoch 19/20  Iteration 3253/3560 Training loss: 1.5154 0.9191 sec/batch
Epoch 19/20  Iteration 3254/3560 Training loss: 1.5161 0.9339 sec/batch
Epoch 19/20  Iteration 3255/3560 Training loss: 1.5157 0.9730 sec/batch
Epoch 19/20  Iteration 3256/3560 Training loss: 1.5164 0.9702 sec/batch
Epoch 19/20  Iteration 3257/3560 Training loss: 1.5164 0.9657 sec/batch
Epoch 19/20  Iteration 3258/3560 Training loss: 1.5165 1.0058 sec/batch
Epoch 19/20  Iteration 3259/3560 Training loss: 1.5166 0.9762 sec/batch
Epoch 19/20  Iteration 3260/3560 Training loss: 1.5166 0.9915 sec/batch
Epoch 19/20  Iteration 3261/3560 Training loss: 1.5171 0.9940 sec/batch
Epoch 19/20  Iteration 3262/3560 Training loss: 1.5169 1.0599 sec/batch
Epoch 19/20  Iteration 3263/3560 Training loss: 1.5164 1.0783 sec/batch
Epoch 19/20  Iteration 3264/3560 Training loss: 1.5170 1.2345 sec/batch
Epoch 19/20  Iteration 3265/3560 Training loss: 1.5171 1.0993 sec/batch
Epoch 19/20  Iteration 3266/3560 Training loss: 1.5180 1.1073 sec/batch
Epoch 19/20  Iteration 3267/3560 Training loss: 1.5181 0.9914 sec/batch
Epoch 19/20  Iteration 3268/3560 Training loss: 1.5183 1.0180 sec/batch
Epoch 19/20  Iteration 3269/3560 Training loss: 1.5182 1.0143 sec/batch
Epoch 19/20  Iteration 3270/3560 Training loss: 1.5185 1.0347 sec/batch
Epoch 19/20  Iteration 3271/3560 Training loss: 1.5188 1.0278 sec/batch
Epoch 19/20  Iteration 3272/3560 Training loss: 1.5186 1.0076 sec/batch
Epoch 19/20  Iteration 3273/3560 Training loss: 1.5187 1.0337 sec/batch
Epoch 19/20  Iteration 3274/3560 Training loss: 1.5185 0.9348 sec/batch
Epoch 19/20  Iteration 3275/3560 Training loss: 1.5192 0.8989 sec/batch
Epoch 19/20  Iteration 3276/3560 Training loss: 1.5195 0.9370 sec/batch
Epoch 19/20  Iteration 3277/3560 Training loss: 1.5201 0.9401 sec/batch
Epoch 19/20  Iteration 3278/3560 Training loss: 1.5200 0.9684 sec/batch
Epoch 19/20  Iteration 3279/3560 Training loss: 1.5199 0.9369 sec/batch
Epoch 19/20  Iteration 3280/3560 Training loss: 1.5200 0.9145 sec/batch
Epoch 19/20  Iteration 3281/3560 Training loss: 1.5198 0.8625 sec/batch
Epoch 19/20  Iteration 3282/3560 Training loss: 1.5198 0.8650 sec/batch
Epoch 19/20  Iteration 3283/3560 Training loss: 1.5193 0.8642 sec/batch
Epoch 19/20  Iteration 3284/3560 Training loss: 1.5193 0.9533 sec/batch
Epoch 19/20  Iteration 3285/3560 Training loss: 1.5188 1.1237 sec/batch
Epoch 19/20  Iteration 3286/3560 Training loss: 1.5189 0.9383 sec/batch
Epoch 19/20  Iteration 3287/3560 Training loss: 1.5184 0.9076 sec/batch
Epoch 19/20  Iteration 3288/3560 Training loss: 1.5182 0.8612 sec/batch
Epoch 19/20  Iteration 3289/3560 Training loss: 1.5179 0.8660 sec/batch
Epoch 19/20  Iteration 3290/3560 Training loss: 1.5178 0.8589 sec/batch
Epoch 19/20  Iteration 3291/3560 Training loss: 1.5175 0.8607 sec/batch
Epoch 19/20  Iteration 3292/3560 Training loss: 1.5172 0.8574 sec/batch
Epoch 19/20  Iteration 3293/3560 Training loss: 1.5167 0.8660 sec/batch
Epoch 19/20  Iteration 3294/3560 Training loss: 1.5169 0.8581 sec/batch
Epoch 19/20  Iteration 3295/3560 Training loss: 1.5166 0.8689 sec/batch
Epoch 19/20  Iteration 3296/3560 Training loss: 1.5164 0.8621 sec/batch
Epoch 19/20  Iteration 3297/3560 Training loss: 1.5161 0.8577 sec/batch
Epoch 19/20  Iteration 3298/3560 Training loss: 1.5159 0.8588 sec/batch
Epoch 19/20  Iteration 3299/3560 Training loss: 1.5155 0.8609 sec/batch
Epoch 19/20  Iteration 3300/3560 Training loss: 1.5155 0.9030 sec/batch
Epoch 19/20  Iteration 3301/3560 Training loss: 1.5156 0.8555 sec/batch
Epoch 19/20  Iteration 3302/3560 Training loss: 1.5152 0.8633 sec/batch
Epoch 19/20  Iteration 3303/3560 Training loss: 1.5148 0.8724 sec/batch
Epoch 19/20  Iteration 3304/3560 Training loss: 1.5144 0.8664 sec/batch
Epoch 19/20  Iteration 3305/3560 Training loss: 1.5144 0.8678 sec/batch
Epoch 19/20  Iteration 3306/3560 Training loss: 1.5143 0.8605 sec/batch
Epoch 19/20  Iteration 3307/3560 Training loss: 1.5142 0.9047 sec/batch
Epoch 19/20  Iteration 3308/3560 Training loss: 1.5140 0.9079 sec/batch
Epoch 19/20  Iteration 3309/3560 Training loss: 1.5139 0.8615 sec/batch
Epoch 19/20  Iteration 3310/3560 Training loss: 1.5137 0.9303 sec/batch
Epoch 19/20  Iteration 3311/3560 Training loss: 1.5136 0.8810 sec/batch
Epoch 19/20  Iteration 3312/3560 Training loss: 1.5136 0.9030 sec/batch
Epoch 19/20  Iteration 3313/3560 Training loss: 1.5135 0.9498 sec/batch
Epoch 19/20  Iteration 3314/3560 Training loss: 1.5135 0.8565 sec/batch
Epoch 19/20  Iteration 3315/3560 Training loss: 1.5133 0.8735 sec/batch
Epoch 19/20  Iteration 3316/3560 Training loss: 1.5132 0.9027 sec/batch
Epoch 19/20  Iteration 3317/3560 Training loss: 1.5130 0.9001 sec/batch
Epoch 19/20  Iteration 3318/3560 Training loss: 1.5128 0.8641 sec/batch
Epoch 19/20  Iteration 3319/3560 Training loss: 1.5126 0.8692 sec/batch
Epoch 19/20  Iteration 3320/3560 Training loss: 1.5123 0.8955 sec/batch
Epoch 19/20  Iteration 3321/3560 Training loss: 1.5123 0.8963 sec/batch
Epoch 19/20  Iteration 3322/3560 Training loss: 1.5123 0.8986 sec/batch
Epoch 19/20  Iteration 3323/3560 Training loss: 1.5121 0.9493 sec/batch
Epoch 19/20  Iteration 3324/3560 Training loss: 1.5120 0.8593 sec/batch
Epoch 19/20  Iteration 3325/3560 Training loss: 1.5119 0.8650 sec/batch
Epoch 19/20  Iteration 3326/3560 Training loss: 1.5115 0.8657 sec/batch
Epoch 19/20  Iteration 3327/3560 Training loss: 1.5111 0.8649 sec/batch
Epoch 19/20  Iteration 3328/3560 Training loss: 1.5112 0.8638 sec/batch
Epoch 19/20  Iteration 3329/3560 Training loss: 1.5111 0.8692 sec/batch
Epoch 19/20  Iteration 3330/3560 Training loss: 1.5107 0.8610 sec/batch
Epoch 19/20  Iteration 3331/3560 Training loss: 1.5107 0.8641 sec/batch
Epoch 19/20  Iteration 3332/3560 Training loss: 1.5108 0.8576 sec/batch
Epoch 19/20  Iteration 3333/3560 Training loss: 1.5106 0.8595 sec/batch
Epoch 19/20  Iteration 3334/3560 Training loss: 1.5104 0.8623 sec/batch
Epoch 19/20  Iteration 3335/3560 Training loss: 1.5100 0.8634 sec/batch
Epoch 19/20  Iteration 3336/3560 Training loss: 1.5098 0.8600 sec/batch
Epoch 19/20  Iteration 3337/3560 Training loss: 1.5099 0.8714 sec/batch
Epoch 19/20  Iteration 3338/3560 Training loss: 1.5099 0.8600 sec/batch
Epoch 19/20  Iteration 3339/3560 Training loss: 1.5099 0.8629 sec/batch
Epoch 19/20  Iteration 3340/3560 Training loss: 1.5099 0.8633 sec/batch
Epoch 19/20  Iteration 3341/3560 Training loss: 1.5101 0.8592 sec/batch
Epoch 19/20  Iteration 3342/3560 Training loss: 1.5102 0.8552 sec/batch
Epoch 19/20  Iteration 3343/3560 Training loss: 1.5102 0.8562 sec/batch
Epoch 19/20  Iteration 3344/3560 Training loss: 1.5102 0.8594 sec/batch
Epoch 19/20  Iteration 3345/3560 Training loss: 1.5106 0.8639 sec/batch
Epoch 19/20  Iteration 3346/3560 Training loss: 1.5106 0.8678 sec/batch
Epoch 19/20  Iteration 3347/3560 Training loss: 1.5105 0.8804 sec/batch
Epoch 19/20  Iteration 3348/3560 Training loss: 1.5107 0.8615 sec/batch
Epoch 19/20  Iteration 3349/3560 Training loss: 1.5105 0.8638 sec/batch
Epoch 19/20  Iteration 3350/3560 Training loss: 1.5106 0.8667 sec/batch
Epoch 19/20  Iteration 3351/3560 Training loss: 1.5106 0.8642 sec/batch
Epoch 19/20  Iteration 3352/3560 Training loss: 1.5108 0.8694 sec/batch
Epoch 19/20  Iteration 3353/3560 Training loss: 1.5109 0.8669 sec/batch
Epoch 19/20  Iteration 3354/3560 Training loss: 1.5109 0.9505 sec/batch
Epoch 19/20  Iteration 3355/3560 Training loss: 1.5106 0.8595 sec/batch
Epoch 19/20  Iteration 3356/3560 Training loss: 1.5106 0.8946 sec/batch
Epoch 19/20  Iteration 3357/3560 Training loss: 1.5107 0.8633 sec/batch
Epoch 19/20  Iteration 3358/3560 Training loss: 1.5107 0.9075 sec/batch
Epoch 19/20  Iteration 3359/3560 Training loss: 1.5107 0.8535 sec/batch
Epoch 19/20  Iteration 3360/3560 Training loss: 1.5107 0.8703 sec/batch
Epoch 19/20  Iteration 3361/3560 Training loss: 1.5108 0.8647 sec/batch
Epoch 19/20  Iteration 3362/3560 Training loss: 1.5109 0.8665 sec/batch
Epoch 19/20  Iteration 3363/3560 Training loss: 1.5107 0.8683 sec/batch
Epoch 19/20  Iteration 3364/3560 Training loss: 1.5108 0.8629 sec/batch
Epoch 19/20  Iteration 3365/3560 Training loss: 1.5110 0.8678 sec/batch
Epoch 19/20  Iteration 3366/3560 Training loss: 1.5110 0.8604 sec/batch
Epoch 19/20  Iteration 3367/3560 Training loss: 1.5111 0.8633 sec/batch
Epoch 19/20  Iteration 3368/3560 Training loss: 1.5111 0.8634 sec/batch
Epoch 19/20  Iteration 3369/3560 Training loss: 1.5110 0.8643 sec/batch
Epoch 19/20  Iteration 3370/3560 Training loss: 1.5110 0.8562 sec/batch
Epoch 19/20  Iteration 3371/3560 Training loss: 1.5112 0.8594 sec/batch
Epoch 19/20  Iteration 3372/3560 Training loss: 1.5116 0.8553 sec/batch
Epoch 19/20  Iteration 3373/3560 Training loss: 1.5116 0.8993 sec/batch
Epoch 19/20  Iteration 3374/3560 Training loss: 1.5116 0.8574 sec/batch
Epoch 19/20  Iteration 3375/3560 Training loss: 1.5115 0.8656 sec/batch
Epoch 19/20  Iteration 3376/3560 Training loss: 1.5113 0.8599 sec/batch
Epoch 19/20  Iteration 3377/3560 Training loss: 1.5115 0.8660 sec/batch
Epoch 19/20  Iteration 3378/3560 Training loss: 1.5115 0.8659 sec/batch
Epoch 19/20  Iteration 3379/3560 Training loss: 1.5116 0.8571 sec/batch
Epoch 19/20  Iteration 3380/3560 Training loss: 1.5115 0.8628 sec/batch
Epoch 19/20  Iteration 3381/3560 Training loss: 1.5113 0.8803 sec/batch
Epoch 19/20  Iteration 3382/3560 Training loss: 1.5115 0.8866 sec/batch
Epoch 20/20  Iteration 3383/3560 Training loss: 1.5968 0.9123 sec/batch
Epoch 20/20  Iteration 3384/3560 Training loss: 1.5546 0.9281 sec/batch
Epoch 20/20  Iteration 3385/3560 Training loss: 1.5395 0.8728 sec/batch
Epoch 20/20  Iteration 3386/3560 Training loss: 1.5380 0.8941 sec/batch
Epoch 20/20  Iteration 3387/3560 Training loss: 1.5276 0.8535 sec/batch
Epoch 20/20  Iteration 3388/3560 Training loss: 1.5170 0.9148 sec/batch
Epoch 20/20  Iteration 3389/3560 Training loss: 1.5152 0.8623 sec/batch
Epoch 20/20  Iteration 3390/3560 Training loss: 1.5130 0.8712 sec/batch
Epoch 20/20  Iteration 3391/3560 Training loss: 1.5134 0.9082 sec/batch
Epoch 20/20  Iteration 3392/3560 Training loss: 1.5122 0.8643 sec/batch
Epoch 20/20  Iteration 3393/3560 Training loss: 1.5086 0.8626 sec/batch
Epoch 20/20  Iteration 3394/3560 Training loss: 1.5077 0.8735 sec/batch
Epoch 20/20  Iteration 3395/3560 Training loss: 1.5074 0.8675 sec/batch
Epoch 20/20  Iteration 3396/3560 Training loss: 1.5090 0.9034 sec/batch
Epoch 20/20  Iteration 3397/3560 Training loss: 1.5074 0.9855 sec/batch
Epoch 20/20  Iteration 3398/3560 Training loss: 1.5057 0.9077 sec/batch
Epoch 20/20  Iteration 3399/3560 Training loss: 1.5063 0.9466 sec/batch
Epoch 20/20  Iteration 3400/3560 Training loss: 1.5071 0.8679 sec/batch
Epoch 20/20  Iteration 3401/3560 Training loss: 1.5073 0.8641 sec/batch
Epoch 20/20  Iteration 3402/3560 Training loss: 1.5091 0.8647 sec/batch
Epoch 20/20  Iteration 3403/3560 Training loss: 1.5081 0.8622 sec/batch
Epoch 20/20  Iteration 3404/3560 Training loss: 1.5084 0.8554 sec/batch
Epoch 20/20  Iteration 3405/3560 Training loss: 1.5083 0.8550 sec/batch
Epoch 20/20  Iteration 3406/3560 Training loss: 1.5079 0.8572 sec/batch
Epoch 20/20  Iteration 3407/3560 Training loss: 1.5086 0.8682 sec/batch
Epoch 20/20  Iteration 3408/3560 Training loss: 1.5068 0.8711 sec/batch
Epoch 20/20  Iteration 3409/3560 Training loss: 1.5055 0.9050 sec/batch
Epoch 20/20  Iteration 3410/3560 Training loss: 1.5066 0.8559 sec/batch
Epoch 20/20  Iteration 3411/3560 Training loss: 1.5072 0.8643 sec/batch
Epoch 20/20  Iteration 3412/3560 Training loss: 1.5077 0.8613 sec/batch
Epoch 20/20  Iteration 3413/3560 Training loss: 1.5075 0.8607 sec/batch
Epoch 20/20  Iteration 3414/3560 Training loss: 1.5066 0.8585 sec/batch
Epoch 20/20  Iteration 3415/3560 Training loss: 1.5068 0.8789 sec/batch
Epoch 20/20  Iteration 3416/3560 Training loss: 1.5070 0.8684 sec/batch
Epoch 20/20  Iteration 3417/3560 Training loss: 1.5070 0.8614 sec/batch
Epoch 20/20  Iteration 3418/3560 Training loss: 1.5069 0.8646 sec/batch
Epoch 20/20  Iteration 3419/3560 Training loss: 1.5063 0.8659 sec/batch
Epoch 20/20  Iteration 3420/3560 Training loss: 1.5051 0.8631 sec/batch
Epoch 20/20  Iteration 3421/3560 Training loss: 1.5038 0.8644 sec/batch
Epoch 20/20  Iteration 3422/3560 Training loss: 1.5032 0.8573 sec/batch
Epoch 20/20  Iteration 3423/3560 Training loss: 1.5027 0.9072 sec/batch
Epoch 20/20  Iteration 3424/3560 Training loss: 1.5035 0.8540 sec/batch
Epoch 20/20  Iteration 3425/3560 Training loss: 1.5032 0.8619 sec/batch
Epoch 20/20  Iteration 3426/3560 Training loss: 1.5024 0.8610 sec/batch
Epoch 20/20  Iteration 3427/3560 Training loss: 1.5027 0.8552 sec/batch
Epoch 20/20  Iteration 3428/3560 Training loss: 1.5019 0.8556 sec/batch
Epoch 20/20  Iteration 3429/3560 Training loss: 1.5015 0.8729 sec/batch
Epoch 20/20  Iteration 3430/3560 Training loss: 1.5010 0.8656 sec/batch
Epoch 20/20  Iteration 3431/3560 Training loss: 1.5011 0.8694 sec/batch
Epoch 20/20  Iteration 3432/3560 Training loss: 1.5016 0.8597 sec/batch
Epoch 20/20  Iteration 3433/3560 Training loss: 1.5012 0.8677 sec/batch
Epoch 20/20  Iteration 3434/3560 Training loss: 1.5019 0.8611 sec/batch
Epoch 20/20  Iteration 3435/3560 Training loss: 1.5019 0.8592 sec/batch
Epoch 20/20  Iteration 3436/3560 Training loss: 1.5022 0.8614 sec/batch
Epoch 20/20  Iteration 3437/3560 Training loss: 1.5022 0.8632 sec/batch
Epoch 20/20  Iteration 3438/3560 Training loss: 1.5023 0.8636 sec/batch
Epoch 20/20  Iteration 3439/3560 Training loss: 1.5029 0.8631 sec/batch
Epoch 20/20  Iteration 3440/3560 Training loss: 1.5028 0.8623 sec/batch
Epoch 20/20  Iteration 3441/3560 Training loss: 1.5022 0.9088 sec/batch
Epoch 20/20  Iteration 3442/3560 Training loss: 1.5028 0.8843 sec/batch
Epoch 20/20  Iteration 3443/3560 Training loss: 1.5028 0.9503 sec/batch
Epoch 20/20  Iteration 3444/3560 Training loss: 1.5037 0.8951 sec/batch
Epoch 20/20  Iteration 3445/3560 Training loss: 1.5040 0.8775 sec/batch
Epoch 20/20  Iteration 3446/3560 Training loss: 1.5042 0.8630 sec/batch
Epoch 20/20  Iteration 3447/3560 Training loss: 1.5040 0.8660 sec/batch
Epoch 20/20  Iteration 3448/3560 Training loss: 1.5043 0.8640 sec/batch
Epoch 20/20  Iteration 3449/3560 Training loss: 1.5046 0.8808 sec/batch
Epoch 20/20  Iteration 3450/3560 Training loss: 1.5044 0.8604 sec/batch
Epoch 20/20  Iteration 3451/3560 Training loss: 1.5044 0.8626 sec/batch
Epoch 20/20  Iteration 3452/3560 Training loss: 1.5044 0.8629 sec/batch
Epoch 20/20  Iteration 3453/3560 Training loss: 1.5051 0.8637 sec/batch
Epoch 20/20  Iteration 3454/3560 Training loss: 1.5053 0.8570 sec/batch
Epoch 20/20  Iteration 3455/3560 Training loss: 1.5058 0.8635 sec/batch
Epoch 20/20  Iteration 3456/3560 Training loss: 1.5056 0.9126 sec/batch
Epoch 20/20  Iteration 3457/3560 Training loss: 1.5055 0.8622 sec/batch
Epoch 20/20  Iteration 3458/3560 Training loss: 1.5058 0.9136 sec/batch
Epoch 20/20  Iteration 3459/3560 Training loss: 1.5057 0.8673 sec/batch
Epoch 20/20  Iteration 3460/3560 Training loss: 1.5056 0.8646 sec/batch
Epoch 20/20  Iteration 3461/3560 Training loss: 1.5051 0.8574 sec/batch
Epoch 20/20  Iteration 3462/3560 Training loss: 1.5051 0.8610 sec/batch
Epoch 20/20  Iteration 3463/3560 Training loss: 1.5046 0.8744 sec/batch
Epoch 20/20  Iteration 3464/3560 Training loss: 1.5047 0.9062 sec/batch
Epoch 20/20  Iteration 3465/3560 Training loss: 1.5042 0.8679 sec/batch
Epoch 20/20  Iteration 3466/3560 Training loss: 1.5042 0.8649 sec/batch
Epoch 20/20  Iteration 3467/3560 Training loss: 1.5040 0.8629 sec/batch
Epoch 20/20  Iteration 3468/3560 Training loss: 1.5039 0.9065 sec/batch
Epoch 20/20  Iteration 3469/3560 Training loss: 1.5037 0.8628 sec/batch
Epoch 20/20  Iteration 3470/3560 Training loss: 1.5034 0.8887 sec/batch
Epoch 20/20  Iteration 3471/3560 Training loss: 1.5030 0.9017 sec/batch
Epoch 20/20  Iteration 3472/3560 Training loss: 1.5032 0.8811 sec/batch
Epoch 20/20  Iteration 3473/3560 Training loss: 1.5028 0.8766 sec/batch
Epoch 20/20  Iteration 3474/3560 Training loss: 1.5027 0.8994 sec/batch
Epoch 20/20  Iteration 3475/3560 Training loss: 1.5023 0.9010 sec/batch
Epoch 20/20  Iteration 3476/3560 Training loss: 1.5020 0.8919 sec/batch
Epoch 20/20  Iteration 3477/3560 Training loss: 1.5017 0.8969 sec/batch
Epoch 20/20  Iteration 3478/3560 Training loss: 1.5018 0.8886 sec/batch
Epoch 20/20  Iteration 3479/3560 Training loss: 1.5019 0.8927 sec/batch
Epoch 20/20  Iteration 3480/3560 Training loss: 1.5015 0.8662 sec/batch
Epoch 20/20  Iteration 3481/3560 Training loss: 1.5010 0.8609 sec/batch
Epoch 20/20  Iteration 3482/3560 Training loss: 1.5007 0.8632 sec/batch
Epoch 20/20  Iteration 3483/3560 Training loss: 1.5007 0.8820 sec/batch
Epoch 20/20  Iteration 3484/3560 Training loss: 1.5005 0.8635 sec/batch
Epoch 20/20  Iteration 3485/3560 Training loss: 1.5003 0.8627 sec/batch
Epoch 20/20  Iteration 3486/3560 Training loss: 1.5002 0.8632 sec/batch
Epoch 20/20  Iteration 3487/3560 Training loss: 1.5001 0.8659 sec/batch
Epoch 20/20  Iteration 3488/3560 Training loss: 1.5000 0.8669 sec/batch
Epoch 20/20  Iteration 3489/3560 Training loss: 1.4999 0.9110 sec/batch
Epoch 20/20  Iteration 3490/3560 Training loss: 1.4999 0.9073 sec/batch
Epoch 20/20  Iteration 3491/3560 Training loss: 1.4997 0.8660 sec/batch
Epoch 20/20  Iteration 3492/3560 Training loss: 1.4998 0.9024 sec/batch
Epoch 20/20  Iteration 3493/3560 Training loss: 1.4997 0.8603 sec/batch
Epoch 20/20  Iteration 3494/3560 Training loss: 1.4997 0.8584 sec/batch
Epoch 20/20  Iteration 3495/3560 Training loss: 1.4996 0.9053 sec/batch
Epoch 20/20  Iteration 3496/3560 Training loss: 1.4993 0.8658 sec/batch
Epoch 20/20  Iteration 3497/3560 Training loss: 1.4991 0.8605 sec/batch
Epoch 20/20  Iteration 3498/3560 Training loss: 1.4988 0.8664 sec/batch
Epoch 20/20  Iteration 3499/3560 Training loss: 1.4988 0.8592 sec/batch
Epoch 20/20  Iteration 3500/3560 Training loss: 1.4988 0.8652 sec/batch
Epoch 20/20  Iteration 3501/3560 Training loss: 1.4987 0.8651 sec/batch
Epoch 20/20  Iteration 3502/3560 Training loss: 1.4987 0.9102 sec/batch
Epoch 20/20  Iteration 3503/3560 Training loss: 1.4986 0.8582 sec/batch
Epoch 20/20  Iteration 3504/3560 Training loss: 1.4982 0.8676 sec/batch
Epoch 20/20  Iteration 3505/3560 Training loss: 1.4978 0.8622 sec/batch
Epoch 20/20  Iteration 3506/3560 Training loss: 1.4979 0.8688 sec/batch
Epoch 20/20  Iteration 3507/3560 Training loss: 1.4979 0.8666 sec/batch
Epoch 20/20  Iteration 3508/3560 Training loss: 1.4975 0.8653 sec/batch
Epoch 20/20  Iteration 3509/3560 Training loss: 1.4976 0.8590 sec/batch
Epoch 20/20  Iteration 3510/3560 Training loss: 1.4977 0.8584 sec/batch
Epoch 20/20  Iteration 3511/3560 Training loss: 1.4976 0.8581 sec/batch
Epoch 20/20  Iteration 3512/3560 Training loss: 1.4973 0.8620 sec/batch
Epoch 20/20  Iteration 3513/3560 Training loss: 1.4970 0.8584 sec/batch
Epoch 20/20  Iteration 3514/3560 Training loss: 1.4968 0.8634 sec/batch
Epoch 20/20  Iteration 3515/3560 Training loss: 1.4969 0.8626 sec/batch
Epoch 20/20  Iteration 3516/3560 Training loss: 1.4969 0.8589 sec/batch
Epoch 20/20  Iteration 3517/3560 Training loss: 1.4969 0.8760 sec/batch
Epoch 20/20  Iteration 3518/3560 Training loss: 1.4969 0.9855 sec/batch
Epoch 20/20  Iteration 3519/3560 Training loss: 1.4972 0.8746 sec/batch
Epoch 20/20  Iteration 3520/3560 Training loss: 1.4973 0.8610 sec/batch
Epoch 20/20  Iteration 3521/3560 Training loss: 1.4974 0.8645 sec/batch
Epoch 20/20  Iteration 3522/3560 Training loss: 1.4974 0.8617 sec/batch
Epoch 20/20  Iteration 3523/3560 Training loss: 1.4977 0.8601 sec/batch
Epoch 20/20  Iteration 3524/3560 Training loss: 1.4978 0.8623 sec/batch
Epoch 20/20  Iteration 3525/3560 Training loss: 1.4978 0.8584 sec/batch
Epoch 20/20  Iteration 3526/3560 Training loss: 1.4979 0.8983 sec/batch
Epoch 20/20  Iteration 3527/3560 Training loss: 1.4978 0.8988 sec/batch
Epoch 20/20  Iteration 3528/3560 Training loss: 1.4979 0.8923 sec/batch
Epoch 20/20  Iteration 3529/3560 Training loss: 1.4979 0.8787 sec/batch
Epoch 20/20  Iteration 3530/3560 Training loss: 1.4981 0.8649 sec/batch
Epoch 20/20  Iteration 3531/3560 Training loss: 1.4982 0.8625 sec/batch
Epoch 20/20  Iteration 3532/3560 Training loss: 1.4982 0.8588 sec/batch
Epoch 20/20  Iteration 3533/3560 Training loss: 1.4979 1.0128 sec/batch
Epoch 20/20  Iteration 3534/3560 Training loss: 1.4979 0.8734 sec/batch
Epoch 20/20  Iteration 3535/3560 Training loss: 1.4979 0.9112 sec/batch
Epoch 20/20  Iteration 3536/3560 Training loss: 1.4980 0.8782 sec/batch
Epoch 20/20  Iteration 3537/3560 Training loss: 1.4979 0.8617 sec/batch
Epoch 20/20  Iteration 3538/3560 Training loss: 1.4979 0.8684 sec/batch
Epoch 20/20  Iteration 3539/3560 Training loss: 1.4980 0.8667 sec/batch
Epoch 20/20  Iteration 3540/3560 Training loss: 1.4981 0.8595 sec/batch
Epoch 20/20  Iteration 3541/3560 Training loss: 1.4978 0.8627 sec/batch
Epoch 20/20  Iteration 3542/3560 Training loss: 1.4979 0.8671 sec/batch
Epoch 20/20  Iteration 3543/3560 Training loss: 1.4981 0.8633 sec/batch
Epoch 20/20  Iteration 3544/3560 Training loss: 1.4981 0.8654 sec/batch
Epoch 20/20  Iteration 3545/3560 Training loss: 1.4981 0.9045 sec/batch
Epoch 20/20  Iteration 3546/3560 Training loss: 1.4981 0.8648 sec/batch
Epoch 20/20  Iteration 3547/3560 Training loss: 1.4982 0.9094 sec/batch
Epoch 20/20  Iteration 3548/3560 Training loss: 1.4981 0.8676 sec/batch
Epoch 20/20  Iteration 3549/3560 Training loss: 1.4983 0.8783 sec/batch
Epoch 20/20  Iteration 3550/3560 Training loss: 1.4987 0.9372 sec/batch
Epoch 20/20  Iteration 3551/3560 Training loss: 1.4986 0.9128 sec/batch
Epoch 20/20  Iteration 3552/3560 Training loss: 1.4986 0.9416 sec/batch
Epoch 20/20  Iteration 3553/3560 Training loss: 1.4985 0.9077 sec/batch
Epoch 20/20  Iteration 3554/3560 Training loss: 1.4983 0.8720 sec/batch
Epoch 20/20  Iteration 3555/3560 Training loss: 1.4984 0.8649 sec/batch
Epoch 20/20  Iteration 3556/3560 Training loss: 1.4984 0.8616 sec/batch
Epoch 20/20  Iteration 3557/3560 Training loss: 1.4985 0.8656 sec/batch
Epoch 20/20  Iteration 3558/3560 Training loss: 1.4983 0.8679 sec/batch
Epoch 20/20  Iteration 3559/3560 Training loss: 1.4981 0.9087 sec/batch
Epoch 20/20  Iteration 3560/3560 Training loss: 1.4983 0.8633 sec/batch
Epoch 1/20  Iteration 1/3560 Training loss: 4.4184 1.7386 sec/batch
Epoch 1/20  Iteration 2/3560 Training loss: 4.3804 1.7311 sec/batch
Epoch 1/20  Iteration 3/3560 Training loss: 4.2168 1.7525 sec/batch
Epoch 1/20  Iteration 4/3560 Training loss: 4.1216 1.7472 sec/batch
Epoch 1/20  Iteration 5/3560 Training loss: 4.0097 1.7258 sec/batch
Epoch 1/20  Iteration 6/3560 Training loss: 3.9258 1.7654 sec/batch
Epoch 1/20  Iteration 7/3560 Training loss: 3.8640 1.8553 sec/batch
Epoch 1/20  Iteration 8/3560 Training loss: 3.8114 1.8378 sec/batch
Epoch 1/20  Iteration 9/3560 Training loss: 3.7611 1.7737 sec/batch
Epoch 1/20  Iteration 10/3560 Training loss: 3.7178 1.7543 sec/batch
Epoch 1/20  Iteration 11/3560 Training loss: 3.6780 1.7240 sec/batch
Epoch 1/20  Iteration 12/3560 Training loss: 3.6452 1.7718 sec/batch
Epoch 1/20  Iteration 13/3560 Training loss: 3.6162 1.8107 sec/batch
Epoch 1/20  Iteration 14/3560 Training loss: 3.5914 1.8478 sec/batch
Epoch 1/20  Iteration 15/3560 Training loss: 3.5694 1.8135 sec/batch
Epoch 1/20  Iteration 16/3560 Training loss: 3.5497 1.7406 sec/batch
Epoch 1/20  Iteration 17/3560 Training loss: 3.5312 1.7137 sec/batch
Epoch 1/20  Iteration 18/3560 Training loss: 3.5164 1.7094 sec/batch
Epoch 1/20  Iteration 19/3560 Training loss: 3.5016 1.7470 sec/batch
Epoch 1/20  Iteration 20/3560 Training loss: 3.4861 2.2519 sec/batch
Epoch 1/20  Iteration 21/3560 Training loss: 3.4728 2.0322 sec/batch
Epoch 1/20  Iteration 22/3560 Training loss: 3.4609 2.0165 sec/batch
Epoch 1/20  Iteration 23/3560 Training loss: 3.4497 1.8420 sec/batch
Epoch 1/20  Iteration 24/3560 Training loss: 3.4390 1.7379 sec/batch
Epoch 1/20  Iteration 25/3560 Training loss: 3.4291 1.7057 sec/batch
Epoch 1/20  Iteration 26/3560 Training loss: 3.4207 1.7307 sec/batch
Epoch 1/20  Iteration 27/3560 Training loss: 3.4126 1.7044 sec/batch
Epoch 1/20  Iteration 28/3560 Training loss: 3.4042 1.7504 sec/batch
Epoch 1/20  Iteration 29/3560 Training loss: 3.3964 1.7307 sec/batch
Epoch 1/20  Iteration 30/3560 Training loss: 3.3896 1.7009 sec/batch
Epoch 1/20  Iteration 31/3560 Training loss: 3.3836 2.0334 sec/batch
Epoch 1/20  Iteration 32/3560 Training loss: 3.3771 1.7313 sec/batch
Epoch 1/20  Iteration 33/3560 Training loss: 3.3707 1.7368 sec/batch
Epoch 1/20  Iteration 34/3560 Training loss: 3.3653 1.7250 sec/batch
Epoch 1/20  Iteration 35/3560 Training loss: 3.3596 1.7381 sec/batch
Epoch 1/20  Iteration 36/3560 Training loss: 3.3548 1.7325 sec/batch
Epoch 1/20  Iteration 37/3560 Training loss: 3.3491 1.7322 sec/batch
Epoch 1/20  Iteration 38/3560 Training loss: 3.3440 1.7076 sec/batch
Epoch 1/20  Iteration 39/3560 Training loss: 3.3389 1.7342 sec/batch
Epoch 1/20  Iteration 40/3560 Training loss: 3.3344 1.7474 sec/batch
Epoch 1/20  Iteration 41/3560 Training loss: 3.3298 1.7342 sec/batch
Epoch 1/20  Iteration 42/3560 Training loss: 3.3256 1.7444 sec/batch
Epoch 1/20  Iteration 43/3560 Training loss: 3.3213 1.6965 sec/batch
Epoch 1/20  Iteration 44/3560 Training loss: 3.3172 1.7356 sec/batch
Epoch 1/20  Iteration 45/3560 Training loss: 3.3132 1.7170 sec/batch
Epoch 1/20  Iteration 46/3560 Training loss: 3.3097 1.7246 sec/batch
Epoch 1/20  Iteration 47/3560 Training loss: 3.3065 1.7319 sec/batch
Epoch 1/20  Iteration 48/3560 Training loss: 3.3035 1.7313 sec/batch
Epoch 1/20  Iteration 49/3560 Training loss: 3.3005 1.7320 sec/batch
Epoch 1/20  Iteration 50/3560 Training loss: 3.2976 1.7623 sec/batch
Epoch 1/20  Iteration 51/3560 Training loss: 3.2946 1.7031 sec/batch
Epoch 1/20  Iteration 52/3560 Training loss: 3.2915 1.7201 sec/batch
Epoch 1/20  Iteration 53/3560 Training loss: 3.2888 1.8940 sec/batch
Epoch 1/20  Iteration 54/3560 Training loss: 3.2858 1.7921 sec/batch
Epoch 1/20  Iteration 55/3560 Training loss: 3.2832 1.7468 sec/batch
Epoch 1/20  Iteration 56/3560 Training loss: 3.2804 1.7142 sec/batch
Epoch 1/20  Iteration 57/3560 Training loss: 3.2778 1.7122 sec/batch
Epoch 1/20  Iteration 58/3560 Training loss: 3.2753 1.7441 sec/batch
Epoch 1/20  Iteration 59/3560 Training loss: 3.2726 1.7162 sec/batch
Epoch 1/20  Iteration 60/3560 Training loss: 3.2703 1.7334 sec/batch
Epoch 1/20  Iteration 61/3560 Training loss: 3.2679 1.7110 sec/batch
Epoch 1/20  Iteration 62/3560 Training loss: 3.2660 1.7325 sec/batch
Epoch 1/20  Iteration 63/3560 Training loss: 3.2642 1.7822 sec/batch
Epoch 1/20  Iteration 64/3560 Training loss: 3.2617 1.7380 sec/batch
Epoch 1/20  Iteration 65/3560 Training loss: 3.2594 1.7733 sec/batch
Epoch 1/20  Iteration 66/3560 Training loss: 3.2575 1.7347 sec/batch
Epoch 1/20  Iteration 67/3560 Training loss: 3.2556 1.7209 sec/batch
Epoch 1/20  Iteration 68/3560 Training loss: 3.2530 1.7428 sec/batch
Epoch 1/20  Iteration 69/3560 Training loss: 3.2508 1.7440 sec/batch
Epoch 1/20  Iteration 70/3560 Training loss: 3.2490 1.7010 sec/batch
Epoch 1/20  Iteration 71/3560 Training loss: 3.2470 1.7386 sec/batch
Epoch 1/20  Iteration 72/3560 Training loss: 3.2454 1.7348 sec/batch
Epoch 1/20  Iteration 73/3560 Training loss: 3.2434 1.7304 sec/batch
Epoch 1/20  Iteration 74/3560 Training loss: 3.2416 1.7365 sec/batch
Epoch 1/20  Iteration 75/3560 Training loss: 3.2399 1.7497 sec/batch
Epoch 1/20  Iteration 76/3560 Training loss: 3.2382 1.7328 sec/batch
Epoch 1/20  Iteration 77/3560 Training loss: 3.2365 1.7408 sec/batch
Epoch 1/20  Iteration 78/3560 Training loss: 3.2346 1.7292 sec/batch
Epoch 1/20  Iteration 79/3560 Training loss: 3.2327 1.7030 sec/batch
Epoch 1/20  Iteration 80/3560 Training loss: 3.2306 1.7515 sec/batch
Epoch 1/20  Iteration 81/3560 Training loss: 3.2286 1.7174 sec/batch
Epoch 1/20  Iteration 82/3560 Training loss: 3.2267 1.7125 sec/batch
Epoch 1/20  Iteration 83/3560 Training loss: 3.2249 1.7421 sec/batch
Epoch 1/20  Iteration 84/3560 Training loss: 3.2229 1.7268 sec/batch
Epoch 1/20  Iteration 85/3560 Training loss: 3.2208 1.7416 sec/batch
Epoch 1/20  Iteration 86/3560 Training loss: 3.2187 1.7048 sec/batch
Epoch 1/20  Iteration 87/3560 Training loss: 3.2165 1.7374 sec/batch
Epoch 1/20  Iteration 88/3560 Training loss: 3.2142 1.7520 sec/batch
Epoch 1/20  Iteration 89/3560 Training loss: 3.2122 1.7100 sec/batch
Epoch 1/20  Iteration 90/3560 Training loss: 3.2100 1.7335 sec/batch
Epoch 1/20  Iteration 91/3560 Training loss: 3.2078 1.7363 sec/batch
Epoch 1/20  Iteration 92/3560 Training loss: 3.2055 1.7303 sec/batch
Epoch 1/20  Iteration 93/3560 Training loss: 3.2030 1.7487 sec/batch
Epoch 1/20  Iteration 94/3560 Training loss: 3.2006 1.7379 sec/batch
Epoch 1/20  Iteration 95/3560 Training loss: 3.1980 1.7374 sec/batch
Epoch 1/20  Iteration 96/3560 Training loss: 3.1953 1.7428 sec/batch
Epoch 1/20  Iteration 97/3560 Training loss: 3.1927 1.7514 sec/batch
Epoch 1/20  Iteration 98/3560 Training loss: 3.1897 1.7113 sec/batch
Epoch 1/20  Iteration 99/3560 Training loss: 3.1869 1.7196 sec/batch
Epoch 1/20  Iteration 100/3560 Training loss: 3.1840 1.7361 sec/batch
Epoch 1/20  Iteration 101/3560 Training loss: 3.1810 1.7226 sec/batch
Epoch 1/20  Iteration 102/3560 Training loss: 3.1780 1.7314 sec/batch
Epoch 1/20  Iteration 103/3560 Training loss: 3.1748 1.7197 sec/batch
Epoch 1/20  Iteration 104/3560 Training loss: 3.1715 1.7075 sec/batch
Epoch 1/20  Iteration 105/3560 Training loss: 3.1680 1.7278 sec/batch
Epoch 1/20  Iteration 106/3560 Training loss: 3.1648 1.7310 sec/batch
Epoch 1/20  Iteration 107/3560 Training loss: 3.1612 1.7528 sec/batch
Epoch 1/20  Iteration 108/3560 Training loss: 3.1575 1.7365 sec/batch
Epoch 1/20  Iteration 109/3560 Training loss: 3.1540 1.7326 sec/batch
Epoch 1/20  Iteration 110/3560 Training loss: 3.1501 1.7404 sec/batch
Epoch 1/20  Iteration 111/3560 Training loss: 3.1463 1.7074 sec/batch
Epoch 1/20  Iteration 112/3560 Training loss: 3.1426 1.7054 sec/batch
Epoch 1/20  Iteration 113/3560 Training loss: 3.1387 1.7136 sec/batch
Epoch 1/20  Iteration 114/3560 Training loss: 3.1346 1.7218 sec/batch
Epoch 1/20  Iteration 115/3560 Training loss: 3.1307 1.7120 sec/batch
Epoch 1/20  Iteration 116/3560 Training loss: 3.1269 1.7338 sec/batch
Epoch 1/20  Iteration 117/3560 Training loss: 3.1238 1.7426 sec/batch
Epoch 1/20  Iteration 118/3560 Training loss: 3.1201 1.7190 sec/batch
Epoch 1/20  Iteration 119/3560 Training loss: 3.1167 1.7584 sec/batch
Epoch 1/20  Iteration 120/3560 Training loss: 3.1132 1.7080 sec/batch
Epoch 1/20  Iteration 121/3560 Training loss: 3.1103 1.7446 sec/batch
Epoch 1/20  Iteration 122/3560 Training loss: 3.1068 1.8233 sec/batch
Epoch 1/20  Iteration 123/3560 Training loss: 3.1033 1.8422 sec/batch
Epoch 1/20  Iteration 124/3560 Training loss: 3.0997 1.7728 sec/batch
Epoch 1/20  Iteration 125/3560 Training loss: 3.0960 1.7299 sec/batch
Epoch 1/20  Iteration 126/3560 Training loss: 3.0921 1.7422 sec/batch
Epoch 1/20  Iteration 127/3560 Training loss: 3.0886 1.7337 sec/batch
Epoch 1/20  Iteration 128/3560 Training loss: 3.0851 1.7080 sec/batch
Epoch 1/20  Iteration 129/3560 Training loss: 3.0813 1.7053 sec/batch
Epoch 1/20  Iteration 130/3560 Training loss: 3.0777 1.7037 sec/batch
Epoch 1/20  Iteration 131/3560 Training loss: 3.0741 1.7355 sec/batch
Epoch 1/20  Iteration 132/3560 Training loss: 3.0703 1.7504 sec/batch
Epoch 1/20  Iteration 133/3560 Training loss: 3.0667 1.7338 sec/batch
Epoch 1/20  Iteration 134/3560 Training loss: 3.0632 1.7298 sec/batch
Epoch 1/20  Iteration 135/3560 Training loss: 3.0593 1.7754 sec/batch
Epoch 1/20  Iteration 136/3560 Training loss: 3.0557 1.7387 sec/batch
Epoch 1/20  Iteration 137/3560 Training loss: 3.0520 1.7315 sec/batch
Epoch 1/20  Iteration 138/3560 Training loss: 3.0483 1.7344 sec/batch
Epoch 1/20  Iteration 139/3560 Training loss: 3.0449 1.7280 sec/batch
Epoch 1/20  Iteration 140/3560 Training loss: 3.0413 1.7375 sec/batch
Epoch 1/20  Iteration 141/3560 Training loss: 3.0378 1.7428 sec/batch
Epoch 1/20  Iteration 142/3560 Training loss: 3.0342 1.7223 sec/batch
Epoch 1/20  Iteration 143/3560 Training loss: 3.0307 1.7106 sec/batch
Epoch 1/20  Iteration 144/3560 Training loss: 3.0271 1.7314 sec/batch
Epoch 1/20  Iteration 145/3560 Training loss: 3.0236 1.7332 sec/batch
Epoch 1/20  Iteration 146/3560 Training loss: 3.0203 1.7362 sec/batch
Epoch 1/20  Iteration 147/3560 Training loss: 3.0169 1.7256 sec/batch
Epoch 1/20  Iteration 148/3560 Training loss: 3.0136 1.7085 sec/batch
Epoch 1/20  Iteration 149/3560 Training loss: 3.0101 1.7340 sec/batch
Epoch 1/20  Iteration 150/3560 Training loss: 3.0066 1.7300 sec/batch
Epoch 1/20  Iteration 151/3560 Training loss: 3.0035 1.7048 sec/batch
Epoch 1/20  Iteration 152/3560 Training loss: 3.0005 1.7389 sec/batch
Epoch 1/20  Iteration 153/3560 Training loss: 2.9972 1.7200 sec/batch
Epoch 1/20  Iteration 154/3560 Training loss: 2.9940 1.7444 sec/batch
Epoch 1/20  Iteration 155/3560 Training loss: 2.9907 1.7365 sec/batch
Epoch 1/20  Iteration 156/3560 Training loss: 2.9874 1.7092 sec/batch
Epoch 1/20  Iteration 157/3560 Training loss: 2.9841 1.7514 sec/batch
Epoch 1/20  Iteration 158/3560 Training loss: 2.9809 1.7268 sec/batch
Epoch 1/20  Iteration 159/3560 Training loss: 2.9776 1.7261 sec/batch
Epoch 1/20  Iteration 160/3560 Training loss: 2.9745 1.7394 sec/batch
Epoch 1/20  Iteration 161/3560 Training loss: 2.9714 1.7309 sec/batch
Epoch 1/20  Iteration 162/3560 Training loss: 2.9681 1.7387 sec/batch
Epoch 1/20  Iteration 163/3560 Training loss: 2.9648 1.7397 sec/batch
Epoch 1/20  Iteration 164/3560 Training loss: 2.9617 1.7287 sec/batch
Epoch 1/20  Iteration 165/3560 Training loss: 2.9587 1.7370 sec/batch
Epoch 1/20  Iteration 166/3560 Training loss: 2.9556 1.7198 sec/batch
Epoch 1/20  Iteration 167/3560 Training loss: 2.9526 1.7227 sec/batch
Epoch 1/20  Iteration 168/3560 Training loss: 2.9496 1.7396 sec/batch
Epoch 1/20  Iteration 169/3560 Training loss: 2.9467 1.7041 sec/batch
Epoch 1/20  Iteration 170/3560 Training loss: 2.9436 1.7385 sec/batch
Epoch 1/20  Iteration 171/3560 Training loss: 2.9407 1.7343 sec/batch
Epoch 1/20  Iteration 172/3560 Training loss: 2.9380 1.7376 sec/batch
Epoch 1/20  Iteration 173/3560 Training loss: 2.9353 1.7388 sec/batch
Epoch 1/20  Iteration 174/3560 Training loss: 2.9327 1.7331 sec/batch
Epoch 1/20  Iteration 175/3560 Training loss: 2.9300 1.7405 sec/batch
Epoch 1/20  Iteration 176/3560 Training loss: 2.9272 1.7072 sec/batch
Epoch 1/20  Iteration 177/3560 Training loss: 2.9244 1.7105 sec/batch
Epoch 1/20  Iteration 178/3560 Training loss: 2.9214 1.7129 sec/batch
Epoch 2/20  Iteration 179/3560 Training loss: 2.4632 1.7353 sec/batch
Epoch 2/20  Iteration 180/3560 Training loss: 2.4232 1.7239 sec/batch
Epoch 2/20  Iteration 181/3560 Training loss: 2.4113 1.7290 sec/batch
Epoch 2/20  Iteration 182/3560 Training loss: 2.4077 1.7394 sec/batch
Epoch 2/20  Iteration 183/3560 Training loss: 2.4057 1.7523 sec/batch
Epoch 2/20  Iteration 184/3560 Training loss: 2.4024 1.7945 sec/batch
Epoch 2/20  Iteration 185/3560 Training loss: 2.4026 1.7404 sec/batch
Epoch 2/20  Iteration 186/3560 Training loss: 2.4041 1.7162 sec/batch
Epoch 2/20  Iteration 187/3560 Training loss: 2.4040 1.7319 sec/batch
Epoch 2/20  Iteration 188/3560 Training loss: 2.4021 1.7144 sec/batch
Epoch 2/20  Iteration 189/3560 Training loss: 2.3992 1.7303 sec/batch
Epoch 2/20  Iteration 190/3560 Training loss: 2.3984 1.7352 sec/batch
Epoch 2/20  Iteration 191/3560 Training loss: 2.3966 1.8847 sec/batch
Epoch 2/20  Iteration 192/3560 Training loss: 2.3989 1.7989 sec/batch
Epoch 2/20  Iteration 193/3560 Training loss: 2.3981 1.7738 sec/batch
Epoch 2/20  Iteration 194/3560 Training loss: 2.3974 1.7380 sec/batch
Epoch 2/20  Iteration 195/3560 Training loss: 2.3957 1.7163 sec/batch
Epoch 2/20  Iteration 196/3560 Training loss: 2.3959 1.7140 sec/batch
Epoch 2/20  Iteration 197/3560 Training loss: 2.3949 1.7401 sec/batch
Epoch 2/20  Iteration 198/3560 Training loss: 2.3922 1.7209 sec/batch
Epoch 2/20  Iteration 199/3560 Training loss: 2.3902 1.7435 sec/batch
Epoch 2/20  Iteration 200/3560 Training loss: 2.3901 1.7538 sec/batch
Epoch 2/20  Iteration 201/3560 Training loss: 2.3884 1.7124 sec/batch
Epoch 2/20  Iteration 202/3560 Training loss: 2.3866 1.7419 sec/batch
Epoch 2/20  Iteration 203/3560 Training loss: 2.3847 1.9339 sec/batch
Epoch 2/20  Iteration 204/3560 Training loss: 2.3833 1.8777 sec/batch
Epoch 2/20  Iteration 205/3560 Training loss: 2.3816 1.7119 sec/batch
Epoch 2/20  Iteration 206/3560 Training loss: 2.3805 1.7332 sec/batch
Epoch 2/20  Iteration 207/3560 Training loss: 2.3798 1.7362 sec/batch
Epoch 2/20  Iteration 208/3560 Training loss: 2.3787 1.7424 sec/batch
Epoch 2/20  Iteration 209/3560 Training loss: 2.3779 1.7482 sec/batch
Epoch 2/20  Iteration 210/3560 Training loss: 2.3763 1.7597 sec/batch
Epoch 2/20  Iteration 211/3560 Training loss: 2.3746 1.7422 sec/batch
Epoch 2/20  Iteration 212/3560 Training loss: 2.3740 1.7223 sec/batch
Epoch 2/20  Iteration 213/3560 Training loss: 2.3725 1.7156 sec/batch
Epoch 2/20  Iteration 214/3560 Training loss: 2.3714 1.7499 sec/batch
Epoch 2/20  Iteration 215/3560 Training loss: 2.3699 1.7437 sec/batch
Epoch 2/20  Iteration 216/3560 Training loss: 2.3678 1.7382 sec/batch
Epoch 2/20  Iteration 217/3560 Training loss: 2.3660 1.7399 sec/batch
Epoch 2/20  Iteration 218/3560 Training loss: 2.3641 1.7242 sec/batch
Epoch 2/20  Iteration 219/3560 Training loss: 2.3627 1.7406 sec/batch
Epoch 2/20  Iteration 220/3560 Training loss: 2.3611 1.7156 sec/batch
Epoch 2/20  Iteration 221/3560 Training loss: 2.3593 1.7131 sec/batch
Epoch 2/20  Iteration 222/3560 Training loss: 2.3574 1.7443 sec/batch
Epoch 2/20  Iteration 223/3560 Training loss: 2.3559 1.7173 sec/batch
Epoch 2/20  Iteration 224/3560 Training loss: 2.3536 1.7135 sec/batch
Epoch 2/20  Iteration 225/3560 Training loss: 2.3526 1.7124 sec/batch
Epoch 2/20  Iteration 226/3560 Training loss: 2.3510 1.7242 sec/batch
Epoch 2/20  Iteration 227/3560 Training loss: 2.3497 1.7374 sec/batch
Epoch 2/20  Iteration 228/3560 Training loss: 2.3488 1.7168 sec/batch
Epoch 2/20  Iteration 229/3560 Training loss: 2.3472 1.7132 sec/batch
Epoch 2/20  Iteration 230/3560 Training loss: 2.3463 1.7186 sec/batch
Epoch 2/20  Iteration 231/3560 Training loss: 2.3448 1.7403 sec/batch
Epoch 2/20  Iteration 232/3560 Training loss: 2.3434 1.7331 sec/batch
Epoch 2/20  Iteration 233/3560 Training loss: 2.3420 1.7434 sec/batch
Epoch 2/20  Iteration 234/3560 Training loss: 2.3408 1.7331 sec/batch
Epoch 2/20  Iteration 235/3560 Training loss: 2.3396 1.7514 sec/batch
Epoch 2/20  Iteration 236/3560 Training loss: 2.3381 1.7417 sec/batch
Epoch 2/20  Iteration 237/3560 Training loss: 2.3366 1.7166 sec/batch
Epoch 2/20  Iteration 238/3560 Training loss: 2.3358 1.7422 sec/batch
Epoch 2/20  Iteration 239/3560 Training loss: 2.3344 1.7322 sec/batch
Epoch 2/20  Iteration 240/3560 Training loss: 2.3335 1.7327 sec/batch
Epoch 2/20  Iteration 241/3560 Training loss: 2.3326 1.7130 sec/batch
Epoch 2/20  Iteration 242/3560 Training loss: 2.3315 1.7155 sec/batch
Epoch 2/20  Iteration 243/3560 Training loss: 2.3300 1.7404 sec/batch
Epoch 2/20  Iteration 244/3560 Training loss: 2.3290 1.7123 sec/batch
Epoch 2/20  Iteration 245/3560 Training loss: 2.3279 1.7089 sec/batch
Epoch 2/20  Iteration 246/3560 Training loss: 2.3262 1.7323 sec/batch
Epoch 2/20  Iteration 247/3560 Training loss: 2.3246 1.7362 sec/batch
Epoch 2/20  Iteration 248/3560 Training loss: 2.3235 1.7391 sec/batch
Epoch 2/20  Iteration 249/3560 Training loss: 2.3225 1.7391 sec/batch
Epoch 2/20  Iteration 250/3560 Training loss: 2.3216 1.7552 sec/batch
Epoch 2/20  Iteration 251/3560 Training loss: 2.3207 1.7378 sec/batch
Epoch 2/20  Iteration 252/3560 Training loss: 2.3193 1.7504 sec/batch
Epoch 2/20  Iteration 253/3560 Training loss: 2.3181 1.7216 sec/batch
Epoch 2/20  Iteration 254/3560 Training loss: 2.3175 1.7128 sec/batch
Epoch 2/20  Iteration 255/3560 Training loss: 2.3162 1.7122 sec/batch
Epoch 2/20  Iteration 256/3560 Training loss: 2.3154 1.7301 sec/batch
Epoch 2/20  Iteration 257/3560 Training loss: 2.3140 1.7132 sec/batch
Epoch 2/20  Iteration 258/3560 Training loss: 2.3128 1.7168 sec/batch
Epoch 2/20  Iteration 259/3560 Training loss: 2.3114 1.7390 sec/batch
Epoch 2/20  Iteration 260/3560 Training loss: 2.3105 1.8660 sec/batch
Epoch 2/20  Iteration 261/3560 Training loss: 2.3090 1.7940 sec/batch
Epoch 2/20  Iteration 262/3560 Training loss: 2.3077 1.7498 sec/batch
Epoch 2/20  Iteration 263/3560 Training loss: 2.3061 1.7402 sec/batch
Epoch 2/20  Iteration 264/3560 Training loss: 2.3048 1.7027 sec/batch
Epoch 2/20  Iteration 265/3560 Training loss: 2.3036 1.7212 sec/batch
Epoch 2/20  Iteration 266/3560 Training loss: 2.3023 1.7255 sec/batch
Epoch 2/20  Iteration 267/3560 Training loss: 2.3009 1.7393 sec/batch
Epoch 2/20  Iteration 268/3560 Training loss: 2.2998 1.7067 sec/batch
Epoch 2/20  Iteration 269/3560 Training loss: 2.2986 1.7533 sec/batch
Epoch 2/20  Iteration 270/3560 Training loss: 2.2975 1.7515 sec/batch
Epoch 2/20  Iteration 271/3560 Training loss: 2.2961 1.7350 sec/batch
Epoch 2/20  Iteration 272/3560 Training loss: 2.2947 1.7346 sec/batch
Epoch 2/20  Iteration 273/3560 Training loss: 2.2933 1.7443 sec/batch
Epoch 2/20  Iteration 274/3560 Training loss: 2.2920 1.7504 sec/batch
Epoch 2/20  Iteration 275/3560 Training loss: 2.2908 1.6985 sec/batch
Epoch 2/20  Iteration 276/3560 Training loss: 2.2896 1.7349 sec/batch
Epoch 2/20  Iteration 277/3560 Training loss: 2.2883 1.7499 sec/batch
Epoch 2/20  Iteration 278/3560 Training loss: 2.2869 1.7368 sec/batch
Epoch 2/20  Iteration 279/3560 Training loss: 2.2859 1.7339 sec/batch
Epoch 2/20  Iteration 280/3560 Training loss: 2.2849 1.7376 sec/batch
Epoch 2/20  Iteration 281/3560 Training loss: 2.2836 1.7380 sec/batch
Epoch 2/20  Iteration 282/3560 Training loss: 2.2823 1.7335 sec/batch
Epoch 2/20  Iteration 283/3560 Training loss: 2.2810 1.7143 sec/batch
Epoch 2/20  Iteration 284/3560 Training loss: 2.2800 1.7429 sec/batch
Epoch 2/20  Iteration 285/3560 Training loss: 2.2788 1.7380 sec/batch
Epoch 2/20  Iteration 286/3560 Training loss: 2.2778 1.7259 sec/batch
Epoch 2/20  Iteration 287/3560 Training loss: 2.2768 1.7497 sec/batch
Epoch 2/20  Iteration 288/3560 Training loss: 2.2756 1.7116 sec/batch
Epoch 2/20  Iteration 289/3560 Training loss: 2.2746 1.7440 sec/batch
Epoch 2/20  Iteration 290/3560 Training loss: 2.2735 1.7162 sec/batch
Epoch 2/20  Iteration 291/3560 Training loss: 2.2724 1.7486 sec/batch
Epoch 2/20  Iteration 292/3560 Training loss: 2.2713 1.7350 sec/batch
Epoch 2/20  Iteration 293/3560 Training loss: 2.2702 1.7389 sec/batch
Epoch 2/20  Iteration 294/3560 Training loss: 2.2688 1.7137 sec/batch
Epoch 2/20  Iteration 295/3560 Training loss: 2.2678 1.7063 sec/batch
Epoch 2/20  Iteration 296/3560 Training loss: 2.2667 1.7165 sec/batch
Epoch 2/20  Iteration 297/3560 Training loss: 2.2657 1.7386 sec/batch
Epoch 2/20  Iteration 298/3560 Training loss: 2.2647 1.7320 sec/batch
Epoch 2/20  Iteration 299/3560 Training loss: 2.2637 1.7436 sec/batch
Epoch 2/20  Iteration 300/3560 Training loss: 2.2625 1.7109 sec/batch
Epoch 2/20  Iteration 301/3560 Training loss: 2.2615 1.7594 sec/batch
Epoch 2/20  Iteration 302/3560 Training loss: 2.2605 1.7254 sec/batch
Epoch 2/20  Iteration 303/3560 Training loss: 2.2595 1.7616 sec/batch
Epoch 2/20  Iteration 304/3560 Training loss: 2.2582 1.7624 sec/batch
Epoch 2/20  Iteration 305/3560 Training loss: 2.2572 1.7188 sec/batch
Epoch 2/20  Iteration 306/3560 Training loss: 2.2563 1.7425 sec/batch
Epoch 2/20  Iteration 307/3560 Training loss: 2.2553 1.7457 sec/batch
Epoch 2/20  Iteration 308/3560 Training loss: 2.2543 1.6935 sec/batch
Epoch 2/20  Iteration 309/3560 Training loss: 2.2532 1.7694 sec/batch
Epoch 2/20  Iteration 310/3560 Training loss: 2.2520 1.8087 sec/batch
Epoch 2/20  Iteration 311/3560 Training loss: 2.2510 1.8364 sec/batch
Epoch 2/20  Iteration 312/3560 Training loss: 2.2501 1.7844 sec/batch
Epoch 2/20  Iteration 313/3560 Training loss: 2.2489 1.8149 sec/batch
Epoch 2/20  Iteration 314/3560 Training loss: 2.2480 1.7492 sec/batch
Epoch 2/20  Iteration 315/3560 Training loss: 2.2470 1.7545 sec/batch
Epoch 2/20  Iteration 316/3560 Training loss: 2.2461 1.8175 sec/batch
Epoch 2/20  Iteration 317/3560 Training loss: 2.2454 2.0573 sec/batch
Epoch 2/20  Iteration 318/3560 Training loss: 2.2443 1.9529 sec/batch
Epoch 2/20  Iteration 319/3560 Training loss: 2.2436 1.8425 sec/batch
Epoch 2/20  Iteration 320/3560 Training loss: 2.2425 2.0459 sec/batch
Epoch 2/20  Iteration 321/3560 Training loss: 2.2415 2.0088 sec/batch
Epoch 2/20  Iteration 322/3560 Training loss: 2.2405 1.7857 sec/batch
Epoch 2/20  Iteration 323/3560 Training loss: 2.2395 1.8495 sec/batch
Epoch 2/20  Iteration 324/3560 Training loss: 2.2387 1.9223 sec/batch
Epoch 2/20  Iteration 325/3560 Training loss: 2.2378 1.8786 sec/batch
Epoch 2/20  Iteration 326/3560 Training loss: 2.2371 1.7606 sec/batch
Epoch 2/20  Iteration 327/3560 Training loss: 2.2361 2.0117 sec/batch
Epoch 2/20  Iteration 328/3560 Training loss: 2.2350 1.9396 sec/batch
Epoch 2/20  Iteration 329/3560 Training loss: 2.2341 1.8196 sec/batch
Epoch 2/20  Iteration 330/3560 Training loss: 2.2334 1.7621 sec/batch
Epoch 2/20  Iteration 331/3560 Training loss: 2.2325 1.7931 sec/batch
Epoch 2/20  Iteration 332/3560 Training loss: 2.2316 1.7817 sec/batch
Epoch 2/20  Iteration 333/3560 Training loss: 2.2306 2.0395 sec/batch
Epoch 2/20  Iteration 334/3560 Training loss: 2.2297 2.3786 sec/batch
Epoch 2/20  Iteration 335/3560 Training loss: 2.2287 2.0419 sec/batch
Epoch 2/20  Iteration 336/3560 Training loss: 2.2277 2.0561 sec/batch
Epoch 2/20  Iteration 337/3560 Training loss: 2.2266 2.3978 sec/batch
Epoch 2/20  Iteration 338/3560 Training loss: 2.2258 2.2014 sec/batch
Epoch 2/20  Iteration 339/3560 Training loss: 2.2250 2.2728 sec/batch
Epoch 2/20  Iteration 340/3560 Training loss: 2.2240 2.2230 sec/batch
Epoch 2/20  Iteration 341/3560 Training loss: 2.2231 2.1588 sec/batch
Epoch 2/20  Iteration 342/3560 Training loss: 2.2222 1.7828 sec/batch
Epoch 2/20  Iteration 343/3560 Training loss: 2.2213 1.7536 sec/batch
Epoch 2/20  Iteration 344/3560 Training loss: 2.2204 1.9098 sec/batch
Epoch 2/20  Iteration 345/3560 Training loss: 2.2196 1.7107 sec/batch
Epoch 2/20  Iteration 346/3560 Training loss: 2.2188 1.7608 sec/batch
Epoch 2/20  Iteration 347/3560 Training loss: 2.2179 1.7414 sec/batch
Epoch 2/20  Iteration 348/3560 Training loss: 2.2169 1.7555 sec/batch
Epoch 2/20  Iteration 349/3560 Training loss: 2.2160 1.9557 sec/batch
Epoch 2/20  Iteration 350/3560 Training loss: 2.2151 1.7752 sec/batch
Epoch 2/20  Iteration 351/3560 Training loss: 2.2143 1.7574 sec/batch
Epoch 2/20  Iteration 352/3560 Training loss: 2.2135 1.7601 sec/batch
Epoch 2/20  Iteration 353/3560 Training loss: 2.2128 1.8477 sec/batch
Epoch 2/20  Iteration 354/3560 Training loss: 2.2119 1.7963 sec/batch
Epoch 2/20  Iteration 355/3560 Training loss: 2.2109 2.0669 sec/batch
Epoch 2/20  Iteration 356/3560 Training loss: 2.2100 2.1088 sec/batch
Epoch 3/20  Iteration 357/3560 Training loss: 2.1149 1.9220 sec/batch
Epoch 3/20  Iteration 358/3560 Training loss: 2.0741 1.8843 sec/batch
Epoch 3/20  Iteration 359/3560 Training loss: 2.0594 1.7341 sec/batch
Epoch 3/20  Iteration 360/3560 Training loss: 2.0503 1.8938 sec/batch
Epoch 3/20  Iteration 361/3560 Training loss: 2.0492 2.0863 sec/batch
Epoch 3/20  Iteration 362/3560 Training loss: 2.0420 1.9285 sec/batch
Epoch 3/20  Iteration 363/3560 Training loss: 2.0428 1.8953 sec/batch
Epoch 3/20  Iteration 364/3560 Training loss: 2.0418 1.7379 sec/batch
Epoch 3/20  Iteration 365/3560 Training loss: 2.0436 1.9287 sec/batch
Epoch 3/20  Iteration 366/3560 Training loss: 2.0428 1.7553 sec/batch
Epoch 3/20  Iteration 367/3560 Training loss: 2.0401 1.8875 sec/batch
Epoch 3/20  Iteration 368/3560 Training loss: 2.0382 1.9272 sec/batch
Epoch 3/20  Iteration 369/3560 Training loss: 2.0377 1.9256 sec/batch
Epoch 3/20  Iteration 370/3560 Training loss: 2.0395 2.7588 sec/batch
Epoch 3/20  Iteration 371/3560 Training loss: 2.0389 2.5272 sec/batch
Epoch 3/20  Iteration 372/3560 Training loss: 2.0374 3.0239 sec/batch
Epoch 3/20  Iteration 373/3560 Training loss: 2.0367 2.2999 sec/batch
Epoch 3/20  Iteration 374/3560 Training loss: 2.0383 2.1846 sec/batch
Epoch 3/20  Iteration 375/3560 Training loss: 2.0374 2.3974 sec/batch
Epoch 3/20  Iteration 376/3560 Training loss: 2.0367 2.2691 sec/batch
Epoch 3/20  Iteration 377/3560 Training loss: 2.0352 2.1853 sec/batch
Epoch 3/20  Iteration 378/3560 Training loss: 2.0361 2.1773 sec/batch
Epoch 3/20  Iteration 379/3560 Training loss: 2.0348 2.1379 sec/batch
Epoch 3/20  Iteration 380/3560 Training loss: 2.0336 2.1207 sec/batch
Epoch 3/20  Iteration 381/3560 Training loss: 2.0322 2.0651 sec/batch
Epoch 3/20  Iteration 382/3560 Training loss: 2.0307 2.1583 sec/batch
Epoch 3/20  Iteration 383/3560 Training loss: 2.0298 2.0879 sec/batch
Epoch 3/20  Iteration 384/3560 Training loss: 2.0292 2.0854 sec/batch
Epoch 3/20  Iteration 385/3560 Training loss: 2.0294 2.0958 sec/batch
Epoch 3/20  Iteration 386/3560 Training loss: 2.0291 2.1168 sec/batch
Epoch 3/20  Iteration 387/3560 Training loss: 2.0287 2.1849 sec/batch
Epoch 3/20  Iteration 388/3560 Training loss: 2.0274 2.1757 sec/batch
Epoch 3/20  Iteration 389/3560 Training loss: 2.0268 2.1121 sec/batch
Epoch 3/20  Iteration 390/3560 Training loss: 2.0270 2.1314 sec/batch
Epoch 3/20  Iteration 391/3560 Training loss: 2.0256 2.0852 sec/batch
Epoch 3/20  Iteration 392/3560 Training loss: 2.0247 2.1030 sec/batch
Epoch 3/20  Iteration 393/3560 Training loss: 2.0235 2.1063 sec/batch
Epoch 3/20  Iteration 394/3560 Training loss: 2.0219 2.1116 sec/batch
Epoch 3/20  Iteration 395/3560 Training loss: 2.0199 2.0570 sec/batch
Epoch 3/20  Iteration 396/3560 Training loss: 2.0184 2.0702 sec/batch
Epoch 3/20  Iteration 397/3560 Training loss: 2.0172 2.0973 sec/batch
Epoch 3/20  Iteration 398/3560 Training loss: 2.0164 2.0823 sec/batch
Epoch 3/20  Iteration 399/3560 Training loss: 2.0152 2.0951 sec/batch
Epoch 3/20  Iteration 400/3560 Training loss: 2.0138 2.0429 sec/batch
Epoch 3/20  Iteration 401/3560 Training loss: 2.0134 2.0701 sec/batch
Epoch 3/20  Iteration 402/3560 Training loss: 2.0116 2.0643 sec/batch
Epoch 3/20  Iteration 403/3560 Training loss: 2.0111 2.0860 sec/batch
Epoch 3/20  Iteration 404/3560 Training loss: 2.0100 2.1170 sec/batch
Epoch 3/20  Iteration 405/3560 Training loss: 2.0092 2.3655 sec/batch
Epoch 3/20  Iteration 406/3560 Training loss: 2.0093 2.5327 sec/batch
Epoch 3/20  Iteration 407/3560 Training loss: 2.0082 2.1659 sec/batch
Epoch 3/20  Iteration 408/3560 Training loss: 2.0084 2.1519 sec/batch
Epoch 3/20  Iteration 409/3560 Training loss: 2.0076 2.5426 sec/batch
Epoch 3/20  Iteration 410/3560 Training loss: 2.0069 2.2058 sec/batch
Epoch 3/20  Iteration 411/3560 Training loss: 2.0061 2.5336 sec/batch
Epoch 3/20  Iteration 412/3560 Training loss: 2.0055 2.2707 sec/batch
Epoch 3/20  Iteration 413/3560 Training loss: 2.0052 2.2738 sec/batch
Epoch 3/20  Iteration 414/3560 Training loss: 2.0044 2.3387 sec/batch
Epoch 3/20  Iteration 415/3560 Training loss: 2.0034 2.2623 sec/batch
Epoch 3/20  Iteration 416/3560 Training loss: 2.0033 2.2690 sec/batch
Epoch 3/20  Iteration 417/3560 Training loss: 2.0027 2.2683 sec/batch
Epoch 3/20  Iteration 418/3560 Training loss: 2.0028 2.2555 sec/batch
Epoch 3/20  Iteration 419/3560 Training loss: 2.0026 2.2716 sec/batch
Epoch 3/20  Iteration 420/3560 Training loss: 2.0021 2.2900 sec/batch
Epoch 3/20  Iteration 421/3560 Training loss: 2.0013 2.2954 sec/batch
Epoch 3/20  Iteration 422/3560 Training loss: 2.0011 2.2258 sec/batch
Epoch 3/20  Iteration 423/3560 Training loss: 2.0006 2.3042 sec/batch
Epoch 3/20  Iteration 424/3560 Training loss: 1.9996 2.2293 sec/batch
Epoch 3/20  Iteration 425/3560 Training loss: 1.9989 2.3093 sec/batch
Epoch 3/20  Iteration 426/3560 Training loss: 1.9982 2.3235 sec/batch
Epoch 3/20  Iteration 427/3560 Training loss: 1.9980 2.3278 sec/batch
Epoch 3/20  Iteration 428/3560 Training loss: 1.9976 2.2910 sec/batch
Epoch 3/20  Iteration 429/3560 Training loss: 1.9974 2.3409 sec/batch
Epoch 3/20  Iteration 430/3560 Training loss: 1.9966 2.3251 sec/batch
Epoch 3/20  Iteration 431/3560 Training loss: 1.9960 2.2821 sec/batch
Epoch 3/20  Iteration 432/3560 Training loss: 1.9958 2.3235 sec/batch
Epoch 3/20  Iteration 433/3560 Training loss: 1.9951 2.3630 sec/batch
Epoch 3/20  Iteration 434/3560 Training loss: 1.9946 2.3543 sec/batch
Epoch 3/20  Iteration 435/3560 Training loss: 1.9936 2.2636 sec/batch
Epoch 3/20  Iteration 436/3560 Training loss: 1.9931 2.2926 sec/batch
Epoch 3/20  Iteration 437/3560 Training loss: 1.9920 2.3079 sec/batch
Epoch 3/20  Iteration 438/3560 Training loss: 1.9917 2.2988 sec/batch
Epoch 3/20  Iteration 439/3560 Training loss: 1.9908 2.2314 sec/batch
Epoch 3/20  Iteration 440/3560 Training loss: 1.9901 2.2673 sec/batch
Epoch 3/20  Iteration 441/3560 Training loss: 1.9892 2.4957 sec/batch
Epoch 3/20  Iteration 442/3560 Training loss: 1.9884 2.3292 sec/batch
Epoch 3/20  Iteration 443/3560 Training loss: 1.9877 2.2952 sec/batch
Epoch 3/20  Iteration 444/3560 Training loss: 1.9869 2.3250 sec/batch
Epoch 3/20  Iteration 445/3560 Training loss: 1.9859 2.3422 sec/batch
Epoch 3/20  Iteration 446/3560 Training loss: 1.9855 2.3504 sec/batch
Epoch 3/20  Iteration 447/3560 Training loss: 1.9847 2.3686 sec/batch
Epoch 3/20  Iteration 448/3560 Training loss: 1.9841 2.3208 sec/batch
Epoch 3/20  Iteration 449/3560 Training loss: 1.9831 2.3368 sec/batch
Epoch 3/20  Iteration 450/3560 Training loss: 1.9822 2.3561 sec/batch
Epoch 3/20  Iteration 451/3560 Training loss: 1.9813 2.3792 sec/batch
Epoch 3/20  Iteration 452/3560 Training loss: 1.9807 2.3167 sec/batch
Epoch 3/20  Iteration 453/3560 Training loss: 1.9802 2.2921 sec/batch
Epoch 3/20  Iteration 454/3560 Training loss: 1.9795 2.3408 sec/batch
Epoch 3/20  Iteration 455/3560 Training loss: 1.9786 2.0242 sec/batch
Epoch 3/20  Iteration 456/3560 Training loss: 1.9777 2.1606 sec/batch
Epoch 3/20  Iteration 457/3560 Training loss: 1.9771 2.4738 sec/batch
Epoch 3/20  Iteration 458/3560 Training loss: 1.9765 2.4017 sec/batch
Epoch 3/20  Iteration 459/3560 Training loss: 1.9757 2.1815 sec/batch
Epoch 3/20  Iteration 460/3560 Training loss: 1.9750 2.1249 sec/batch
Epoch 3/20  Iteration 461/3560 Training loss: 1.9743 2.6794 sec/batch
Epoch 3/20  Iteration 462/3560 Training loss: 1.9738 2.2267 sec/batch
Epoch 3/20  Iteration 463/3560 Training loss: 1.9731 2.1121 sec/batch
Epoch 3/20  Iteration 464/3560 Training loss: 1.9726 2.0821 sec/batch
Epoch 3/20  Iteration 465/3560 Training loss: 1.9722 2.0644 sec/batch
Epoch 3/20  Iteration 466/3560 Training loss: 1.9716 2.3043 sec/batch
Epoch 3/20  Iteration 467/3560 Training loss: 1.9710 2.2843 sec/batch
Epoch 3/20  Iteration 468/3560 Training loss: 1.9703 2.1788 sec/batch
Epoch 3/20  Iteration 469/3560 Training loss: 1.9697 2.0658 sec/batch
Epoch 3/20  Iteration 470/3560 Training loss: 1.9691 2.0902 sec/batch
Epoch 3/20  Iteration 471/3560 Training loss: 1.9684 2.1381 sec/batch
Epoch 3/20  Iteration 472/3560 Training loss: 1.9675 2.1426 sec/batch
Epoch 3/20  Iteration 473/3560 Training loss: 1.9669 2.0798 sec/batch
Epoch 3/20  Iteration 474/3560 Training loss: 1.9663 2.0680 sec/batch
Epoch 3/20  Iteration 475/3560 Training loss: 1.9658 2.1121 sec/batch
Epoch 3/20  Iteration 476/3560 Training loss: 1.9652 2.1019 sec/batch
Epoch 3/20  Iteration 477/3560 Training loss: 1.9647 2.0811 sec/batch
Epoch 3/20  Iteration 478/3560 Training loss: 1.9640 2.0782 sec/batch
Epoch 3/20  Iteration 479/3560 Training loss: 1.9634 2.1339 sec/batch
Epoch 3/20  Iteration 480/3560 Training loss: 1.9630 2.0719 sec/batch
Epoch 3/20  Iteration 481/3560 Training loss: 1.9624 2.0749 sec/batch
Epoch 3/20  Iteration 482/3560 Training loss: 1.9616 2.0484 sec/batch
Epoch 3/20  Iteration 483/3560 Training loss: 1.9613 2.1025 sec/batch
Epoch 3/20  Iteration 484/3560 Training loss: 1.9609 2.0743 sec/batch
Epoch 3/20  Iteration 485/3560 Training loss: 1.9603 2.0416 sec/batch
Epoch 3/20  Iteration 486/3560 Training loss: 1.9598 2.0841 sec/batch
Epoch 3/20  Iteration 487/3560 Training loss: 1.9591 2.0924 sec/batch
Epoch 3/20  Iteration 488/3560 Training loss: 1.9584 2.1950 sec/batch
Epoch 3/20  Iteration 489/3560 Training loss: 1.9580 2.6732 sec/batch
Epoch 3/20  Iteration 490/3560 Training loss: 1.9576 2.5017 sec/batch
Epoch 3/20  Iteration 491/3560 Training loss: 1.9570 2.6341 sec/batch
Epoch 3/20  Iteration 492/3560 Training loss: 1.9566 2.7562 sec/batch
Epoch 3/20  Iteration 493/3560 Training loss: 1.9562 2.6080 sec/batch
Epoch 3/20  Iteration 494/3560 Training loss: 1.9558 2.8330 sec/batch
Epoch 3/20  Iteration 495/3560 Training loss: 1.9555 2.2752 sec/batch
Epoch 3/20  Iteration 496/3560 Training loss: 1.9550 2.6307 sec/batch
Epoch 3/20  Iteration 497/3560 Training loss: 1.9547 2.3401 sec/batch
Epoch 3/20  Iteration 498/3560 Training loss: 1.9542 2.0896 sec/batch
Epoch 3/20  Iteration 499/3560 Training loss: 1.9536 2.3051 sec/batch
Epoch 3/20  Iteration 500/3560 Training loss: 1.9532 2.5983 sec/batch
Epoch 3/20  Iteration 501/3560 Training loss: 1.9527 2.0884 sec/batch
Epoch 3/20  Iteration 502/3560 Training loss: 1.9522 2.2911 sec/batch
Epoch 3/20  Iteration 503/3560 Training loss: 1.9518 2.2701 sec/batch
Epoch 3/20  Iteration 504/3560 Training loss: 1.9515 2.0693 sec/batch
Epoch 3/20  Iteration 505/3560 Training loss: 1.9510 2.0877 sec/batch
Epoch 3/20  Iteration 506/3560 Training loss: 1.9504 2.1042 sec/batch
Epoch 3/20  Iteration 507/3560 Training loss: 1.9499 2.0617 sec/batch
Epoch 3/20  Iteration 508/3560 Training loss: 1.9497 2.0852 sec/batch
Epoch 3/20  Iteration 509/3560 Training loss: 1.9493 2.0616 sec/batch
Epoch 3/20  Iteration 510/3560 Training loss: 1.9489 2.1233 sec/batch
Epoch 3/20  Iteration 511/3560 Training loss: 1.9484 2.0974 sec/batch
Epoch 3/20  Iteration 512/3560 Training loss: 1.9479 2.0734 sec/batch
Epoch 3/20  Iteration 513/3560 Training loss: 1.9475 2.0380 sec/batch
Epoch 3/20  Iteration 514/3560 Training loss: 1.9470 2.0998 sec/batch
Epoch 3/20  Iteration 515/3560 Training loss: 1.9464 2.0745 sec/batch
Epoch 3/20  Iteration 516/3560 Training loss: 1.9461 2.0685 sec/batch
Epoch 3/20  Iteration 517/3560 Training loss: 1.9458 2.0693 sec/batch
Epoch 3/20  Iteration 518/3560 Training loss: 1.9453 2.1210 sec/batch
Epoch 3/20  Iteration 519/3560 Training loss: 1.9450 2.1213 sec/batch
Epoch 3/20  Iteration 520/3560 Training loss: 1.9445 2.3835 sec/batch
Epoch 3/20  Iteration 521/3560 Training loss: 1.9441 2.1372 sec/batch
Epoch 3/20  Iteration 522/3560 Training loss: 1.9436 2.0121 sec/batch
Epoch 3/20  Iteration 523/3560 Training loss: 1.9432 1.9541 sec/batch
Epoch 3/20  Iteration 524/3560 Training loss: 1.9431 2.0021 sec/batch
Epoch 3/20  Iteration 525/3560 Training loss: 1.9426 2.3371 sec/batch
Epoch 3/20  Iteration 526/3560 Training loss: 1.9421 2.6328 sec/batch
Epoch 3/20  Iteration 527/3560 Training loss: 1.9416 2.3398 sec/batch
Epoch 3/20  Iteration 528/3560 Training loss: 1.9411 2.2219 sec/batch
Epoch 3/20  Iteration 529/3560 Training loss: 1.9407 2.1008 sec/batch
Epoch 3/20  Iteration 530/3560 Training loss: 1.9403 2.7019 sec/batch
Epoch 3/20  Iteration 531/3560 Training loss: 1.9400 2.1373 sec/batch
Epoch 3/20  Iteration 532/3560 Training loss: 1.9395 2.0659 sec/batch
Epoch 3/20  Iteration 533/3560 Training loss: 1.9390 2.0960 sec/batch
Epoch 3/20  Iteration 534/3560 Training loss: 1.9385 2.0756 sec/batch
Epoch 4/20  Iteration 535/3560 Training loss: 1.9278 2.0371 sec/batch
Epoch 4/20  Iteration 536/3560 Training loss: 1.8878 2.0314 sec/batch
Epoch 4/20  Iteration 537/3560 Training loss: 1.8714 2.1155 sec/batch
Epoch 4/20  Iteration 538/3560 Training loss: 1.8646 2.0776 sec/batch
Epoch 4/20  Iteration 539/3560 Training loss: 1.8626 2.1177 sec/batch
Epoch 4/20  Iteration 540/3560 Training loss: 1.8541 2.1115 sec/batch
Epoch 4/20  Iteration 541/3560 Training loss: 1.8554 2.0993 sec/batch
Epoch 4/20  Iteration 542/3560 Training loss: 1.8532 2.1013 sec/batch
Epoch 4/20  Iteration 543/3560 Training loss: 1.8559 2.0399 sec/batch
Epoch 4/20  Iteration 544/3560 Training loss: 1.8549 2.1020 sec/batch
Epoch 4/20  Iteration 545/3560 Training loss: 1.8507 2.0808 sec/batch
Epoch 4/20  Iteration 546/3560 Training loss: 1.8486 2.0810 sec/batch
Epoch 4/20  Iteration 547/3560 Training loss: 1.8493 2.0930 sec/batch
Epoch 4/20  Iteration 548/3560 Training loss: 1.8521 2.1565 sec/batch
Epoch 4/20  Iteration 549/3560 Training loss: 1.8510 2.2261 sec/batch
Epoch 4/20  Iteration 550/3560 Training loss: 1.8494 2.1253 sec/batch
Epoch 4/20  Iteration 551/3560 Training loss: 1.8487 2.1331 sec/batch
Epoch 4/20  Iteration 552/3560 Training loss: 1.8504 2.1234 sec/batch
Epoch 4/20  Iteration 553/3560 Training loss: 1.8498 2.0909 sec/batch
Epoch 4/20  Iteration 554/3560 Training loss: 1.8501 2.1015 sec/batch
Epoch 4/20  Iteration 555/3560 Training loss: 1.8494 2.0870 sec/batch
Epoch 4/20  Iteration 556/3560 Training loss: 1.8504 2.1690 sec/batch
Epoch 4/20  Iteration 557/3560 Training loss: 1.8498 2.0659 sec/batch
Epoch 4/20  Iteration 558/3560 Training loss: 1.8486 2.0945 sec/batch
Epoch 4/20  Iteration 559/3560 Training loss: 1.8479 2.1097 sec/batch
Epoch 4/20  Iteration 560/3560 Training loss: 1.8469 2.0990 sec/batch
Epoch 4/20  Iteration 561/3560 Training loss: 1.8456 2.0865 sec/batch
Epoch 4/20  Iteration 562/3560 Training loss: 1.8453 2.0324 sec/batch
Epoch 4/20  Iteration 563/3560 Training loss: 1.8456 2.0853 sec/batch
Epoch 4/20  Iteration 564/3560 Training loss: 1.8455 2.1134 sec/batch
Epoch 4/20  Iteration 565/3560 Training loss: 1.8448 2.0542 sec/batch
Epoch 4/20  Iteration 566/3560 Training loss: 1.8436 2.1089 sec/batch
Epoch 4/20  Iteration 567/3560 Training loss: 1.8434 2.1190 sec/batch
Epoch 4/20  Iteration 568/3560 Training loss: 1.8435 2.1017 sec/batch
Epoch 4/20  Iteration 569/3560 Training loss: 1.8428 2.0348 sec/batch
Epoch 4/20  Iteration 570/3560 Training loss: 1.8422 2.0681 sec/batch
Epoch 4/20  Iteration 571/3560 Training loss: 1.8414 2.1266 sec/batch
Epoch 4/20  Iteration 572/3560 Training loss: 1.8397 2.0391 sec/batch
Epoch 4/20  Iteration 573/3560 Training loss: 1.8382 2.0844 sec/batch
Epoch 4/20  Iteration 574/3560 Training loss: 1.8372 2.0787 sec/batch
Epoch 4/20  Iteration 575/3560 Training loss: 1.8363 2.1315 sec/batch
Epoch 4/20  Iteration 576/3560 Training loss: 1.8362 2.0699 sec/batch
Epoch 4/20  Iteration 577/3560 Training loss: 1.8353 2.0855 sec/batch
Epoch 4/20  Iteration 578/3560 Training loss: 1.8341 2.1985 sec/batch
Epoch 4/20  Iteration 579/3560 Training loss: 1.8340 2.1276 sec/batch
Epoch 4/20  Iteration 580/3560 Training loss: 1.8325 2.0884 sec/batch
Epoch 4/20  Iteration 581/3560 Training loss: 1.8322 2.0960 sec/batch
Epoch 4/20  Iteration 582/3560 Training loss: 1.8314 2.0845 sec/batch
Epoch 4/20  Iteration 583/3560 Training loss: 1.8308 2.0725 sec/batch
Epoch 4/20  Iteration 584/3560 Training loss: 1.8312 2.0739 sec/batch
Epoch 4/20  Iteration 585/3560 Training loss: 1.8304 2.0533 sec/batch
Epoch 4/20  Iteration 586/3560 Training loss: 1.8310 2.1342 sec/batch
Epoch 4/20  Iteration 587/3560 Training loss: 1.8303 2.0793 sec/batch
Epoch 4/20  Iteration 588/3560 Training loss: 1.8300 2.1059 sec/batch
Epoch 4/20  Iteration 589/3560 Training loss: 1.8295 2.0908 sec/batch
Epoch 4/20  Iteration 590/3560 Training loss: 1.8292 2.0783 sec/batch
Epoch 4/20  Iteration 591/3560 Training loss: 1.8291 2.1132 sec/batch
Epoch 4/20  Iteration 592/3560 Training loss: 1.8285 2.1135 sec/batch
Epoch 4/20  Iteration 593/3560 Training loss: 1.8278 2.1097 sec/batch
Epoch 4/20  Iteration 594/3560 Training loss: 1.8280 2.1376 sec/batch
Epoch 4/20  Iteration 595/3560 Training loss: 1.8275 2.0990 sec/batch
Epoch 4/20  Iteration 596/3560 Training loss: 1.8279 2.1029 sec/batch
Epoch 4/20  Iteration 597/3560 Training loss: 1.8280 2.0730 sec/batch
Epoch 4/20  Iteration 598/3560 Training loss: 1.8279 1.9605 sec/batch
Epoch 4/20  Iteration 599/3560 Training loss: 1.8275 2.0559 sec/batch
Epoch 4/20  Iteration 600/3560 Training loss: 1.8277 1.9930 sec/batch
Epoch 4/20  Iteration 601/3560 Training loss: 1.8275 1.8897 sec/batch
Epoch 4/20  Iteration 602/3560 Training loss: 1.8268 2.0313 sec/batch
Epoch 4/20  Iteration 603/3560 Training loss: 1.8264 2.2886 sec/batch
Epoch 4/20  Iteration 604/3560 Training loss: 1.8259 2.0692 sec/batch
Epoch 4/20  Iteration 605/3560 Training loss: 1.8260 1.9246 sec/batch
Epoch 4/20  Iteration 606/3560 Training loss: 1.8260 1.7814 sec/batch
Epoch 4/20  Iteration 607/3560 Training loss: 1.8261 1.9601 sec/batch
Epoch 4/20  Iteration 608/3560 Training loss: 1.8255 1.7771 sec/batch
Epoch 4/20  Iteration 609/3560 Training loss: 1.8252 1.7042 sec/batch
Epoch 4/20  Iteration 610/3560 Training loss: 1.8253 1.7426 sec/batch
Epoch 4/20  Iteration 611/3560 Training loss: 1.8250 1.7298 sec/batch
Epoch 4/20  Iteration 612/3560 Training loss: 1.8247 1.7378 sec/batch
Epoch 4/20  Iteration 613/3560 Training loss: 1.8240 1.7263 sec/batch
Epoch 4/20  Iteration 614/3560 Training loss: 1.8236 1.7215 sec/batch
Epoch 4/20  Iteration 615/3560 Training loss: 1.8229 1.7534 sec/batch
Epoch 4/20  Iteration 616/3560 Training loss: 1.8227 1.7949 sec/batch
Epoch 4/20  Iteration 617/3560 Training loss: 1.8220 1.8643 sec/batch
Epoch 4/20  Iteration 618/3560 Training loss: 1.8216 1.7448 sec/batch
Epoch 4/20  Iteration 619/3560 Training loss: 1.8210 1.9951 sec/batch
Epoch 4/20  Iteration 620/3560 Training loss: 1.8204 2.1944 sec/batch
Epoch 4/20  Iteration 621/3560 Training loss: 1.8200 1.7463 sec/batch
Epoch 4/20  Iteration 622/3560 Training loss: 1.8194 1.7523 sec/batch
Epoch 4/20  Iteration 623/3560 Training loss: 1.8186 1.9128 sec/batch
Epoch 4/20  Iteration 624/3560 Training loss: 1.8184 1.7446 sec/batch
Epoch 4/20  Iteration 625/3560 Training loss: 1.8177 2.2969 sec/batch
Epoch 4/20  Iteration 626/3560 Training loss: 1.8173 1.8338 sec/batch
Epoch 4/20  Iteration 627/3560 Training loss: 1.8166 1.8443 sec/batch
Epoch 4/20  Iteration 628/3560 Training loss: 1.8160 1.9442 sec/batch
Epoch 4/20  Iteration 629/3560 Training loss: 1.8154 1.7104 sec/batch
Epoch 4/20  Iteration 630/3560 Training loss: 1.8151 1.7411 sec/batch
Epoch 4/20  Iteration 631/3560 Training loss: 1.8147 1.7413 sec/batch
Epoch 4/20  Iteration 632/3560 Training loss: 1.8140 1.7179 sec/batch
Epoch 4/20  Iteration 633/3560 Training loss: 1.8133 1.7926 sec/batch
Epoch 4/20  Iteration 634/3560 Training loss: 1.8124 2.0084 sec/batch
Epoch 4/20  Iteration 635/3560 Training loss: 1.8120 1.9011 sec/batch
Epoch 4/20  Iteration 636/3560 Training loss: 1.8117 1.7409 sec/batch
Epoch 4/20  Iteration 637/3560 Training loss: 1.8113 1.6967 sec/batch
Epoch 4/20  Iteration 638/3560 Training loss: 1.8108 1.7060 sec/batch
Epoch 4/20  Iteration 639/3560 Training loss: 1.8103 1.7560 sec/batch
Epoch 4/20  Iteration 640/3560 Training loss: 1.8099 2.0940 sec/batch
Epoch 4/20  Iteration 641/3560 Training loss: 1.8095 2.3291 sec/batch
Epoch 4/20  Iteration 642/3560 Training loss: 1.8092 2.2242 sec/batch
Epoch 4/20  Iteration 643/3560 Training loss: 1.8090 1.7646 sec/batch
Epoch 4/20  Iteration 644/3560 Training loss: 1.8087 1.7417 sec/batch
Epoch 4/20  Iteration 645/3560 Training loss: 1.8084 1.9045 sec/batch
Epoch 4/20  Iteration 646/3560 Training loss: 1.8080 1.8575 sec/batch
Epoch 4/20  Iteration 647/3560 Training loss: 1.8076 2.0962 sec/batch
Epoch 4/20  Iteration 648/3560 Training loss: 1.8072 1.8689 sec/batch
Epoch 4/20  Iteration 649/3560 Training loss: 1.8067 1.8960 sec/batch
Epoch 4/20  Iteration 650/3560 Training loss: 1.8061 1.9861 sec/batch
Epoch 4/20  Iteration 651/3560 Training loss: 1.8059 1.8151 sec/batch
Epoch 4/20  Iteration 652/3560 Training loss: 1.8055 1.7425 sec/batch
Epoch 4/20  Iteration 653/3560 Training loss: 1.8051 1.9058 sec/batch
Epoch 4/20  Iteration 654/3560 Training loss: 1.8048 1.7436 sec/batch
Epoch 4/20  Iteration 655/3560 Training loss: 1.8045 1.7320 sec/batch
Epoch 4/20  Iteration 656/3560 Training loss: 1.8039 1.7302 sec/batch
Epoch 4/20  Iteration 657/3560 Training loss: 1.8034 1.8399 sec/batch
Epoch 4/20  Iteration 658/3560 Training loss: 1.8032 1.7651 sec/batch
Epoch 4/20  Iteration 659/3560 Training loss: 1.8029 1.7200 sec/batch
Epoch 4/20  Iteration 660/3560 Training loss: 1.8023 1.6969 sec/batch
Epoch 4/20  Iteration 661/3560 Training loss: 1.8022 1.7325 sec/batch
Epoch 4/20  Iteration 662/3560 Training loss: 1.8020 1.7393 sec/batch
Epoch 4/20  Iteration 663/3560 Training loss: 1.8017 1.7328 sec/batch
Epoch 4/20  Iteration 664/3560 Training loss: 1.8014 1.7424 sec/batch
Epoch 4/20  Iteration 665/3560 Training loss: 1.8008 1.7460 sec/batch
Epoch 4/20  Iteration 666/3560 Training loss: 1.8003 1.7083 sec/batch
Epoch 4/20  Iteration 667/3560 Training loss: 1.8001 1.7326 sec/batch
Epoch 4/20  Iteration 668/3560 Training loss: 1.7998 1.7512 sec/batch
Epoch 4/20  Iteration 669/3560 Training loss: 1.7995 1.7404 sec/batch
Epoch 4/20  Iteration 670/3560 Training loss: 1.7993 1.7266 sec/batch
Epoch 4/20  Iteration 671/3560 Training loss: 1.7991 1.7009 sec/batch
Epoch 4/20  Iteration 672/3560 Training loss: 1.7989 1.8788 sec/batch
Epoch 4/20  Iteration 673/3560 Training loss: 1.7987 1.7853 sec/batch
Epoch 4/20  Iteration 674/3560 Training loss: 1.7984 1.7008 sec/batch
Epoch 4/20  Iteration 675/3560 Training loss: 1.7983 1.7321 sec/batch
Epoch 4/20  Iteration 676/3560 Training loss: 1.7980 1.9323 sec/batch
Epoch 4/20  Iteration 677/3560 Training loss: 1.7977 1.7486 sec/batch
Epoch 4/20  Iteration 678/3560 Training loss: 1.7974 1.7981 sec/batch
Epoch 4/20  Iteration 679/3560 Training loss: 1.7971 1.7436 sec/batch
Epoch 4/20  Iteration 680/3560 Training loss: 1.7969 1.7473 sec/batch
Epoch 4/20  Iteration 681/3560 Training loss: 1.7966 1.9091 sec/batch
Epoch 4/20  Iteration 682/3560 Training loss: 1.7965 1.7415 sec/batch
Epoch 4/20  Iteration 683/3560 Training loss: 1.7963 1.7340 sec/batch
Epoch 4/20  Iteration 684/3560 Training loss: 1.7959 1.7314 sec/batch
Epoch 4/20  Iteration 685/3560 Training loss: 1.7954 1.7320 sec/batch
Epoch 4/20  Iteration 686/3560 Training loss: 1.7954 1.6958 sec/batch
Epoch 4/20  Iteration 687/3560 Training loss: 1.7952 1.6976 sec/batch
Epoch 4/20  Iteration 688/3560 Training loss: 1.7949 1.7221 sec/batch
Epoch 4/20  Iteration 689/3560 Training loss: 1.7946 1.7130 sec/batch
Epoch 4/20  Iteration 690/3560 Training loss: 1.7943 1.7058 sec/batch
Epoch 4/20  Iteration 691/3560 Training loss: 1.7942 1.7000 sec/batch
Epoch 4/20  Iteration 692/3560 Training loss: 1.7939 1.7148 sec/batch
Epoch 4/20  Iteration 693/3560 Training loss: 1.7934 1.7298 sec/batch
Epoch 4/20  Iteration 694/3560 Training loss: 1.7933 1.6967 sec/batch
Epoch 4/20  Iteration 695/3560 Training loss: 1.7932 1.7350 sec/batch
Epoch 4/20  Iteration 696/3560 Training loss: 1.7930 1.8018 sec/batch
Epoch 4/20  Iteration 697/3560 Training loss: 1.7928 1.7511 sec/batch
Epoch 4/20  Iteration 698/3560 Training loss: 1.7925 1.9695 sec/batch
Epoch 4/20  Iteration 699/3560 Training loss: 1.7922 1.9260 sec/batch
Epoch 4/20  Iteration 700/3560 Training loss: 1.7919 2.0218 sec/batch
Epoch 4/20  Iteration 701/3560 Training loss: 1.7918 1.8352 sec/batch
Epoch 4/20  Iteration 702/3560 Training loss: 1.7919 1.7730 sec/batch
Epoch 4/20  Iteration 703/3560 Training loss: 1.7916 1.7341 sec/batch
Epoch 4/20  Iteration 704/3560 Training loss: 1.7913 1.7269 sec/batch
Epoch 4/20  Iteration 705/3560 Training loss: 1.7910 1.9329 sec/batch
Epoch 4/20  Iteration 706/3560 Training loss: 1.7906 1.7564 sec/batch
Epoch 4/20  Iteration 707/3560 Training loss: 1.7905 1.7242 sec/batch
Epoch 4/20  Iteration 708/3560 Training loss: 1.7902 1.6916 sec/batch
Epoch 4/20  Iteration 709/3560 Training loss: 1.7901 1.7405 sec/batch
Epoch 4/20  Iteration 710/3560 Training loss: 1.7898 1.6942 sec/batch
Epoch 4/20  Iteration 711/3560 Training loss: 1.7895 1.7268 sec/batch
Epoch 4/20  Iteration 712/3560 Training loss: 1.7894 1.7021 sec/batch
Epoch 5/20  Iteration 713/3560 Training loss: 1.8174 1.7271 sec/batch
Epoch 5/20  Iteration 714/3560 Training loss: 1.7743 1.7450 sec/batch
Epoch 5/20  Iteration 715/3560 Training loss: 1.7619 1.7043 sec/batch
Epoch 5/20  Iteration 716/3560 Training loss: 1.7564 1.7076 sec/batch
Epoch 5/20  Iteration 717/3560 Training loss: 1.7536 1.6979 sec/batch
Epoch 5/20  Iteration 718/3560 Training loss: 1.7422 1.7293 sec/batch
Epoch 5/20  Iteration 719/3560 Training loss: 1.7423 1.7000 sec/batch
Epoch 5/20  Iteration 720/3560 Training loss: 1.7396 1.7332 sec/batch
Epoch 5/20  Iteration 721/3560 Training loss: 1.7419 1.7243 sec/batch
Epoch 5/20  Iteration 722/3560 Training loss: 1.7413 1.9835 sec/batch
Epoch 5/20  Iteration 723/3560 Training loss: 1.7372 2.0545 sec/batch
Epoch 5/20  Iteration 724/3560 Training loss: 1.7354 1.7508 sec/batch
Epoch 5/20  Iteration 725/3560 Training loss: 1.7349 1.7206 sec/batch
Epoch 5/20  Iteration 726/3560 Training loss: 1.7374 1.7088 sec/batch
Epoch 5/20  Iteration 727/3560 Training loss: 1.7359 1.6825 sec/batch
Epoch 5/20  Iteration 728/3560 Training loss: 1.7345 1.7240 sec/batch
Epoch 5/20  Iteration 729/3560 Training loss: 1.7341 1.7352 sec/batch
Epoch 5/20  Iteration 730/3560 Training loss: 1.7356 1.7250 sec/batch
Epoch 5/20  Iteration 731/3560 Training loss: 1.7353 1.7445 sec/batch
Epoch 5/20  Iteration 732/3560 Training loss: 1.7355 1.6976 sec/batch
Epoch 5/20  Iteration 733/3560 Training loss: 1.7349 1.7384 sec/batch
Epoch 5/20  Iteration 734/3560 Training loss: 1.7361 1.7106 sec/batch
Epoch 5/20  Iteration 735/3560 Training loss: 1.7350 1.7017 sec/batch
Epoch 5/20  Iteration 736/3560 Training loss: 1.7341 1.6973 sec/batch
Epoch 5/20  Iteration 737/3560 Training loss: 1.7339 1.7321 sec/batch
Epoch 5/20  Iteration 738/3560 Training loss: 1.7325 1.7264 sec/batch
Epoch 5/20  Iteration 739/3560 Training loss: 1.7313 1.8253 sec/batch
Epoch 5/20  Iteration 740/3560 Training loss: 1.7315 1.7624 sec/batch
Epoch 5/20  Iteration 741/3560 Training loss: 1.7323 1.7564 sec/batch
Epoch 5/20  Iteration 742/3560 Training loss: 1.7324 1.6938 sec/batch
Epoch 5/20  Iteration 743/3560 Training loss: 1.7324 1.7322 sec/batch
Epoch 5/20  Iteration 744/3560 Training loss: 1.7315 1.6985 sec/batch
Epoch 5/20  Iteration 745/3560 Training loss: 1.7317 1.9108 sec/batch
Epoch 5/20  Iteration 746/3560 Training loss: 1.7319 1.6999 sec/batch
Epoch 5/20  Iteration 747/3560 Training loss: 1.7314 1.7285 sec/batch
Epoch 5/20  Iteration 748/3560 Training loss: 1.7311 1.6995 sec/batch
Epoch 5/20  Iteration 749/3560 Training loss: 1.7300 1.7074 sec/batch
Epoch 5/20  Iteration 750/3560 Training loss: 1.7287 1.7342 sec/batch
Epoch 5/20  Iteration 751/3560 Training loss: 1.7272 1.7333 sec/batch
Epoch 5/20  Iteration 752/3560 Training loss: 1.7263 1.7480 sec/batch
Epoch 5/20  Iteration 753/3560 Training loss: 1.7256 1.7051 sec/batch
Epoch 5/20  Iteration 754/3560 Training loss: 1.7257 1.7346 sec/batch
Epoch 5/20  Iteration 755/3560 Training loss: 1.7249 1.7231 sec/batch
Epoch 5/20  Iteration 756/3560 Training loss: 1.7239 1.9428 sec/batch
Epoch 5/20  Iteration 757/3560 Training loss: 1.7240 1.8564 sec/batch
Epoch 5/20  Iteration 758/3560 Training loss: 1.7228 1.7244 sec/batch
Epoch 5/20  Iteration 759/3560 Training loss: 1.7222 1.7338 sec/batch
Epoch 5/20  Iteration 760/3560 Training loss: 1.7216 1.7039 sec/batch
Epoch 5/20  Iteration 761/3560 Training loss: 1.7212 1.7033 sec/batch
Epoch 5/20  Iteration 762/3560 Training loss: 1.7218 1.7356 sec/batch
Epoch 5/20  Iteration 763/3560 Training loss: 1.7212 1.7168 sec/batch
Epoch 5/20  Iteration 764/3560 Training loss: 1.7218 1.6928 sec/batch
Epoch 5/20  Iteration 765/3560 Training loss: 1.7212 1.7236 sec/batch
Epoch 5/20  Iteration 766/3560 Training loss: 1.7208 1.7040 sec/batch
Epoch 5/20  Iteration 767/3560 Training loss: 1.7204 1.7330 sec/batch
Epoch 5/20  Iteration 768/3560 Training loss: 1.7204 1.7022 sec/batch
Epoch 5/20  Iteration 769/3560 Training loss: 1.7205 1.7116 sec/batch
Epoch 5/20  Iteration 770/3560 Training loss: 1.7200 1.7468 sec/batch
Epoch 5/20  Iteration 771/3560 Training loss: 1.7193 1.7118 sec/batch
Epoch 5/20  Iteration 772/3560 Training loss: 1.7197 1.7167 sec/batch
Epoch 5/20  Iteration 773/3560 Training loss: 1.7195 1.7141 sec/batch
Epoch 5/20  Iteration 774/3560 Training loss: 1.7200 1.9983 sec/batch
Epoch 5/20  Iteration 775/3560 Training loss: 1.7202 1.8622 sec/batch
Epoch 5/20  Iteration 776/3560 Training loss: 1.7201 1.7447 sec/batch
Epoch 5/20  Iteration 777/3560 Training loss: 1.7198 1.7404 sec/batch
Epoch 5/20  Iteration 778/3560 Training loss: 1.7200 2.0361 sec/batch
Epoch 5/20  Iteration 779/3560 Training loss: 1.7201 1.9016 sec/batch
Epoch 5/20  Iteration 780/3560 Training loss: 1.7196 1.7407 sec/batch
Epoch 5/20  Iteration 781/3560 Training loss: 1.7194 1.7349 sec/batch
Epoch 5/20  Iteration 782/3560 Training loss: 1.7191 1.8708 sec/batch
Epoch 5/20  Iteration 783/3560 Training loss: 1.7194 1.9687 sec/batch
Epoch 5/20  Iteration 784/3560 Training loss: 1.7194 1.9228 sec/batch
Epoch 5/20  Iteration 785/3560 Training loss: 1.7196 1.9739 sec/batch
Epoch 5/20  Iteration 786/3560 Training loss: 1.7192 1.8190 sec/batch
Epoch 5/20  Iteration 787/3560 Training loss: 1.7189 1.7367 sec/batch
Epoch 5/20  Iteration 788/3560 Training loss: 1.7190 1.8349 sec/batch
Epoch 5/20  Iteration 789/3560 Training loss: 1.7187 1.7328 sec/batch
Epoch 5/20  Iteration 790/3560 Training loss: 1.7186 1.7436 sec/batch
Epoch 5/20  Iteration 791/3560 Training loss: 1.7180 1.7041 sec/batch
Epoch 5/20  Iteration 792/3560 Training loss: 1.7177 1.7142 sec/batch
Epoch 5/20  Iteration 793/3560 Training loss: 1.7169 2.0572 sec/batch
Epoch 5/20  Iteration 794/3560 Training loss: 1.7170 1.9024 sec/batch
Epoch 5/20  Iteration 795/3560 Training loss: 1.7164 1.7157 sec/batch
Epoch 5/20  Iteration 796/3560 Training loss: 1.7162 1.7389 sec/batch
Epoch 5/20  Iteration 797/3560 Training loss: 1.7157 1.7145 sec/batch
Epoch 5/20  Iteration 798/3560 Training loss: 1.7152 1.8824 sec/batch
Epoch 5/20  Iteration 799/3560 Training loss: 1.7149 1.7821 sec/batch
Epoch 5/20  Iteration 800/3560 Training loss: 1.7145 1.7192 sec/batch
Epoch 5/20  Iteration 801/3560 Training loss: 1.7138 1.7194 sec/batch
Epoch 5/20  Iteration 802/3560 Training loss: 1.7138 1.7234 sec/batch
Epoch 5/20  Iteration 803/3560 Training loss: 1.7133 1.7024 sec/batch
Epoch 5/20  Iteration 804/3560 Training loss: 1.7130 2.1381 sec/batch
Epoch 5/20  Iteration 805/3560 Training loss: 1.7124 1.7866 sec/batch
Epoch 5/20  Iteration 806/3560 Training loss: 1.7120 1.7321 sec/batch
Epoch 5/20  Iteration 807/3560 Training loss: 1.7116 1.7957 sec/batch
Epoch 5/20  Iteration 808/3560 Training loss: 1.7114 1.8002 sec/batch
Epoch 5/20  Iteration 809/3560 Training loss: 1.7110 1.7615 sec/batch
Epoch 5/20  Iteration 810/3560 Training loss: 1.7105 1.7046 sec/batch
Epoch 5/20  Iteration 811/3560 Training loss: 1.7100 1.7266 sec/batch
Epoch 5/20  Iteration 812/3560 Training loss: 1.7094 1.7016 sec/batch
Epoch 5/20  Iteration 813/3560 Training loss: 1.7092 1.7520 sec/batch
Epoch 5/20  Iteration 814/3560 Training loss: 1.7089 1.7274 sec/batch
Epoch 5/20  Iteration 815/3560 Training loss: 1.7086 1.7319 sec/batch
Epoch 5/20  Iteration 816/3560 Training loss: 1.7082 1.7546 sec/batch
Epoch 5/20  Iteration 817/3560 Training loss: 1.7079 1.7377 sec/batch
Epoch 5/20  Iteration 818/3560 Training loss: 1.7076 1.7440 sec/batch
Epoch 5/20  Iteration 819/3560 Training loss: 1.7073 1.9875 sec/batch
Epoch 5/20  Iteration 820/3560 Training loss: 1.7071 1.8946 sec/batch
Epoch 5/20  Iteration 821/3560 Training loss: 1.7069 1.7413 sec/batch
Epoch 5/20  Iteration 822/3560 Training loss: 1.7067 1.7326 sec/batch
Epoch 5/20  Iteration 823/3560 Training loss: 1.7064 1.8083 sec/batch
Epoch 5/20  Iteration 824/3560 Training loss: 1.7062 1.9450 sec/batch
Epoch 5/20  Iteration 825/3560 Training loss: 1.7059 1.9183 sec/batch
Epoch 5/20  Iteration 826/3560 Training loss: 1.7056 1.7251 sec/batch
Epoch 5/20  Iteration 827/3560 Training loss: 1.7053 1.7167 sec/batch
Epoch 5/20  Iteration 828/3560 Training loss: 1.7048 1.6976 sec/batch
Epoch 5/20  Iteration 829/3560 Training loss: 1.7046 2.0621 sec/batch
Epoch 5/20  Iteration 830/3560 Training loss: 1.7044 1.8127 sec/batch
Epoch 5/20  Iteration 831/3560 Training loss: 1.7041 1.7091 sec/batch
Epoch 5/20  Iteration 832/3560 Training loss: 1.7039 1.7470 sec/batch
Epoch 5/20  Iteration 833/3560 Training loss: 1.7036 1.7152 sec/batch
Epoch 5/20  Iteration 834/3560 Training loss: 1.7032 1.9827 sec/batch
Epoch 5/20  Iteration 835/3560 Training loss: 1.7027 1.8064 sec/batch
Epoch 5/20  Iteration 836/3560 Training loss: 1.7026 1.7401 sec/batch
Epoch 5/20  Iteration 837/3560 Training loss: 1.7024 1.7165 sec/batch
Epoch 5/20  Iteration 838/3560 Training loss: 1.7018 1.7335 sec/batch
Epoch 5/20  Iteration 839/3560 Training loss: 1.7019 1.7332 sec/batch
Epoch 5/20  Iteration 840/3560 Training loss: 1.7017 2.0255 sec/batch
Epoch 5/20  Iteration 841/3560 Training loss: 1.7015 2.0244 sec/batch
Epoch 5/20  Iteration 842/3560 Training loss: 1.7012 1.9495 sec/batch
Epoch 5/20  Iteration 843/3560 Training loss: 1.7008 1.8743 sec/batch
Epoch 5/20  Iteration 844/3560 Training loss: 1.7004 1.7608 sec/batch
Epoch 5/20  Iteration 845/3560 Training loss: 1.7003 1.9279 sec/batch
Epoch 5/20  Iteration 846/3560 Training loss: 1.7002 1.9551 sec/batch
Epoch 5/20  Iteration 847/3560 Training loss: 1.7000 1.7690 sec/batch
Epoch 5/20  Iteration 848/3560 Training loss: 1.6999 1.7351 sec/batch
Epoch 5/20  Iteration 849/3560 Training loss: 1.6998 2.0541 sec/batch
Epoch 5/20  Iteration 850/3560 Training loss: 1.6997 1.8167 sec/batch
Epoch 5/20  Iteration 851/3560 Training loss: 1.6995 2.0560 sec/batch
Epoch 5/20  Iteration 852/3560 Training loss: 1.6993 2.0332 sec/batch
Epoch 5/20  Iteration 853/3560 Training loss: 1.6995 1.8412 sec/batch
Epoch 5/20  Iteration 854/3560 Training loss: 1.6993 1.7158 sec/batch
Epoch 5/20  Iteration 855/3560 Training loss: 1.6990 1.7096 sec/batch
Epoch 5/20  Iteration 856/3560 Training loss: 1.6990 1.6961 sec/batch
Epoch 5/20  Iteration 857/3560 Training loss: 1.6986 1.7306 sec/batch
Epoch 5/20  Iteration 858/3560 Training loss: 1.6985 1.9386 sec/batch
Epoch 5/20  Iteration 859/3560 Training loss: 1.6984 1.8680 sec/batch
Epoch 5/20  Iteration 860/3560 Training loss: 1.6984 1.7562 sec/batch
Epoch 5/20  Iteration 861/3560 Training loss: 1.6983 1.7793 sec/batch
Epoch 5/20  Iteration 862/3560 Training loss: 1.6981 1.7228 sec/batch
Epoch 5/20  Iteration 863/3560 Training loss: 1.6977 1.7106 sec/batch
Epoch 5/20  Iteration 864/3560 Training loss: 1.6976 1.7360 sec/batch
Epoch 5/20  Iteration 865/3560 Training loss: 1.6975 1.7725 sec/batch
Epoch 5/20  Iteration 866/3560 Training loss: 1.6973 1.8727 sec/batch
Epoch 5/20  Iteration 867/3560 Training loss: 1.6971 2.1764 sec/batch
Epoch 5/20  Iteration 868/3560 Training loss: 1.6970 1.9626 sec/batch
Epoch 5/20  Iteration 869/3560 Training loss: 1.6969 1.9181 sec/batch
Epoch 5/20  Iteration 870/3560 Training loss: 1.6968 2.1225 sec/batch
Epoch 5/20  Iteration 871/3560 Training loss: 1.6964 2.7202 sec/batch
Epoch 5/20  Iteration 872/3560 Training loss: 1.6964 2.2895 sec/batch
Epoch 5/20  Iteration 873/3560 Training loss: 1.6964 1.8403 sec/batch
Epoch 5/20  Iteration 874/3560 Training loss: 1.6962 1.7631 sec/batch
Epoch 5/20  Iteration 875/3560 Training loss: 1.6962 1.9953 sec/batch
Epoch 5/20  Iteration 876/3560 Training loss: 1.6959 1.8972 sec/batch
Epoch 5/20  Iteration 877/3560 Training loss: 1.6958 1.7191 sec/batch
Epoch 5/20  Iteration 878/3560 Training loss: 1.6956 1.8918 sec/batch
Epoch 5/20  Iteration 879/3560 Training loss: 1.6956 1.8853 sec/batch
Epoch 5/20  Iteration 880/3560 Training loss: 1.6958 1.7928 sec/batch
Epoch 5/20  Iteration 881/3560 Training loss: 1.6957 1.7340 sec/batch
Epoch 5/20  Iteration 882/3560 Training loss: 1.6955 1.9007 sec/batch
Epoch 5/20  Iteration 883/3560 Training loss: 1.6952 1.8709 sec/batch
Epoch 5/20  Iteration 884/3560 Training loss: 1.6949 1.7644 sec/batch
Epoch 5/20  Iteration 885/3560 Training loss: 1.6948 2.1632 sec/batch
Epoch 5/20  Iteration 886/3560 Training loss: 1.6947 1.7473 sec/batch
Epoch 5/20  Iteration 887/3560 Training loss: 1.6946 1.7038 sec/batch
Epoch 5/20  Iteration 888/3560 Training loss: 1.6943 2.0190 sec/batch
Epoch 5/20  Iteration 889/3560 Training loss: 1.6940 1.9881 sec/batch
Epoch 5/20  Iteration 890/3560 Training loss: 1.6939 1.9887 sec/batch
Epoch 6/20  Iteration 891/3560 Training loss: 1.7490 1.9834 sec/batch
Epoch 6/20  Iteration 892/3560 Training loss: 1.7008 1.7228 sec/batch
Epoch 6/20  Iteration 893/3560 Training loss: 1.6846 1.7186 sec/batch
Epoch 6/20  Iteration 894/3560 Training loss: 1.6794 1.9107 sec/batch
Epoch 6/20  Iteration 895/3560 Training loss: 1.6727 1.8947 sec/batch
Epoch 6/20  Iteration 896/3560 Training loss: 1.6623 1.7406 sec/batch
Epoch 6/20  Iteration 897/3560 Training loss: 1.6626 1.7226 sec/batch
Epoch 6/20  Iteration 898/3560 Training loss: 1.6595 1.9417 sec/batch
Epoch 6/20  Iteration 899/3560 Training loss: 1.6610 1.7253 sec/batch
Epoch 6/20  Iteration 900/3560 Training loss: 1.6599 1.7242 sec/batch
Epoch 6/20  Iteration 901/3560 Training loss: 1.6560 1.8910 sec/batch
Epoch 6/20  Iteration 902/3560 Training loss: 1.6543 1.7160 sec/batch
Epoch 6/20  Iteration 903/3560 Training loss: 1.6539 1.7112 sec/batch
Epoch 6/20  Iteration 904/3560 Training loss: 1.6563 1.7714 sec/batch
Epoch 6/20  Iteration 905/3560 Training loss: 1.6552 2.1064 sec/batch
Epoch 6/20  Iteration 906/3560 Training loss: 1.6538 2.0355 sec/batch
Epoch 6/20  Iteration 907/3560 Training loss: 1.6532 1.7403 sec/batch
Epoch 6/20  Iteration 908/3560 Training loss: 1.6553 1.7075 sec/batch
Epoch 6/20  Iteration 909/3560 Training loss: 1.6554 1.7229 sec/batch
Epoch 6/20  Iteration 910/3560 Training loss: 1.6560 1.7136 sec/batch
Epoch 6/20  Iteration 911/3560 Training loss: 1.6556 1.7228 sec/batch
Epoch 6/20  Iteration 912/3560 Training loss: 1.6565 1.6929 sec/batch
Epoch 6/20  Iteration 913/3560 Training loss: 1.6558 1.7225 sec/batch
Epoch 6/20  Iteration 914/3560 Training loss: 1.6556 1.7066 sec/batch
Epoch 6/20  Iteration 915/3560 Training loss: 1.6551 1.6968 sec/batch
Epoch 6/20  Iteration 916/3560 Training loss: 1.6535 1.7293 sec/batch
Epoch 6/20  Iteration 917/3560 Training loss: 1.6523 1.8445 sec/batch
Epoch 6/20  Iteration 918/3560 Training loss: 1.6526 2.0379 sec/batch
Epoch 6/20  Iteration 919/3560 Training loss: 1.6536 2.2759 sec/batch
Epoch 6/20  Iteration 920/3560 Training loss: 1.6539 1.8945 sec/batch
Epoch 6/20  Iteration 921/3560 Training loss: 1.6538 1.7524 sec/batch
Epoch 6/20  Iteration 922/3560 Training loss: 1.6529 1.7133 sec/batch
Epoch 6/20  Iteration 923/3560 Training loss: 1.6533 2.0158 sec/batch
Epoch 6/20  Iteration 924/3560 Training loss: 1.6537 2.2076 sec/batch
Epoch 6/20  Iteration 925/3560 Training loss: 1.6535 1.8458 sec/batch
Epoch 6/20  Iteration 926/3560 Training loss: 1.6536 1.7031 sec/batch
Epoch 6/20  Iteration 927/3560 Training loss: 1.6525 1.7320 sec/batch
Epoch 6/20  Iteration 928/3560 Training loss: 1.6513 1.7278 sec/batch
Epoch 6/20  Iteration 929/3560 Training loss: 1.6498 1.7109 sec/batch
Epoch 6/20  Iteration 930/3560 Training loss: 1.6491 1.7092 sec/batch
Epoch 6/20  Iteration 931/3560 Training loss: 1.6484 1.7321 sec/batch
Epoch 6/20  Iteration 932/3560 Training loss: 1.6487 1.7191 sec/batch
Epoch 6/20  Iteration 933/3560 Training loss: 1.6479 1.6967 sec/batch
Epoch 6/20  Iteration 934/3560 Training loss: 1.6472 1.7024 sec/batch
Epoch 6/20  Iteration 935/3560 Training loss: 1.6474 1.7104 sec/batch
Epoch 6/20  Iteration 936/3560 Training loss: 1.6465 1.7111 sec/batch
Epoch 6/20  Iteration 937/3560 Training loss: 1.6463 1.7294 sec/batch
Epoch 6/20  Iteration 938/3560 Training loss: 1.6456 1.8655 sec/batch
Epoch 6/20  Iteration 939/3560 Training loss: 1.6452 1.9980 sec/batch
Epoch 6/20  Iteration 940/3560 Training loss: 1.6457 1.7570 sec/batch
Epoch 6/20  Iteration 941/3560 Training loss: 1.6453 1.7163 sec/batch
Epoch 6/20  Iteration 942/3560 Training loss: 1.6461 1.7170 sec/batch
Epoch 6/20  Iteration 943/3560 Training loss: 1.6460 1.7314 sec/batch
Epoch 6/20  Iteration 944/3560 Training loss: 1.6457 1.7235 sec/batch
Epoch 6/20  Iteration 945/3560 Training loss: 1.6453 1.7310 sec/batch
Epoch 6/20  Iteration 946/3560 Training loss: 1.6453 1.7545 sec/batch
Epoch 6/20  Iteration 947/3560 Training loss: 1.6456 1.7193 sec/batch
Epoch 6/20  Iteration 948/3560 Training loss: 1.6453 1.7178 sec/batch
Epoch 6/20  Iteration 949/3560 Training loss: 1.6445 1.7270 sec/batch
Epoch 6/20  Iteration 950/3560 Training loss: 1.6451 1.7046 sec/batch
Epoch 6/20  Iteration 951/3560 Training loss: 1.6449 1.7261 sec/batch
Epoch 6/20  Iteration 952/3560 Training loss: 1.6456 1.7215 sec/batch
Epoch 6/20  Iteration 953/3560 Training loss: 1.6458 1.7029 sec/batch
Epoch 6/20  Iteration 954/3560 Training loss: 1.6460 1.7312 sec/batch
Epoch 6/20  Iteration 955/3560 Training loss: 1.6458 1.7124 sec/batch
Epoch 6/20  Iteration 956/3560 Training loss: 1.6458 1.7243 sec/batch
Epoch 6/20  Iteration 957/3560 Training loss: 1.6458 1.7140 sec/batch
Epoch 6/20  Iteration 958/3560 Training loss: 1.6453 1.7314 sec/batch
Epoch 6/20  Iteration 959/3560 Training loss: 1.6451 1.6906 sec/batch
Epoch 6/20  Iteration 960/3560 Training loss: 1.6451 1.7101 sec/batch
Epoch 6/20  Iteration 961/3560 Training loss: 1.6457 1.7277 sec/batch
Epoch 6/20  Iteration 962/3560 Training loss: 1.6457 1.7315 sec/batch
Epoch 6/20  Iteration 963/3560 Training loss: 1.6460 1.6862 sec/batch
Epoch 6/20  Iteration 964/3560 Training loss: 1.6457 2.0324 sec/batch
Epoch 6/20  Iteration 965/3560 Training loss: 1.6455 1.7229 sec/batch
Epoch 6/20  Iteration 966/3560 Training loss: 1.6456 1.7282 sec/batch
Epoch 6/20  Iteration 967/3560 Training loss: 1.6454 1.7303 sec/batch
Epoch 6/20  Iteration 968/3560 Training loss: 1.6454 1.7304 sec/batch
Epoch 6/20  Iteration 969/3560 Training loss: 1.6448 1.7240 sec/batch
Epoch 6/20  Iteration 970/3560 Training loss: 1.6445 1.8835 sec/batch
Epoch 6/20  Iteration 971/3560 Training loss: 1.6438 1.8209 sec/batch
Epoch 6/20  Iteration 972/3560 Training loss: 1.6438 2.1442 sec/batch
Epoch 6/20  Iteration 973/3560 Training loss: 1.6431 2.3724 sec/batch
Epoch 6/20  Iteration 974/3560 Training loss: 1.6429 2.3475 sec/batch
Epoch 6/20  Iteration 975/3560 Training loss: 1.6426 2.3830 sec/batch
Epoch 6/20  Iteration 976/3560 Training loss: 1.6422 2.4108 sec/batch
Epoch 6/20  Iteration 977/3560 Training loss: 1.6419 2.0984 sec/batch
Epoch 6/20  Iteration 978/3560 Training loss: 1.6416 1.9376 sec/batch
Epoch 6/20  Iteration 979/3560 Training loss: 1.6411 1.8789 sec/batch
Epoch 6/20  Iteration 980/3560 Training loss: 1.6410 1.7860 sec/batch
Epoch 6/20  Iteration 981/3560 Training loss: 1.6404 2.0779 sec/batch
Epoch 6/20  Iteration 982/3560 Training loss: 1.6402 1.9509 sec/batch
Epoch 6/20  Iteration 983/3560 Training loss: 1.6398 1.9908 sec/batch
Epoch 6/20  Iteration 984/3560 Training loss: 1.6393 2.1195 sec/batch
Epoch 6/20  Iteration 985/3560 Training loss: 1.6389 2.0276 sec/batch
Epoch 6/20  Iteration 986/3560 Training loss: 1.6389 1.7909 sec/batch
Epoch 6/20  Iteration 987/3560 Training loss: 1.6387 1.9719 sec/batch
Epoch 6/20  Iteration 988/3560 Training loss: 1.6383 1.7811 sec/batch
Epoch 6/20  Iteration 989/3560 Training loss: 1.6378 1.7567 sec/batch
Epoch 6/20  Iteration 990/3560 Training loss: 1.6372 2.0200 sec/batch
Epoch 6/20  Iteration 991/3560 Training loss: 1.6372 1.8059 sec/batch
Epoch 6/20  Iteration 992/3560 Training loss: 1.6369 1.7955 sec/batch
Epoch 6/20  Iteration 993/3560 Training loss: 1.6366 1.7622 sec/batch
Epoch 6/20  Iteration 994/3560 Training loss: 1.6362 1.8428 sec/batch
Epoch 6/20  Iteration 995/3560 Training loss: 1.6358 1.7855 sec/batch
Epoch 6/20  Iteration 996/3560 Training loss: 1.6357 1.7789 sec/batch
Epoch 6/20  Iteration 997/3560 Training loss: 1.6356 1.7468 sec/batch
Epoch 6/20  Iteration 998/3560 Training loss: 1.6353 1.7550 sec/batch
Epoch 6/20  Iteration 999/3560 Training loss: 1.6352 1.7789 sec/batch
Epoch 6/20  Iteration 1000/3560 Training loss: 1.6351 1.7623 sec/batch
Epoch 6/20  Iteration 1001/3560 Training loss: 1.6349 1.7613 sec/batch
Epoch 6/20  Iteration 1002/3560 Training loss: 1.6346 2.0427 sec/batch
Epoch 6/20  Iteration 1003/3560 Training loss: 1.6343 2.2230 sec/batch
Epoch 6/20  Iteration 1004/3560 Training loss: 1.6341 1.7996 sec/batch
Epoch 6/20  Iteration 1005/3560 Training loss: 1.6337 1.7949 sec/batch
Epoch 6/20  Iteration 1006/3560 Training loss: 1.6333 1.8018 sec/batch
Epoch 6/20  Iteration 1007/3560 Training loss: 1.6332 1.7529 sec/batch
Epoch 6/20  Iteration 1008/3560 Training loss: 1.6331 1.7627 sec/batch
Epoch 6/20  Iteration 1009/3560 Training loss: 1.6329 1.7434 sec/batch
Epoch 6/20  Iteration 1010/3560 Training loss: 1.6328 1.9058 sec/batch
Epoch 6/20  Iteration 1011/3560 Training loss: 1.6326 1.8081 sec/batch
Epoch 6/20  Iteration 1012/3560 Training loss: 1.6322 1.9732 sec/batch
Epoch 6/20  Iteration 1013/3560 Training loss: 1.6318 1.7141 sec/batch
Epoch 6/20  Iteration 1014/3560 Training loss: 1.6317 1.7003 sec/batch
Epoch 6/20  Iteration 1015/3560 Training loss: 1.6316 1.7258 sec/batch
Epoch 6/20  Iteration 1016/3560 Training loss: 1.6311 1.7202 sec/batch
Epoch 6/20  Iteration 1017/3560 Training loss: 1.6312 1.7262 sec/batch
Epoch 6/20  Iteration 1018/3560 Training loss: 1.6312 1.7168 sec/batch
Epoch 6/20  Iteration 1019/3560 Training loss: 1.6310 1.6950 sec/batch
Epoch 6/20  Iteration 1020/3560 Training loss: 1.6307 1.7079 sec/batch
Epoch 6/20  Iteration 1021/3560 Training loss: 1.6302 1.7237 sec/batch
Epoch 6/20  Iteration 1022/3560 Training loss: 1.6299 1.7395 sec/batch
Epoch 6/20  Iteration 1023/3560 Training loss: 1.6299 1.8482 sec/batch
Epoch 6/20  Iteration 1024/3560 Training loss: 1.6298 1.7522 sec/batch
Epoch 6/20  Iteration 1025/3560 Training loss: 1.6297 1.7243 sec/batch
Epoch 6/20  Iteration 1026/3560 Training loss: 1.6296 1.7260 sec/batch
Epoch 6/20  Iteration 1027/3560 Training loss: 1.6297 1.8612 sec/batch
Epoch 6/20  Iteration 1028/3560 Training loss: 1.6296 1.8077 sec/batch
Epoch 6/20  Iteration 1029/3560 Training loss: 1.6295 1.9302 sec/batch
Epoch 6/20  Iteration 1030/3560 Training loss: 1.6293 2.0337 sec/batch
Epoch 6/20  Iteration 1031/3560 Training loss: 1.6296 1.8893 sec/batch
Epoch 6/20  Iteration 1032/3560 Training loss: 1.6294 1.9151 sec/batch
Epoch 6/20  Iteration 1033/3560 Training loss: 1.6292 1.7431 sec/batch
Epoch 6/20  Iteration 1034/3560 Training loss: 1.6293 1.7380 sec/batch
Epoch 6/20  Iteration 1035/3560 Training loss: 1.6291 1.7356 sec/batch
Epoch 6/20  Iteration 1036/3560 Training loss: 1.6291 1.7087 sec/batch
Epoch 6/20  Iteration 1037/3560 Training loss: 1.6290 1.7534 sec/batch
Epoch 6/20  Iteration 1038/3560 Training loss: 1.6291 1.7702 sec/batch
Epoch 6/20  Iteration 1039/3560 Training loss: 1.6290 1.7395 sec/batch
Epoch 6/20  Iteration 1040/3560 Training loss: 1.6288 1.7218 sec/batch
Epoch 6/20  Iteration 1041/3560 Training loss: 1.6284 1.7191 sec/batch
Epoch 6/20  Iteration 1042/3560 Training loss: 1.6282 1.7293 sec/batch
Epoch 6/20  Iteration 1043/3560 Training loss: 1.6282 1.7231 sec/batch
Epoch 6/20  Iteration 1044/3560 Training loss: 1.6281 1.7342 sec/batch
Epoch 6/20  Iteration 1045/3560 Training loss: 1.6280 1.7824 sec/batch
Epoch 6/20  Iteration 1046/3560 Training loss: 1.6278 1.7444 sec/batch
Epoch 6/20  Iteration 1047/3560 Training loss: 1.6277 1.7467 sec/batch
Epoch 6/20  Iteration 1048/3560 Training loss: 1.6275 1.7458 sec/batch
Epoch 6/20  Iteration 1049/3560 Training loss: 1.6272 1.7397 sec/batch
Epoch 6/20  Iteration 1050/3560 Training loss: 1.6272 1.7249 sec/batch
Epoch 6/20  Iteration 1051/3560 Training loss: 1.6273 1.7433 sec/batch
Epoch 6/20  Iteration 1052/3560 Training loss: 1.6272 1.7211 sec/batch
Epoch 6/20  Iteration 1053/3560 Training loss: 1.6271 1.7156 sec/batch
Epoch 6/20  Iteration 1054/3560 Training loss: 1.6270 1.7455 sec/batch
Epoch 6/20  Iteration 1055/3560 Training loss: 1.6270 1.7227 sec/batch
Epoch 6/20  Iteration 1056/3560 Training loss: 1.6268 1.7272 sec/batch
Epoch 6/20  Iteration 1057/3560 Training loss: 1.6267 1.7166 sec/batch
Epoch 6/20  Iteration 1058/3560 Training loss: 1.6271 1.7445 sec/batch
Epoch 6/20  Iteration 1059/3560 Training loss: 1.6270 1.7313 sec/batch
Epoch 6/20  Iteration 1060/3560 Training loss: 1.6269 1.7126 sec/batch
Epoch 6/20  Iteration 1061/3560 Training loss: 1.6267 1.7157 sec/batch
Epoch 6/20  Iteration 1062/3560 Training loss: 1.6264 1.7561 sec/batch
Epoch 6/20  Iteration 1063/3560 Training loss: 1.6264 1.7186 sec/batch
Epoch 6/20  Iteration 1064/3560 Training loss: 1.6263 1.7502 sec/batch
Epoch 6/20  Iteration 1065/3560 Training loss: 1.6264 1.7163 sec/batch
Epoch 6/20  Iteration 1066/3560 Training loss: 1.6262 1.7370 sec/batch
Epoch 6/20  Iteration 1067/3560 Training loss: 1.6260 1.7431 sec/batch
Epoch 6/20  Iteration 1068/3560 Training loss: 1.6259 1.7413 sec/batch
Epoch 7/20  Iteration 1069/3560 Training loss: 1.6949 1.7725 sec/batch
Epoch 7/20  Iteration 1070/3560 Training loss: 1.6443 1.8684 sec/batch
Epoch 7/20  Iteration 1071/3560 Training loss: 1.6291 1.7747 sec/batch
Epoch 7/20  Iteration 1072/3560 Training loss: 1.6235 1.7360 sec/batch
Epoch 7/20  Iteration 1073/3560 Training loss: 1.6167 1.7244 sec/batch
Epoch 7/20  Iteration 1074/3560 Training loss: 1.6071 1.7408 sec/batch
Epoch 7/20  Iteration 1075/3560 Training loss: 1.6071 1.7415 sec/batch
Epoch 7/20  Iteration 1076/3560 Training loss: 1.6055 1.7453 sec/batch
Epoch 7/20  Iteration 1077/3560 Training loss: 1.6069 1.7421 sec/batch
Epoch 7/20  Iteration 1078/3560 Training loss: 1.6060 1.7383 sec/batch
Epoch 7/20  Iteration 1079/3560 Training loss: 1.6029 1.7515 sec/batch
Epoch 7/20  Iteration 1080/3560 Training loss: 1.6014 1.7341 sec/batch
Epoch 7/20  Iteration 1081/3560 Training loss: 1.6013 1.7191 sec/batch
Epoch 7/20  Iteration 1082/3560 Training loss: 1.6025 1.7234 sec/batch
Epoch 7/20  Iteration 1083/3560 Training loss: 1.6016 1.7306 sec/batch
Epoch 7/20  Iteration 1084/3560 Training loss: 1.6004 1.7152 sec/batch
Epoch 7/20  Iteration 1085/3560 Training loss: 1.6003 1.7384 sec/batch
Epoch 7/20  Iteration 1086/3560 Training loss: 1.6014 1.7266 sec/batch
Epoch 7/20  Iteration 1087/3560 Training loss: 1.6014 1.7153 sec/batch
Epoch 7/20  Iteration 1088/3560 Training loss: 1.6019 1.7143 sec/batch
Epoch 7/20  Iteration 1089/3560 Training loss: 1.6013 1.7094 sec/batch
Epoch 7/20  Iteration 1090/3560 Training loss: 1.6024 1.7550 sec/batch
Epoch 7/20  Iteration 1091/3560 Training loss: 1.6015 1.7197 sec/batch
Epoch 7/20  Iteration 1092/3560 Training loss: 1.6012 1.7269 sec/batch
Epoch 7/20  Iteration 1093/3560 Training loss: 1.6009 1.7421 sec/batch
Epoch 7/20  Iteration 1094/3560 Training loss: 1.5996 1.7897 sec/batch
Epoch 7/20  Iteration 1095/3560 Training loss: 1.5984 1.7695 sec/batch
Epoch 7/20  Iteration 1096/3560 Training loss: 1.5984 1.9017 sec/batch
Epoch 7/20  Iteration 1097/3560 Training loss: 1.5994 1.7697 sec/batch
Epoch 7/20  Iteration 1098/3560 Training loss: 1.5993 1.7302 sec/batch
Epoch 7/20  Iteration 1099/3560 Training loss: 1.5990 1.7381 sec/batch
Epoch 7/20  Iteration 1100/3560 Training loss: 1.5981 1.7639 sec/batch
Epoch 7/20  Iteration 1101/3560 Training loss: 1.5983 1.7214 sec/batch
Epoch 7/20  Iteration 1102/3560 Training loss: 1.5981 1.7245 sec/batch
Epoch 7/20  Iteration 1103/3560 Training loss: 1.5979 1.7467 sec/batch
Epoch 7/20  Iteration 1104/3560 Training loss: 1.5977 1.7802 sec/batch
Epoch 7/20  Iteration 1105/3560 Training loss: 1.5970 1.7370 sec/batch
Epoch 7/20  Iteration 1106/3560 Training loss: 1.5957 1.7198 sec/batch
Epoch 7/20  Iteration 1107/3560 Training loss: 1.5941 1.7257 sec/batch
Epoch 7/20  Iteration 1108/3560 Training loss: 1.5934 1.7096 sec/batch
Epoch 7/20  Iteration 1109/3560 Training loss: 1.5927 1.7074 sec/batch
Epoch 7/20  Iteration 1110/3560 Training loss: 1.5932 1.7537 sec/batch
Epoch 7/20  Iteration 1111/3560 Training loss: 1.5928 1.7150 sec/batch
Epoch 7/20  Iteration 1112/3560 Training loss: 1.5921 1.7385 sec/batch
Epoch 7/20  Iteration 1113/3560 Training loss: 1.5922 1.7633 sec/batch
Epoch 7/20  Iteration 1114/3560 Training loss: 1.5911 1.7183 sec/batch
Epoch 7/20  Iteration 1115/3560 Training loss: 1.5905 1.7370 sec/batch
Epoch 7/20  Iteration 1116/3560 Training loss: 1.5902 1.7324 sec/batch
Epoch 7/20  Iteration 1117/3560 Training loss: 1.5899 1.7685 sec/batch
Epoch 7/20  Iteration 1118/3560 Training loss: 1.5903 1.7137 sec/batch
Epoch 7/20  Iteration 1119/3560 Training loss: 1.5900 1.6997 sec/batch
Epoch 7/20  Iteration 1120/3560 Training loss: 1.5905 1.7262 sec/batch
Epoch 7/20  Iteration 1121/3560 Training loss: 1.5904 1.7081 sec/batch
Epoch 7/20  Iteration 1122/3560 Training loss: 1.5903 1.7458 sec/batch
Epoch 7/20  Iteration 1123/3560 Training loss: 1.5899 1.7164 sec/batch
Epoch 7/20  Iteration 1124/3560 Training loss: 1.5898 1.7437 sec/batch
Epoch 7/20  Iteration 1125/3560 Training loss: 1.5903 1.7188 sec/batch
Epoch 7/20  Iteration 1126/3560 Training loss: 1.5898 1.7368 sec/batch
Epoch 7/20  Iteration 1127/3560 Training loss: 1.5892 1.7498 sec/batch
Epoch 7/20  Iteration 1128/3560 Training loss: 1.5896 1.7187 sec/batch
Epoch 7/20  Iteration 1129/3560 Training loss: 1.5894 1.7250 sec/batch
Epoch 7/20  Iteration 1130/3560 Training loss: 1.5903 1.7524 sec/batch
Epoch 7/20  Iteration 1131/3560 Training loss: 1.5906 1.7398 sec/batch
Epoch 7/20  Iteration 1132/3560 Training loss: 1.5906 1.7117 sec/batch
Epoch 7/20  Iteration 1133/3560 Training loss: 1.5902 1.7079 sec/batch
Epoch 7/20  Iteration 1134/3560 Training loss: 1.5904 1.7447 sec/batch
Epoch 7/20  Iteration 1135/3560 Training loss: 1.5905 1.7458 sec/batch
Epoch 7/20  Iteration 1136/3560 Training loss: 1.5899 1.7135 sec/batch
Epoch 7/20  Iteration 1137/3560 Training loss: 1.5899 1.7479 sec/batch
Epoch 7/20  Iteration 1138/3560 Training loss: 1.5898 1.9320 sec/batch
Epoch 7/20  Iteration 1139/3560 Training loss: 1.5903 1.7571 sec/batch
Epoch 7/20  Iteration 1140/3560 Training loss: 1.5903 1.8418 sec/batch
Epoch 7/20  Iteration 1141/3560 Training loss: 1.5908 1.7274 sec/batch
Epoch 7/20  Iteration 1142/3560 Training loss: 1.5903 1.7170 sec/batch
Epoch 7/20  Iteration 1143/3560 Training loss: 1.5902 1.7271 sec/batch
Epoch 7/20  Iteration 1144/3560 Training loss: 1.5905 1.7456 sec/batch
Epoch 7/20  Iteration 1145/3560 Training loss: 1.5903 1.7251 sec/batch
Epoch 7/20  Iteration 1146/3560 Training loss: 1.5903 1.7208 sec/batch
Epoch 7/20  Iteration 1147/3560 Training loss: 1.5897 2.0401 sec/batch
Epoch 7/20  Iteration 1148/3560 Training loss: 1.5893 1.7153 sec/batch
Epoch 7/20  Iteration 1149/3560 Training loss: 1.5887 1.7619 sec/batch
Epoch 7/20  Iteration 1150/3560 Training loss: 1.5887 1.9887 sec/batch
Epoch 7/20  Iteration 1151/3560 Training loss: 1.5881 1.7628 sec/batch
Epoch 7/20  Iteration 1152/3560 Training loss: 1.5879 1.7370 sec/batch
Epoch 7/20  Iteration 1153/3560 Training loss: 1.5874 1.7486 sec/batch
Epoch 7/20  Iteration 1154/3560 Training loss: 1.5871 1.7307 sec/batch
Epoch 7/20  Iteration 1155/3560 Training loss: 1.5868 1.7372 sec/batch
Epoch 7/20  Iteration 1156/3560 Training loss: 1.5865 1.7481 sec/batch
Epoch 7/20  Iteration 1157/3560 Training loss: 1.5860 1.7789 sec/batch
Epoch 7/20  Iteration 1158/3560 Training loss: 1.5861 1.7291 sec/batch
Epoch 7/20  Iteration 1159/3560 Training loss: 1.5858 1.7414 sec/batch
Epoch 7/20  Iteration 1160/3560 Training loss: 1.5855 1.7118 sec/batch
Epoch 7/20  Iteration 1161/3560 Training loss: 1.5850 1.7412 sec/batch
Epoch 7/20  Iteration 1162/3560 Training loss: 1.5845 1.7102 sec/batch
Epoch 7/20  Iteration 1163/3560 Training loss: 1.5841 1.7351 sec/batch
Epoch 7/20  Iteration 1164/3560 Training loss: 1.5842 1.7583 sec/batch
Epoch 7/20  Iteration 1165/3560 Training loss: 1.5841 1.7430 sec/batch
Epoch 7/20  Iteration 1166/3560 Training loss: 1.5836 1.7190 sec/batch
Epoch 7/20  Iteration 1167/3560 Training loss: 1.5832 1.7465 sec/batch
Epoch 7/20  Iteration 1168/3560 Training loss: 1.5827 1.7473 sec/batch
Epoch 7/20  Iteration 1169/3560 Training loss: 1.5827 1.7123 sec/batch
Epoch 7/20  Iteration 1170/3560 Training loss: 1.5824 1.7521 sec/batch
Epoch 7/20  Iteration 1171/3560 Training loss: 1.5823 1.7421 sec/batch
Epoch 7/20  Iteration 1172/3560 Training loss: 1.5820 1.7247 sec/batch
Epoch 7/20  Iteration 1173/3560 Training loss: 1.5817 1.7390 sec/batch
Epoch 7/20  Iteration 1174/3560 Training loss: 1.5815 1.7110 sec/batch
Epoch 7/20  Iteration 1175/3560 Training loss: 1.5815 1.7444 sec/batch
Epoch 7/20  Iteration 1176/3560 Training loss: 1.5814 1.7139 sec/batch
Epoch 7/20  Iteration 1177/3560 Training loss: 1.5813 1.7255 sec/batch
Epoch 7/20  Iteration 1178/3560 Training loss: 1.5813 1.7396 sec/batch
Epoch 7/20  Iteration 1179/3560 Training loss: 1.5810 1.7323 sec/batch
Epoch 7/20  Iteration 1180/3560 Training loss: 1.5809 1.7487 sec/batch
Epoch 7/20  Iteration 1181/3560 Training loss: 1.5807 1.7386 sec/batch
Epoch 7/20  Iteration 1182/3560 Training loss: 1.5805 2.0440 sec/batch
Epoch 7/20  Iteration 1183/3560 Training loss: 1.5802 1.7179 sec/batch
Epoch 7/20  Iteration 1184/3560 Training loss: 1.5797 1.7347 sec/batch
Epoch 7/20  Iteration 1185/3560 Training loss: 1.5796 1.7379 sec/batch
Epoch 7/20  Iteration 1186/3560 Training loss: 1.5794 1.7460 sec/batch
Epoch 7/20  Iteration 1187/3560 Training loss: 1.5793 1.7351 sec/batch
Epoch 7/20  Iteration 1188/3560 Training loss: 1.5793 1.7401 sec/batch
Epoch 7/20  Iteration 1189/3560 Training loss: 1.5792 1.7515 sec/batch
Epoch 7/20  Iteration 1190/3560 Training loss: 1.5788 1.7395 sec/batch
Epoch 7/20  Iteration 1191/3560 Training loss: 1.5785 1.7114 sec/batch
Epoch 7/20  Iteration 1192/3560 Training loss: 1.5785 1.7089 sec/batch
Epoch 7/20  Iteration 1193/3560 Training loss: 1.5783 1.7369 sec/batch
Epoch 7/20  Iteration 1194/3560 Training loss: 1.5779 1.7503 sec/batch
Epoch 7/20  Iteration 1195/3560 Training loss: 1.5780 1.7423 sec/batch
Epoch 7/20  Iteration 1196/3560 Training loss: 1.5780 1.7422 sec/batch
Epoch 7/20  Iteration 1197/3560 Training loss: 1.5778 1.7405 sec/batch
Epoch 7/20  Iteration 1198/3560 Training loss: 1.5776 1.7359 sec/batch
Epoch 7/20  Iteration 1199/3560 Training loss: 1.5771 1.7516 sec/batch
Epoch 7/20  Iteration 1200/3560 Training loss: 1.5769 1.7330 sec/batch
Epoch 7/20  Iteration 1201/3560 Training loss: 1.5769 1.7551 sec/batch
Epoch 7/20  Iteration 1202/3560 Training loss: 1.5768 1.7142 sec/batch
Epoch 7/20  Iteration 1203/3560 Training loss: 1.5768 1.7428 sec/batch
Epoch 7/20  Iteration 1204/3560 Training loss: 1.5767 1.7088 sec/batch
Epoch 7/20  Iteration 1205/3560 Training loss: 1.5769 1.7372 sec/batch
Epoch 7/20  Iteration 1206/3560 Training loss: 1.5768 1.7899 sec/batch
Epoch 7/20  Iteration 1207/3560 Training loss: 1.5767 1.8925 sec/batch
Epoch 7/20  Iteration 1208/3560 Training loss: 1.5767 1.7564 sec/batch
Epoch 7/20  Iteration 1209/3560 Training loss: 1.5770 1.7346 sec/batch
Epoch 7/20  Iteration 1210/3560 Training loss: 1.5768 1.7430 sec/batch
Epoch 7/20  Iteration 1211/3560 Training loss: 1.5767 1.7228 sec/batch
Epoch 7/20  Iteration 1212/3560 Training loss: 1.5767 1.7440 sec/batch
Epoch 7/20  Iteration 1213/3560 Training loss: 1.5765 1.7082 sec/batch
Epoch 7/20  Iteration 1214/3560 Training loss: 1.5765 1.7064 sec/batch
Epoch 7/20  Iteration 1215/3560 Training loss: 1.5765 1.7225 sec/batch
Epoch 7/20  Iteration 1216/3560 Training loss: 1.5767 1.7538 sec/batch
Epoch 7/20  Iteration 1217/3560 Training loss: 1.5767 1.7141 sec/batch
Epoch 7/20  Iteration 1218/3560 Training loss: 1.5765 1.7388 sec/batch
Epoch 7/20  Iteration 1219/3560 Training loss: 1.5761 1.7441 sec/batch
Epoch 7/20  Iteration 1220/3560 Training loss: 1.5760 1.7206 sec/batch
Epoch 7/20  Iteration 1221/3560 Training loss: 1.5760 1.7407 sec/batch
Epoch 7/20  Iteration 1222/3560 Training loss: 1.5759 1.7454 sec/batch
Epoch 7/20  Iteration 1223/3560 Training loss: 1.5758 1.7334 sec/batch
Epoch 7/20  Iteration 1224/3560 Training loss: 1.5757 1.7037 sec/batch
Epoch 7/20  Iteration 1225/3560 Training loss: 1.5758 1.7459 sec/batch
Epoch 7/20  Iteration 1226/3560 Training loss: 1.5758 1.7320 sec/batch
Epoch 7/20  Iteration 1227/3560 Training loss: 1.5754 1.7377 sec/batch
Epoch 7/20  Iteration 1228/3560 Training loss: 1.5755 1.7480 sec/batch
Epoch 7/20  Iteration 1229/3560 Training loss: 1.5756 1.7155 sec/batch
Epoch 7/20  Iteration 1230/3560 Training loss: 1.5755 1.7113 sec/batch
Epoch 7/20  Iteration 1231/3560 Training loss: 1.5755 1.7288 sec/batch
Epoch 7/20  Iteration 1232/3560 Training loss: 1.5754 1.7152 sec/batch
Epoch 7/20  Iteration 1233/3560 Training loss: 1.5754 1.7205 sec/batch
Epoch 7/20  Iteration 1234/3560 Training loss: 1.5753 1.7344 sec/batch
Epoch 7/20  Iteration 1235/3560 Training loss: 1.5753 1.7439 sec/batch
Epoch 7/20  Iteration 1236/3560 Training loss: 1.5757 1.7126 sec/batch
Epoch 7/20  Iteration 1237/3560 Training loss: 1.5756 1.7573 sec/batch
Epoch 7/20  Iteration 1238/3560 Training loss: 1.5755 1.7201 sec/batch
Epoch 7/20  Iteration 1239/3560 Training loss: 1.5753 1.7105 sec/batch
Epoch 7/20  Iteration 1240/3560 Training loss: 1.5751 1.7133 sec/batch
Epoch 7/20  Iteration 1241/3560 Training loss: 1.5752 1.7109 sec/batch
Epoch 7/20  Iteration 1242/3560 Training loss: 1.5751 1.7457 sec/batch
Epoch 7/20  Iteration 1243/3560 Training loss: 1.5751 1.7304 sec/batch
Epoch 7/20  Iteration 1244/3560 Training loss: 1.5750 1.7239 sec/batch
Epoch 7/20  Iteration 1245/3560 Training loss: 1.5747 1.7400 sec/batch
Epoch 7/20  Iteration 1246/3560 Training loss: 1.5746 1.7222 sec/batch
Epoch 8/20  Iteration 1247/3560 Training loss: 1.6392 1.7119 sec/batch
Epoch 8/20  Iteration 1248/3560 Training loss: 1.5953 1.7247 sec/batch
Epoch 8/20  Iteration 1249/3560 Training loss: 1.5806 1.7514 sec/batch
Epoch 8/20  Iteration 1250/3560 Training loss: 1.5792 1.7591 sec/batch
Epoch 8/20  Iteration 1251/3560 Training loss: 1.5747 1.7095 sec/batch
Epoch 8/20  Iteration 1252/3560 Training loss: 1.5641 1.7555 sec/batch
Epoch 8/20  Iteration 1253/3560 Training loss: 1.5646 1.7132 sec/batch
Epoch 8/20  Iteration 1254/3560 Training loss: 1.5615 1.7221 sec/batch
Epoch 8/20  Iteration 1255/3560 Training loss: 1.5625 1.7569 sec/batch
Epoch 8/20  Iteration 1256/3560 Training loss: 1.5607 1.7181 sec/batch
Epoch 8/20  Iteration 1257/3560 Training loss: 1.5585 1.7049 sec/batch
Epoch 8/20  Iteration 1258/3560 Training loss: 1.5579 1.7483 sec/batch
Epoch 8/20  Iteration 1259/3560 Training loss: 1.5574 1.7136 sec/batch
Epoch 8/20  Iteration 1260/3560 Training loss: 1.5594 1.7079 sec/batch
Epoch 8/20  Iteration 1261/3560 Training loss: 1.5581 1.7376 sec/batch
Epoch 8/20  Iteration 1262/3560 Training loss: 1.5561 1.7140 sec/batch
Epoch 8/20  Iteration 1263/3560 Training loss: 1.5565 1.7056 sec/batch
Epoch 8/20  Iteration 1264/3560 Training loss: 1.5578 1.7130 sec/batch
Epoch 8/20  Iteration 1265/3560 Training loss: 1.5579 1.7193 sec/batch
Epoch 8/20  Iteration 1266/3560 Training loss: 1.5585 1.7318 sec/batch
Epoch 8/20  Iteration 1267/3560 Training loss: 1.5576 1.7459 sec/batch
Epoch 8/20  Iteration 1268/3560 Training loss: 1.5583 1.7290 sec/batch
Epoch 8/20  Iteration 1269/3560 Training loss: 1.5574 1.7409 sec/batch
Epoch 8/20  Iteration 1270/3560 Training loss: 1.5569 1.7097 sec/batch
Epoch 8/20  Iteration 1271/3560 Training loss: 1.5568 1.7377 sec/batch
Epoch 8/20  Iteration 1272/3560 Training loss: 1.5555 1.7544 sec/batch
Epoch 8/20  Iteration 1273/3560 Training loss: 1.5540 1.7152 sec/batch
Epoch 8/20  Iteration 1274/3560 Training loss: 1.5545 1.7531 sec/batch
Epoch 8/20  Iteration 1275/3560 Training loss: 1.5550 1.8212 sec/batch
Epoch 8/20  Iteration 1276/3560 Training loss: 1.5552 1.8733 sec/batch
Epoch 8/20  Iteration 1277/3560 Training loss: 1.5551 1.7772 sec/batch
Epoch 8/20  Iteration 1278/3560 Training loss: 1.5543 1.7504 sec/batch
Epoch 8/20  Iteration 1279/3560 Training loss: 1.5547 1.7376 sec/batch
Epoch 8/20  Iteration 1280/3560 Training loss: 1.5546 1.7291 sec/batch
Epoch 8/20  Iteration 1281/3560 Training loss: 1.5546 1.7789 sec/batch
Epoch 8/20  Iteration 1282/3560 Training loss: 1.5544 1.7488 sec/batch
Epoch 8/20  Iteration 1283/3560 Training loss: 1.5536 1.7423 sec/batch
Epoch 8/20  Iteration 1284/3560 Training loss: 1.5526 1.7085 sec/batch
Epoch 8/20  Iteration 1285/3560 Training loss: 1.5510 1.7564 sec/batch
Epoch 8/20  Iteration 1286/3560 Training loss: 1.5505 1.7440 sec/batch
Epoch 8/20  Iteration 1287/3560 Training loss: 1.5499 1.7404 sec/batch
Epoch 8/20  Iteration 1288/3560 Training loss: 1.5503 1.7442 sec/batch
Epoch 8/20  Iteration 1289/3560 Training loss: 1.5499 1.7519 sec/batch
Epoch 8/20  Iteration 1290/3560 Training loss: 1.5492 1.7341 sec/batch
Epoch 8/20  Iteration 1291/3560 Training loss: 1.5492 1.7126 sec/batch
Epoch 8/20  Iteration 1292/3560 Training loss: 1.5481 1.7182 sec/batch
Epoch 8/20  Iteration 1293/3560 Training loss: 1.5478 1.7338 sec/batch
Epoch 8/20  Iteration 1294/3560 Training loss: 1.5471 1.7044 sec/batch
Epoch 8/20  Iteration 1295/3560 Training loss: 1.5470 1.7192 sec/batch
Epoch 8/20  Iteration 1296/3560 Training loss: 1.5474 1.7416 sec/batch
Epoch 8/20  Iteration 1297/3560 Training loss: 1.5468 1.7170 sec/batch
Epoch 8/20  Iteration 1298/3560 Training loss: 1.5474 1.7287 sec/batch
Epoch 8/20  Iteration 1299/3560 Training loss: 1.5473 1.7503 sec/batch
Epoch 8/20  Iteration 1300/3560 Training loss: 1.5472 1.7324 sec/batch
Epoch 8/20  Iteration 1301/3560 Training loss: 1.5469 1.7307 sec/batch
Epoch 8/20  Iteration 1302/3560 Training loss: 1.5469 1.7609 sec/batch
Epoch 8/20  Iteration 1303/3560 Training loss: 1.5472 1.7342 sec/batch
Epoch 8/20  Iteration 1304/3560 Training loss: 1.5468 1.7406 sec/batch
Epoch 8/20  Iteration 1305/3560 Training loss: 1.5462 1.7410 sec/batch
Epoch 8/20  Iteration 1306/3560 Training loss: 1.5466 1.7473 sec/batch
Epoch 8/20  Iteration 1307/3560 Training loss: 1.5465 1.7417 sec/batch
Epoch 8/20  Iteration 1308/3560 Training loss: 1.5472 1.7366 sec/batch
Epoch 8/20  Iteration 1309/3560 Training loss: 1.5476 1.7366 sec/batch
Epoch 8/20  Iteration 1310/3560 Training loss: 1.5476 1.7224 sec/batch
Epoch 8/20  Iteration 1311/3560 Training loss: 1.5474 1.7153 sec/batch
Epoch 8/20  Iteration 1312/3560 Training loss: 1.5476 1.7047 sec/batch
Epoch 8/20  Iteration 1313/3560 Training loss: 1.5478 1.7430 sec/batch
Epoch 8/20  Iteration 1314/3560 Training loss: 1.5473 1.7472 sec/batch
Epoch 8/20  Iteration 1315/3560 Training loss: 1.5472 1.7396 sec/batch
Epoch 8/20  Iteration 1316/3560 Training loss: 1.5470 1.7165 sec/batch
Epoch 8/20  Iteration 1317/3560 Training loss: 1.5474 1.7503 sec/batch
Epoch 8/20  Iteration 1318/3560 Training loss: 1.5477 1.7250 sec/batch
Epoch 8/20  Iteration 1319/3560 Training loss: 1.5482 1.7380 sec/batch
Epoch 8/20  Iteration 1320/3560 Training loss: 1.5479 1.7325 sec/batch
Epoch 8/20  Iteration 1321/3560 Training loss: 1.5477 1.7416 sec/batch
Epoch 8/20  Iteration 1322/3560 Training loss: 1.5479 2.1011 sec/batch
Epoch 8/20  Iteration 1323/3560 Training loss: 1.5479 1.7763 sec/batch
Epoch 8/20  Iteration 1324/3560 Training loss: 1.5478 1.7828 sec/batch
Epoch 8/20  Iteration 1325/3560 Training loss: 1.5473 1.7373 sec/batch
Epoch 8/20  Iteration 1326/3560 Training loss: 1.5471 1.7193 sec/batch
Epoch 8/20  Iteration 1327/3560 Training loss: 1.5465 1.7354 sec/batch
Epoch 8/20  Iteration 1328/3560 Training loss: 1.5466 1.7069 sec/batch
Epoch 8/20  Iteration 1329/3560 Training loss: 1.5460 1.7405 sec/batch
Epoch 8/20  Iteration 1330/3560 Training loss: 1.5460 1.7492 sec/batch
Epoch 8/20  Iteration 1331/3560 Training loss: 1.5457 1.7354 sec/batch
Epoch 8/20  Iteration 1332/3560 Training loss: 1.5455 1.7399 sec/batch
Epoch 8/20  Iteration 1333/3560 Training loss: 1.5452 1.7377 sec/batch
Epoch 8/20  Iteration 1334/3560 Training loss: 1.5449 1.7319 sec/batch
Epoch 8/20  Iteration 1335/3560 Training loss: 1.5445 1.7157 sec/batch
Epoch 8/20  Iteration 1336/3560 Training loss: 1.5446 1.7631 sec/batch
Epoch 8/20  Iteration 1337/3560 Training loss: 1.5443 1.7403 sec/batch
Epoch 8/20  Iteration 1338/3560 Training loss: 1.5441 1.7366 sec/batch
Epoch 8/20  Iteration 1339/3560 Training loss: 1.5437 1.7487 sec/batch
Epoch 8/20  Iteration 1340/3560 Training loss: 1.5434 1.7390 sec/batch
Epoch 8/20  Iteration 1341/3560 Training loss: 1.5430 1.7734 sec/batch
Epoch 8/20  Iteration 1342/3560 Training loss: 1.5429 1.7140 sec/batch
Epoch 8/20  Iteration 1343/3560 Training loss: 1.5428 1.7287 sec/batch
Epoch 8/20  Iteration 1344/3560 Training loss: 1.5424 1.8842 sec/batch
Epoch 8/20  Iteration 1345/3560 Training loss: 1.5420 1.8100 sec/batch
Epoch 8/20  Iteration 1346/3560 Training loss: 1.5414 1.7535 sec/batch
Epoch 8/20  Iteration 1347/3560 Training loss: 1.5414 1.7408 sec/batch
Epoch 8/20  Iteration 1348/3560 Training loss: 1.5413 1.7470 sec/batch
Epoch 8/20  Iteration 1349/3560 Training loss: 1.5411 1.7088 sec/batch
Epoch 8/20  Iteration 1350/3560 Training loss: 1.5409 1.7554 sec/batch
Epoch 8/20  Iteration 1351/3560 Training loss: 1.5406 1.7581 sec/batch
Epoch 8/20  Iteration 1352/3560 Training loss: 1.5405 1.7248 sec/batch
Epoch 8/20  Iteration 1353/3560 Training loss: 1.5404 1.7248 sec/batch
Epoch 8/20  Iteration 1354/3560 Training loss: 1.5403 1.7354 sec/batch
Epoch 8/20  Iteration 1355/3560 Training loss: 1.5402 1.7075 sec/batch
Epoch 8/20  Iteration 1356/3560 Training loss: 1.5402 1.7488 sec/batch
Epoch 8/20  Iteration 1357/3560 Training loss: 1.5399 1.7378 sec/batch
Epoch 8/20  Iteration 1358/3560 Training loss: 1.5398 1.7388 sec/batch
Epoch 8/20  Iteration 1359/3560 Training loss: 1.5396 1.7451 sec/batch
Epoch 8/20  Iteration 1360/3560 Training loss: 1.5394 1.7331 sec/batch
Epoch 8/20  Iteration 1361/3560 Training loss: 1.5390 1.7158 sec/batch
Epoch 8/20  Iteration 1362/3560 Training loss: 1.5386 1.7387 sec/batch
Epoch 8/20  Iteration 1363/3560 Training loss: 1.5386 1.7430 sec/batch
Epoch 8/20  Iteration 1364/3560 Training loss: 1.5385 1.7477 sec/batch
Epoch 8/20  Iteration 1365/3560 Training loss: 1.5383 1.7355 sec/batch
Epoch 8/20  Iteration 1366/3560 Training loss: 1.5382 1.7266 sec/batch
Epoch 8/20  Iteration 1367/3560 Training loss: 1.5381 1.7335 sec/batch
Epoch 8/20  Iteration 1368/3560 Training loss: 1.5377 1.7170 sec/batch
Epoch 8/20  Iteration 1369/3560 Training loss: 1.5373 1.7517 sec/batch
Epoch 8/20  Iteration 1370/3560 Training loss: 1.5374 1.7064 sec/batch
Epoch 8/20  Iteration 1371/3560 Training loss: 1.5373 1.7465 sec/batch
Epoch 8/20  Iteration 1372/3560 Training loss: 1.5369 1.7424 sec/batch
Epoch 8/20  Iteration 1373/3560 Training loss: 1.5369 1.7430 sec/batch
Epoch 8/20  Iteration 1374/3560 Training loss: 1.5369 1.7395 sec/batch
Epoch 8/20  Iteration 1375/3560 Training loss: 1.5367 1.7395 sec/batch
Epoch 8/20  Iteration 1376/3560 Training loss: 1.5364 1.7466 sec/batch
Epoch 8/20  Iteration 1377/3560 Training loss: 1.5360 1.7029 sec/batch
Epoch 8/20  Iteration 1378/3560 Training loss: 1.5357 1.7140 sec/batch
Epoch 8/20  Iteration 1379/3560 Training loss: 1.5357 1.7455 sec/batch
Epoch 8/20  Iteration 1380/3560 Training loss: 1.5357 1.7387 sec/batch
Epoch 8/20  Iteration 1381/3560 Training loss: 1.5356 1.7156 sec/batch
Epoch 8/20  Iteration 1382/3560 Training loss: 1.5357 1.6913 sec/batch
Epoch 8/20  Iteration 1383/3560 Training loss: 1.5357 1.7366 sec/batch
Epoch 8/20  Iteration 1384/3560 Training loss: 1.5358 1.7398 sec/batch
Epoch 8/20  Iteration 1385/3560 Training loss: 1.5357 1.7423 sec/batch
Epoch 8/20  Iteration 1386/3560 Training loss: 1.5356 1.7632 sec/batch
Epoch 8/20  Iteration 1387/3560 Training loss: 1.5359 1.7511 sec/batch
Epoch 8/20  Iteration 1388/3560 Training loss: 1.5359 1.7750 sec/batch
Epoch 8/20  Iteration 1389/3560 Training loss: 1.5358 1.7503 sec/batch
Epoch 8/20  Iteration 1390/3560 Training loss: 1.5359 1.7387 sec/batch
Epoch 8/20  Iteration 1391/3560 Training loss: 1.5357 1.7525 sec/batch
Epoch 8/20  Iteration 1392/3560 Training loss: 1.5357 1.7422 sec/batch
Epoch 8/20  Iteration 1393/3560 Training loss: 1.5357 1.7167 sec/batch
Epoch 8/20  Iteration 1394/3560 Training loss: 1.5359 1.7202 sec/batch
Epoch 8/20  Iteration 1395/3560 Training loss: 1.5359 1.7178 sec/batch
Epoch 8/20  Iteration 1396/3560 Training loss: 1.5358 1.7478 sec/batch
Epoch 8/20  Iteration 1397/3560 Training loss: 1.5355 1.7203 sec/batch
Epoch 8/20  Iteration 1398/3560 Training loss: 1.5355 1.7390 sec/batch
Epoch 8/20  Iteration 1399/3560 Training loss: 1.5355 1.7218 sec/batch
Epoch 8/20  Iteration 1400/3560 Training loss: 1.5355 1.7295 sec/batch
Epoch 8/20  Iteration 1401/3560 Training loss: 1.5353 1.7336 sec/batch
Epoch 8/20  Iteration 1402/3560 Training loss: 1.5353 1.7225 sec/batch
Epoch 8/20  Iteration 1403/3560 Training loss: 1.5353 1.7447 sec/batch
Epoch 8/20  Iteration 1404/3560 Training loss: 1.5352 1.7311 sec/batch
Epoch 8/20  Iteration 1405/3560 Training loss: 1.5349 1.7678 sec/batch
Epoch 8/20  Iteration 1406/3560 Training loss: 1.5350 1.7494 sec/batch
Epoch 8/20  Iteration 1407/3560 Training loss: 1.5352 1.7319 sec/batch
Epoch 8/20  Iteration 1408/3560 Training loss: 1.5351 1.7400 sec/batch
Epoch 8/20  Iteration 1409/3560 Training loss: 1.5350 1.7375 sec/batch
Epoch 8/20  Iteration 1410/3560 Training loss: 1.5350 1.7391 sec/batch
Epoch 8/20  Iteration 1411/3560 Training loss: 1.5349 1.7394 sec/batch
Epoch 8/20  Iteration 1412/3560 Training loss: 1.5349 1.7385 sec/batch
Epoch 8/20  Iteration 1413/3560 Training loss: 1.5350 1.8893 sec/batch
Epoch 8/20  Iteration 1414/3560 Training loss: 1.5353 1.8098 sec/batch
Epoch 8/20  Iteration 1415/3560 Training loss: 1.5352 1.7159 sec/batch
Epoch 8/20  Iteration 1416/3560 Training loss: 1.5352 1.7349 sec/batch
Epoch 8/20  Iteration 1417/3560 Training loss: 1.5349 1.6998 sec/batch
Epoch 8/20  Iteration 1418/3560 Training loss: 1.5347 1.7422 sec/batch
Epoch 8/20  Iteration 1419/3560 Training loss: 1.5348 1.7415 sec/batch
Epoch 8/20  Iteration 1420/3560 Training loss: 1.5347 1.7405 sec/batch
Epoch 8/20  Iteration 1421/3560 Training loss: 1.5347 1.7295 sec/batch
Epoch 8/20  Iteration 1422/3560 Training loss: 1.5345 1.7280 sec/batch
Epoch 8/20  Iteration 1423/3560 Training loss: 1.5343 1.7426 sec/batch
Epoch 8/20  Iteration 1424/3560 Training loss: 1.5344 1.7356 sec/batch
Epoch 9/20  Iteration 1425/3560 Training loss: 1.5986 1.7313 sec/batch
Epoch 9/20  Iteration 1426/3560 Training loss: 1.5634 1.7400 sec/batch
Epoch 9/20  Iteration 1427/3560 Training loss: 1.5487 1.7216 sec/batch
Epoch 9/20  Iteration 1428/3560 Training loss: 1.5425 1.7444 sec/batch
Epoch 9/20  Iteration 1429/3560 Training loss: 1.5362 1.7290 sec/batch
Epoch 9/20  Iteration 1430/3560 Training loss: 1.5253 1.7442 sec/batch
Epoch 9/20  Iteration 1431/3560 Training loss: 1.5263 1.7277 sec/batch
Epoch 9/20  Iteration 1432/3560 Training loss: 1.5244 1.7332 sec/batch
Epoch 9/20  Iteration 1433/3560 Training loss: 1.5260 1.7400 sec/batch
Epoch 9/20  Iteration 1434/3560 Training loss: 1.5245 1.7407 sec/batch
Epoch 9/20  Iteration 1435/3560 Training loss: 1.5210 1.7402 sec/batch
Epoch 9/20  Iteration 1436/3560 Training loss: 1.5205 1.7524 sec/batch
Epoch 9/20  Iteration 1437/3560 Training loss: 1.5198 1.7444 sec/batch
Epoch 9/20  Iteration 1438/3560 Training loss: 1.5213 1.7425 sec/batch
Epoch 9/20  Iteration 1439/3560 Training loss: 1.5202 1.7095 sec/batch
Epoch 9/20  Iteration 1440/3560 Training loss: 1.5182 1.8189 sec/batch
Epoch 9/20  Iteration 1441/3560 Training loss: 1.5184 1.6857 sec/batch
Epoch 9/20  Iteration 1442/3560 Training loss: 1.5199 1.7140 sec/batch
Epoch 9/20  Iteration 1443/3560 Training loss: 1.5201 1.7158 sec/batch
Epoch 9/20  Iteration 1444/3560 Training loss: 1.5212 1.7314 sec/batch
Epoch 9/20  Iteration 1445/3560 Training loss: 1.5209 1.7599 sec/batch
Epoch 9/20  Iteration 1446/3560 Training loss: 1.5213 1.7076 sec/batch
Epoch 9/20  Iteration 1447/3560 Training loss: 1.5205 1.7237 sec/batch
Epoch 9/20  Iteration 1448/3560 Training loss: 1.5202 1.7261 sec/batch
Epoch 9/20  Iteration 1449/3560 Training loss: 1.5195 1.7072 sec/batch
Epoch 9/20  Iteration 1450/3560 Training loss: 1.5180 1.7174 sec/batch
Epoch 9/20  Iteration 1451/3560 Training loss: 1.5168 1.7090 sec/batch
Epoch 9/20  Iteration 1452/3560 Training loss: 1.5175 1.6938 sec/batch
Epoch 9/20  Iteration 1453/3560 Training loss: 1.5185 1.7607 sec/batch
Epoch 9/20  Iteration 1454/3560 Training loss: 1.5184 1.7344 sec/batch
Epoch 9/20  Iteration 1455/3560 Training loss: 1.5183 1.7062 sec/batch
Epoch 9/20  Iteration 1456/3560 Training loss: 1.5175 1.7245 sec/batch
Epoch 9/20  Iteration 1457/3560 Training loss: 1.5179 1.7698 sec/batch
Epoch 9/20  Iteration 1458/3560 Training loss: 1.5179 1.7281 sec/batch
Epoch 9/20  Iteration 1459/3560 Training loss: 1.5176 1.7215 sec/batch
Epoch 9/20  Iteration 1460/3560 Training loss: 1.5175 1.7260 sec/batch
Epoch 9/20  Iteration 1461/3560 Training loss: 1.5167 1.7481 sec/batch
Epoch 9/20  Iteration 1462/3560 Training loss: 1.5153 1.7235 sec/batch
Epoch 9/20  Iteration 1463/3560 Training loss: 1.5140 1.7216 sec/batch
Epoch 9/20  Iteration 1464/3560 Training loss: 1.5135 1.7015 sec/batch
Epoch 9/20  Iteration 1465/3560 Training loss: 1.5129 1.7112 sec/batch
Epoch 9/20  Iteration 1466/3560 Training loss: 1.5132 1.7022 sec/batch
Epoch 9/20  Iteration 1467/3560 Training loss: 1.5127 1.7292 sec/batch
Epoch 9/20  Iteration 1468/3560 Training loss: 1.5119 1.7021 sec/batch
Epoch 9/20  Iteration 1469/3560 Training loss: 1.5122 1.7216 sec/batch
Epoch 9/20  Iteration 1470/3560 Training loss: 1.5114 1.7281 sec/batch
Epoch 9/20  Iteration 1471/3560 Training loss: 1.5112 1.7139 sec/batch
Epoch 9/20  Iteration 1472/3560 Training loss: 1.5106 1.6992 sec/batch
Epoch 9/20  Iteration 1473/3560 Training loss: 1.5103 1.7056 sec/batch
Epoch 9/20  Iteration 1474/3560 Training loss: 1.5106 1.7452 sec/batch
Epoch 9/20  Iteration 1475/3560 Training loss: 1.5100 1.7185 sec/batch
Epoch 9/20  Iteration 1476/3560 Training loss: 1.5108 1.7306 sec/batch
Epoch 9/20  Iteration 1477/3560 Training loss: 1.5106 1.7066 sec/batch
Epoch 9/20  Iteration 1478/3560 Training loss: 1.5108 1.7120 sec/batch
Epoch 9/20  Iteration 1479/3560 Training loss: 1.5104 1.6853 sec/batch
Epoch 9/20  Iteration 1480/3560 Training loss: 1.5105 1.7243 sec/batch
Epoch 9/20  Iteration 1481/3560 Training loss: 1.5109 1.7284 sec/batch
Epoch 9/20  Iteration 1482/3560 Training loss: 1.5105 1.7242 sec/batch
Epoch 9/20  Iteration 1483/3560 Training loss: 1.5099 1.8922 sec/batch
Epoch 9/20  Iteration 1484/3560 Training loss: 1.5105 1.7238 sec/batch
Epoch 9/20  Iteration 1485/3560 Training loss: 1.5104 1.7205 sec/batch
Epoch 9/20  Iteration 1486/3560 Training loss: 1.5115 1.7243 sec/batch
Epoch 9/20  Iteration 1487/3560 Training loss: 1.5117 1.6997 sec/batch
Epoch 9/20  Iteration 1488/3560 Training loss: 1.5119 1.7601 sec/batch
Epoch 9/20  Iteration 1489/3560 Training loss: 1.5119 1.7353 sec/batch
Epoch 9/20  Iteration 1490/3560 Training loss: 1.5120 1.6999 sec/batch
Epoch 9/20  Iteration 1491/3560 Training loss: 1.5122 1.7487 sec/batch
Epoch 9/20  Iteration 1492/3560 Training loss: 1.5118 1.7231 sec/batch
Epoch 9/20  Iteration 1493/3560 Training loss: 1.5118 1.7011 sec/batch
Epoch 9/20  Iteration 1494/3560 Training loss: 1.5118 1.9379 sec/batch
Epoch 9/20  Iteration 1495/3560 Training loss: 1.5123 1.7329 sec/batch
Epoch 9/20  Iteration 1496/3560 Training loss: 1.5125 1.7319 sec/batch
Epoch 9/20  Iteration 1497/3560 Training loss: 1.5129 1.7324 sec/batch
Epoch 9/20  Iteration 1498/3560 Training loss: 1.5127 1.7371 sec/batch
Epoch 9/20  Iteration 1499/3560 Training loss: 1.5126 1.7062 sec/batch
Epoch 9/20  Iteration 1500/3560 Training loss: 1.5128 1.7064 sec/batch
Epoch 9/20  Iteration 1501/3560 Training loss: 1.5126 1.7300 sec/batch
Epoch 9/20  Iteration 1502/3560 Training loss: 1.5127 1.7373 sec/batch
Epoch 9/20  Iteration 1503/3560 Training loss: 1.5122 1.7554 sec/batch
Epoch 9/20  Iteration 1504/3560 Training loss: 1.5120 1.7495 sec/batch
Epoch 9/20  Iteration 1505/3560 Training loss: 1.5114 1.6936 sec/batch
Epoch 9/20  Iteration 1506/3560 Training loss: 1.5113 1.6910 sec/batch
Epoch 9/20  Iteration 1507/3560 Training loss: 1.5107 1.7197 sec/batch
Epoch 9/20  Iteration 1508/3560 Training loss: 1.5106 1.7336 sec/batch
Epoch 9/20  Iteration 1509/3560 Training loss: 1.5102 1.7233 sec/batch
Epoch 9/20  Iteration 1510/3560 Training loss: 1.5100 1.7065 sec/batch
Epoch 9/20  Iteration 1511/3560 Training loss: 1.5098 1.7374 sec/batch
Epoch 9/20  Iteration 1512/3560 Training loss: 1.5095 1.7277 sec/batch
Epoch 9/20  Iteration 1513/3560 Training loss: 1.5091 1.7213 sec/batch
Epoch 9/20  Iteration 1514/3560 Training loss: 1.5091 1.6869 sec/batch
Epoch 9/20  Iteration 1515/3560 Training loss: 1.5086 1.7033 sec/batch
Epoch 9/20  Iteration 1516/3560 Training loss: 1.5086 1.7237 sec/batch
Epoch 9/20  Iteration 1517/3560 Training loss: 1.5082 1.7284 sec/batch
Epoch 9/20  Iteration 1518/3560 Training loss: 1.5078 1.7057 sec/batch
Epoch 9/20  Iteration 1519/3560 Training loss: 1.5075 1.7213 sec/batch
Epoch 9/20  Iteration 1520/3560 Training loss: 1.5075 1.7067 sec/batch
Epoch 9/20  Iteration 1521/3560 Training loss: 1.5074 1.7287 sec/batch
Epoch 9/20  Iteration 1522/3560 Training loss: 1.5070 1.7187 sec/batch
Epoch 9/20  Iteration 1523/3560 Training loss: 1.5065 1.7338 sec/batch
Epoch 9/20  Iteration 1524/3560 Training loss: 1.5060 1.7215 sec/batch
Epoch 9/20  Iteration 1525/3560 Training loss: 1.5060 1.7295 sec/batch
Epoch 9/20  Iteration 1526/3560 Training loss: 1.5058 1.7301 sec/batch
Epoch 9/20  Iteration 1527/3560 Training loss: 1.5056 1.7304 sec/batch
Epoch 9/20  Iteration 1528/3560 Training loss: 1.5055 1.6880 sec/batch
Epoch 9/20  Iteration 1529/3560 Training loss: 1.5053 1.7224 sec/batch
Epoch 9/20  Iteration 1530/3560 Training loss: 1.5052 1.7364 sec/batch
Epoch 9/20  Iteration 1531/3560 Training loss: 1.5051 1.7315 sec/batch
Epoch 9/20  Iteration 1532/3560 Training loss: 1.5050 1.7326 sec/batch
Epoch 9/20  Iteration 1533/3560 Training loss: 1.5051 1.7230 sec/batch
Epoch 9/20  Iteration 1534/3560 Training loss: 1.5051 1.7020 sec/batch
Epoch 9/20  Iteration 1535/3560 Training loss: 1.5048 1.7243 sec/batch
Epoch 9/20  Iteration 1536/3560 Training loss: 1.5047 1.7188 sec/batch
Epoch 9/20  Iteration 1537/3560 Training loss: 1.5045 1.7334 sec/batch
Epoch 9/20  Iteration 1538/3560 Training loss: 1.5043 1.7016 sec/batch
Epoch 9/20  Iteration 1539/3560 Training loss: 1.5040 1.6988 sec/batch
Epoch 9/20  Iteration 1540/3560 Training loss: 1.5036 1.6963 sec/batch
Epoch 9/20  Iteration 1541/3560 Training loss: 1.5036 1.7061 sec/batch
Epoch 9/20  Iteration 1542/3560 Training loss: 1.5034 1.6892 sec/batch
Epoch 9/20  Iteration 1543/3560 Training loss: 1.5033 1.7235 sec/batch
Epoch 9/20  Iteration 1544/3560 Training loss: 1.5032 1.7295 sec/batch
Epoch 9/20  Iteration 1545/3560 Training loss: 1.5030 1.6999 sec/batch
Epoch 9/20  Iteration 1546/3560 Training loss: 1.5026 1.7182 sec/batch
Epoch 9/20  Iteration 1547/3560 Training loss: 1.5023 1.7345 sec/batch
Epoch 9/20  Iteration 1548/3560 Training loss: 1.5022 1.7122 sec/batch
Epoch 9/20  Iteration 1549/3560 Training loss: 1.5022 1.7075 sec/batch
Epoch 9/20  Iteration 1550/3560 Training loss: 1.5018 1.7403 sec/batch
Epoch 9/20  Iteration 1551/3560 Training loss: 1.5020 1.7316 sec/batch
Epoch 9/20  Iteration 1552/3560 Training loss: 1.5019 1.8687 sec/batch
Epoch 9/20  Iteration 1553/3560 Training loss: 1.5018 1.7379 sec/batch
Epoch 9/20  Iteration 1554/3560 Training loss: 1.5015 1.7286 sec/batch
Epoch 9/20  Iteration 1555/3560 Training loss: 1.5011 1.6921 sec/batch
Epoch 9/20  Iteration 1556/3560 Training loss: 1.5009 1.7182 sec/batch
Epoch 9/20  Iteration 1557/3560 Training loss: 1.5010 1.7308 sec/batch
Epoch 9/20  Iteration 1558/3560 Training loss: 1.5011 1.7244 sec/batch
Epoch 9/20  Iteration 1559/3560 Training loss: 1.5010 1.7280 sec/batch
Epoch 9/20  Iteration 1560/3560 Training loss: 1.5010 1.7292 sec/batch
Epoch 9/20  Iteration 1561/3560 Training loss: 1.5011 1.7455 sec/batch
Epoch 9/20  Iteration 1562/3560 Training loss: 1.5012 1.7260 sec/batch
Epoch 9/20  Iteration 1563/3560 Training loss: 1.5012 1.7384 sec/batch
Epoch 9/20  Iteration 1564/3560 Training loss: 1.5012 1.7102 sec/batch
Epoch 9/20  Iteration 1565/3560 Training loss: 1.5015 1.7014 sec/batch
Epoch 9/20  Iteration 1566/3560 Training loss: 1.5015 1.7168 sec/batch
Epoch 9/20  Iteration 1567/3560 Training loss: 1.5013 1.7212 sec/batch
Epoch 9/20  Iteration 1568/3560 Training loss: 1.5015 1.7030 sec/batch
Epoch 9/20  Iteration 1569/3560 Training loss: 1.5013 1.6945 sec/batch
Epoch 9/20  Iteration 1570/3560 Training loss: 1.5014 1.6962 sec/batch
Epoch 9/20  Iteration 1571/3560 Training loss: 1.5013 1.7404 sec/batch
Epoch 9/20  Iteration 1572/3560 Training loss: 1.5016 1.7138 sec/batch
Epoch 9/20  Iteration 1573/3560 Training loss: 1.5016 1.7164 sec/batch
Epoch 9/20  Iteration 1574/3560 Training loss: 1.5015 1.7236 sec/batch
Epoch 9/20  Iteration 1575/3560 Training loss: 1.5011 1.7172 sec/batch
Epoch 9/20  Iteration 1576/3560 Training loss: 1.5011 1.7235 sec/batch
Epoch 9/20  Iteration 1577/3560 Training loss: 1.5011 1.7146 sec/batch
Epoch 9/20  Iteration 1578/3560 Training loss: 1.5011 1.7116 sec/batch
Epoch 9/20  Iteration 1579/3560 Training loss: 1.5011 1.7309 sec/batch
Epoch 9/20  Iteration 1580/3560 Training loss: 1.5010 1.6915 sec/batch
Epoch 9/20  Iteration 1581/3560 Training loss: 1.5010 1.7267 sec/batch
Epoch 9/20  Iteration 1582/3560 Training loss: 1.5010 1.7315 sec/batch
Epoch 9/20  Iteration 1583/3560 Training loss: 1.5007 1.7236 sec/batch
Epoch 9/20  Iteration 1584/3560 Training loss: 1.5008 1.7289 sec/batch
Epoch 9/20  Iteration 1585/3560 Training loss: 1.5009 1.6923 sec/batch
Epoch 9/20  Iteration 1586/3560 Training loss: 1.5009 1.7093 sec/batch
Epoch 9/20  Iteration 1587/3560 Training loss: 1.5008 1.7305 sec/batch
Epoch 9/20  Iteration 1588/3560 Training loss: 1.5008 1.7187 sec/batch
Epoch 9/20  Iteration 1589/3560 Training loss: 1.5008 1.7252 sec/batch
Epoch 9/20  Iteration 1590/3560 Training loss: 1.5008 1.7171 sec/batch
Epoch 9/20  Iteration 1591/3560 Training loss: 1.5009 1.7175 sec/batch
Epoch 9/20  Iteration 1592/3560 Training loss: 1.5013 1.7495 sec/batch
Epoch 9/20  Iteration 1593/3560 Training loss: 1.5013 1.7215 sec/batch
Epoch 9/20  Iteration 1594/3560 Training loss: 1.5012 1.7281 sec/batch
Epoch 9/20  Iteration 1595/3560 Training loss: 1.5011 1.7347 sec/batch
Epoch 9/20  Iteration 1596/3560 Training loss: 1.5009 1.7204 sec/batch
Epoch 9/20  Iteration 1597/3560 Training loss: 1.5009 1.7009 sec/batch
Epoch 9/20  Iteration 1598/3560 Training loss: 1.5009 1.7154 sec/batch
Epoch 9/20  Iteration 1599/3560 Training loss: 1.5009 1.7295 sec/batch
Epoch 9/20  Iteration 1600/3560 Training loss: 1.5007 1.7161 sec/batch
Epoch 9/20  Iteration 1601/3560 Training loss: 1.5005 1.7350 sec/batch
Epoch 9/20  Iteration 1602/3560 Training loss: 1.5006 1.7295 sec/batch
Epoch 10/20  Iteration 1603/3560 Training loss: 1.5782 1.7216 sec/batch
Epoch 10/20  Iteration 1604/3560 Training loss: 1.5367 1.7281 sec/batch
Epoch 10/20  Iteration 1605/3560 Training loss: 1.5200 1.7252 sec/batch
Epoch 10/20  Iteration 1606/3560 Training loss: 1.5164 1.7220 sec/batch
Epoch 10/20  Iteration 1607/3560 Training loss: 1.5092 1.7295 sec/batch
Epoch 10/20  Iteration 1608/3560 Training loss: 1.4976 1.7237 sec/batch
Epoch 10/20  Iteration 1609/3560 Training loss: 1.4982 1.7236 sec/batch
Epoch 10/20  Iteration 1610/3560 Training loss: 1.4965 1.7314 sec/batch
Epoch 10/20  Iteration 1611/3560 Training loss: 1.4970 1.7313 sec/batch
Epoch 10/20  Iteration 1612/3560 Training loss: 1.4960 1.7294 sec/batch
Epoch 10/20  Iteration 1613/3560 Training loss: 1.4924 1.7364 sec/batch
Epoch 10/20  Iteration 1614/3560 Training loss: 1.4914 1.7026 sec/batch
Epoch 10/20  Iteration 1615/3560 Training loss: 1.4909 1.7100 sec/batch
Epoch 10/20  Iteration 1616/3560 Training loss: 1.4917 1.7367 sec/batch
Epoch 10/20  Iteration 1617/3560 Training loss: 1.4907 1.7285 sec/batch
Epoch 10/20  Iteration 1618/3560 Training loss: 1.4886 1.7267 sec/batch
Epoch 10/20  Iteration 1619/3560 Training loss: 1.4895 1.7252 sec/batch
Epoch 10/20  Iteration 1620/3560 Training loss: 1.4908 1.7096 sec/batch
Epoch 10/20  Iteration 1621/3560 Training loss: 1.4908 1.6986 sec/batch
Epoch 10/20  Iteration 1622/3560 Training loss: 1.4918 1.8514 sec/batch
Epoch 10/20  Iteration 1623/3560 Training loss: 1.4914 1.7685 sec/batch
Epoch 10/20  Iteration 1624/3560 Training loss: 1.4918 1.7300 sec/batch
Epoch 10/20  Iteration 1625/3560 Training loss: 1.4911 1.7293 sec/batch
Epoch 10/20  Iteration 1626/3560 Training loss: 1.4904 1.7180 sec/batch
Epoch 10/20  Iteration 1627/3560 Training loss: 1.4904 1.7048 sec/batch
Epoch 10/20  Iteration 1628/3560 Training loss: 1.4887 1.6934 sec/batch
Epoch 10/20  Iteration 1629/3560 Training loss: 1.4870 1.7171 sec/batch
Epoch 10/20  Iteration 1630/3560 Training loss: 1.4875 1.7112 sec/batch
Epoch 10/20  Iteration 1631/3560 Training loss: 1.4881 1.7246 sec/batch
Epoch 10/20  Iteration 1632/3560 Training loss: 1.4885 1.6941 sec/batch
Epoch 10/20  Iteration 1633/3560 Training loss: 1.4882 1.7418 sec/batch
Epoch 10/20  Iteration 1634/3560 Training loss: 1.4873 1.7038 sec/batch
Epoch 10/20  Iteration 1635/3560 Training loss: 1.4878 1.7453 sec/batch
Epoch 10/20  Iteration 1636/3560 Training loss: 1.4882 1.7275 sec/batch
Epoch 10/20  Iteration 1637/3560 Training loss: 1.4878 1.7111 sec/batch
Epoch 10/20  Iteration 1638/3560 Training loss: 1.4877 1.7341 sec/batch
Epoch 10/20  Iteration 1639/3560 Training loss: 1.4869 1.7213 sec/batch
Epoch 10/20  Iteration 1640/3560 Training loss: 1.4857 1.7064 sec/batch
Epoch 10/20  Iteration 1641/3560 Training loss: 1.4843 1.7185 sec/batch
Epoch 10/20  Iteration 1642/3560 Training loss: 1.4837 1.6885 sec/batch
Epoch 10/20  Iteration 1643/3560 Training loss: 1.4830 1.7322 sec/batch
Epoch 10/20  Iteration 1644/3560 Training loss: 1.4836 1.6831 sec/batch
Epoch 10/20  Iteration 1645/3560 Training loss: 1.4830 1.7029 sec/batch
Epoch 10/20  Iteration 1646/3560 Training loss: 1.4823 1.7161 sec/batch
Epoch 10/20  Iteration 1647/3560 Training loss: 1.4825 1.7318 sec/batch
Epoch 10/20  Iteration 1648/3560 Training loss: 1.4818 1.7192 sec/batch
Epoch 10/20  Iteration 1649/3560 Training loss: 1.4815 1.7014 sec/batch
Epoch 10/20  Iteration 1650/3560 Training loss: 1.4811 1.7237 sec/batch
Epoch 10/20  Iteration 1651/3560 Training loss: 1.4807 1.7056 sec/batch
Epoch 10/20  Iteration 1652/3560 Training loss: 1.4812 1.7030 sec/batch
Epoch 10/20  Iteration 1653/3560 Training loss: 1.4807 1.7039 sec/batch
Epoch 10/20  Iteration 1654/3560 Training loss: 1.4814 1.7263 sec/batch
Epoch 10/20  Iteration 1655/3560 Training loss: 1.4813 1.6933 sec/batch
Epoch 10/20  Iteration 1656/3560 Training loss: 1.4813 1.7324 sec/batch
Epoch 10/20  Iteration 1657/3560 Training loss: 1.4811 1.7214 sec/batch
Epoch 10/20  Iteration 1658/3560 Training loss: 1.4812 1.7110 sec/batch
Epoch 10/20  Iteration 1659/3560 Training loss: 1.4818 1.7287 sec/batch
Epoch 10/20  Iteration 1660/3560 Training loss: 1.4814 1.7246 sec/batch
Epoch 10/20  Iteration 1661/3560 Training loss: 1.4809 1.6950 sec/batch
Epoch 10/20  Iteration 1662/3560 Training loss: 1.4815 1.7308 sec/batch
Epoch 10/20  Iteration 1663/3560 Training loss: 1.4817 1.7345 sec/batch
Epoch 10/20  Iteration 1664/3560 Training loss: 1.4826 1.7126 sec/batch
Epoch 10/20  Iteration 1665/3560 Training loss: 1.4829 1.7431 sec/batch
Epoch 10/20  Iteration 1666/3560 Training loss: 1.4830 1.7374 sec/batch
Epoch 10/20  Iteration 1667/3560 Training loss: 1.4831 1.7370 sec/batch
Epoch 10/20  Iteration 1668/3560 Training loss: 1.4833 2.2041 sec/batch
Epoch 10/20  Iteration 1669/3560 Training loss: 1.4835 1.7444 sec/batch
Epoch 10/20  Iteration 1670/3560 Training loss: 1.4832 1.6936 sec/batch
Epoch 10/20  Iteration 1671/3560 Training loss: 1.4832 1.6996 sec/batch
Epoch 10/20  Iteration 1672/3560 Training loss: 1.4830 1.7363 sec/batch
Epoch 10/20  Iteration 1673/3560 Training loss: 1.4836 1.7311 sec/batch
Epoch 10/20  Iteration 1674/3560 Training loss: 1.4839 1.7303 sec/batch
Epoch 10/20  Iteration 1675/3560 Training loss: 1.4844 1.6959 sec/batch
Epoch 10/20  Iteration 1676/3560 Training loss: 1.4841 1.6991 sec/batch
Epoch 10/20  Iteration 1677/3560 Training loss: 1.4838 1.7282 sec/batch
Epoch 10/20  Iteration 1678/3560 Training loss: 1.4840 1.7300 sec/batch
Epoch 10/20  Iteration 1679/3560 Training loss: 1.4839 1.7194 sec/batch
Epoch 10/20  Iteration 1680/3560 Training loss: 1.4840 1.7357 sec/batch
Epoch 10/20  Iteration 1681/3560 Training loss: 1.4834 1.7144 sec/batch
Epoch 10/20  Iteration 1682/3560 Training loss: 1.4832 1.7403 sec/batch
Epoch 10/20  Iteration 1683/3560 Training loss: 1.4828 1.7176 sec/batch
Epoch 10/20  Iteration 1684/3560 Training loss: 1.4828 1.7222 sec/batch
Epoch 10/20  Iteration 1685/3560 Training loss: 1.4822 1.7228 sec/batch
Epoch 10/20  Iteration 1686/3560 Training loss: 1.4820 1.7348 sec/batch
Epoch 10/20  Iteration 1687/3560 Training loss: 1.4817 1.7329 sec/batch
Epoch 10/20  Iteration 1688/3560 Training loss: 1.4815 1.7273 sec/batch
Epoch 10/20  Iteration 1689/3560 Training loss: 1.4811 1.7010 sec/batch
Epoch 10/20  Iteration 1690/3560 Training loss: 1.4809 1.6933 sec/batch
Epoch 10/20  Iteration 1691/3560 Training loss: 1.4804 1.8424 sec/batch
Epoch 10/20  Iteration 1692/3560 Training loss: 1.4804 1.7877 sec/batch
Epoch 10/20  Iteration 1693/3560 Training loss: 1.4801 1.6950 sec/batch
Epoch 10/20  Iteration 1694/3560 Training loss: 1.4798 1.7120 sec/batch
Epoch 10/20  Iteration 1695/3560 Training loss: 1.4794 1.6996 sec/batch
Epoch 10/20  Iteration 1696/3560 Training loss: 1.4791 1.7241 sec/batch
Epoch 10/20  Iteration 1697/3560 Training loss: 1.4789 1.7258 sec/batch
Epoch 10/20  Iteration 1698/3560 Training loss: 1.4790 1.7128 sec/batch
Epoch 10/20  Iteration 1699/3560 Training loss: 1.4788 1.7387 sec/batch
Epoch 10/20  Iteration 1700/3560 Training loss: 1.4784 1.7231 sec/batch
Epoch 10/20  Iteration 1701/3560 Training loss: 1.4780 1.7022 sec/batch
Epoch 10/20  Iteration 1702/3560 Training loss: 1.4776 1.7192 sec/batch
Epoch 10/20  Iteration 1703/3560 Training loss: 1.4776 1.7076 sec/batch
Epoch 10/20  Iteration 1704/3560 Training loss: 1.4774 1.7308 sec/batch
Epoch 10/20  Iteration 1705/3560 Training loss: 1.4773 1.7324 sec/batch
Epoch 10/20  Iteration 1706/3560 Training loss: 1.4771 1.7603 sec/batch
Epoch 10/20  Iteration 1707/3560 Training loss: 1.4769 1.7238 sec/batch
Epoch 10/20  Iteration 1708/3560 Training loss: 1.4768 1.7235 sec/batch
Epoch 10/20  Iteration 1709/3560 Training loss: 1.4766 1.7256 sec/batch
Epoch 10/20  Iteration 1710/3560 Training loss: 1.4766 1.7221 sec/batch
Epoch 10/20  Iteration 1711/3560 Training loss: 1.4766 1.7310 sec/batch
Epoch 10/20  Iteration 1712/3560 Training loss: 1.4765 1.6943 sec/batch
Epoch 10/20  Iteration 1713/3560 Training loss: 1.4763 1.7307 sec/batch
Epoch 10/20  Iteration 1714/3560 Training loss: 1.4762 1.7012 sec/batch
Epoch 10/20  Iteration 1715/3560 Training loss: 1.4762 1.7177 sec/batch
Epoch 10/20  Iteration 1716/3560 Training loss: 1.4759 1.7015 sec/batch
Epoch 10/20  Iteration 1717/3560 Training loss: 1.4757 1.7389 sec/batch
Epoch 10/20  Iteration 1718/3560 Training loss: 1.4754 1.7346 sec/batch
Epoch 10/20  Iteration 1719/3560 Training loss: 1.4753 1.6956 sec/batch
Epoch 10/20  Iteration 1720/3560 Training loss: 1.4753 1.7298 sec/batch
Epoch 10/20  Iteration 1721/3560 Training loss: 1.4751 1.7084 sec/batch
Epoch 10/20  Iteration 1722/3560 Training loss: 1.4751 1.7341 sec/batch
Epoch 10/20  Iteration 1723/3560 Training loss: 1.4750 1.7002 sec/batch
Epoch 10/20  Iteration 1724/3560 Training loss: 1.4746 1.6961 sec/batch
Epoch 10/20  Iteration 1725/3560 Training loss: 1.4742 1.7118 sec/batch
Epoch 10/20  Iteration 1726/3560 Training loss: 1.4742 1.7408 sec/batch
Epoch 10/20  Iteration 1727/3560 Training loss: 1.4742 1.7083 sec/batch
Epoch 10/20  Iteration 1728/3560 Training loss: 1.4738 1.7242 sec/batch
Epoch 10/20  Iteration 1729/3560 Training loss: 1.4738 1.7285 sec/batch
Epoch 10/20  Iteration 1730/3560 Training loss: 1.4738 1.7283 sec/batch
Epoch 10/20  Iteration 1731/3560 Training loss: 1.4737 1.7244 sec/batch
Epoch 10/20  Iteration 1732/3560 Training loss: 1.4734 1.7193 sec/batch
Epoch 10/20  Iteration 1733/3560 Training loss: 1.4730 1.7301 sec/batch
Epoch 10/20  Iteration 1734/3560 Training loss: 1.4728 1.7014 sec/batch
Epoch 10/20  Iteration 1735/3560 Training loss: 1.4729 1.7234 sec/batch
Epoch 10/20  Iteration 1736/3560 Training loss: 1.4728 1.7324 sec/batch
Epoch 10/20  Iteration 1737/3560 Training loss: 1.4728 1.7117 sec/batch
Epoch 10/20  Iteration 1738/3560 Training loss: 1.4728 1.7264 sec/batch
Epoch 10/20  Iteration 1739/3560 Training loss: 1.4729 1.6987 sec/batch
Epoch 10/20  Iteration 1740/3560 Training loss: 1.4730 1.6853 sec/batch
Epoch 10/20  Iteration 1741/3560 Training loss: 1.4730 1.7037 sec/batch
Epoch 10/20  Iteration 1742/3560 Training loss: 1.4729 1.7220 sec/batch
Epoch 10/20  Iteration 1743/3560 Training loss: 1.4732 1.6983 sec/batch
Epoch 10/20  Iteration 1744/3560 Training loss: 1.4732 1.7327 sec/batch
Epoch 10/20  Iteration 1745/3560 Training loss: 1.4731 1.7275 sec/batch
Epoch 10/20  Iteration 1746/3560 Training loss: 1.4733 1.7375 sec/batch
Epoch 10/20  Iteration 1747/3560 Training loss: 1.4731 1.7006 sec/batch
Epoch 10/20  Iteration 1748/3560 Training loss: 1.4732 1.6975 sec/batch
Epoch 10/20  Iteration 1749/3560 Training loss: 1.4731 1.7308 sec/batch
Epoch 10/20  Iteration 1750/3560 Training loss: 1.4734 1.7137 sec/batch
Epoch 10/20  Iteration 1751/3560 Training loss: 1.4735 1.7202 sec/batch
Epoch 10/20  Iteration 1752/3560 Training loss: 1.4733 1.7265 sec/batch
Epoch 10/20  Iteration 1753/3560 Training loss: 1.4731 1.6959 sec/batch
Epoch 10/20  Iteration 1754/3560 Training loss: 1.4730 1.6949 sec/batch
Epoch 10/20  Iteration 1755/3560 Training loss: 1.4730 1.7331 sec/batch
Epoch 10/20  Iteration 1756/3560 Training loss: 1.4730 1.7317 sec/batch
Epoch 10/20  Iteration 1757/3560 Training loss: 1.4730 1.7307 sec/batch
Epoch 10/20  Iteration 1758/3560 Training loss: 1.4730 1.7072 sec/batch
Epoch 10/20  Iteration 1759/3560 Training loss: 1.4731 1.8423 sec/batch
Epoch 10/20  Iteration 1760/3560 Training loss: 1.4730 1.7861 sec/batch
Epoch 10/20  Iteration 1761/3560 Training loss: 1.4727 1.7729 sec/batch
Epoch 10/20  Iteration 1762/3560 Training loss: 1.4728 1.7205 sec/batch
Epoch 10/20  Iteration 1763/3560 Training loss: 1.4730 1.6966 sec/batch
Epoch 10/20  Iteration 1764/3560 Training loss: 1.4730 1.7284 sec/batch
Epoch 10/20  Iteration 1765/3560 Training loss: 1.4730 1.7248 sec/batch
Epoch 10/20  Iteration 1766/3560 Training loss: 1.4730 1.7155 sec/batch
Epoch 10/20  Iteration 1767/3560 Training loss: 1.4730 1.6932 sec/batch
Epoch 10/20  Iteration 1768/3560 Training loss: 1.4730 1.7056 sec/batch
Epoch 10/20  Iteration 1769/3560 Training loss: 1.4730 1.7079 sec/batch
Epoch 10/20  Iteration 1770/3560 Training loss: 1.4735 1.7082 sec/batch
Epoch 10/20  Iteration 1771/3560 Training loss: 1.4735 1.6981 sec/batch
Epoch 10/20  Iteration 1772/3560 Training loss: 1.4734 1.7315 sec/batch
Epoch 10/20  Iteration 1773/3560 Training loss: 1.4733 1.7263 sec/batch
Epoch 10/20  Iteration 1774/3560 Training loss: 1.4731 1.7289 sec/batch
Epoch 10/20  Iteration 1775/3560 Training loss: 1.4732 1.7301 sec/batch
Epoch 10/20  Iteration 1776/3560 Training loss: 1.4731 1.6952 sec/batch
Epoch 10/20  Iteration 1777/3560 Training loss: 1.4732 1.7223 sec/batch
Epoch 10/20  Iteration 1778/3560 Training loss: 1.4730 1.7340 sec/batch
Epoch 10/20  Iteration 1779/3560 Training loss: 1.4729 1.7227 sec/batch
Epoch 10/20  Iteration 1780/3560 Training loss: 1.4731 1.7277 sec/batch
Epoch 11/20  Iteration 1781/3560 Training loss: 1.5516 1.7086 sec/batch
Epoch 11/20  Iteration 1782/3560 Training loss: 1.5130 1.7280 sec/batch
Epoch 11/20  Iteration 1783/3560 Training loss: 1.4970 1.7123 sec/batch
Epoch 11/20  Iteration 1784/3560 Training loss: 1.4919 1.7302 sec/batch
Epoch 11/20  Iteration 1785/3560 Training loss: 1.4827 1.7177 sec/batch
Epoch 11/20  Iteration 1786/3560 Training loss: 1.4715 1.7427 sec/batch
Epoch 11/20  Iteration 1787/3560 Training loss: 1.4724 1.6933 sec/batch
Epoch 11/20  Iteration 1788/3560 Training loss: 1.4698 1.7341 sec/batch
Epoch 11/20  Iteration 1789/3560 Training loss: 1.4700 1.6952 sec/batch
Epoch 11/20  Iteration 1790/3560 Training loss: 1.4699 1.7050 sec/batch
Epoch 11/20  Iteration 1791/3560 Training loss: 1.4667 1.7004 sec/batch
Epoch 11/20  Iteration 1792/3560 Training loss: 1.4654 1.7343 sec/batch
Epoch 11/20  Iteration 1793/3560 Training loss: 1.4659 1.7197 sec/batch
Epoch 11/20  Iteration 1794/3560 Training loss: 1.4676 1.6973 sec/batch
Epoch 11/20  Iteration 1795/3560 Training loss: 1.4663 1.7684 sec/batch
Epoch 11/20  Iteration 1796/3560 Training loss: 1.4642 1.7507 sec/batch
Epoch 11/20  Iteration 1797/3560 Training loss: 1.4651 1.7084 sec/batch
Epoch 11/20  Iteration 1798/3560 Training loss: 1.4664 1.6951 sec/batch
Epoch 11/20  Iteration 1799/3560 Training loss: 1.4666 1.7331 sec/batch
Epoch 11/20  Iteration 1800/3560 Training loss: 1.4675 1.7324 sec/batch
Epoch 11/20  Iteration 1801/3560 Training loss: 1.4668 1.7255 sec/batch
Epoch 11/20  Iteration 1802/3560 Training loss: 1.4672 1.6893 sec/batch
Epoch 11/20  Iteration 1803/3560 Training loss: 1.4665 1.7163 sec/batch
Epoch 11/20  Iteration 1804/3560 Training loss: 1.4662 1.7288 sec/batch
Epoch 11/20  Iteration 1805/3560 Training loss: 1.4662 1.7018 sec/batch
Epoch 11/20  Iteration 1806/3560 Training loss: 1.4649 1.7410 sec/batch
Epoch 11/20  Iteration 1807/3560 Training loss: 1.4640 1.7209 sec/batch
Epoch 11/20  Iteration 1808/3560 Training loss: 1.4646 1.7252 sec/batch
Epoch 11/20  Iteration 1809/3560 Training loss: 1.4653 1.7311 sec/batch
Epoch 11/20  Iteration 1810/3560 Training loss: 1.4654 1.6901 sec/batch
Epoch 11/20  Iteration 1811/3560 Training loss: 1.4650 1.7238 sec/batch
Epoch 11/20  Iteration 1812/3560 Training loss: 1.4641 1.6994 sec/batch
Epoch 11/20  Iteration 1813/3560 Training loss: 1.4642 1.7362 sec/batch
Epoch 11/20  Iteration 1814/3560 Training loss: 1.4643 1.6918 sec/batch
Epoch 11/20  Iteration 1815/3560 Training loss: 1.4642 1.7195 sec/batch
Epoch 11/20  Iteration 1816/3560 Training loss: 1.4642 1.7136 sec/batch
Epoch 11/20  Iteration 1817/3560 Training loss: 1.4633 1.7169 sec/batch
Epoch 11/20  Iteration 1818/3560 Training loss: 1.4620 1.7265 sec/batch
Epoch 11/20  Iteration 1819/3560 Training loss: 1.4606 1.7286 sec/batch
Epoch 11/20  Iteration 1820/3560 Training loss: 1.4600 1.7305 sec/batch
Epoch 11/20  Iteration 1821/3560 Training loss: 1.4594 1.7034 sec/batch
Epoch 11/20  Iteration 1822/3560 Training loss: 1.4601 1.8727 sec/batch
Epoch 11/20  Iteration 1823/3560 Training loss: 1.4595 1.7717 sec/batch
Epoch 11/20  Iteration 1824/3560 Training loss: 1.4589 1.7097 sec/batch
Epoch 11/20  Iteration 1825/3560 Training loss: 1.4591 1.7092 sec/batch
Epoch 11/20  Iteration 1826/3560 Training loss: 1.4580 1.7333 sec/batch
Epoch 11/20  Iteration 1827/3560 Training loss: 1.4578 1.7308 sec/batch
Epoch 11/20  Iteration 1828/3560 Training loss: 1.4575 1.7034 sec/batch
Epoch 11/20  Iteration 1829/3560 Training loss: 1.4572 1.7256 sec/batch
Epoch 11/20  Iteration 1830/3560 Training loss: 1.4574 1.8691 sec/batch
Epoch 11/20  Iteration 1831/3560 Training loss: 1.4568 1.7383 sec/batch
Epoch 11/20  Iteration 1832/3560 Training loss: 1.4572 1.7311 sec/batch
Epoch 11/20  Iteration 1833/3560 Training loss: 1.4572 1.7287 sec/batch
Epoch 11/20  Iteration 1834/3560 Training loss: 1.4571 1.6933 sec/batch
Epoch 11/20  Iteration 1835/3560 Training loss: 1.4568 1.7319 sec/batch
Epoch 11/20  Iteration 1836/3560 Training loss: 1.4569 1.7240 sec/batch
Epoch 11/20  Iteration 1837/3560 Training loss: 1.4571 1.7222 sec/batch
Epoch 11/20  Iteration 1838/3560 Training loss: 1.4567 1.7362 sec/batch
Epoch 11/20  Iteration 1839/3560 Training loss: 1.4562 1.7003 sec/batch
Epoch 11/20  Iteration 1840/3560 Training loss: 1.4568 1.7209 sec/batch
Epoch 11/20  Iteration 1841/3560 Training loss: 1.4568 1.9268 sec/batch
Epoch 11/20  Iteration 1842/3560 Training loss: 1.4577 1.7422 sec/batch
Epoch 11/20  Iteration 1843/3560 Training loss: 1.4581 1.7103 sec/batch
Epoch 11/20  Iteration 1844/3560 Training loss: 1.4582 1.7273 sec/batch
Epoch 11/20  Iteration 1845/3560 Training loss: 1.4581 1.7357 sec/batch
Epoch 11/20  Iteration 1846/3560 Training loss: 1.4582 1.7342 sec/batch
Epoch 11/20  Iteration 1847/3560 Training loss: 1.4584 1.7177 sec/batch
Epoch 11/20  Iteration 1848/3560 Training loss: 1.4580 1.7009 sec/batch
Epoch 11/20  Iteration 1849/3560 Training loss: 1.4580 1.7300 sec/batch
Epoch 11/20  Iteration 1850/3560 Training loss: 1.4578 1.7469 sec/batch
Epoch 11/20  Iteration 1851/3560 Training loss: 1.4582 1.7108 sec/batch
Epoch 11/20  Iteration 1852/3560 Training loss: 1.4585 1.6958 sec/batch
Epoch 11/20  Iteration 1853/3560 Training loss: 1.4590 1.7007 sec/batch
Epoch 11/20  Iteration 1854/3560 Training loss: 1.4586 1.7031 sec/batch
Epoch 11/20  Iteration 1855/3560 Training loss: 1.4584 1.7103 sec/batch
Epoch 11/20  Iteration 1856/3560 Training loss: 1.4586 1.7006 sec/batch
Epoch 11/20  Iteration 1857/3560 Training loss: 1.4584 1.7342 sec/batch
Epoch 11/20  Iteration 1858/3560 Training loss: 1.4583 1.7262 sec/batch
Epoch 11/20  Iteration 1859/3560 Training loss: 1.4577 1.7085 sec/batch
Epoch 11/20  Iteration 1860/3560 Training loss: 1.4576 1.7278 sec/batch
Epoch 11/20  Iteration 1861/3560 Training loss: 1.4570 1.7323 sec/batch
Epoch 11/20  Iteration 1862/3560 Training loss: 1.4569 1.7270 sec/batch
Epoch 11/20  Iteration 1863/3560 Training loss: 1.4564 1.7314 sec/batch
Epoch 11/20  Iteration 1864/3560 Training loss: 1.4563 1.7133 sec/batch
Epoch 11/20  Iteration 1865/3560 Training loss: 1.4559 1.7244 sec/batch
Epoch 11/20  Iteration 1866/3560 Training loss: 1.4556 1.7325 sec/batch
Epoch 11/20  Iteration 1867/3560 Training loss: 1.4554 1.7060 sec/batch
Epoch 11/20  Iteration 1868/3560 Training loss: 1.4551 1.7181 sec/batch
Epoch 11/20  Iteration 1869/3560 Training loss: 1.4547 1.7209 sec/batch
Epoch 11/20  Iteration 1870/3560 Training loss: 1.4548 1.7303 sec/batch
Epoch 11/20  Iteration 1871/3560 Training loss: 1.4545 1.7259 sec/batch
Epoch 11/20  Iteration 1872/3560 Training loss: 1.4543 1.7128 sec/batch
Epoch 11/20  Iteration 1873/3560 Training loss: 1.4539 1.7125 sec/batch
Epoch 11/20  Iteration 1874/3560 Training loss: 1.4536 1.7133 sec/batch
Epoch 11/20  Iteration 1875/3560 Training loss: 1.4533 1.7066 sec/batch
Epoch 11/20  Iteration 1876/3560 Training loss: 1.4533 1.7284 sec/batch
Epoch 11/20  Iteration 1877/3560 Training loss: 1.4532 1.7023 sec/batch
Epoch 11/20  Iteration 1878/3560 Training loss: 1.4529 1.7248 sec/batch
Epoch 11/20  Iteration 1879/3560 Training loss: 1.4525 1.6927 sec/batch
Epoch 11/20  Iteration 1880/3560 Training loss: 1.4522 1.7135 sec/batch
Epoch 11/20  Iteration 1881/3560 Training loss: 1.4522 1.7055 sec/batch
Epoch 11/20  Iteration 1882/3560 Training loss: 1.4522 1.7013 sec/batch
Epoch 11/20  Iteration 1883/3560 Training loss: 1.4521 1.7356 sec/batch
Epoch 11/20  Iteration 1884/3560 Training loss: 1.4521 1.7166 sec/batch
Epoch 11/20  Iteration 1885/3560 Training loss: 1.4520 1.7263 sec/batch
Epoch 11/20  Iteration 1886/3560 Training loss: 1.4519 1.6915 sec/batch
Epoch 11/20  Iteration 1887/3560 Training loss: 1.4518 1.7090 sec/batch
Epoch 11/20  Iteration 1888/3560 Training loss: 1.4518 1.7190 sec/batch
Epoch 11/20  Iteration 1889/3560 Training loss: 1.4516 1.7036 sec/batch
Epoch 11/20  Iteration 1890/3560 Training loss: 1.4517 1.7345 sec/batch
Epoch 11/20  Iteration 1891/3560 Training loss: 1.4515 1.7232 sec/batch
Epoch 11/20  Iteration 1892/3560 Training loss: 1.4515 1.7205 sec/batch
Epoch 11/20  Iteration 1893/3560 Training loss: 1.4515 1.7137 sec/batch
Epoch 11/20  Iteration 1894/3560 Training loss: 1.4512 1.7274 sec/batch
Epoch 11/20  Iteration 1895/3560 Training loss: 1.4509 1.7305 sec/batch
Epoch 11/20  Iteration 1896/3560 Training loss: 1.4506 1.6965 sec/batch
Epoch 11/20  Iteration 1897/3560 Training loss: 1.4506 1.7348 sec/batch
Epoch 11/20  Iteration 1898/3560 Training loss: 1.4506 1.7125 sec/batch
Epoch 11/20  Iteration 1899/3560 Training loss: 1.4505 1.8615 sec/batch
Epoch 11/20  Iteration 1900/3560 Training loss: 1.4504 1.7915 sec/batch
Epoch 11/20  Iteration 1901/3560 Training loss: 1.4502 1.7361 sec/batch
Epoch 11/20  Iteration 1902/3560 Training loss: 1.4499 1.7824 sec/batch
Epoch 11/20  Iteration 1903/3560 Training loss: 1.4495 1.7471 sec/batch
Epoch 11/20  Iteration 1904/3560 Training loss: 1.4495 1.6964 sec/batch
Epoch 11/20  Iteration 1905/3560 Training loss: 1.4495 1.7310 sec/batch
Epoch 11/20  Iteration 1906/3560 Training loss: 1.4492 1.6996 sec/batch
Epoch 11/20  Iteration 1907/3560 Training loss: 1.4492 1.7152 sec/batch
Epoch 11/20  Iteration 1908/3560 Training loss: 1.4491 1.8230 sec/batch
Epoch 11/20  Iteration 1909/3560 Training loss: 1.4490 1.7241 sec/batch
Epoch 11/20  Iteration 1910/3560 Training loss: 1.4487 1.7340 sec/batch
Epoch 11/20  Iteration 1911/3560 Training loss: 1.4484 1.7152 sec/batch
Epoch 11/20  Iteration 1912/3560 Training loss: 1.4482 1.6995 sec/batch
Epoch 11/20  Iteration 1913/3560 Training loss: 1.4483 1.7340 sec/batch
Epoch 11/20  Iteration 1914/3560 Training loss: 1.4483 1.6816 sec/batch
Epoch 11/20  Iteration 1915/3560 Training loss: 1.4483 1.6916 sec/batch
Epoch 11/20  Iteration 1916/3560 Training loss: 1.4483 1.6753 sec/batch
Epoch 11/20  Iteration 1917/3560 Training loss: 1.4485 1.7978 sec/batch
Epoch 11/20  Iteration 1918/3560 Training loss: 1.4485 1.7318 sec/batch
Epoch 11/20  Iteration 1919/3560 Training loss: 1.4485 1.7045 sec/batch
Epoch 11/20  Iteration 1920/3560 Training loss: 1.4484 1.7253 sec/batch
Epoch 11/20  Iteration 1921/3560 Training loss: 1.4488 1.7246 sec/batch
Epoch 11/20  Iteration 1922/3560 Training loss: 1.4488 1.7198 sec/batch
Epoch 11/20  Iteration 1923/3560 Training loss: 1.4487 1.7222 sec/batch
Epoch 11/20  Iteration 1924/3560 Training loss: 1.4489 1.7269 sec/batch
Epoch 11/20  Iteration 1925/3560 Training loss: 1.4488 1.7573 sec/batch
Epoch 11/20  Iteration 1926/3560 Training loss: 1.4489 1.7358 sec/batch
Epoch 11/20  Iteration 1927/3560 Training loss: 1.4489 1.7020 sec/batch
Epoch 11/20  Iteration 1928/3560 Training loss: 1.4491 1.7033 sec/batch
Epoch 11/20  Iteration 1929/3560 Training loss: 1.4492 1.7083 sec/batch
Epoch 11/20  Iteration 1930/3560 Training loss: 1.4491 1.7004 sec/batch
Epoch 11/20  Iteration 1931/3560 Training loss: 1.4488 1.7289 sec/batch
Epoch 11/20  Iteration 1932/3560 Training loss: 1.4487 1.7220 sec/batch
Epoch 11/20  Iteration 1933/3560 Training loss: 1.4488 1.7188 sec/batch
Epoch 11/20  Iteration 1934/3560 Training loss: 1.4488 1.6947 sec/batch
Epoch 11/20  Iteration 1935/3560 Training loss: 1.4488 1.7325 sec/batch
Epoch 11/20  Iteration 1936/3560 Training loss: 1.4487 1.7246 sec/batch
Epoch 11/20  Iteration 1937/3560 Training loss: 1.4488 1.7280 sec/batch
Epoch 11/20  Iteration 1938/3560 Training loss: 1.4487 1.7034 sec/batch
Epoch 11/20  Iteration 1939/3560 Training loss: 1.4484 1.7260 sec/batch
Epoch 11/20  Iteration 1940/3560 Training loss: 1.4486 1.7303 sec/batch
Epoch 11/20  Iteration 1941/3560 Training loss: 1.4488 1.7294 sec/batch
Epoch 11/20  Iteration 1942/3560 Training loss: 1.4489 1.7525 sec/batch
Epoch 11/20  Iteration 1943/3560 Training loss: 1.4489 1.6939 sec/batch
Epoch 11/20  Iteration 1944/3560 Training loss: 1.4489 1.7354 sec/batch
Epoch 11/20  Iteration 1945/3560 Training loss: 1.4489 1.7382 sec/batch
Epoch 11/20  Iteration 1946/3560 Training loss: 1.4489 1.7164 sec/batch
Epoch 11/20  Iteration 1947/3560 Training loss: 1.4490 1.7191 sec/batch
Epoch 11/20  Iteration 1948/3560 Training loss: 1.4493 1.7009 sec/batch
Epoch 11/20  Iteration 1949/3560 Training loss: 1.4493 1.7172 sec/batch
Epoch 11/20  Iteration 1950/3560 Training loss: 1.4493 1.7265 sec/batch
Epoch 11/20  Iteration 1951/3560 Training loss: 1.4492 1.7197 sec/batch
Epoch 11/20  Iteration 1952/3560 Training loss: 1.4490 1.7363 sec/batch
Epoch 11/20  Iteration 1953/3560 Training loss: 1.4491 1.7354 sec/batch
Epoch 11/20  Iteration 1954/3560 Training loss: 1.4491 1.7261 sec/batch
Epoch 11/20  Iteration 1955/3560 Training loss: 1.4492 1.7274 sec/batch
Epoch 11/20  Iteration 1956/3560 Training loss: 1.4490 1.7326 sec/batch
Epoch 11/20  Iteration 1957/3560 Training loss: 1.4490 1.6969 sec/batch
Epoch 11/20  Iteration 1958/3560 Training loss: 1.4491 1.7228 sec/batch
Epoch 12/20  Iteration 1959/3560 Training loss: 1.5291 1.7351 sec/batch
Epoch 12/20  Iteration 1960/3560 Training loss: 1.4868 1.7460 sec/batch
Epoch 12/20  Iteration 1961/3560 Training loss: 1.4740 1.7039 sec/batch
Epoch 12/20  Iteration 1962/3560 Training loss: 1.4754 1.7287 sec/batch
Epoch 12/20  Iteration 1963/3560 Training loss: 1.4681 1.7387 sec/batch
Epoch 12/20  Iteration 1964/3560 Training loss: 1.4575 1.7270 sec/batch
Epoch 12/20  Iteration 1965/3560 Training loss: 1.4587 1.6996 sec/batch
Epoch 12/20  Iteration 1966/3560 Training loss: 1.4558 1.7235 sec/batch
Epoch 12/20  Iteration 1967/3560 Training loss: 1.4561 1.8251 sec/batch
Epoch 12/20  Iteration 1968/3560 Training loss: 1.4543 1.7692 sec/batch
Epoch 12/20  Iteration 1969/3560 Training loss: 1.4509 1.7372 sec/batch
Epoch 12/20  Iteration 1970/3560 Training loss: 1.4497 1.7310 sec/batch
Epoch 12/20  Iteration 1971/3560 Training loss: 1.4485 1.7314 sec/batch
Epoch 12/20  Iteration 1972/3560 Training loss: 1.4490 1.7240 sec/batch
Epoch 12/20  Iteration 1973/3560 Training loss: 1.4480 1.7226 sec/batch
Epoch 12/20  Iteration 1974/3560 Training loss: 1.4459 1.7116 sec/batch
Epoch 12/20  Iteration 1975/3560 Training loss: 1.4459 1.7192 sec/batch
Epoch 12/20  Iteration 1976/3560 Training loss: 1.4472 1.7275 sec/batch
Epoch 12/20  Iteration 1977/3560 Training loss: 1.4471 1.7488 sec/batch
Epoch 12/20  Iteration 1978/3560 Training loss: 1.4476 1.7246 sec/batch
Epoch 12/20  Iteration 1979/3560 Training loss: 1.4467 1.7262 sec/batch
Epoch 12/20  Iteration 1980/3560 Training loss: 1.4474 1.7303 sec/batch
Epoch 12/20  Iteration 1981/3560 Training loss: 1.4464 1.6933 sec/batch
Epoch 12/20  Iteration 1982/3560 Training loss: 1.4461 1.7446 sec/batch
Epoch 12/20  Iteration 1983/3560 Training loss: 1.4461 1.7142 sec/batch
Epoch 12/20  Iteration 1984/3560 Training loss: 1.4447 1.7287 sec/batch
Epoch 12/20  Iteration 1985/3560 Training loss: 1.4435 1.7044 sec/batch
Epoch 12/20  Iteration 1986/3560 Training loss: 1.4438 1.7194 sec/batch
Epoch 12/20  Iteration 1987/3560 Training loss: 1.4442 1.7282 sec/batch
Epoch 12/20  Iteration 1988/3560 Training loss: 1.4445 1.7037 sec/batch
Epoch 12/20  Iteration 1989/3560 Training loss: 1.4440 1.7040 sec/batch
Epoch 12/20  Iteration 1990/3560 Training loss: 1.4428 1.7227 sec/batch
Epoch 12/20  Iteration 1991/3560 Training loss: 1.4431 1.6905 sec/batch
Epoch 12/20  Iteration 1992/3560 Training loss: 1.4433 1.7259 sec/batch
Epoch 12/20  Iteration 1993/3560 Training loss: 1.4430 1.7326 sec/batch
Epoch 12/20  Iteration 1994/3560 Training loss: 1.4431 1.7360 sec/batch
Epoch 12/20  Iteration 1995/3560 Training loss: 1.4423 1.7346 sec/batch
Epoch 12/20  Iteration 1996/3560 Training loss: 1.4411 1.7260 sec/batch
Epoch 12/20  Iteration 1997/3560 Training loss: 1.4397 1.6994 sec/batch
Epoch 12/20  Iteration 1998/3560 Training loss: 1.4391 1.7357 sec/batch
Epoch 12/20  Iteration 1999/3560 Training loss: 1.4385 1.7132 sec/batch
Epoch 12/20  Iteration 2000/3560 Training loss: 1.4390 1.7144 sec/batch
Epoch 12/20  Iteration 2001/3560 Training loss: 1.4384 1.6945 sec/batch
Epoch 12/20  Iteration 2002/3560 Training loss: 1.4378 1.7245 sec/batch
Epoch 12/20  Iteration 2003/3560 Training loss: 1.4379 1.7193 sec/batch
Epoch 12/20  Iteration 2004/3560 Training loss: 1.4370 1.7376 sec/batch
Epoch 12/20  Iteration 2005/3560 Training loss: 1.4368 1.7018 sec/batch
Epoch 12/20  Iteration 2006/3560 Training loss: 1.4363 1.7155 sec/batch
Epoch 12/20  Iteration 2007/3560 Training loss: 1.4361 1.7214 sec/batch
Epoch 12/20  Iteration 2008/3560 Training loss: 1.4365 1.7108 sec/batch
Epoch 12/20  Iteration 2009/3560 Training loss: 1.4362 1.7209 sec/batch
Epoch 12/20  Iteration 2010/3560 Training loss: 1.4369 1.7068 sec/batch
Epoch 12/20  Iteration 2011/3560 Training loss: 1.4370 1.7354 sec/batch
Epoch 12/20  Iteration 2012/3560 Training loss: 1.4371 1.7089 sec/batch
Epoch 12/20  Iteration 2013/3560 Training loss: 1.4368 1.7172 sec/batch
Epoch 12/20  Iteration 2014/3560 Training loss: 1.4370 1.9160 sec/batch
Epoch 12/20  Iteration 2015/3560 Training loss: 1.4374 1.9242 sec/batch
Epoch 12/20  Iteration 2016/3560 Training loss: 1.4370 1.7432 sec/batch
Epoch 12/20  Iteration 2017/3560 Training loss: 1.4364 1.6974 sec/batch
Epoch 12/20  Iteration 2018/3560 Training loss: 1.4369 1.6933 sec/batch
Epoch 12/20  Iteration 2019/3560 Training loss: 1.4370 1.7425 sec/batch
Epoch 12/20  Iteration 2020/3560 Training loss: 1.4379 1.7235 sec/batch
Epoch 12/20  Iteration 2021/3560 Training loss: 1.4382 1.7219 sec/batch
Epoch 12/20  Iteration 2022/3560 Training loss: 1.4382 1.7076 sec/batch
Epoch 12/20  Iteration 2023/3560 Training loss: 1.4381 1.7390 sec/batch
Epoch 12/20  Iteration 2024/3560 Training loss: 1.4383 1.6874 sec/batch
Epoch 12/20  Iteration 2025/3560 Training loss: 1.4386 1.7270 sec/batch
Epoch 12/20  Iteration 2026/3560 Training loss: 1.4383 1.7245 sec/batch
Epoch 12/20  Iteration 2027/3560 Training loss: 1.4382 1.7266 sec/batch
Epoch 12/20  Iteration 2028/3560 Training loss: 1.4381 1.6941 sec/batch
Epoch 12/20  Iteration 2029/3560 Training loss: 1.4386 1.7471 sec/batch
Epoch 12/20  Iteration 2030/3560 Training loss: 1.4388 1.7283 sec/batch
Epoch 12/20  Iteration 2031/3560 Training loss: 1.4393 1.7288 sec/batch
Epoch 12/20  Iteration 2032/3560 Training loss: 1.4390 1.7507 sec/batch
Epoch 12/20  Iteration 2033/3560 Training loss: 1.4389 1.7373 sec/batch
Epoch 12/20  Iteration 2034/3560 Training loss: 1.4389 1.7478 sec/batch
Epoch 12/20  Iteration 2035/3560 Training loss: 1.4389 1.7479 sec/batch
Epoch 12/20  Iteration 2036/3560 Training loss: 1.4388 1.7383 sec/batch
Epoch 12/20  Iteration 2037/3560 Training loss: 1.4381 1.7931 sec/batch
Epoch 12/20  Iteration 2038/3560 Training loss: 1.4380 1.7786 sec/batch
Epoch 12/20  Iteration 2039/3560 Training loss: 1.4375 1.7637 sec/batch
Epoch 12/20  Iteration 2040/3560 Training loss: 1.4372 1.7028 sec/batch
Epoch 12/20  Iteration 2041/3560 Training loss: 1.4366 1.6951 sec/batch
Epoch 12/20  Iteration 2042/3560 Training loss: 1.4365 1.7070 sec/batch
Epoch 12/20  Iteration 2043/3560 Training loss: 1.4361 1.7260 sec/batch
Epoch 12/20  Iteration 2044/3560 Training loss: 1.4358 1.7022 sec/batch
Epoch 12/20  Iteration 2045/3560 Training loss: 1.4355 1.6834 sec/batch
Epoch 12/20  Iteration 2046/3560 Training loss: 1.4352 1.7366 sec/batch
Epoch 12/20  Iteration 2047/3560 Training loss: 1.4349 1.7227 sec/batch
Epoch 12/20  Iteration 2048/3560 Training loss: 1.4350 1.7258 sec/batch
Epoch 12/20  Iteration 2049/3560 Training loss: 1.4347 1.7353 sec/batch
Epoch 12/20  Iteration 2050/3560 Training loss: 1.4346 1.7064 sec/batch
Epoch 12/20  Iteration 2051/3560 Training loss: 1.4342 1.7085 sec/batch
Epoch 12/20  Iteration 2052/3560 Training loss: 1.4340 1.6906 sec/batch
Epoch 12/20  Iteration 2053/3560 Training loss: 1.4336 1.7346 sec/batch
Epoch 12/20  Iteration 2054/3560 Training loss: 1.4336 1.7163 sec/batch
Epoch 12/20  Iteration 2055/3560 Training loss: 1.4336 1.7198 sec/batch
Epoch 12/20  Iteration 2056/3560 Training loss: 1.4332 1.7381 sec/batch
Epoch 12/20  Iteration 2057/3560 Training loss: 1.4329 1.6974 sec/batch
Epoch 12/20  Iteration 2058/3560 Training loss: 1.4325 1.6965 sec/batch
Epoch 12/20  Iteration 2059/3560 Training loss: 1.4324 1.7233 sec/batch
Epoch 12/20  Iteration 2060/3560 Training loss: 1.4322 1.7217 sec/batch
Epoch 12/20  Iteration 2061/3560 Training loss: 1.4321 1.7162 sec/batch
Epoch 12/20  Iteration 2062/3560 Training loss: 1.4318 1.6981 sec/batch
Epoch 12/20  Iteration 2063/3560 Training loss: 1.4316 1.7465 sec/batch
Epoch 12/20  Iteration 2064/3560 Training loss: 1.4315 1.6873 sec/batch
Epoch 12/20  Iteration 2065/3560 Training loss: 1.4315 1.7211 sec/batch
Epoch 12/20  Iteration 2066/3560 Training loss: 1.4314 1.6894 sec/batch
Epoch 12/20  Iteration 2067/3560 Training loss: 1.4313 1.7144 sec/batch
Epoch 12/20  Iteration 2068/3560 Training loss: 1.4313 1.7373 sec/batch
Epoch 12/20  Iteration 2069/3560 Training loss: 1.4310 1.7058 sec/batch
Epoch 12/20  Iteration 2070/3560 Training loss: 1.4309 1.7361 sec/batch
Epoch 12/20  Iteration 2071/3560 Training loss: 1.4308 1.7210 sec/batch
Epoch 12/20  Iteration 2072/3560 Training loss: 1.4305 1.7254 sec/batch
Epoch 12/20  Iteration 2073/3560 Training loss: 1.4303 1.7137 sec/batch
Epoch 12/20  Iteration 2074/3560 Training loss: 1.4300 1.7410 sec/batch
Epoch 12/20  Iteration 2075/3560 Training loss: 1.4300 1.7153 sec/batch
Epoch 12/20  Iteration 2076/3560 Training loss: 1.4300 1.7227 sec/batch
Epoch 12/20  Iteration 2077/3560 Training loss: 1.4298 1.7205 sec/batch
Epoch 12/20  Iteration 2078/3560 Training loss: 1.4298 1.7230 sec/batch
Epoch 12/20  Iteration 2079/3560 Training loss: 1.4297 1.7046 sec/batch
Epoch 12/20  Iteration 2080/3560 Training loss: 1.4294 1.7035 sec/batch
Epoch 12/20  Iteration 2081/3560 Training loss: 1.4291 1.7251 sec/batch
Epoch 12/20  Iteration 2082/3560 Training loss: 1.4291 1.6986 sec/batch
Epoch 12/20  Iteration 2083/3560 Training loss: 1.4291 1.7242 sec/batch
Epoch 12/20  Iteration 2084/3560 Training loss: 1.4288 1.7311 sec/batch
Epoch 12/20  Iteration 2085/3560 Training loss: 1.4289 1.7068 sec/batch
Epoch 12/20  Iteration 2086/3560 Training loss: 1.4289 1.7111 sec/batch
Epoch 12/20  Iteration 2087/3560 Training loss: 1.4288 1.7147 sec/batch
Epoch 12/20  Iteration 2088/3560 Training loss: 1.4285 1.7014 sec/batch
Epoch 12/20  Iteration 2089/3560 Training loss: 1.4281 1.6905 sec/batch
Epoch 12/20  Iteration 2090/3560 Training loss: 1.4279 1.7223 sec/batch
Epoch 12/20  Iteration 2091/3560 Training loss: 1.4280 1.7048 sec/batch
Epoch 12/20  Iteration 2092/3560 Training loss: 1.4281 1.6987 sec/batch
Epoch 12/20  Iteration 2093/3560 Training loss: 1.4281 1.7216 sec/batch
Epoch 12/20  Iteration 2094/3560 Training loss: 1.4282 1.7380 sec/batch
Epoch 12/20  Iteration 2095/3560 Training loss: 1.4284 1.7024 sec/batch
Epoch 12/20  Iteration 2096/3560 Training loss: 1.4285 1.7326 sec/batch
Epoch 12/20  Iteration 2097/3560 Training loss: 1.4284 1.7017 sec/batch
Epoch 12/20  Iteration 2098/3560 Training loss: 1.4284 1.7294 sec/batch
Epoch 12/20  Iteration 2099/3560 Training loss: 1.4287 1.7002 sec/batch
Epoch 12/20  Iteration 2100/3560 Training loss: 1.4287 1.7277 sec/batch
Epoch 12/20  Iteration 2101/3560 Training loss: 1.4286 1.7355 sec/batch
Epoch 12/20  Iteration 2102/3560 Training loss: 1.4289 1.7270 sec/batch
Epoch 12/20  Iteration 2103/3560 Training loss: 1.4288 1.6908 sec/batch
Epoch 12/20  Iteration 2104/3560 Training loss: 1.4289 1.7223 sec/batch
Epoch 12/20  Iteration 2105/3560 Training loss: 1.4289 1.6969 sec/batch
Epoch 12/20  Iteration 2106/3560 Training loss: 1.4292 1.6881 sec/batch
Epoch 12/20  Iteration 2107/3560 Training loss: 1.4293 1.8628 sec/batch
Epoch 12/20  Iteration 2108/3560 Training loss: 1.4292 1.7509 sec/batch
Epoch 12/20  Iteration 2109/3560 Training loss: 1.4289 1.7328 sec/batch
Epoch 12/20  Iteration 2110/3560 Training loss: 1.4288 1.6959 sec/batch
Epoch 12/20  Iteration 2111/3560 Training loss: 1.4289 1.7067 sec/batch
Epoch 12/20  Iteration 2112/3560 Training loss: 1.4289 1.7049 sec/batch
Epoch 12/20  Iteration 2113/3560 Training loss: 1.4288 1.7204 sec/batch
Epoch 12/20  Iteration 2114/3560 Training loss: 1.4288 1.7084 sec/batch
Epoch 12/20  Iteration 2115/3560 Training loss: 1.4288 1.7007 sec/batch
Epoch 12/20  Iteration 2116/3560 Training loss: 1.4288 1.9829 sec/batch
Epoch 12/20  Iteration 2117/3560 Training loss: 1.4285 1.7282 sec/batch
Epoch 12/20  Iteration 2118/3560 Training loss: 1.4286 1.7336 sec/batch
Epoch 12/20  Iteration 2119/3560 Training loss: 1.4288 1.6933 sec/batch
Epoch 12/20  Iteration 2120/3560 Training loss: 1.4288 1.7232 sec/batch
Epoch 12/20  Iteration 2121/3560 Training loss: 1.4288 1.7203 sec/batch
Epoch 12/20  Iteration 2122/3560 Training loss: 1.4288 1.7288 sec/batch
Epoch 12/20  Iteration 2123/3560 Training loss: 1.4288 1.7281 sec/batch
Epoch 12/20  Iteration 2124/3560 Training loss: 1.4288 1.7302 sec/batch
Epoch 12/20  Iteration 2125/3560 Training loss: 1.4290 1.7315 sec/batch
Epoch 12/20  Iteration 2126/3560 Training loss: 1.4294 1.7176 sec/batch
Epoch 12/20  Iteration 2127/3560 Training loss: 1.4294 1.7414 sec/batch
Epoch 12/20  Iteration 2128/3560 Training loss: 1.4294 1.6908 sec/batch
Epoch 12/20  Iteration 2129/3560 Training loss: 1.4292 1.7064 sec/batch
Epoch 12/20  Iteration 2130/3560 Training loss: 1.4290 1.7236 sec/batch
Epoch 12/20  Iteration 2131/3560 Training loss: 1.4291 1.7374 sec/batch
Epoch 12/20  Iteration 2132/3560 Training loss: 1.4291 1.7245 sec/batch
Epoch 12/20  Iteration 2133/3560 Training loss: 1.4292 1.7358 sec/batch
Epoch 12/20  Iteration 2134/3560 Training loss: 1.4290 1.7245 sec/batch
Epoch 12/20  Iteration 2135/3560 Training loss: 1.4290 1.7026 sec/batch
Epoch 12/20  Iteration 2136/3560 Training loss: 1.4291 1.7182 sec/batch
Epoch 13/20  Iteration 2137/3560 Training loss: 1.5063 1.6976 sec/batch
Epoch 13/20  Iteration 2138/3560 Training loss: 1.4648 1.7162 sec/batch
Epoch 13/20  Iteration 2139/3560 Training loss: 1.4522 1.6969 sec/batch
Epoch 13/20  Iteration 2140/3560 Training loss: 1.4496 1.7237 sec/batch
Epoch 13/20  Iteration 2141/3560 Training loss: 1.4407 1.7229 sec/batch
Epoch 13/20  Iteration 2142/3560 Training loss: 1.4309 1.7129 sec/batch
Epoch 13/20  Iteration 2143/3560 Training loss: 1.4318 1.7236 sec/batch
Epoch 13/20  Iteration 2144/3560 Training loss: 1.4297 1.6935 sec/batch
Epoch 13/20  Iteration 2145/3560 Training loss: 1.4288 1.7233 sec/batch
Epoch 13/20  Iteration 2146/3560 Training loss: 1.4283 1.7352 sec/batch
Epoch 13/20  Iteration 2147/3560 Training loss: 1.4255 1.7221 sec/batch
Epoch 13/20  Iteration 2148/3560 Training loss: 1.4251 1.7334 sec/batch
Epoch 13/20  Iteration 2149/3560 Training loss: 1.4248 1.7152 sec/batch
Epoch 13/20  Iteration 2150/3560 Training loss: 1.4256 1.7342 sec/batch
Epoch 13/20  Iteration 2151/3560 Training loss: 1.4245 1.7211 sec/batch
Epoch 13/20  Iteration 2152/3560 Training loss: 1.4238 1.7109 sec/batch
Epoch 13/20  Iteration 2153/3560 Training loss: 1.4241 1.7161 sec/batch
Epoch 13/20  Iteration 2154/3560 Training loss: 1.4250 1.6908 sec/batch
Epoch 13/20  Iteration 2155/3560 Training loss: 1.4255 1.7388 sec/batch
Epoch 13/20  Iteration 2156/3560 Training loss: 1.4264 1.7023 sec/batch
Epoch 13/20  Iteration 2157/3560 Training loss: 1.4257 1.7243 sec/batch
Epoch 13/20  Iteration 2158/3560 Training loss: 1.4267 1.7048 sec/batch
Epoch 13/20  Iteration 2159/3560 Training loss: 1.4256 1.7102 sec/batch
Epoch 13/20  Iteration 2160/3560 Training loss: 1.4255 1.7339 sec/batch
Epoch 13/20  Iteration 2161/3560 Training loss: 1.4255 1.6918 sec/batch
Epoch 13/20  Iteration 2162/3560 Training loss: 1.4241 1.7098 sec/batch
Epoch 13/20  Iteration 2163/3560 Training loss: 1.4234 1.7260 sec/batch
Epoch 13/20  Iteration 2164/3560 Training loss: 1.4242 1.7252 sec/batch
Epoch 13/20  Iteration 2165/3560 Training loss: 1.4243 1.6968 sec/batch
Epoch 13/20  Iteration 2166/3560 Training loss: 1.4247 1.7287 sec/batch
Epoch 13/20  Iteration 2167/3560 Training loss: 1.4244 1.7323 sec/batch
Epoch 13/20  Iteration 2168/3560 Training loss: 1.4237 1.7430 sec/batch
Epoch 13/20  Iteration 2169/3560 Training loss: 1.4241 1.7078 sec/batch
Epoch 13/20  Iteration 2170/3560 Training loss: 1.4244 1.6941 sec/batch
Epoch 13/20  Iteration 2171/3560 Training loss: 1.4240 1.7276 sec/batch
Epoch 13/20  Iteration 2172/3560 Training loss: 1.4238 1.7269 sec/batch
Epoch 13/20  Iteration 2173/3560 Training loss: 1.4229 1.6906 sec/batch
Epoch 13/20  Iteration 2174/3560 Training loss: 1.4217 1.7176 sec/batch
Epoch 13/20  Iteration 2175/3560 Training loss: 1.4203 1.7290 sec/batch
Epoch 13/20  Iteration 2176/3560 Training loss: 1.4197 1.7306 sec/batch
Epoch 13/20  Iteration 2177/3560 Training loss: 1.4192 1.7974 sec/batch
Epoch 13/20  Iteration 2178/3560 Training loss: 1.4198 1.7632 sec/batch
Epoch 13/20  Iteration 2179/3560 Training loss: 1.4193 1.7370 sec/batch
Epoch 13/20  Iteration 2180/3560 Training loss: 1.4188 1.7295 sec/batch
Epoch 13/20  Iteration 2181/3560 Training loss: 1.4190 1.6965 sec/batch
Epoch 13/20  Iteration 2182/3560 Training loss: 1.4182 1.7346 sec/batch
Epoch 13/20  Iteration 2183/3560 Training loss: 1.4179 1.7301 sec/batch
Epoch 13/20  Iteration 2184/3560 Training loss: 1.4175 1.6875 sec/batch
Epoch 13/20  Iteration 2185/3560 Training loss: 1.4172 1.7442 sec/batch
Epoch 13/20  Iteration 2186/3560 Training loss: 1.4175 1.6868 sec/batch
Epoch 13/20  Iteration 2187/3560 Training loss: 1.4170 1.7330 sec/batch
Epoch 13/20  Iteration 2188/3560 Training loss: 1.4174 2.0947 sec/batch
Epoch 13/20  Iteration 2189/3560 Training loss: 1.4173 1.7468 sec/batch
Epoch 13/20  Iteration 2190/3560 Training loss: 1.4174 1.7414 sec/batch
Epoch 13/20  Iteration 2191/3560 Training loss: 1.4173 1.7241 sec/batch
Epoch 13/20  Iteration 2192/3560 Training loss: 1.4172 1.7371 sec/batch
Epoch 13/20  Iteration 2193/3560 Training loss: 1.4176 1.7271 sec/batch
Epoch 13/20  Iteration 2194/3560 Training loss: 1.4172 1.7340 sec/batch
Epoch 13/20  Iteration 2195/3560 Training loss: 1.4166 1.7468 sec/batch
Epoch 13/20  Iteration 2196/3560 Training loss: 1.4173 1.7188 sec/batch
Epoch 13/20  Iteration 2197/3560 Training loss: 1.4173 1.7061 sec/batch
Epoch 13/20  Iteration 2198/3560 Training loss: 1.4182 1.6994 sec/batch
Epoch 13/20  Iteration 2199/3560 Training loss: 1.4184 1.7262 sec/batch
Epoch 13/20  Iteration 2200/3560 Training loss: 1.4186 1.7326 sec/batch
Epoch 13/20  Iteration 2201/3560 Training loss: 1.4185 1.7055 sec/batch
Epoch 13/20  Iteration 2202/3560 Training loss: 1.4187 1.7374 sec/batch
Epoch 13/20  Iteration 2203/3560 Training loss: 1.4189 1.7317 sec/batch
Epoch 13/20  Iteration 2204/3560 Training loss: 1.4186 1.7174 sec/batch
Epoch 13/20  Iteration 2205/3560 Training loss: 1.4187 1.7045 sec/batch
Epoch 13/20  Iteration 2206/3560 Training loss: 1.4185 1.7234 sec/batch
Epoch 13/20  Iteration 2207/3560 Training loss: 1.4190 1.7721 sec/batch
Epoch 13/20  Iteration 2208/3560 Training loss: 1.4194 1.7037 sec/batch
Epoch 13/20  Iteration 2209/3560 Training loss: 1.4198 1.6962 sec/batch
Epoch 13/20  Iteration 2210/3560 Training loss: 1.4194 1.7152 sec/batch
Epoch 13/20  Iteration 2211/3560 Training loss: 1.4192 1.7099 sec/batch
Epoch 13/20  Iteration 2212/3560 Training loss: 1.4193 1.7319 sec/batch
Epoch 13/20  Iteration 2213/3560 Training loss: 1.4192 1.7202 sec/batch
Epoch 13/20  Iteration 2214/3560 Training loss: 1.4190 1.7251 sec/batch
Epoch 13/20  Iteration 2215/3560 Training loss: 1.4185 1.6979 sec/batch
Epoch 13/20  Iteration 2216/3560 Training loss: 1.4184 1.7248 sec/batch
Epoch 13/20  Iteration 2217/3560 Training loss: 1.4180 1.7053 sec/batch
Epoch 13/20  Iteration 2218/3560 Training loss: 1.4178 1.7324 sec/batch
Epoch 13/20  Iteration 2219/3560 Training loss: 1.4173 1.7463 sec/batch
Epoch 13/20  Iteration 2220/3560 Training loss: 1.4172 1.7230 sec/batch
Epoch 13/20  Iteration 2221/3560 Training loss: 1.4170 1.7272 sec/batch
Epoch 13/20  Iteration 2222/3560 Training loss: 1.4168 1.6981 sec/batch
Epoch 13/20  Iteration 2223/3560 Training loss: 1.4166 1.7208 sec/batch
Epoch 13/20  Iteration 2224/3560 Training loss: 1.4164 1.7032 sec/batch
Epoch 13/20  Iteration 2225/3560 Training loss: 1.4161 1.7268 sec/batch
Epoch 13/20  Iteration 2226/3560 Training loss: 1.4161 1.7280 sec/batch
Epoch 13/20  Iteration 2227/3560 Training loss: 1.4158 1.7324 sec/batch
Epoch 13/20  Iteration 2228/3560 Training loss: 1.4158 1.6935 sec/batch
Epoch 13/20  Iteration 2229/3560 Training loss: 1.4155 1.7376 sec/batch
Epoch 13/20  Iteration 2230/3560 Training loss: 1.4152 1.7083 sec/batch
Epoch 13/20  Iteration 2231/3560 Training loss: 1.4148 1.7238 sec/batch
Epoch 13/20  Iteration 2232/3560 Training loss: 1.4149 1.7238 sec/batch
Epoch 13/20  Iteration 2233/3560 Training loss: 1.4149 1.7043 sec/batch
Epoch 13/20  Iteration 2234/3560 Training loss: 1.4145 1.7072 sec/batch
Epoch 13/20  Iteration 2235/3560 Training loss: 1.4141 1.7472 sec/batch
Epoch 13/20  Iteration 2236/3560 Training loss: 1.4136 1.7254 sec/batch
Epoch 13/20  Iteration 2237/3560 Training loss: 1.4137 1.7511 sec/batch
Epoch 13/20  Iteration 2238/3560 Training loss: 1.4137 1.7288 sec/batch
Epoch 13/20  Iteration 2239/3560 Training loss: 1.4135 1.7277 sec/batch
Epoch 13/20  Iteration 2240/3560 Training loss: 1.4134 1.7232 sec/batch
Epoch 13/20  Iteration 2241/3560 Training loss: 1.4132 1.7133 sec/batch
Epoch 13/20  Iteration 2242/3560 Training loss: 1.4131 1.7268 sec/batch
Epoch 13/20  Iteration 2243/3560 Training loss: 1.4131 1.7400 sec/batch
Epoch 13/20  Iteration 2244/3560 Training loss: 1.4130 1.7219 sec/batch
Epoch 13/20  Iteration 2245/3560 Training loss: 1.4130 1.7713 sec/batch
Epoch 13/20  Iteration 2246/3560 Training loss: 1.4130 1.8187 sec/batch
Epoch 13/20  Iteration 2247/3560 Training loss: 1.4129 1.7291 sec/batch
Epoch 13/20  Iteration 2248/3560 Training loss: 1.4129 1.7248 sec/batch
Epoch 13/20  Iteration 2249/3560 Training loss: 1.4128 1.7316 sec/batch
Epoch 13/20  Iteration 2250/3560 Training loss: 1.4126 1.6979 sec/batch
Epoch 13/20  Iteration 2251/3560 Training loss: 1.4124 1.7053 sec/batch
Epoch 13/20  Iteration 2252/3560 Training loss: 1.4120 1.7046 sec/batch
Epoch 13/20  Iteration 2253/3560 Training loss: 1.4120 1.7280 sec/batch
Epoch 13/20  Iteration 2254/3560 Training loss: 1.4120 1.7084 sec/batch
Epoch 13/20  Iteration 2255/3560 Training loss: 1.4119 1.7122 sec/batch
Epoch 13/20  Iteration 2256/3560 Training loss: 1.4119 1.7133 sec/batch
Epoch 13/20  Iteration 2257/3560 Training loss: 1.4118 1.7065 sec/batch
Epoch 13/20  Iteration 2258/3560 Training loss: 1.4115 1.7068 sec/batch
Epoch 13/20  Iteration 2259/3560 Training loss: 1.4112 1.7034 sec/batch
Epoch 13/20  Iteration 2260/3560 Training loss: 1.4112 1.7260 sec/batch
Epoch 13/20  Iteration 2261/3560 Training loss: 1.4111 1.7186 sec/batch
Epoch 13/20  Iteration 2262/3560 Training loss: 1.4108 1.7100 sec/batch
Epoch 13/20  Iteration 2263/3560 Training loss: 1.4109 1.7073 sec/batch
Epoch 13/20  Iteration 2264/3560 Training loss: 1.4108 1.7109 sec/batch
Epoch 13/20  Iteration 2265/3560 Training loss: 1.4107 1.6918 sec/batch
Epoch 13/20  Iteration 2266/3560 Training loss: 1.4104 1.7194 sec/batch
Epoch 13/20  Iteration 2267/3560 Training loss: 1.4101 1.7319 sec/batch
Epoch 13/20  Iteration 2268/3560 Training loss: 1.4099 1.7219 sec/batch
Epoch 13/20  Iteration 2269/3560 Training loss: 1.4100 1.7251 sec/batch
Epoch 13/20  Iteration 2270/3560 Training loss: 1.4100 1.7122 sec/batch
Epoch 13/20  Iteration 2271/3560 Training loss: 1.4099 1.6961 sec/batch
Epoch 13/20  Iteration 2272/3560 Training loss: 1.4098 1.7354 sec/batch
Epoch 13/20  Iteration 2273/3560 Training loss: 1.4100 1.7245 sec/batch
Epoch 13/20  Iteration 2274/3560 Training loss: 1.4101 1.7239 sec/batch
Epoch 13/20  Iteration 2275/3560 Training loss: 1.4101 1.7194 sec/batch
Epoch 13/20  Iteration 2276/3560 Training loss: 1.4101 1.7278 sec/batch
Epoch 13/20  Iteration 2277/3560 Training loss: 1.4105 1.6952 sec/batch
Epoch 13/20  Iteration 2278/3560 Training loss: 1.4104 1.7218 sec/batch
Epoch 13/20  Iteration 2279/3560 Training loss: 1.4104 1.7223 sec/batch
Epoch 13/20  Iteration 2280/3560 Training loss: 1.4106 1.7284 sec/batch
Epoch 13/20  Iteration 2281/3560 Training loss: 1.4104 1.6979 sec/batch
Epoch 13/20  Iteration 2282/3560 Training loss: 1.4106 1.7317 sec/batch
Epoch 13/20  Iteration 2283/3560 Training loss: 1.4106 1.7075 sec/batch
Epoch 13/20  Iteration 2284/3560 Training loss: 1.4108 1.7222 sec/batch
Epoch 13/20  Iteration 2285/3560 Training loss: 1.4109 1.7026 sec/batch
Epoch 13/20  Iteration 2286/3560 Training loss: 1.4109 1.7373 sec/batch
Epoch 13/20  Iteration 2287/3560 Training loss: 1.4106 1.7290 sec/batch
Epoch 13/20  Iteration 2288/3560 Training loss: 1.4105 1.6845 sec/batch
Epoch 13/20  Iteration 2289/3560 Training loss: 1.4106 1.7380 sec/batch
Epoch 13/20  Iteration 2290/3560 Training loss: 1.4107 1.7317 sec/batch
Epoch 13/20  Iteration 2291/3560 Training loss: 1.4106 1.7258 sec/batch
Epoch 13/20  Iteration 2292/3560 Training loss: 1.4106 1.7252 sec/batch
Epoch 13/20  Iteration 2293/3560 Training loss: 1.4106 1.7161 sec/batch
Epoch 13/20  Iteration 2294/3560 Training loss: 1.4106 1.7206 sec/batch
Epoch 13/20  Iteration 2295/3560 Training loss: 1.4103 1.6963 sec/batch
Epoch 13/20  Iteration 2296/3560 Training loss: 1.4105 1.7124 sec/batch
Epoch 13/20  Iteration 2297/3560 Training loss: 1.4108 1.6976 sec/batch
Epoch 13/20  Iteration 2298/3560 Training loss: 1.4108 1.7174 sec/batch
Epoch 13/20  Iteration 2299/3560 Training loss: 1.4109 1.7157 sec/batch
Epoch 13/20  Iteration 2300/3560 Training loss: 1.4108 1.7158 sec/batch
Epoch 13/20  Iteration 2301/3560 Training loss: 1.4108 1.7385 sec/batch
Epoch 13/20  Iteration 2302/3560 Training loss: 1.4108 1.7146 sec/batch
Epoch 13/20  Iteration 2303/3560 Training loss: 1.4110 1.7181 sec/batch
Epoch 13/20  Iteration 2304/3560 Training loss: 1.4114 1.7105 sec/batch
Epoch 13/20  Iteration 2305/3560 Training loss: 1.4115 1.7310 sec/batch
Epoch 13/20  Iteration 2306/3560 Training loss: 1.4114 1.7579 sec/batch
Epoch 13/20  Iteration 2307/3560 Training loss: 1.4114 1.6986 sec/batch
Epoch 13/20  Iteration 2308/3560 Training loss: 1.4112 1.7208 sec/batch
Epoch 13/20  Iteration 2309/3560 Training loss: 1.4113 1.6970 sec/batch
Epoch 13/20  Iteration 2310/3560 Training loss: 1.4113 1.7224 sec/batch
Epoch 13/20  Iteration 2311/3560 Training loss: 1.4114 1.7091 sec/batch
Epoch 13/20  Iteration 2312/3560 Training loss: 1.4112 1.7065 sec/batch
Epoch 13/20  Iteration 2313/3560 Training loss: 1.4111 1.6986 sec/batch
Epoch 13/20  Iteration 2314/3560 Training loss: 1.4112 1.8583 sec/batch
Epoch 14/20  Iteration 2315/3560 Training loss: 1.4884 1.7889 sec/batch
Epoch 14/20  Iteration 2316/3560 Training loss: 1.4542 1.7439 sec/batch
Epoch 14/20  Iteration 2317/3560 Training loss: 1.4369 1.7356 sec/batch
Epoch 14/20  Iteration 2318/3560 Training loss: 1.4327 1.7072 sec/batch
Epoch 14/20  Iteration 2319/3560 Training loss: 1.4241 1.7044 sec/batch
Epoch 14/20  Iteration 2320/3560 Training loss: 1.4143 1.7283 sec/batch
Epoch 14/20  Iteration 2321/3560 Training loss: 1.4142 1.6982 sec/batch
Epoch 14/20  Iteration 2322/3560 Training loss: 1.4122 1.7032 sec/batch
Epoch 14/20  Iteration 2323/3560 Training loss: 1.4133 1.7490 sec/batch
Epoch 14/20  Iteration 2324/3560 Training loss: 1.4133 1.7314 sec/batch
Epoch 14/20  Iteration 2325/3560 Training loss: 1.4104 1.6963 sec/batch
Epoch 14/20  Iteration 2326/3560 Training loss: 1.4097 1.7214 sec/batch
Epoch 14/20  Iteration 2327/3560 Training loss: 1.4101 1.7067 sec/batch
Epoch 14/20  Iteration 2328/3560 Training loss: 1.4104 1.7325 sec/batch
Epoch 14/20  Iteration 2329/3560 Training loss: 1.4084 1.7167 sec/batch
Epoch 14/20  Iteration 2330/3560 Training loss: 1.4068 1.7673 sec/batch
Epoch 14/20  Iteration 2331/3560 Training loss: 1.4071 1.7372 sec/batch
Epoch 14/20  Iteration 2332/3560 Training loss: 1.4077 1.7302 sec/batch
Epoch 14/20  Iteration 2333/3560 Training loss: 1.4080 1.6868 sec/batch
Epoch 14/20  Iteration 2334/3560 Training loss: 1.4085 1.7315 sec/batch
Epoch 14/20  Iteration 2335/3560 Training loss: 1.4080 1.7210 sec/batch
Epoch 14/20  Iteration 2336/3560 Training loss: 1.4086 1.7281 sec/batch
Epoch 14/20  Iteration 2337/3560 Training loss: 1.4080 1.7236 sec/batch
Epoch 14/20  Iteration 2338/3560 Training loss: 1.4079 1.6980 sec/batch
Epoch 14/20  Iteration 2339/3560 Training loss: 1.4072 1.6917 sec/batch
Epoch 14/20  Iteration 2340/3560 Training loss: 1.4058 1.7269 sec/batch
Epoch 14/20  Iteration 2341/3560 Training loss: 1.4049 1.7354 sec/batch
Epoch 14/20  Iteration 2342/3560 Training loss: 1.4059 1.7193 sec/batch
Epoch 14/20  Iteration 2343/3560 Training loss: 1.4063 1.7308 sec/batch
Epoch 14/20  Iteration 2344/3560 Training loss: 1.4066 1.7180 sec/batch
Epoch 14/20  Iteration 2345/3560 Training loss: 1.4062 1.7055 sec/batch
Epoch 14/20  Iteration 2346/3560 Training loss: 1.4054 1.7051 sec/batch
Epoch 14/20  Iteration 2347/3560 Training loss: 1.4054 1.7245 sec/batch
Epoch 14/20  Iteration 2348/3560 Training loss: 1.4054 1.7403 sec/batch
Epoch 14/20  Iteration 2349/3560 Training loss: 1.4055 1.7302 sec/batch
Epoch 14/20  Iteration 2350/3560 Training loss: 1.4051 1.7201 sec/batch
Epoch 14/20  Iteration 2351/3560 Training loss: 1.4044 1.7146 sec/batch
Epoch 14/20  Iteration 2352/3560 Training loss: 1.4034 1.7424 sec/batch
Epoch 14/20  Iteration 2353/3560 Training loss: 1.4024 1.7563 sec/batch
Epoch 14/20  Iteration 2354/3560 Training loss: 1.4019 1.7304 sec/batch
Epoch 14/20  Iteration 2355/3560 Training loss: 1.4014 1.7423 sec/batch
Epoch 14/20  Iteration 2356/3560 Training loss: 1.4021 1.7362 sec/batch
Epoch 14/20  Iteration 2357/3560 Training loss: 1.4017 1.7228 sec/batch
Epoch 14/20  Iteration 2358/3560 Training loss: 1.4008 1.7058 sec/batch
Epoch 14/20  Iteration 2359/3560 Training loss: 1.4010 1.6961 sec/batch
Epoch 14/20  Iteration 2360/3560 Training loss: 1.4001 1.8038 sec/batch
Epoch 14/20  Iteration 2361/3560 Training loss: 1.4002 2.0953 sec/batch
Epoch 14/20  Iteration 2362/3560 Training loss: 1.3998 1.7541 sec/batch
Epoch 14/20  Iteration 2363/3560 Training loss: 1.3998 1.7268 sec/batch
Epoch 14/20  Iteration 2364/3560 Training loss: 1.4000 1.7383 sec/batch
Epoch 14/20  Iteration 2365/3560 Training loss: 1.3994 1.7130 sec/batch
Epoch 14/20  Iteration 2366/3560 Training loss: 1.4003 1.7142 sec/batch
Epoch 14/20  Iteration 2367/3560 Training loss: 1.4003 1.8266 sec/batch
Epoch 14/20  Iteration 2368/3560 Training loss: 1.4004 1.8483 sec/batch
Epoch 14/20  Iteration 2369/3560 Training loss: 1.4001 1.8966 sec/batch
Epoch 14/20  Iteration 2370/3560 Training loss: 1.4004 1.7610 sec/batch
Epoch 14/20  Iteration 2371/3560 Training loss: 1.4007 1.7918 sec/batch
Epoch 14/20  Iteration 2372/3560 Training loss: 1.4003 1.8049 sec/batch
Epoch 14/20  Iteration 2373/3560 Training loss: 1.3998 1.8884 sec/batch
Epoch 14/20  Iteration 2374/3560 Training loss: 1.4005 1.9098 sec/batch
Epoch 14/20  Iteration 2375/3560 Training loss: 1.4004 1.7247 sec/batch
Epoch 14/20  Iteration 2376/3560 Training loss: 1.4013 1.7303 sec/batch
Epoch 14/20  Iteration 2377/3560 Training loss: 1.4017 1.7290 sec/batch
Epoch 14/20  Iteration 2378/3560 Training loss: 1.4020 1.7712 sec/batch
Epoch 14/20  Iteration 2379/3560 Training loss: 1.4019 1.7005 sec/batch
Epoch 14/20  Iteration 2380/3560 Training loss: 1.4019 1.7576 sec/batch
Epoch 14/20  Iteration 2381/3560 Training loss: 1.4023 1.7265 sec/batch
Epoch 14/20  Iteration 2382/3560 Training loss: 1.4019 1.7286 sec/batch
Epoch 14/20  Iteration 2383/3560 Training loss: 1.4020 1.7463 sec/batch
Epoch 14/20  Iteration 2384/3560 Training loss: 1.4020 1.8775 sec/batch
Epoch 14/20  Iteration 2385/3560 Training loss: 1.4025 1.8020 sec/batch
Epoch 14/20  Iteration 2386/3560 Training loss: 1.4028 1.7451 sec/batch
Epoch 14/20  Iteration 2387/3560 Training loss: 1.4033 1.7309 sec/batch
Epoch 14/20  Iteration 2388/3560 Training loss: 1.4030 1.7292 sec/batch
Epoch 14/20  Iteration 2389/3560 Training loss: 1.4027 1.7085 sec/batch
Epoch 14/20  Iteration 2390/3560 Training loss: 1.4029 1.7228 sec/batch
Epoch 14/20  Iteration 2391/3560 Training loss: 1.4028 1.6973 sec/batch
Epoch 14/20  Iteration 2392/3560 Training loss: 1.4027 1.7223 sec/batch
Epoch 14/20  Iteration 2393/3560 Training loss: 1.4022 1.7007 sec/batch
Epoch 14/20  Iteration 2394/3560 Training loss: 1.4021 1.7011 sec/batch
Epoch 14/20  Iteration 2395/3560 Training loss: 1.4016 1.7236 sec/batch
Epoch 14/20  Iteration 2396/3560 Training loss: 1.4016 1.7260 sec/batch
Epoch 14/20  Iteration 2397/3560 Training loss: 1.4012 1.7047 sec/batch
Epoch 14/20  Iteration 2398/3560 Training loss: 1.4012 1.7350 sec/batch
Epoch 14/20  Iteration 2399/3560 Training loss: 1.4010 1.7275 sec/batch
Epoch 14/20  Iteration 2400/3560 Training loss: 1.4008 1.7020 sec/batch
Epoch 14/20  Iteration 2401/3560 Training loss: 1.4005 1.7212 sec/batch
Epoch 14/20  Iteration 2402/3560 Training loss: 1.4002 1.7052 sec/batch
Epoch 14/20  Iteration 2403/3560 Training loss: 1.3998 1.6986 sec/batch
Epoch 14/20  Iteration 2404/3560 Training loss: 1.3999 1.6920 sec/batch
Epoch 14/20  Iteration 2405/3560 Training loss: 1.3997 1.7050 sec/batch
Epoch 14/20  Iteration 2406/3560 Training loss: 1.3995 1.7253 sec/batch
Epoch 14/20  Iteration 2407/3560 Training loss: 1.3992 1.7270 sec/batch
Epoch 14/20  Iteration 2408/3560 Training loss: 1.3989 1.7450 sec/batch
Epoch 14/20  Iteration 2409/3560 Training loss: 1.3986 1.7337 sec/batch
Epoch 14/20  Iteration 2410/3560 Training loss: 1.3987 1.7472 sec/batch
Epoch 14/20  Iteration 2411/3560 Training loss: 1.3986 1.7292 sec/batch
Epoch 14/20  Iteration 2412/3560 Training loss: 1.3982 1.7271 sec/batch
Epoch 14/20  Iteration 2413/3560 Training loss: 1.3978 1.7046 sec/batch
Epoch 14/20  Iteration 2414/3560 Training loss: 1.3973 1.7272 sec/batch
Epoch 14/20  Iteration 2415/3560 Training loss: 1.3974 1.7404 sec/batch
Epoch 14/20  Iteration 2416/3560 Training loss: 1.3973 1.6881 sec/batch
Epoch 14/20  Iteration 2417/3560 Training loss: 1.3973 1.7111 sec/batch
Epoch 14/20  Iteration 2418/3560 Training loss: 1.3971 1.7257 sec/batch
Epoch 14/20  Iteration 2419/3560 Training loss: 1.3970 1.7317 sec/batch
Epoch 14/20  Iteration 2420/3560 Training loss: 1.3969 1.7067 sec/batch
Epoch 14/20  Iteration 2421/3560 Training loss: 1.3969 1.7339 sec/batch
Epoch 14/20  Iteration 2422/3560 Training loss: 1.3969 1.7027 sec/batch
Epoch 14/20  Iteration 2423/3560 Training loss: 1.3968 1.7323 sec/batch
Epoch 14/20  Iteration 2424/3560 Training loss: 1.3968 1.6943 sec/batch
Epoch 14/20  Iteration 2425/3560 Training loss: 1.3966 1.7297 sec/batch
Epoch 14/20  Iteration 2426/3560 Training loss: 1.3966 1.6914 sec/batch
Epoch 14/20  Iteration 2427/3560 Training loss: 1.3965 1.7436 sec/batch
Epoch 14/20  Iteration 2428/3560 Training loss: 1.3964 1.7295 sec/batch
Epoch 14/20  Iteration 2429/3560 Training loss: 1.3962 1.7369 sec/batch
Epoch 14/20  Iteration 2430/3560 Training loss: 1.3958 1.7362 sec/batch
Epoch 14/20  Iteration 2431/3560 Training loss: 1.3958 1.6999 sec/batch
Epoch 14/20  Iteration 2432/3560 Training loss: 1.3958 1.7391 sec/batch
Epoch 14/20  Iteration 2433/3560 Training loss: 1.3957 1.7008 sec/batch
Epoch 14/20  Iteration 2434/3560 Training loss: 1.3956 1.7277 sec/batch
Epoch 14/20  Iteration 2435/3560 Training loss: 1.3956 1.7084 sec/batch
Epoch 14/20  Iteration 2436/3560 Training loss: 1.3952 1.7259 sec/batch
Epoch 14/20  Iteration 2437/3560 Training loss: 1.3949 1.7263 sec/batch
Epoch 14/20  Iteration 2438/3560 Training loss: 1.3949 1.7195 sec/batch
Epoch 14/20  Iteration 2439/3560 Training loss: 1.3949 1.7105 sec/batch
Epoch 14/20  Iteration 2440/3560 Training loss: 1.3946 1.7156 sec/batch
Epoch 14/20  Iteration 2441/3560 Training loss: 1.3947 1.6915 sec/batch
Epoch 14/20  Iteration 2442/3560 Training loss: 1.3948 1.7349 sec/batch
Epoch 14/20  Iteration 2443/3560 Training loss: 1.3946 1.7005 sec/batch
Epoch 14/20  Iteration 2444/3560 Training loss: 1.3944 1.7118 sec/batch
Epoch 14/20  Iteration 2445/3560 Training loss: 1.3941 1.7227 sec/batch
Epoch 14/20  Iteration 2446/3560 Training loss: 1.3939 1.7291 sec/batch
Epoch 14/20  Iteration 2447/3560 Training loss: 1.3941 1.7222 sec/batch
Epoch 14/20  Iteration 2448/3560 Training loss: 1.3940 1.7178 sec/batch
Epoch 14/20  Iteration 2449/3560 Training loss: 1.3940 1.6974 sec/batch
Epoch 14/20  Iteration 2450/3560 Training loss: 1.3941 1.7260 sec/batch
Epoch 14/20  Iteration 2451/3560 Training loss: 1.3943 1.6914 sec/batch
Epoch 14/20  Iteration 2452/3560 Training loss: 1.3944 1.7345 sec/batch
Epoch 14/20  Iteration 2453/3560 Training loss: 1.3944 1.8638 sec/batch
Epoch 14/20  Iteration 2454/3560 Training loss: 1.3944 1.7874 sec/batch
Epoch 14/20  Iteration 2455/3560 Training loss: 1.3947 1.7136 sec/batch
Epoch 14/20  Iteration 2456/3560 Training loss: 1.3948 1.7079 sec/batch
Epoch 14/20  Iteration 2457/3560 Training loss: 1.3947 1.7052 sec/batch
Epoch 14/20  Iteration 2458/3560 Training loss: 1.3948 1.7143 sec/batch
Epoch 14/20  Iteration 2459/3560 Training loss: 1.3947 1.7174 sec/batch
Epoch 14/20  Iteration 2460/3560 Training loss: 1.3949 1.7232 sec/batch
Epoch 14/20  Iteration 2461/3560 Training loss: 1.3950 1.7262 sec/batch
Epoch 14/20  Iteration 2462/3560 Training loss: 1.3952 1.7270 sec/batch
Epoch 14/20  Iteration 2463/3560 Training loss: 1.3954 1.7038 sec/batch
Epoch 14/20  Iteration 2464/3560 Training loss: 1.3953 1.6926 sec/batch
Epoch 14/20  Iteration 2465/3560 Training loss: 1.3950 1.7231 sec/batch
Epoch 14/20  Iteration 2466/3560 Training loss: 1.3949 1.7065 sec/batch
Epoch 14/20  Iteration 2467/3560 Training loss: 1.3950 1.7433 sec/batch
Epoch 14/20  Iteration 2468/3560 Training loss: 1.3950 1.6943 sec/batch
Epoch 14/20  Iteration 2469/3560 Training loss: 1.3950 1.6989 sec/batch
Epoch 14/20  Iteration 2470/3560 Training loss: 1.3950 1.7194 sec/batch
Epoch 14/20  Iteration 2471/3560 Training loss: 1.3950 1.7214 sec/batch
Epoch 14/20  Iteration 2472/3560 Training loss: 1.3950 1.7300 sec/batch
Epoch 14/20  Iteration 2473/3560 Training loss: 1.3948 1.7071 sec/batch
Epoch 14/20  Iteration 2474/3560 Training loss: 1.3950 1.6916 sec/batch
Epoch 14/20  Iteration 2475/3560 Training loss: 1.3952 1.7286 sec/batch
Epoch 14/20  Iteration 2476/3560 Training loss: 1.3952 1.7310 sec/batch
Epoch 14/20  Iteration 2477/3560 Training loss: 1.3952 1.7201 sec/batch
Epoch 14/20  Iteration 2478/3560 Training loss: 1.3952 1.7274 sec/batch
Epoch 14/20  Iteration 2479/3560 Training loss: 1.3952 1.7503 sec/batch
Epoch 14/20  Iteration 2480/3560 Training loss: 1.3952 1.7301 sec/batch
Epoch 14/20  Iteration 2481/3560 Training loss: 1.3954 1.6975 sec/batch
Epoch 14/20  Iteration 2482/3560 Training loss: 1.3958 1.7130 sec/batch
Epoch 14/20  Iteration 2483/3560 Training loss: 1.3958 1.7304 sec/batch
Epoch 14/20  Iteration 2484/3560 Training loss: 1.3958 1.7318 sec/batch
Epoch 14/20  Iteration 2485/3560 Training loss: 1.3957 1.7291 sec/batch
Epoch 14/20  Iteration 2486/3560 Training loss: 1.3955 1.7302 sec/batch
Epoch 14/20  Iteration 2487/3560 Training loss: 1.3957 1.7324 sec/batch
Epoch 14/20  Iteration 2488/3560 Training loss: 1.3957 1.7098 sec/batch
Epoch 14/20  Iteration 2489/3560 Training loss: 1.3958 1.7396 sec/batch
Epoch 14/20  Iteration 2490/3560 Training loss: 1.3957 1.7328 sec/batch
Epoch 14/20  Iteration 2491/3560 Training loss: 1.3956 1.7006 sec/batch
Epoch 14/20  Iteration 2492/3560 Training loss: 1.3957 1.7290 sec/batch
Epoch 15/20  Iteration 2493/3560 Training loss: 1.4707 1.7436 sec/batch
Epoch 15/20  Iteration 2494/3560 Training loss: 1.4380 1.7315 sec/batch
Epoch 15/20  Iteration 2495/3560 Training loss: 1.4230 1.7262 sec/batch
Epoch 15/20  Iteration 2496/3560 Training loss: 1.4181 1.7526 sec/batch
Epoch 15/20  Iteration 2497/3560 Training loss: 1.4096 1.6906 sec/batch
Epoch 15/20  Iteration 2498/3560 Training loss: 1.4002 1.7024 sec/batch
Epoch 15/20  Iteration 2499/3560 Training loss: 1.4003 1.6940 sec/batch
Epoch 15/20  Iteration 2500/3560 Training loss: 1.3990 1.7260 sec/batch
Epoch 15/20  Iteration 2501/3560 Training loss: 1.3990 1.7291 sec/batch
Epoch 15/20  Iteration 2502/3560 Training loss: 1.3976 1.7030 sec/batch
Epoch 15/20  Iteration 2503/3560 Training loss: 1.3953 1.7274 sec/batch
Epoch 15/20  Iteration 2504/3560 Training loss: 1.3959 1.7373 sec/batch
Epoch 15/20  Iteration 2505/3560 Training loss: 1.3954 1.6997 sec/batch
Epoch 15/20  Iteration 2506/3560 Training loss: 1.3969 1.7297 sec/batch
Epoch 15/20  Iteration 2507/3560 Training loss: 1.3954 1.7237 sec/batch
Epoch 15/20  Iteration 2508/3560 Training loss: 1.3932 1.7230 sec/batch
Epoch 15/20  Iteration 2509/3560 Training loss: 1.3935 1.6966 sec/batch
Epoch 15/20  Iteration 2510/3560 Training loss: 1.3943 1.6895 sec/batch
Epoch 15/20  Iteration 2511/3560 Training loss: 1.3943 1.7245 sec/batch
Epoch 15/20  Iteration 2512/3560 Training loss: 1.3952 1.6941 sec/batch
Epoch 15/20  Iteration 2513/3560 Training loss: 1.3953 1.7286 sec/batch
Epoch 15/20  Iteration 2514/3560 Training loss: 1.3955 1.7409 sec/batch
Epoch 15/20  Iteration 2515/3560 Training loss: 1.3949 1.7005 sec/batch
Epoch 15/20  Iteration 2516/3560 Training loss: 1.3950 1.7311 sec/batch
Epoch 15/20  Iteration 2517/3560 Training loss: 1.3946 1.7436 sec/batch
Epoch 15/20  Iteration 2518/3560 Training loss: 1.3935 1.7376 sec/batch
Epoch 15/20  Iteration 2519/3560 Training loss: 1.3923 1.7048 sec/batch
Epoch 15/20  Iteration 2520/3560 Training loss: 1.3928 1.7143 sec/batch
Epoch 15/20  Iteration 2521/3560 Training loss: 1.3932 1.7276 sec/batch
Epoch 15/20  Iteration 2522/3560 Training loss: 1.3937 1.7626 sec/batch
Epoch 15/20  Iteration 2523/3560 Training loss: 1.3931 1.8275 sec/batch
Epoch 15/20  Iteration 2524/3560 Training loss: 1.3925 1.7645 sec/batch
Epoch 15/20  Iteration 2525/3560 Training loss: 1.3927 1.6909 sec/batch
Epoch 15/20  Iteration 2526/3560 Training loss: 1.3929 1.7393 sec/batch
Epoch 15/20  Iteration 2527/3560 Training loss: 1.3930 1.7216 sec/batch
Epoch 15/20  Iteration 2528/3560 Training loss: 1.3932 1.7132 sec/batch
Epoch 15/20  Iteration 2529/3560 Training loss: 1.3922 1.7213 sec/batch
Epoch 15/20  Iteration 2530/3560 Training loss: 1.3910 1.7301 sec/batch
Epoch 15/20  Iteration 2531/3560 Training loss: 1.3896 1.7102 sec/batch
Epoch 15/20  Iteration 2532/3560 Training loss: 1.3890 1.7084 sec/batch
Epoch 15/20  Iteration 2533/3560 Training loss: 1.3885 1.7275 sec/batch
Epoch 15/20  Iteration 2534/3560 Training loss: 1.3892 2.0748 sec/batch
Epoch 15/20  Iteration 2535/3560 Training loss: 1.3886 1.7212 sec/batch
Epoch 15/20  Iteration 2536/3560 Training loss: 1.3878 1.7424 sec/batch
Epoch 15/20  Iteration 2537/3560 Training loss: 1.3880 1.7555 sec/batch
Epoch 15/20  Iteration 2538/3560 Training loss: 1.3871 1.7105 sec/batch
Epoch 15/20  Iteration 2539/3560 Training loss: 1.3870 1.7200 sec/batch
Epoch 15/20  Iteration 2540/3560 Training loss: 1.3867 1.7133 sec/batch
Epoch 15/20  Iteration 2541/3560 Training loss: 1.3864 1.7292 sec/batch
Epoch 15/20  Iteration 2542/3560 Training loss: 1.3869 1.7063 sec/batch
Epoch 15/20  Iteration 2543/3560 Training loss: 1.3862 1.7191 sec/batch
Epoch 15/20  Iteration 2544/3560 Training loss: 1.3869 1.6946 sec/batch
Epoch 15/20  Iteration 2545/3560 Training loss: 1.3869 1.7256 sec/batch
Epoch 15/20  Iteration 2546/3560 Training loss: 1.3871 1.7311 sec/batch
Epoch 15/20  Iteration 2547/3560 Training loss: 1.3869 1.7089 sec/batch
Epoch 15/20  Iteration 2548/3560 Training loss: 1.3871 1.7501 sec/batch
Epoch 15/20  Iteration 2549/3560 Training loss: 1.3874 1.7208 sec/batch
Epoch 15/20  Iteration 2550/3560 Training loss: 1.3870 1.7259 sec/batch
Epoch 15/20  Iteration 2551/3560 Training loss: 1.3865 1.7020 sec/batch
Epoch 15/20  Iteration 2552/3560 Training loss: 1.3872 1.7359 sec/batch
Epoch 15/20  Iteration 2553/3560 Training loss: 1.3872 1.7043 sec/batch
Epoch 15/20  Iteration 2554/3560 Training loss: 1.3881 1.7077 sec/batch
Epoch 15/20  Iteration 2555/3560 Training loss: 1.3884 1.6957 sec/batch
Epoch 15/20  Iteration 2556/3560 Training loss: 1.3885 1.7219 sec/batch
Epoch 15/20  Iteration 2557/3560 Training loss: 1.3886 1.7237 sec/batch
Epoch 15/20  Iteration 2558/3560 Training loss: 1.3887 1.7292 sec/batch
Epoch 15/20  Iteration 2559/3560 Training loss: 1.3889 1.7016 sec/batch
Epoch 15/20  Iteration 2560/3560 Training loss: 1.3886 1.7153 sec/batch
Epoch 15/20  Iteration 2561/3560 Training loss: 1.3886 1.7177 sec/batch
Epoch 15/20  Iteration 2562/3560 Training loss: 1.3885 1.7327 sec/batch
Epoch 15/20  Iteration 2563/3560 Training loss: 1.3889 1.6998 sec/batch
Epoch 15/20  Iteration 2564/3560 Training loss: 1.3891 1.7243 sec/batch
Epoch 15/20  Iteration 2565/3560 Training loss: 1.3895 1.7367 sec/batch
Epoch 15/20  Iteration 2566/3560 Training loss: 1.3891 2.0007 sec/batch
Epoch 15/20  Iteration 2567/3560 Training loss: 1.3891 1.7048 sec/batch
Epoch 15/20  Iteration 2568/3560 Training loss: 1.3893 1.6957 sec/batch
Epoch 15/20  Iteration 2569/3560 Training loss: 1.3892 1.7296 sec/batch
Epoch 15/20  Iteration 2570/3560 Training loss: 1.3892 1.7098 sec/batch
Epoch 15/20  Iteration 2571/3560 Training loss: 1.3887 1.7119 sec/batch
Epoch 15/20  Iteration 2572/3560 Training loss: 1.3886 1.7252 sec/batch
Epoch 15/20  Iteration 2573/3560 Training loss: 1.3882 1.7233 sec/batch
Epoch 15/20  Iteration 2574/3560 Training loss: 1.3881 1.7263 sec/batch
Epoch 15/20  Iteration 2575/3560 Training loss: 1.3877 1.6996 sec/batch
Epoch 15/20  Iteration 2576/3560 Training loss: 1.3876 1.7199 sec/batch
Epoch 15/20  Iteration 2577/3560 Training loss: 1.3873 1.7394 sec/batch
Epoch 15/20  Iteration 2578/3560 Training loss: 1.3870 1.7131 sec/batch
Epoch 15/20  Iteration 2579/3560 Training loss: 1.3867 1.7226 sec/batch
Epoch 15/20  Iteration 2580/3560 Training loss: 1.3864 1.7117 sec/batch
Epoch 15/20  Iteration 2581/3560 Training loss: 1.3862 1.7201 sec/batch
Epoch 15/20  Iteration 2582/3560 Training loss: 1.3863 1.7259 sec/batch
Epoch 15/20  Iteration 2583/3560 Training loss: 1.3861 1.7511 sec/batch
Epoch 15/20  Iteration 2584/3560 Training loss: 1.3861 1.7280 sec/batch
Epoch 15/20  Iteration 2585/3560 Training loss: 1.3858 1.7145 sec/batch
Epoch 15/20  Iteration 2586/3560 Training loss: 1.3854 1.7213 sec/batch
Epoch 15/20  Iteration 2587/3560 Training loss: 1.3851 1.7034 sec/batch
Epoch 15/20  Iteration 2588/3560 Training loss: 1.3852 1.7179 sec/batch
Epoch 15/20  Iteration 2589/3560 Training loss: 1.3852 1.7261 sec/batch
Epoch 15/20  Iteration 2590/3560 Training loss: 1.3849 1.7483 sec/batch
Epoch 15/20  Iteration 2591/3560 Training loss: 1.3845 1.6992 sec/batch
Epoch 15/20  Iteration 2592/3560 Training loss: 1.3842 1.8716 sec/batch
Epoch 15/20  Iteration 2593/3560 Training loss: 1.3842 1.7799 sec/batch
Epoch 15/20  Iteration 2594/3560 Training loss: 1.3841 1.7324 sec/batch
Epoch 15/20  Iteration 2595/3560 Training loss: 1.3841 1.7119 sec/batch
Epoch 15/20  Iteration 2596/3560 Training loss: 1.3839 1.7188 sec/batch
Epoch 15/20  Iteration 2597/3560 Training loss: 1.3837 1.7292 sec/batch
Epoch 15/20  Iteration 2598/3560 Training loss: 1.3836 1.7027 sec/batch
Epoch 15/20  Iteration 2599/3560 Training loss: 1.3836 1.7374 sec/batch
Epoch 15/20  Iteration 2600/3560 Training loss: 1.3836 1.7397 sec/batch
Epoch 15/20  Iteration 2601/3560 Training loss: 1.3835 1.6999 sec/batch
Epoch 15/20  Iteration 2602/3560 Training loss: 1.3835 1.6997 sec/batch
Epoch 15/20  Iteration 2603/3560 Training loss: 1.3833 1.7410 sec/batch
Epoch 15/20  Iteration 2604/3560 Training loss: 1.3833 1.6957 sec/batch
Epoch 15/20  Iteration 2605/3560 Training loss: 1.3831 1.7011 sec/batch
Epoch 15/20  Iteration 2606/3560 Training loss: 1.3831 1.7440 sec/batch
Epoch 15/20  Iteration 2607/3560 Training loss: 1.3828 1.6933 sec/batch
Epoch 15/20  Iteration 2608/3560 Training loss: 1.3825 1.7236 sec/batch
Epoch 15/20  Iteration 2609/3560 Training loss: 1.3825 1.7211 sec/batch
Epoch 15/20  Iteration 2610/3560 Training loss: 1.3825 1.7109 sec/batch
Epoch 15/20  Iteration 2611/3560 Training loss: 1.3824 1.7341 sec/batch
Epoch 15/20  Iteration 2612/3560 Training loss: 1.3824 1.7024 sec/batch
Epoch 15/20  Iteration 2613/3560 Training loss: 1.3823 1.7265 sec/batch
Epoch 15/20  Iteration 2614/3560 Training loss: 1.3819 1.7323 sec/batch
Epoch 15/20  Iteration 2615/3560 Training loss: 1.3815 1.6922 sec/batch
Epoch 15/20  Iteration 2616/3560 Training loss: 1.3815 1.7208 sec/batch
Epoch 15/20  Iteration 2617/3560 Training loss: 1.3815 1.7180 sec/batch
Epoch 15/20  Iteration 2618/3560 Training loss: 1.3812 1.6851 sec/batch
Epoch 15/20  Iteration 2619/3560 Training loss: 1.3813 1.7225 sec/batch
Epoch 15/20  Iteration 2620/3560 Training loss: 1.3813 1.7051 sec/batch
Epoch 15/20  Iteration 2621/3560 Training loss: 1.3812 1.7332 sec/batch
Epoch 15/20  Iteration 2622/3560 Training loss: 1.3809 1.6940 sec/batch
Epoch 15/20  Iteration 2623/3560 Training loss: 1.3806 1.7126 sec/batch
Epoch 15/20  Iteration 2624/3560 Training loss: 1.3804 1.7150 sec/batch
Epoch 15/20  Iteration 2625/3560 Training loss: 1.3806 1.6920 sec/batch
Epoch 15/20  Iteration 2626/3560 Training loss: 1.3806 1.7110 sec/batch
Epoch 15/20  Iteration 2627/3560 Training loss: 1.3807 1.7038 sec/batch
Epoch 15/20  Iteration 2628/3560 Training loss: 1.3808 1.7064 sec/batch
Epoch 15/20  Iteration 2629/3560 Training loss: 1.3810 1.7065 sec/batch
Epoch 15/20  Iteration 2630/3560 Training loss: 1.3811 1.7269 sec/batch
Epoch 15/20  Iteration 2631/3560 Training loss: 1.3811 1.7254 sec/batch
Epoch 15/20  Iteration 2632/3560 Training loss: 1.3811 1.7041 sec/batch
Epoch 15/20  Iteration 2633/3560 Training loss: 1.3814 1.7318 sec/batch
Epoch 15/20  Iteration 2634/3560 Training loss: 1.3814 1.7318 sec/batch
Epoch 15/20  Iteration 2635/3560 Training loss: 1.3813 1.7253 sec/batch
Epoch 15/20  Iteration 2636/3560 Training loss: 1.3816 1.7235 sec/batch
Epoch 15/20  Iteration 2637/3560 Training loss: 1.3814 1.7273 sec/batch
Epoch 15/20  Iteration 2638/3560 Training loss: 1.3816 1.7372 sec/batch
Epoch 15/20  Iteration 2639/3560 Training loss: 1.3816 1.6956 sec/batch
Epoch 15/20  Iteration 2640/3560 Training loss: 1.3818 1.7025 sec/batch
Epoch 15/20  Iteration 2641/3560 Training loss: 1.3819 1.7376 sec/batch
Epoch 15/20  Iteration 2642/3560 Training loss: 1.3819 1.7308 sec/batch
Epoch 15/20  Iteration 2643/3560 Training loss: 1.3816 1.7194 sec/batch
Epoch 15/20  Iteration 2644/3560 Training loss: 1.3815 1.7144 sec/batch
Epoch 15/20  Iteration 2645/3560 Training loss: 1.3816 1.7170 sec/batch
Epoch 15/20  Iteration 2646/3560 Training loss: 1.3815 1.7275 sec/batch
Epoch 15/20  Iteration 2647/3560 Training loss: 1.3816 1.7208 sec/batch
Epoch 15/20  Iteration 2648/3560 Training loss: 1.3815 1.7057 sec/batch
Epoch 15/20  Iteration 2649/3560 Training loss: 1.3816 1.7170 sec/batch
Epoch 15/20  Iteration 2650/3560 Training loss: 1.3815 1.7051 sec/batch
Epoch 15/20  Iteration 2651/3560 Training loss: 1.3812 1.7197 sec/batch
Epoch 15/20  Iteration 2652/3560 Training loss: 1.3814 1.7031 sec/batch
Epoch 15/20  Iteration 2653/3560 Training loss: 1.3816 1.7358 sec/batch
Epoch 15/20  Iteration 2654/3560 Training loss: 1.3816 1.7028 sec/batch
Epoch 15/20  Iteration 2655/3560 Training loss: 1.3816 1.7258 sec/batch
Epoch 15/20  Iteration 2656/3560 Training loss: 1.3816 1.7274 sec/batch
Epoch 15/20  Iteration 2657/3560 Training loss: 1.3816 1.7034 sec/batch
Epoch 15/20  Iteration 2658/3560 Training loss: 1.3816 1.7153 sec/batch
Epoch 15/20  Iteration 2659/3560 Training loss: 1.3818 1.6804 sec/batch
Epoch 15/20  Iteration 2660/3560 Training loss: 1.3822 1.7124 sec/batch
Epoch 15/20  Iteration 2661/3560 Training loss: 1.3822 1.8703 sec/batch
Epoch 15/20  Iteration 2662/3560 Training loss: 1.3822 1.7572 sec/batch
Epoch 15/20  Iteration 2663/3560 Training loss: 1.3821 1.6952 sec/batch
Epoch 15/20  Iteration 2664/3560 Training loss: 1.3820 1.7214 sec/batch
Epoch 15/20  Iteration 2665/3560 Training loss: 1.3820 1.7065 sec/batch
Epoch 15/20  Iteration 2666/3560 Training loss: 1.3821 1.7305 sec/batch
Epoch 15/20  Iteration 2667/3560 Training loss: 1.3822 1.7253 sec/batch
Epoch 15/20  Iteration 2668/3560 Training loss: 1.3820 1.7033 sec/batch
Epoch 15/20  Iteration 2669/3560 Training loss: 1.3819 1.7280 sec/batch
Epoch 15/20  Iteration 2670/3560 Training loss: 1.3821 1.7350 sec/batch
Epoch 16/20  Iteration 2671/3560 Training loss: 1.4685 1.7308 sec/batch
Epoch 16/20  Iteration 2672/3560 Training loss: 1.4315 1.7347 sec/batch
Epoch 16/20  Iteration 2673/3560 Training loss: 1.4188 1.7206 sec/batch
Epoch 16/20  Iteration 2674/3560 Training loss: 1.4122 1.7010 sec/batch
Epoch 16/20  Iteration 2675/3560 Training loss: 1.4007 1.6970 sec/batch
Epoch 16/20  Iteration 2676/3560 Training loss: 1.3898 1.7528 sec/batch
Epoch 16/20  Iteration 2677/3560 Training loss: 1.3897 1.7221 sec/batch
Epoch 16/20  Iteration 2678/3560 Training loss: 1.3872 1.7038 sec/batch
Epoch 16/20  Iteration 2679/3560 Training loss: 1.3877 1.7264 sec/batch
Epoch 16/20  Iteration 2680/3560 Training loss: 1.3871 1.7012 sec/batch
Epoch 16/20  Iteration 2681/3560 Training loss: 1.3837 1.7335 sec/batch
Epoch 16/20  Iteration 2682/3560 Training loss: 1.3827 1.7302 sec/batch
Epoch 16/20  Iteration 2683/3560 Training loss: 1.3825 1.6903 sec/batch
Epoch 16/20  Iteration 2684/3560 Training loss: 1.3838 1.6993 sec/batch
Epoch 16/20  Iteration 2685/3560 Training loss: 1.3829 1.7339 sec/batch
Epoch 16/20  Iteration 2686/3560 Training loss: 1.3813 1.7269 sec/batch
Epoch 16/20  Iteration 2687/3560 Training loss: 1.3818 1.7413 sec/batch
Epoch 16/20  Iteration 2688/3560 Training loss: 1.3832 1.7159 sec/batch
Epoch 16/20  Iteration 2689/3560 Training loss: 1.3832 1.7001 sec/batch
Epoch 16/20  Iteration 2690/3560 Training loss: 1.3844 1.7244 sec/batch
Epoch 16/20  Iteration 2691/3560 Training loss: 1.3838 1.7145 sec/batch
Epoch 16/20  Iteration 2692/3560 Training loss: 1.3837 1.7226 sec/batch
Epoch 16/20  Iteration 2693/3560 Training loss: 1.3829 1.7106 sec/batch
Epoch 16/20  Iteration 2694/3560 Training loss: 1.3825 1.6968 sec/batch
Epoch 16/20  Iteration 2695/3560 Training loss: 1.3827 1.7109 sec/batch
Epoch 16/20  Iteration 2696/3560 Training loss: 1.3813 1.6964 sec/batch
Epoch 16/20  Iteration 2697/3560 Training loss: 1.3800 1.7288 sec/batch
Epoch 16/20  Iteration 2698/3560 Training loss: 1.3804 1.6968 sec/batch
Epoch 16/20  Iteration 2699/3560 Training loss: 1.3809 1.7292 sec/batch
Epoch 16/20  Iteration 2700/3560 Training loss: 1.3814 1.6902 sec/batch
Epoch 16/20  Iteration 2701/3560 Training loss: 1.3807 1.7266 sec/batch
Epoch 16/20  Iteration 2702/3560 Training loss: 1.3800 1.7123 sec/batch
Epoch 16/20  Iteration 2703/3560 Training loss: 1.3801 1.6987 sec/batch
Epoch 16/20  Iteration 2704/3560 Training loss: 1.3804 1.7346 sec/batch
Epoch 16/20  Iteration 2705/3560 Training loss: 1.3803 1.8587 sec/batch
Epoch 16/20  Iteration 2706/3560 Training loss: 1.3802 1.7400 sec/batch
Epoch 16/20  Iteration 2707/3560 Training loss: 1.3793 1.9175 sec/batch
Epoch 16/20  Iteration 2708/3560 Training loss: 1.3782 1.7266 sec/batch
Epoch 16/20  Iteration 2709/3560 Training loss: 1.3768 1.7152 sec/batch
Epoch 16/20  Iteration 2710/3560 Training loss: 1.3763 1.7300 sec/batch
Epoch 16/20  Iteration 2711/3560 Training loss: 1.3756 1.7561 sec/batch
Epoch 16/20  Iteration 2712/3560 Training loss: 1.3763 1.7464 sec/batch
Epoch 16/20  Iteration 2713/3560 Training loss: 1.3758 1.7526 sec/batch
Epoch 16/20  Iteration 2714/3560 Training loss: 1.3750 1.7100 sec/batch
Epoch 16/20  Iteration 2715/3560 Training loss: 1.3752 1.7512 sec/batch
Epoch 16/20  Iteration 2716/3560 Training loss: 1.3744 1.7572 sec/batch
Epoch 16/20  Iteration 2717/3560 Training loss: 1.3742 1.7603 sec/batch
Epoch 16/20  Iteration 2718/3560 Training loss: 1.3738 1.6890 sec/batch
Epoch 16/20  Iteration 2719/3560 Training loss: 1.3738 1.7455 sec/batch
Epoch 16/20  Iteration 2720/3560 Training loss: 1.3741 1.7149 sec/batch
Epoch 16/20  Iteration 2721/3560 Training loss: 1.3737 1.6947 sec/batch
Epoch 16/20  Iteration 2722/3560 Training loss: 1.3742 1.7469 sec/batch
Epoch 16/20  Iteration 2723/3560 Training loss: 1.3741 1.7184 sec/batch
Epoch 16/20  Iteration 2724/3560 Training loss: 1.3744 1.7285 sec/batch
Epoch 16/20  Iteration 2725/3560 Training loss: 1.3742 1.6838 sec/batch
Epoch 16/20  Iteration 2726/3560 Training loss: 1.3743 1.7341 sec/batch
Epoch 16/20  Iteration 2727/3560 Training loss: 1.3745 1.7297 sec/batch
Epoch 16/20  Iteration 2728/3560 Training loss: 1.3742 1.7256 sec/batch
Epoch 16/20  Iteration 2729/3560 Training loss: 1.3736 1.7454 sec/batch
Epoch 16/20  Iteration 2730/3560 Training loss: 1.3742 1.8165 sec/batch
Epoch 16/20  Iteration 2731/3560 Training loss: 1.3743 1.7643 sec/batch
Epoch 16/20  Iteration 2732/3560 Training loss: 1.3754 1.7120 sec/batch
Epoch 16/20  Iteration 2733/3560 Training loss: 1.3758 1.7336 sec/batch
Epoch 16/20  Iteration 2734/3560 Training loss: 1.3760 1.7043 sec/batch
Epoch 16/20  Iteration 2735/3560 Training loss: 1.3759 1.7073 sec/batch
Epoch 16/20  Iteration 2736/3560 Training loss: 1.3761 1.7039 sec/batch
Epoch 16/20  Iteration 2737/3560 Training loss: 1.3764 1.7323 sec/batch
Epoch 16/20  Iteration 2738/3560 Training loss: 1.3761 1.7242 sec/batch
Epoch 16/20  Iteration 2739/3560 Training loss: 1.3761 1.7511 sec/batch
Epoch 16/20  Iteration 2740/3560 Training loss: 1.3760 1.7025 sec/batch
Epoch 16/20  Iteration 2741/3560 Training loss: 1.3765 1.6849 sec/batch
Epoch 16/20  Iteration 2742/3560 Training loss: 1.3769 1.7301 sec/batch
Epoch 16/20  Iteration 2743/3560 Training loss: 1.3773 1.7285 sec/batch
Epoch 16/20  Iteration 2744/3560 Training loss: 1.3769 1.7303 sec/batch
Epoch 16/20  Iteration 2745/3560 Training loss: 1.3768 1.7261 sec/batch
Epoch 16/20  Iteration 2746/3560 Training loss: 1.3770 1.7165 sec/batch
Epoch 16/20  Iteration 2747/3560 Training loss: 1.3769 1.7321 sec/batch
Epoch 16/20  Iteration 2748/3560 Training loss: 1.3768 1.7215 sec/batch
Epoch 16/20  Iteration 2749/3560 Training loss: 1.3762 1.6931 sec/batch
Epoch 16/20  Iteration 2750/3560 Training loss: 1.3761 1.7274 sec/batch
Epoch 16/20  Iteration 2751/3560 Training loss: 1.3757 1.7273 sec/batch
Epoch 16/20  Iteration 2752/3560 Training loss: 1.3756 1.7262 sec/batch
Epoch 16/20  Iteration 2753/3560 Training loss: 1.3752 1.7276 sec/batch
Epoch 16/20  Iteration 2754/3560 Training loss: 1.3751 1.6929 sec/batch
Epoch 16/20  Iteration 2755/3560 Training loss: 1.3748 1.6942 sec/batch
Epoch 16/20  Iteration 2756/3560 Training loss: 1.3746 1.7217 sec/batch
Epoch 16/20  Iteration 2757/3560 Training loss: 1.3743 1.7339 sec/batch
Epoch 16/20  Iteration 2758/3560 Training loss: 1.3740 1.6966 sec/batch
Epoch 16/20  Iteration 2759/3560 Training loss: 1.3737 1.7379 sec/batch
Epoch 16/20  Iteration 2760/3560 Training loss: 1.3738 1.7289 sec/batch
Epoch 16/20  Iteration 2761/3560 Training loss: 1.3734 1.7265 sec/batch
Epoch 16/20  Iteration 2762/3560 Training loss: 1.3732 1.7342 sec/batch
Epoch 16/20  Iteration 2763/3560 Training loss: 1.3730 1.7112 sec/batch
Epoch 16/20  Iteration 2764/3560 Training loss: 1.3726 1.7203 sec/batch
Epoch 16/20  Iteration 2765/3560 Training loss: 1.3723 1.7267 sec/batch
Epoch 16/20  Iteration 2766/3560 Training loss: 1.3725 1.7316 sec/batch
Epoch 16/20  Iteration 2767/3560 Training loss: 1.3724 1.7340 sec/batch
Epoch 16/20  Iteration 2768/3560 Training loss: 1.3721 1.6935 sec/batch
Epoch 16/20  Iteration 2769/3560 Training loss: 1.3718 1.7274 sec/batch
Epoch 16/20  Iteration 2770/3560 Training loss: 1.3714 1.7264 sec/batch
Epoch 16/20  Iteration 2771/3560 Training loss: 1.3715 1.7132 sec/batch
Epoch 16/20  Iteration 2772/3560 Training loss: 1.3715 1.7253 sec/batch
Epoch 16/20  Iteration 2773/3560 Training loss: 1.3714 1.7170 sec/batch
Epoch 16/20  Iteration 2774/3560 Training loss: 1.3712 1.7472 sec/batch
Epoch 16/20  Iteration 2775/3560 Training loss: 1.3710 1.7289 sec/batch
Epoch 16/20  Iteration 2776/3560 Training loss: 1.3709 1.7067 sec/batch
Epoch 16/20  Iteration 2777/3560 Training loss: 1.3708 1.7277 sec/batch
Epoch 16/20  Iteration 2778/3560 Training loss: 1.3707 1.6846 sec/batch
Epoch 16/20  Iteration 2779/3560 Training loss: 1.3707 1.7395 sec/batch
Epoch 16/20  Iteration 2780/3560 Training loss: 1.3707 1.6971 sec/batch
Epoch 16/20  Iteration 2781/3560 Training loss: 1.3706 1.7191 sec/batch
Epoch 16/20  Iteration 2782/3560 Training loss: 1.3705 1.7118 sec/batch
Epoch 16/20  Iteration 2783/3560 Training loss: 1.3704 1.7051 sec/batch
Epoch 16/20  Iteration 2784/3560 Training loss: 1.3703 1.7229 sec/batch
Epoch 16/20  Iteration 2785/3560 Training loss: 1.3700 1.7058 sec/batch
Epoch 16/20  Iteration 2786/3560 Training loss: 1.3697 1.7006 sec/batch
Epoch 16/20  Iteration 2787/3560 Training loss: 1.3697 1.6967 sec/batch
Epoch 16/20  Iteration 2788/3560 Training loss: 1.3698 1.6918 sec/batch
Epoch 16/20  Iteration 2789/3560 Training loss: 1.3697 1.7146 sec/batch
Epoch 16/20  Iteration 2790/3560 Training loss: 1.3696 1.7354 sec/batch
Epoch 16/20  Iteration 2791/3560 Training loss: 1.3695 1.7143 sec/batch
Epoch 16/20  Iteration 2792/3560 Training loss: 1.3691 1.7265 sec/batch
Epoch 16/20  Iteration 2793/3560 Training loss: 1.3688 1.7232 sec/batch
Epoch 16/20  Iteration 2794/3560 Training loss: 1.3689 1.6984 sec/batch
Epoch 16/20  Iteration 2795/3560 Training loss: 1.3689 1.7008 sec/batch
Epoch 16/20  Iteration 2796/3560 Training loss: 1.3686 1.6978 sec/batch
Epoch 16/20  Iteration 2797/3560 Training loss: 1.3686 1.7117 sec/batch
Epoch 16/20  Iteration 2798/3560 Training loss: 1.3687 1.7317 sec/batch
Epoch 16/20  Iteration 2799/3560 Training loss: 1.3684 1.6888 sec/batch
Epoch 16/20  Iteration 2800/3560 Training loss: 1.3681 1.8786 sec/batch
Epoch 16/20  Iteration 2801/3560 Training loss: 1.3677 1.7692 sec/batch
Epoch 16/20  Iteration 2802/3560 Training loss: 1.3675 1.7172 sec/batch
Epoch 16/20  Iteration 2803/3560 Training loss: 1.3676 1.7218 sec/batch
Epoch 16/20  Iteration 2804/3560 Training loss: 1.3677 1.6874 sec/batch
Epoch 16/20  Iteration 2805/3560 Training loss: 1.3677 1.7280 sec/batch
Epoch 16/20  Iteration 2806/3560 Training loss: 1.3676 1.7172 sec/batch
Epoch 16/20  Iteration 2807/3560 Training loss: 1.3679 1.7212 sec/batch
Epoch 16/20  Iteration 2808/3560 Training loss: 1.3680 1.7212 sec/batch
Epoch 16/20  Iteration 2809/3560 Training loss: 1.3679 1.7335 sec/batch
Epoch 16/20  Iteration 2810/3560 Training loss: 1.3680 1.6905 sec/batch
Epoch 16/20  Iteration 2811/3560 Training loss: 1.3684 1.7309 sec/batch
Epoch 16/20  Iteration 2812/3560 Training loss: 1.3684 1.7291 sec/batch
Epoch 16/20  Iteration 2813/3560 Training loss: 1.3684 1.7342 sec/batch
Epoch 16/20  Iteration 2814/3560 Training loss: 1.3686 1.6974 sec/batch
Epoch 16/20  Iteration 2815/3560 Training loss: 1.3685 1.7375 sec/batch
Epoch 16/20  Iteration 2816/3560 Training loss: 1.3687 1.7071 sec/batch
Epoch 16/20  Iteration 2817/3560 Training loss: 1.3687 1.6957 sec/batch
Epoch 16/20  Iteration 2818/3560 Training loss: 1.3688 1.6954 sec/batch
Epoch 16/20  Iteration 2819/3560 Training loss: 1.3690 1.7233 sec/batch
Epoch 16/20  Iteration 2820/3560 Training loss: 1.3690 1.7154 sec/batch
Epoch 16/20  Iteration 2821/3560 Training loss: 1.3687 1.7105 sec/batch
Epoch 16/20  Iteration 2822/3560 Training loss: 1.3686 1.6996 sec/batch
Epoch 16/20  Iteration 2823/3560 Training loss: 1.3686 1.7056 sec/batch
Epoch 16/20  Iteration 2824/3560 Training loss: 1.3686 1.7253 sec/batch
Epoch 16/20  Iteration 2825/3560 Training loss: 1.3687 1.7031 sec/batch
Epoch 16/20  Iteration 2826/3560 Training loss: 1.3686 1.7459 sec/batch
Epoch 16/20  Iteration 2827/3560 Training loss: 1.3687 1.7097 sec/batch
Epoch 16/20  Iteration 2828/3560 Training loss: 1.3686 1.7249 sec/batch
Epoch 16/20  Iteration 2829/3560 Training loss: 1.3685 1.6937 sec/batch
Epoch 16/20  Iteration 2830/3560 Training loss: 1.3686 1.7282 sec/batch
Epoch 16/20  Iteration 2831/3560 Training loss: 1.3688 1.7057 sec/batch
Epoch 16/20  Iteration 2832/3560 Training loss: 1.3689 1.7151 sec/batch
Epoch 16/20  Iteration 2833/3560 Training loss: 1.3689 1.6888 sec/batch
Epoch 16/20  Iteration 2834/3560 Training loss: 1.3689 1.7122 sec/batch
Epoch 16/20  Iteration 2835/3560 Training loss: 1.3689 1.6944 sec/batch
Epoch 16/20  Iteration 2836/3560 Training loss: 1.3689 1.6995 sec/batch
Epoch 16/20  Iteration 2837/3560 Training loss: 1.3691 1.7265 sec/batch
Epoch 16/20  Iteration 2838/3560 Training loss: 1.3695 1.7041 sec/batch
Epoch 16/20  Iteration 2839/3560 Training loss: 1.3695 1.7054 sec/batch
Epoch 16/20  Iteration 2840/3560 Training loss: 1.3695 1.7040 sec/batch
Epoch 16/20  Iteration 2841/3560 Training loss: 1.3694 1.7334 sec/batch
Epoch 16/20  Iteration 2842/3560 Training loss: 1.3693 1.7000 sec/batch
Epoch 16/20  Iteration 2843/3560 Training loss: 1.3695 1.7260 sec/batch
Epoch 16/20  Iteration 2844/3560 Training loss: 1.3695 1.7305 sec/batch
Epoch 16/20  Iteration 2845/3560 Training loss: 1.3695 1.7290 sec/batch
Epoch 16/20  Iteration 2846/3560 Training loss: 1.3694 1.6971 sec/batch
Epoch 16/20  Iteration 2847/3560 Training loss: 1.3693 1.7108 sec/batch
Epoch 16/20  Iteration 2848/3560 Training loss: 1.3694 1.7198 sec/batch
Epoch 17/20  Iteration 2849/3560 Training loss: 1.4598 1.6993 sec/batch
Epoch 17/20  Iteration 2850/3560 Training loss: 1.4210 1.7145 sec/batch
Epoch 17/20  Iteration 2851/3560 Training loss: 1.4034 1.7283 sec/batch
Epoch 17/20  Iteration 2852/3560 Training loss: 1.3988 1.7284 sec/batch
Epoch 17/20  Iteration 2853/3560 Training loss: 1.3899 1.7263 sec/batch
Epoch 17/20  Iteration 2854/3560 Training loss: 1.3784 1.7066 sec/batch
Epoch 17/20  Iteration 2855/3560 Training loss: 1.3789 1.6883 sec/batch
Epoch 17/20  Iteration 2856/3560 Training loss: 1.3769 1.6960 sec/batch
Epoch 17/20  Iteration 2857/3560 Training loss: 1.3757 1.7028 sec/batch
Epoch 17/20  Iteration 2858/3560 Training loss: 1.3748 1.7039 sec/batch
Epoch 17/20  Iteration 2859/3560 Training loss: 1.3726 1.7153 sec/batch
Epoch 17/20  Iteration 2860/3560 Training loss: 1.3718 1.7268 sec/batch
Epoch 17/20  Iteration 2861/3560 Training loss: 1.3717 1.7454 sec/batch
Epoch 17/20  Iteration 2862/3560 Training loss: 1.3722 1.7297 sec/batch
Epoch 17/20  Iteration 2863/3560 Training loss: 1.3712 1.7271 sec/batch
Epoch 17/20  Iteration 2864/3560 Training loss: 1.3692 1.7123 sec/batch
Epoch 17/20  Iteration 2865/3560 Training loss: 1.3693 1.6941 sec/batch
Epoch 17/20  Iteration 2866/3560 Training loss: 1.3704 1.6953 sec/batch
Epoch 17/20  Iteration 2867/3560 Training loss: 1.3707 1.7247 sec/batch
Epoch 17/20  Iteration 2868/3560 Training loss: 1.3720 1.8967 sec/batch
Epoch 17/20  Iteration 2869/3560 Training loss: 1.3715 1.8363 sec/batch
Epoch 17/20  Iteration 2870/3560 Training loss: 1.3719 1.8491 sec/batch
Epoch 17/20  Iteration 2871/3560 Training loss: 1.3710 1.7360 sec/batch
Epoch 17/20  Iteration 2872/3560 Training loss: 1.3713 1.7232 sec/batch
Epoch 17/20  Iteration 2873/3560 Training loss: 1.3708 1.7243 sec/batch
Epoch 17/20  Iteration 2874/3560 Training loss: 1.3696 1.7251 sec/batch
Epoch 17/20  Iteration 2875/3560 Training loss: 1.3686 1.7141 sec/batch
Epoch 17/20  Iteration 2876/3560 Training loss: 1.3695 1.7208 sec/batch
Epoch 17/20  Iteration 2877/3560 Training loss: 1.3694 1.7382 sec/batch
Epoch 17/20  Iteration 2878/3560 Training loss: 1.3697 1.7467 sec/batch
Epoch 17/20  Iteration 2879/3560 Training loss: 1.3692 1.7250 sec/batch
Epoch 17/20  Iteration 2880/3560 Training loss: 1.3685 1.7237 sec/batch
Epoch 17/20  Iteration 2881/3560 Training loss: 1.3688 2.0416 sec/batch
Epoch 17/20  Iteration 2882/3560 Training loss: 1.3689 1.7365 sec/batch
Epoch 17/20  Iteration 2883/3560 Training loss: 1.3683 1.6963 sec/batch
Epoch 17/20  Iteration 2884/3560 Training loss: 1.3684 1.7147 sec/batch
Epoch 17/20  Iteration 2885/3560 Training loss: 1.3675 1.7302 sec/batch
Epoch 17/20  Iteration 2886/3560 Training loss: 1.3664 1.7188 sec/batch
Epoch 17/20  Iteration 2887/3560 Training loss: 1.3652 1.7380 sec/batch
Epoch 17/20  Iteration 2888/3560 Training loss: 1.3647 1.6860 sec/batch
Epoch 17/20  Iteration 2889/3560 Training loss: 1.3642 1.7098 sec/batch
Epoch 17/20  Iteration 2890/3560 Training loss: 1.3650 1.7441 sec/batch
Epoch 17/20  Iteration 2891/3560 Training loss: 1.3644 1.6828 sec/batch
Epoch 17/20  Iteration 2892/3560 Training loss: 1.3638 1.7268 sec/batch
Epoch 17/20  Iteration 2893/3560 Training loss: 1.3639 1.7274 sec/batch
Epoch 17/20  Iteration 2894/3560 Training loss: 1.3631 1.7378 sec/batch
Epoch 17/20  Iteration 2895/3560 Training loss: 1.3628 1.7407 sec/batch
Epoch 17/20  Iteration 2896/3560 Training loss: 1.3624 1.7356 sec/batch
Epoch 17/20  Iteration 2897/3560 Training loss: 1.3623 1.7232 sec/batch
Epoch 17/20  Iteration 2898/3560 Training loss: 1.3624 1.7247 sec/batch
Epoch 17/20  Iteration 2899/3560 Training loss: 1.3621 1.7367 sec/batch
Epoch 17/20  Iteration 2900/3560 Training loss: 1.3625 1.7357 sec/batch
Epoch 17/20  Iteration 2901/3560 Training loss: 1.3623 1.6879 sec/batch
Epoch 17/20  Iteration 2902/3560 Training loss: 1.3625 1.7253 sec/batch
Epoch 17/20  Iteration 2903/3560 Training loss: 1.3623 1.7236 sec/batch
Epoch 17/20  Iteration 2904/3560 Training loss: 1.3625 1.7243 sec/batch
Epoch 17/20  Iteration 2905/3560 Training loss: 1.3628 1.6938 sec/batch
Epoch 17/20  Iteration 2906/3560 Training loss: 1.3625 1.7414 sec/batch
Epoch 17/20  Iteration 2907/3560 Training loss: 1.3620 1.6938 sec/batch
Epoch 17/20  Iteration 2908/3560 Training loss: 1.3628 1.7027 sec/batch
Epoch 17/20  Iteration 2909/3560 Training loss: 1.3629 1.7314 sec/batch
Epoch 17/20  Iteration 2910/3560 Training loss: 1.3639 1.7077 sec/batch
Epoch 17/20  Iteration 2911/3560 Training loss: 1.3642 1.7282 sec/batch
Epoch 17/20  Iteration 2912/3560 Training loss: 1.3644 1.7199 sec/batch
Epoch 17/20  Iteration 2913/3560 Training loss: 1.3643 1.7145 sec/batch
Epoch 17/20  Iteration 2914/3560 Training loss: 1.3645 1.7247 sec/batch
Epoch 17/20  Iteration 2915/3560 Training loss: 1.3649 1.7204 sec/batch
Epoch 17/20  Iteration 2916/3560 Training loss: 1.3645 1.7428 sec/batch
Epoch 17/20  Iteration 2917/3560 Training loss: 1.3644 1.7229 sec/batch
Epoch 17/20  Iteration 2918/3560 Training loss: 1.3643 1.6932 sec/batch
Epoch 17/20  Iteration 2919/3560 Training loss: 1.3647 1.6978 sec/batch
Epoch 17/20  Iteration 2920/3560 Training loss: 1.3650 1.7340 sec/batch
Epoch 17/20  Iteration 2921/3560 Training loss: 1.3654 1.7129 sec/batch
Epoch 17/20  Iteration 2922/3560 Training loss: 1.3650 1.7267 sec/batch
Epoch 17/20  Iteration 2923/3560 Training loss: 1.3649 1.7238 sec/batch
Epoch 17/20  Iteration 2924/3560 Training loss: 1.3650 1.7283 sec/batch
Epoch 17/20  Iteration 2925/3560 Training loss: 1.3649 1.7020 sec/batch
Epoch 17/20  Iteration 2926/3560 Training loss: 1.3648 1.6858 sec/batch
Epoch 17/20  Iteration 2927/3560 Training loss: 1.3642 1.7029 sec/batch
Epoch 17/20  Iteration 2928/3560 Training loss: 1.3641 1.7085 sec/batch
Epoch 17/20  Iteration 2929/3560 Training loss: 1.3637 1.7298 sec/batch
Epoch 17/20  Iteration 2930/3560 Training loss: 1.3636 1.7419 sec/batch
Epoch 17/20  Iteration 2931/3560 Training loss: 1.3632 1.7016 sec/batch
Epoch 17/20  Iteration 2932/3560 Training loss: 1.3631 1.7279 sec/batch
Epoch 17/20  Iteration 2933/3560 Training loss: 1.3628 1.7324 sec/batch
Epoch 17/20  Iteration 2934/3560 Training loss: 1.3626 1.7348 sec/batch
Epoch 17/20  Iteration 2935/3560 Training loss: 1.3624 1.7109 sec/batch
Epoch 17/20  Iteration 2936/3560 Training loss: 1.3621 1.7033 sec/batch
Epoch 17/20  Iteration 2937/3560 Training loss: 1.3619 1.7008 sec/batch
Epoch 17/20  Iteration 2938/3560 Training loss: 1.3620 1.7047 sec/batch
Epoch 17/20  Iteration 2939/3560 Training loss: 1.3617 1.8763 sec/batch
Epoch 17/20  Iteration 2940/3560 Training loss: 1.3617 1.7813 sec/batch
Epoch 17/20  Iteration 2941/3560 Training loss: 1.3614 1.7003 sec/batch
Epoch 17/20  Iteration 2942/3560 Training loss: 1.3611 1.7202 sec/batch
Epoch 17/20  Iteration 2943/3560 Training loss: 1.3609 1.7109 sec/batch
Epoch 17/20  Iteration 2944/3560 Training loss: 1.3610 1.7129 sec/batch
Epoch 17/20  Iteration 2945/3560 Training loss: 1.3609 1.7323 sec/batch
Epoch 17/20  Iteration 2946/3560 Training loss: 1.3606 1.7091 sec/batch
Epoch 17/20  Iteration 2947/3560 Training loss: 1.3603 1.7276 sec/batch
Epoch 17/20  Iteration 2948/3560 Training loss: 1.3599 1.6914 sec/batch
Epoch 17/20  Iteration 2949/3560 Training loss: 1.3599 1.7111 sec/batch
Epoch 17/20  Iteration 2950/3560 Training loss: 1.3599 1.7340 sec/batch
Epoch 17/20  Iteration 2951/3560 Training loss: 1.3599 1.7018 sec/batch
Epoch 17/20  Iteration 2952/3560 Training loss: 1.3597 1.7135 sec/batch
Epoch 17/20  Iteration 2953/3560 Training loss: 1.3596 1.7207 sec/batch
Epoch 17/20  Iteration 2954/3560 Training loss: 1.3595 1.7380 sec/batch
Epoch 17/20  Iteration 2955/3560 Training loss: 1.3595 1.7220 sec/batch
Epoch 17/20  Iteration 2956/3560 Training loss: 1.3593 1.7100 sec/batch
Epoch 17/20  Iteration 2957/3560 Training loss: 1.3593 1.7163 sec/batch
Epoch 17/20  Iteration 2958/3560 Training loss: 1.3594 1.7021 sec/batch
Epoch 17/20  Iteration 2959/3560 Training loss: 1.3591 1.7351 sec/batch
Epoch 17/20  Iteration 2960/3560 Training loss: 1.3591 1.7275 sec/batch
Epoch 17/20  Iteration 2961/3560 Training loss: 1.3591 1.7047 sec/batch
Epoch 17/20  Iteration 2962/3560 Training loss: 1.3590 1.7332 sec/batch
Epoch 17/20  Iteration 2963/3560 Training loss: 1.3586 1.7201 sec/batch
Epoch 17/20  Iteration 2964/3560 Training loss: 1.3583 1.7024 sec/batch
Epoch 17/20  Iteration 2965/3560 Training loss: 1.3583 1.7494 sec/batch
Epoch 17/20  Iteration 2966/3560 Training loss: 1.3584 1.7167 sec/batch
Epoch 17/20  Iteration 2967/3560 Training loss: 1.3583 1.7224 sec/batch
Epoch 17/20  Iteration 2968/3560 Training loss: 1.3582 1.6981 sec/batch
Epoch 17/20  Iteration 2969/3560 Training loss: 1.3581 1.7387 sec/batch
Epoch 17/20  Iteration 2970/3560 Training loss: 1.3578 1.7273 sec/batch
Epoch 17/20  Iteration 2971/3560 Training loss: 1.3575 1.7201 sec/batch
Epoch 17/20  Iteration 2972/3560 Training loss: 1.3575 1.7287 sec/batch
Epoch 17/20  Iteration 2973/3560 Training loss: 1.3574 1.7209 sec/batch
Epoch 17/20  Iteration 2974/3560 Training loss: 1.3571 1.6944 sec/batch
Epoch 17/20  Iteration 2975/3560 Training loss: 1.3572 1.7238 sec/batch
Epoch 17/20  Iteration 2976/3560 Training loss: 1.3572 1.7302 sec/batch
Epoch 17/20  Iteration 2977/3560 Training loss: 1.3571 1.6900 sec/batch
Epoch 17/20  Iteration 2978/3560 Training loss: 1.3568 1.7215 sec/batch
Epoch 17/20  Iteration 2979/3560 Training loss: 1.3565 1.7400 sec/batch
Epoch 17/20  Iteration 2980/3560 Training loss: 1.3563 1.7029 sec/batch
Epoch 17/20  Iteration 2981/3560 Training loss: 1.3564 1.7131 sec/batch
Epoch 17/20  Iteration 2982/3560 Training loss: 1.3564 1.7144 sec/batch
Epoch 17/20  Iteration 2983/3560 Training loss: 1.3563 1.7034 sec/batch
Epoch 17/20  Iteration 2984/3560 Training loss: 1.3564 1.7126 sec/batch
Epoch 17/20  Iteration 2985/3560 Training loss: 1.3566 1.7289 sec/batch
Epoch 17/20  Iteration 2986/3560 Training loss: 1.3567 1.7329 sec/batch
Epoch 17/20  Iteration 2987/3560 Training loss: 1.3568 1.7225 sec/batch
Epoch 17/20  Iteration 2988/3560 Training loss: 1.3567 1.7241 sec/batch
Epoch 17/20  Iteration 2989/3560 Training loss: 1.3570 1.7327 sec/batch
Epoch 17/20  Iteration 2990/3560 Training loss: 1.3570 1.7013 sec/batch
Epoch 17/20  Iteration 2991/3560 Training loss: 1.3569 1.7250 sec/batch
Epoch 17/20  Iteration 2992/3560 Training loss: 1.3572 1.7295 sec/batch
Epoch 17/20  Iteration 2993/3560 Training loss: 1.3570 1.7041 sec/batch
Epoch 17/20  Iteration 2994/3560 Training loss: 1.3571 1.6930 sec/batch
Epoch 17/20  Iteration 2995/3560 Training loss: 1.3572 1.7047 sec/batch
Epoch 17/20  Iteration 2996/3560 Training loss: 1.3574 1.7263 sec/batch
Epoch 17/20  Iteration 2997/3560 Training loss: 1.3576 1.7297 sec/batch
Epoch 17/20  Iteration 2998/3560 Training loss: 1.3576 1.7201 sec/batch
Epoch 17/20  Iteration 2999/3560 Training loss: 1.3573 1.7390 sec/batch
Epoch 17/20  Iteration 3000/3560 Training loss: 1.3572 1.6980 sec/batch
Epoch 17/20  Iteration 3001/3560 Training loss: 1.3572 1.7239 sec/batch
Epoch 17/20  Iteration 3002/3560 Training loss: 1.3572 1.6856 sec/batch
Epoch 17/20  Iteration 3003/3560 Training loss: 1.3572 1.6990 sec/batch
Epoch 17/20  Iteration 3004/3560 Training loss: 1.3572 1.6944 sec/batch
Epoch 17/20  Iteration 3005/3560 Training loss: 1.3572 1.7238 sec/batch
Epoch 17/20  Iteration 3006/3560 Training loss: 1.3572 1.7132 sec/batch
Epoch 17/20  Iteration 3007/3560 Training loss: 1.3570 1.8199 sec/batch
Epoch 17/20  Iteration 3008/3560 Training loss: 1.3570 1.8114 sec/batch
Epoch 17/20  Iteration 3009/3560 Training loss: 1.3573 1.7518 sec/batch
Epoch 17/20  Iteration 3010/3560 Training loss: 1.3574 1.7211 sec/batch
Epoch 17/20  Iteration 3011/3560 Training loss: 1.3574 1.7293 sec/batch
Epoch 17/20  Iteration 3012/3560 Training loss: 1.3574 1.7221 sec/batch
Epoch 17/20  Iteration 3013/3560 Training loss: 1.3574 1.7129 sec/batch
Epoch 17/20  Iteration 3014/3560 Training loss: 1.3575 1.7269 sec/batch
Epoch 17/20  Iteration 3015/3560 Training loss: 1.3576 1.7128 sec/batch
Epoch 17/20  Iteration 3016/3560 Training loss: 1.3580 1.6996 sec/batch
Epoch 17/20  Iteration 3017/3560 Training loss: 1.3580 1.7427 sec/batch
Epoch 17/20  Iteration 3018/3560 Training loss: 1.3580 1.7170 sec/batch
Epoch 17/20  Iteration 3019/3560 Training loss: 1.3580 1.7078 sec/batch
Epoch 17/20  Iteration 3020/3560 Training loss: 1.3578 1.7071 sec/batch
Epoch 17/20  Iteration 3021/3560 Training loss: 1.3580 1.6877 sec/batch
Epoch 17/20  Iteration 3022/3560 Training loss: 1.3580 1.7258 sec/batch
Epoch 17/20  Iteration 3023/3560 Training loss: 1.3581 1.7094 sec/batch
Epoch 17/20  Iteration 3024/3560 Training loss: 1.3580 1.7384 sec/batch
Epoch 17/20  Iteration 3025/3560 Training loss: 1.3579 1.6968 sec/batch
Epoch 17/20  Iteration 3026/3560 Training loss: 1.3580 1.7266 sec/batch
Epoch 18/20  Iteration 3027/3560 Training loss: 1.4457 1.7249 sec/batch
Epoch 18/20  Iteration 3028/3560 Training loss: 1.4050 1.6943 sec/batch
Epoch 18/20  Iteration 3029/3560 Training loss: 1.3914 1.7005 sec/batch
Epoch 18/20  Iteration 3030/3560 Training loss: 1.3883 1.7401 sec/batch
Epoch 18/20  Iteration 3031/3560 Training loss: 1.3799 1.7306 sec/batch
Epoch 18/20  Iteration 3032/3560 Training loss: 1.3696 1.7255 sec/batch
Epoch 18/20  Iteration 3033/3560 Training loss: 1.3699 1.7297 sec/batch
Epoch 18/20  Iteration 3034/3560 Training loss: 1.3675 1.7163 sec/batch
Epoch 18/20  Iteration 3035/3560 Training loss: 1.3672 1.7294 sec/batch
Epoch 18/20  Iteration 3036/3560 Training loss: 1.3653 1.7275 sec/batch
Epoch 18/20  Iteration 3037/3560 Training loss: 1.3621 1.7081 sec/batch
Epoch 18/20  Iteration 3038/3560 Training loss: 1.3611 1.7287 sec/batch
Epoch 18/20  Iteration 3039/3560 Training loss: 1.3605 1.7140 sec/batch
Epoch 18/20  Iteration 3040/3560 Training loss: 1.3604 1.7372 sec/batch
Epoch 18/20  Iteration 3041/3560 Training loss: 1.3595 1.7262 sec/batch
Epoch 18/20  Iteration 3042/3560 Training loss: 1.3575 1.7367 sec/batch
Epoch 18/20  Iteration 3043/3560 Training loss: 1.3575 1.7075 sec/batch
Epoch 18/20  Iteration 3044/3560 Training loss: 1.3588 1.7059 sec/batch
Epoch 18/20  Iteration 3045/3560 Training loss: 1.3593 1.7331 sec/batch
Epoch 18/20  Iteration 3046/3560 Training loss: 1.3601 1.7107 sec/batch
Epoch 18/20  Iteration 3047/3560 Training loss: 1.3597 1.7207 sec/batch
Epoch 18/20  Iteration 3048/3560 Training loss: 1.3601 1.7307 sec/batch
Epoch 18/20  Iteration 3049/3560 Training loss: 1.3595 1.7019 sec/batch
Epoch 18/20  Iteration 3050/3560 Training loss: 1.3596 1.6920 sec/batch
Epoch 18/20  Iteration 3051/3560 Training loss: 1.3594 1.7369 sec/batch
Epoch 18/20  Iteration 3052/3560 Training loss: 1.3579 1.7039 sec/batch
Epoch 18/20  Iteration 3053/3560 Training loss: 1.3568 1.6860 sec/batch
Epoch 18/20  Iteration 3054/3560 Training loss: 1.3574 1.9207 sec/batch
Epoch 18/20  Iteration 3055/3560 Training loss: 1.3577 1.9822 sec/batch
Epoch 18/20  Iteration 3056/3560 Training loss: 1.3581 1.7564 sec/batch
Epoch 18/20  Iteration 3057/3560 Training loss: 1.3575 1.7014 sec/batch
Epoch 18/20  Iteration 3058/3560 Training loss: 1.3569 1.7110 sec/batch
Epoch 18/20  Iteration 3059/3560 Training loss: 1.3572 1.7080 sec/batch
Epoch 18/20  Iteration 3060/3560 Training loss: 1.3572 1.6947 sec/batch
Epoch 18/20  Iteration 3061/3560 Training loss: 1.3569 1.7044 sec/batch
Epoch 18/20  Iteration 3062/3560 Training loss: 1.3568 1.7308 sec/batch
Epoch 18/20  Iteration 3063/3560 Training loss: 1.3562 1.7139 sec/batch
Epoch 18/20  Iteration 3064/3560 Training loss: 1.3549 1.7361 sec/batch
Epoch 18/20  Iteration 3065/3560 Training loss: 1.3535 1.6957 sec/batch
Epoch 18/20  Iteration 3066/3560 Training loss: 1.3533 1.9151 sec/batch
Epoch 18/20  Iteration 3067/3560 Training loss: 1.3525 1.7340 sec/batch
Epoch 18/20  Iteration 3068/3560 Training loss: 1.3532 1.7021 sec/batch
Epoch 18/20  Iteration 3069/3560 Training loss: 1.3527 1.7094 sec/batch
Epoch 18/20  Iteration 3070/3560 Training loss: 1.3522 1.6949 sec/batch
Epoch 18/20  Iteration 3071/3560 Training loss: 1.3523 1.7403 sec/batch
Epoch 18/20  Iteration 3072/3560 Training loss: 1.3516 1.7086 sec/batch
Epoch 18/20  Iteration 3073/3560 Training loss: 1.3514 1.7319 sec/batch
Epoch 18/20  Iteration 3074/3560 Training loss: 1.3512 1.7193 sec/batch
Epoch 18/20  Iteration 3075/3560 Training loss: 1.3510 1.7230 sec/batch
Epoch 18/20  Iteration 3076/3560 Training loss: 1.3513 1.7114 sec/batch
Epoch 18/20  Iteration 3077/3560 Training loss: 1.3508 1.8876 sec/batch
Epoch 18/20  Iteration 3078/3560 Training loss: 1.3514 1.7398 sec/batch
Epoch 18/20  Iteration 3079/3560 Training loss: 1.3515 1.7030 sec/batch
Epoch 18/20  Iteration 3080/3560 Training loss: 1.3516 1.7281 sec/batch
Epoch 18/20  Iteration 3081/3560 Training loss: 1.3514 1.7416 sec/batch
Epoch 18/20  Iteration 3082/3560 Training loss: 1.3515 1.7333 sec/batch
Epoch 18/20  Iteration 3083/3560 Training loss: 1.3518 1.6895 sec/batch
Epoch 18/20  Iteration 3084/3560 Training loss: 1.3516 1.7044 sec/batch
Epoch 18/20  Iteration 3085/3560 Training loss: 1.3511 1.7193 sec/batch
Epoch 18/20  Iteration 3086/3560 Training loss: 1.3518 1.7320 sec/batch
Epoch 18/20  Iteration 3087/3560 Training loss: 1.3518 1.7469 sec/batch
Epoch 18/20  Iteration 3088/3560 Training loss: 1.3528 1.7322 sec/batch
Epoch 18/20  Iteration 3089/3560 Training loss: 1.3531 1.7276 sec/batch
Epoch 18/20  Iteration 3090/3560 Training loss: 1.3531 1.7269 sec/batch
Epoch 18/20  Iteration 3091/3560 Training loss: 1.3530 1.7281 sec/batch
Epoch 18/20  Iteration 3092/3560 Training loss: 1.3533 1.7297 sec/batch
Epoch 18/20  Iteration 3093/3560 Training loss: 1.3537 1.7276 sec/batch
Epoch 18/20  Iteration 3094/3560 Training loss: 1.3533 1.7075 sec/batch
Epoch 18/20  Iteration 3095/3560 Training loss: 1.3533 1.7104 sec/batch
Epoch 18/20  Iteration 3096/3560 Training loss: 1.3532 1.7160 sec/batch
Epoch 18/20  Iteration 3097/3560 Training loss: 1.3537 1.8516 sec/batch
Epoch 18/20  Iteration 3098/3560 Training loss: 1.3540 1.8351 sec/batch
Epoch 18/20  Iteration 3099/3560 Training loss: 1.3544 2.1217 sec/batch
Epoch 18/20  Iteration 3100/3560 Training loss: 1.3541 1.8893 sec/batch
Epoch 18/20  Iteration 3101/3560 Training loss: 1.3539 1.9056 sec/batch
Epoch 18/20  Iteration 3102/3560 Training loss: 1.3541 1.8923 sec/batch
Epoch 18/20  Iteration 3103/3560 Training loss: 1.3542 1.9008 sec/batch
Epoch 18/20  Iteration 3104/3560 Training loss: 1.3541 1.7921 sec/batch
Epoch 18/20  Iteration 3105/3560 Training loss: 1.3535 1.8643 sec/batch
Epoch 18/20  Iteration 3106/3560 Training loss: 1.3534 1.7521 sec/batch
Epoch 18/20  Iteration 3107/3560 Training loss: 1.3530 2.0324 sec/batch
Epoch 18/20  Iteration 3108/3560 Training loss: 1.3530 1.7638 sec/batch
Epoch 18/20  Iteration 3109/3560 Training loss: 1.3527 1.7766 sec/batch
Epoch 18/20  Iteration 3110/3560 Training loss: 1.3526 1.8960 sec/batch
Epoch 18/20  Iteration 3111/3560 Training loss: 1.3522 1.9930 sec/batch
Epoch 18/20  Iteration 3112/3560 Training loss: 1.3521 2.2147 sec/batch
Epoch 18/20  Iteration 3113/3560 Training loss: 1.3519 1.7666 sec/batch
Epoch 18/20  Iteration 3114/3560 Training loss: 1.3516 1.7247 sec/batch
Epoch 18/20  Iteration 3115/3560 Training loss: 1.3513 1.7161 sec/batch
Epoch 18/20  Iteration 3116/3560 Training loss: 1.3514 1.8800 sec/batch
Epoch 18/20  Iteration 3117/3560 Training loss: 1.3512 1.7333 sec/batch
Epoch 18/20  Iteration 3118/3560 Training loss: 1.3510 1.7433 sec/batch
Epoch 18/20  Iteration 3119/3560 Training loss: 1.3506 1.8432 sec/batch
Epoch 18/20  Iteration 3120/3560 Training loss: 1.3503 2.0909 sec/batch
Epoch 18/20  Iteration 3121/3560 Training loss: 1.3501 2.1917 sec/batch
Epoch 18/20  Iteration 3122/3560 Training loss: 1.3502 1.8684 sec/batch
Epoch 18/20  Iteration 3123/3560 Training loss: 1.3501 1.9435 sec/batch
Epoch 18/20  Iteration 3124/3560 Training loss: 1.3498 1.8997 sec/batch
Epoch 18/20  Iteration 3125/3560 Training loss: 1.3494 1.8624 sec/batch
Epoch 18/20  Iteration 3126/3560 Training loss: 1.3491 1.9387 sec/batch
Epoch 18/20  Iteration 3127/3560 Training loss: 1.3492 2.0012 sec/batch
Epoch 18/20  Iteration 3128/3560 Training loss: 1.3491 1.8505 sec/batch
Epoch 18/20  Iteration 3129/3560 Training loss: 1.3491 1.8930 sec/batch
Epoch 18/20  Iteration 3130/3560 Training loss: 1.3489 1.8909 sec/batch
Epoch 18/20  Iteration 3131/3560 Training loss: 1.3488 1.8257 sec/batch
Epoch 18/20  Iteration 3132/3560 Training loss: 1.3487 1.8376 sec/batch
Epoch 18/20  Iteration 3133/3560 Training loss: 1.3486 2.1690 sec/batch
Epoch 18/20  Iteration 3134/3560 Training loss: 1.3486 2.4362 sec/batch
Epoch 18/20  Iteration 3135/3560 Training loss: 1.3485 2.1635 sec/batch
Epoch 18/20  Iteration 3136/3560 Training loss: 1.3486 1.9005 sec/batch
Epoch 18/20  Iteration 3137/3560 Training loss: 1.3485 1.8925 sec/batch
Epoch 18/20  Iteration 3138/3560 Training loss: 1.3484 2.4274 sec/batch
Epoch 18/20  Iteration 3139/3560 Training loss: 1.3484 2.4104 sec/batch
Epoch 18/20  Iteration 3140/3560 Training loss: 1.3483 2.3892 sec/batch
Epoch 18/20  Iteration 3141/3560 Training loss: 1.3480 2.2130 sec/batch
Epoch 18/20  Iteration 3142/3560 Training loss: 1.3477 1.9458 sec/batch
Epoch 18/20  Iteration 3143/3560 Training loss: 1.3477 1.9919 sec/batch
Epoch 18/20  Iteration 3144/3560 Training loss: 1.3478 1.9960 sec/batch
Epoch 18/20  Iteration 3145/3560 Training loss: 1.3477 1.9600 sec/batch
Epoch 18/20  Iteration 3146/3560 Training loss: 1.3477 2.3929 sec/batch
Epoch 18/20  Iteration 3147/3560 Training loss: 1.3477 2.1448 sec/batch
Epoch 18/20  Iteration 3148/3560 Training loss: 1.3473 2.0548 sec/batch
Epoch 18/20  Iteration 3149/3560 Training loss: 1.3470 2.0472 sec/batch
Epoch 18/20  Iteration 3150/3560 Training loss: 1.3471 1.9357 sec/batch
Epoch 18/20  Iteration 3151/3560 Training loss: 1.3470 1.9874 sec/batch
Epoch 18/20  Iteration 3152/3560 Training loss: 1.3467 2.1943 sec/batch
Epoch 18/20  Iteration 3153/3560 Training loss: 1.3467 2.4107 sec/batch
Epoch 18/20  Iteration 3154/3560 Training loss: 1.3467 2.2770 sec/batch
Epoch 18/20  Iteration 3155/3560 Training loss: 1.3465 2.6817 sec/batch
Epoch 18/20  Iteration 3156/3560 Training loss: 1.3462 2.1467 sec/batch
Epoch 18/20  Iteration 3157/3560 Training loss: 1.3458 1.9174 sec/batch
Epoch 18/20  Iteration 3158/3560 Training loss: 1.3458 1.8930 sec/batch
Epoch 18/20  Iteration 3159/3560 Training loss: 1.3459 2.4245 sec/batch
Epoch 18/20  Iteration 3160/3560 Training loss: 1.3459 2.4928 sec/batch
Epoch 18/20  Iteration 3161/3560 Training loss: 1.3458 2.4976 sec/batch
Epoch 18/20  Iteration 3162/3560 Training loss: 1.3458 2.3019 sec/batch
Epoch 18/20  Iteration 3163/3560 Training loss: 1.3460 2.0624 sec/batch
Epoch 18/20  Iteration 3164/3560 Training loss: 1.3461 1.8949 sec/batch
Epoch 18/20  Iteration 3165/3560 Training loss: 1.3461 1.9373 sec/batch
Epoch 18/20  Iteration 3166/3560 Training loss: 1.3461 1.8437 sec/batch
Epoch 18/20  Iteration 3167/3560 Training loss: 1.3465 1.8358 sec/batch
Epoch 18/20  Iteration 3168/3560 Training loss: 1.3465 1.8823 sec/batch
Epoch 18/20  Iteration 3169/3560 Training loss: 1.3464 2.1032 sec/batch
Epoch 18/20  Iteration 3170/3560 Training loss: 1.3467 1.8546 sec/batch
Epoch 18/20  Iteration 3171/3560 Training loss: 1.3466 1.8494 sec/batch
Epoch 18/20  Iteration 3172/3560 Training loss: 1.3468 1.8568 sec/batch
Epoch 18/20  Iteration 3173/3560 Training loss: 1.3470 1.9406 sec/batch
Epoch 18/20  Iteration 3174/3560 Training loss: 1.3473 1.8841 sec/batch
Epoch 18/20  Iteration 3175/3560 Training loss: 1.3475 1.8327 sec/batch
Epoch 18/20  Iteration 3176/3560 Training loss: 1.3473 1.8439 sec/batch
Epoch 18/20  Iteration 3177/3560 Training loss: 1.3471 1.9605 sec/batch
Epoch 18/20  Iteration 3178/3560 Training loss: 1.3471 1.8987 sec/batch
Epoch 18/20  Iteration 3179/3560 Training loss: 1.3471 1.8303 sec/batch
Epoch 18/20  Iteration 3180/3560 Training loss: 1.3471 1.8639 sec/batch
Epoch 18/20  Iteration 3181/3560 Training loss: 1.3472 1.9011 sec/batch
Epoch 18/20  Iteration 3182/3560 Training loss: 1.3473 1.9350 sec/batch
Epoch 18/20  Iteration 3183/3560 Training loss: 1.3474 1.8667 sec/batch
Epoch 18/20  Iteration 3184/3560 Training loss: 1.3473 1.8590 sec/batch
Epoch 18/20  Iteration 3185/3560 Training loss: 1.3471 1.8280 sec/batch
Epoch 18/20  Iteration 3186/3560 Training loss: 1.3473 1.9329 sec/batch
Epoch 18/20  Iteration 3187/3560 Training loss: 1.3475 1.8630 sec/batch
Epoch 18/20  Iteration 3188/3560 Training loss: 1.3476 1.8536 sec/batch
Epoch 18/20  Iteration 3189/3560 Training loss: 1.3476 1.8591 sec/batch
Epoch 18/20  Iteration 3190/3560 Training loss: 1.3477 1.9107 sec/batch
Epoch 18/20  Iteration 3191/3560 Training loss: 1.3477 1.8405 sec/batch
Epoch 18/20  Iteration 3192/3560 Training loss: 1.3477 1.8313 sec/batch
Epoch 18/20  Iteration 3193/3560 Training loss: 1.3479 1.8955 sec/batch
Epoch 18/20  Iteration 3194/3560 Training loss: 1.3483 1.8600 sec/batch
Epoch 18/20  Iteration 3195/3560 Training loss: 1.3483 1.8638 sec/batch
Epoch 18/20  Iteration 3196/3560 Training loss: 1.3484 1.8793 sec/batch
Epoch 18/20  Iteration 3197/3560 Training loss: 1.3483 1.8625 sec/batch
Epoch 18/20  Iteration 3198/3560 Training loss: 1.3482 1.8725 sec/batch
Epoch 18/20  Iteration 3199/3560 Training loss: 1.3484 1.9139 sec/batch
Epoch 18/20  Iteration 3200/3560 Training loss: 1.3484 1.8118 sec/batch
Epoch 18/20  Iteration 3201/3560 Training loss: 1.3485 2.0156 sec/batch
Epoch 18/20  Iteration 3202/3560 Training loss: 1.3483 2.0984 sec/batch
Epoch 18/20  Iteration 3203/3560 Training loss: 1.3482 1.9384 sec/batch
Epoch 18/20  Iteration 3204/3560 Training loss: 1.3484 1.8878 sec/batch
Epoch 19/20  Iteration 3205/3560 Training loss: 1.4341 1.9365 sec/batch
Epoch 19/20  Iteration 3206/3560 Training loss: 1.4006 2.4686 sec/batch
Epoch 19/20  Iteration 3207/3560 Training loss: 1.3835 2.2020 sec/batch
Epoch 19/20  Iteration 3208/3560 Training loss: 1.3768 1.9756 sec/batch
Epoch 19/20  Iteration 3209/3560 Training loss: 1.3682 2.0658 sec/batch
Epoch 19/20  Iteration 3210/3560 Training loss: 1.3582 2.1313 sec/batch
Epoch 19/20  Iteration 3211/3560 Training loss: 1.3597 2.3013 sec/batch
Epoch 19/20  Iteration 3212/3560 Training loss: 1.3578 1.9774 sec/batch
Epoch 19/20  Iteration 3213/3560 Training loss: 1.3573 1.9721 sec/batch
Epoch 19/20  Iteration 3214/3560 Training loss: 1.3555 2.0573 sec/batch
Epoch 19/20  Iteration 3215/3560 Training loss: 1.3524 1.9159 sec/batch
Epoch 19/20  Iteration 3216/3560 Training loss: 1.3518 1.8750 sec/batch
Epoch 19/20  Iteration 3217/3560 Training loss: 1.3516 1.8688 sec/batch
Epoch 19/20  Iteration 3218/3560 Training loss: 1.3525 1.9584 sec/batch
Epoch 19/20  Iteration 3219/3560 Training loss: 1.3509 1.8720 sec/batch
Epoch 19/20  Iteration 3220/3560 Training loss: 1.3491 1.8277 sec/batch
Epoch 19/20  Iteration 3221/3560 Training loss: 1.3496 1.8620 sec/batch
Epoch 19/20  Iteration 3222/3560 Training loss: 1.3502 1.9375 sec/batch
Epoch 19/20  Iteration 3223/3560 Training loss: 1.3504 1.8668 sec/batch
Epoch 19/20  Iteration 3224/3560 Training loss: 1.3514 1.8608 sec/batch
Epoch 19/20  Iteration 3225/3560 Training loss: 1.3508 1.8360 sec/batch
Epoch 19/20  Iteration 3226/3560 Training loss: 1.3512 1.9205 sec/batch
Epoch 19/20  Iteration 3227/3560 Training loss: 1.3503 1.8703 sec/batch
Epoch 19/20  Iteration 3228/3560 Training loss: 1.3503 1.8580 sec/batch
Epoch 19/20  Iteration 3229/3560 Training loss: 1.3502 1.8253 sec/batch
Epoch 19/20  Iteration 3230/3560 Training loss: 1.3487 1.8821 sec/batch
Epoch 19/20  Iteration 3231/3560 Training loss: 1.3479 1.9341 sec/batch
Epoch 19/20  Iteration 3232/3560 Training loss: 1.3486 1.8612 sec/batch
Epoch 19/20  Iteration 3233/3560 Training loss: 1.3489 1.8258 sec/batch
Epoch 19/20  Iteration 3234/3560 Training loss: 1.3493 1.8529 sec/batch
Epoch 19/20  Iteration 3235/3560 Training loss: 1.3488 1.8760 sec/batch
Epoch 19/20  Iteration 3236/3560 Training loss: 1.3476 1.8642 sec/batch
Epoch 19/20  Iteration 3237/3560 Training loss: 1.3481 1.8564 sec/batch
Epoch 19/20  Iteration 3238/3560 Training loss: 1.3482 1.8153 sec/batch
Epoch 19/20  Iteration 3239/3560 Training loss: 1.3479 1.8782 sec/batch
Epoch 19/20  Iteration 3240/3560 Training loss: 1.3479 1.8612 sec/batch
Epoch 19/20  Iteration 3241/3560 Training loss: 1.3472 1.8672 sec/batch
Epoch 19/20  Iteration 3242/3560 Training loss: 1.3460 1.8310 sec/batch
Epoch 19/20  Iteration 3243/3560 Training loss: 1.3450 1.8459 sec/batch
Epoch 19/20  Iteration 3244/3560 Training loss: 1.3446 1.8906 sec/batch
Epoch 19/20  Iteration 3245/3560 Training loss: 1.3441 1.8299 sec/batch
Epoch 19/20  Iteration 3246/3560 Training loss: 1.3446 1.9707 sec/batch
Epoch 19/20  Iteration 3247/3560 Training loss: 1.3442 2.5679 sec/batch
Epoch 19/20  Iteration 3248/3560 Training loss: 1.3434 2.3857 sec/batch
Epoch 19/20  Iteration 3249/3560 Training loss: 1.3436 2.4240 sec/batch
Epoch 19/20  Iteration 3250/3560 Training loss: 1.3427 2.2386 sec/batch
Epoch 19/20  Iteration 3251/3560 Training loss: 1.3424 1.9420 sec/batch
Epoch 19/20  Iteration 3252/3560 Training loss: 1.3419 1.9769 sec/batch
Epoch 19/20  Iteration 3253/3560 Training loss: 1.3420 2.0759 sec/batch
Epoch 19/20  Iteration 3254/3560 Training loss: 1.3423 1.8760 sec/batch
Epoch 19/20  Iteration 3255/3560 Training loss: 1.3419 1.8412 sec/batch
Epoch 19/20  Iteration 3256/3560 Training loss: 1.3424 1.9320 sec/batch
Epoch 19/20  Iteration 3257/3560 Training loss: 1.3424 1.8724 sec/batch
Epoch 19/20  Iteration 3258/3560 Training loss: 1.3427 1.8688 sec/batch
Epoch 19/20  Iteration 3259/3560 Training loss: 1.3425 1.8540 sec/batch
Epoch 19/20  Iteration 3260/3560 Training loss: 1.3425 1.8467 sec/batch
Epoch 19/20  Iteration 3261/3560 Training loss: 1.3430 1.9266 sec/batch
Epoch 19/20  Iteration 3262/3560 Training loss: 1.3427 2.0128 sec/batch
Epoch 19/20  Iteration 3263/3560 Training loss: 1.3422 1.9133 sec/batch
Epoch 19/20  Iteration 3264/3560 Training loss: 1.3427 1.8612 sec/batch
Epoch 19/20  Iteration 3265/3560 Training loss: 1.3427 1.9725 sec/batch
Epoch 19/20  Iteration 3266/3560 Training loss: 1.3434 1.8307 sec/batch
Epoch 19/20  Iteration 3267/3560 Training loss: 1.3438 1.8279 sec/batch
Epoch 19/20  Iteration 3268/3560 Training loss: 1.3441 1.8536 sec/batch
Epoch 19/20  Iteration 3269/3560 Training loss: 1.3439 1.9600 sec/batch
Epoch 19/20  Iteration 3270/3560 Training loss: 1.3442 1.8551 sec/batch
Epoch 19/20  Iteration 3271/3560 Training loss: 1.3444 1.8605 sec/batch
Epoch 19/20  Iteration 3272/3560 Training loss: 1.3441 1.8214 sec/batch
Epoch 19/20  Iteration 3273/3560 Training loss: 1.3443 1.9138 sec/batch
Epoch 19/20  Iteration 3274/3560 Training loss: 1.3441 1.8238 sec/batch
Epoch 19/20  Iteration 3275/3560 Training loss: 1.3446 1.8663 sec/batch
Epoch 19/20  Iteration 3276/3560 Training loss: 1.3448 1.8886 sec/batch
Epoch 19/20  Iteration 3277/3560 Training loss: 1.3452 1.8448 sec/batch
Epoch 19/20  Iteration 3278/3560 Training loss: 1.3449 1.8648 sec/batch
Epoch 19/20  Iteration 3279/3560 Training loss: 1.3448 1.8515 sec/batch
Epoch 19/20  Iteration 3280/3560 Training loss: 1.3451 1.8572 sec/batch
Epoch 19/20  Iteration 3281/3560 Training loss: 1.3452 1.8162 sec/batch
Epoch 19/20  Iteration 3282/3560 Training loss: 1.3450 1.8437 sec/batch
Epoch 19/20  Iteration 3283/3560 Training loss: 1.3444 1.8527 sec/batch
Epoch 19/20  Iteration 3284/3560 Training loss: 1.3443 1.8933 sec/batch
Epoch 19/20  Iteration 3285/3560 Training loss: 1.3439 1.8518 sec/batch
Epoch 19/20  Iteration 3286/3560 Training loss: 1.3438 2.2381 sec/batch
Epoch 19/20  Iteration 3287/3560 Training loss: 1.3435 2.0365 sec/batch
Epoch 19/20  Iteration 3288/3560 Training loss: 1.3433 2.5084 sec/batch
Epoch 19/20  Iteration 3289/3560 Training loss: 1.3430 2.3937 sec/batch
Epoch 19/20  Iteration 3290/3560 Training loss: 1.3430 2.0150 sec/batch
Epoch 19/20  Iteration 3291/3560 Training loss: 1.3427 2.2893 sec/batch
Epoch 19/20  Iteration 3292/3560 Training loss: 1.3425 2.3140 sec/batch
Epoch 19/20  Iteration 3293/3560 Training loss: 1.3421 2.2286 sec/batch
Epoch 19/20  Iteration 3294/3560 Training loss: 1.3422 2.3718 sec/batch
Epoch 19/20  Iteration 3295/3560 Training loss: 1.3419 2.3802 sec/batch
Epoch 19/20  Iteration 3296/3560 Training loss: 1.3418 2.0735 sec/batch
Epoch 19/20  Iteration 3297/3560 Training loss: 1.3415 1.8887 sec/batch
Epoch 19/20  Iteration 3298/3560 Training loss: 1.3412 1.8624 sec/batch
Epoch 19/20  Iteration 3299/3560 Training loss: 1.3409 1.8987 sec/batch
Epoch 19/20  Iteration 3300/3560 Training loss: 1.3410 1.9017 sec/batch
Epoch 19/20  Iteration 3301/3560 Training loss: 1.3411 1.8963 sec/batch
Epoch 19/20  Iteration 3302/3560 Training loss: 1.3408 1.8898 sec/batch
Epoch 19/20  Iteration 3303/3560 Training loss: 1.3405 1.9234 sec/batch
Epoch 19/20  Iteration 3304/3560 Training loss: 1.3402 1.9043 sec/batch
Epoch 19/20  Iteration 3305/3560 Training loss: 1.3401 1.8324 sec/batch
Epoch 19/20  Iteration 3306/3560 Training loss: 1.3401 1.8597 sec/batch
Epoch 19/20  Iteration 3307/3560 Training loss: 1.3400 1.8865 sec/batch
Epoch 19/20  Iteration 3308/3560 Training loss: 1.3399 1.9286 sec/batch
Epoch 19/20  Iteration 3309/3560 Training loss: 1.3398 1.8451 sec/batch
Epoch 19/20  Iteration 3310/3560 Training loss: 1.3398 1.8540 sec/batch
Epoch 19/20  Iteration 3311/3560 Training loss: 1.3397 1.8457 sec/batch
Epoch 19/20  Iteration 3312/3560 Training loss: 1.3397 1.9069 sec/batch
Epoch 19/20  Iteration 3313/3560 Training loss: 1.3395 1.8462 sec/batch
Epoch 19/20  Iteration 3314/3560 Training loss: 1.3396 1.8403 sec/batch
Epoch 19/20  Iteration 3315/3560 Training loss: 1.3394 1.8987 sec/batch
Epoch 19/20  Iteration 3316/3560 Training loss: 1.3394 1.9246 sec/batch
Epoch 19/20  Iteration 3317/3560 Training loss: 1.3393 1.8529 sec/batch
Epoch 19/20  Iteration 3318/3560 Training loss: 1.3391 1.8504 sec/batch
Epoch 19/20  Iteration 3319/3560 Training loss: 1.3389 1.8227 sec/batch
Epoch 19/20  Iteration 3320/3560 Training loss: 1.3386 1.8909 sec/batch
Epoch 19/20  Iteration 3321/3560 Training loss: 1.3386 1.8336 sec/batch
Epoch 19/20  Iteration 3322/3560 Training loss: 1.3387 1.8627 sec/batch
Epoch 19/20  Iteration 3323/3560 Training loss: 1.3386 1.9116 sec/batch
Epoch 19/20  Iteration 3324/3560 Training loss: 1.3386 1.9396 sec/batch
Epoch 19/20  Iteration 3325/3560 Training loss: 1.3385 2.0171 sec/batch
Epoch 19/20  Iteration 3326/3560 Training loss: 1.3382 2.0692 sec/batch
Epoch 19/20  Iteration 3327/3560 Training loss: 1.3378 2.5221 sec/batch
Epoch 19/20  Iteration 3328/3560 Training loss: 1.3378 2.0791 sec/batch
Epoch 19/20  Iteration 3329/3560 Training loss: 1.3378 1.8812 sec/batch
Epoch 19/20  Iteration 3330/3560 Training loss: 1.3374 1.8610 sec/batch
Epoch 19/20  Iteration 3331/3560 Training loss: 1.3376 2.0910 sec/batch
Epoch 19/20  Iteration 3332/3560 Training loss: 1.3377 1.8901 sec/batch
Epoch 19/20  Iteration 3333/3560 Training loss: 1.3375 1.8402 sec/batch
Epoch 19/20  Iteration 3334/3560 Training loss: 1.3372 1.8906 sec/batch
Epoch 19/20  Iteration 3335/3560 Training loss: 1.3368 1.9440 sec/batch
Epoch 19/20  Iteration 3336/3560 Training loss: 1.3367 1.8705 sec/batch
Epoch 19/20  Iteration 3337/3560 Training loss: 1.3368 1.9059 sec/batch
Epoch 19/20  Iteration 3338/3560 Training loss: 1.3369 1.8655 sec/batch
Epoch 19/20  Iteration 3339/3560 Training loss: 1.3369 1.9652 sec/batch
Epoch 19/20  Iteration 3340/3560 Training loss: 1.3369 1.8687 sec/batch
Epoch 19/20  Iteration 3341/3560 Training loss: 1.3371 1.9098 sec/batch
Epoch 19/20  Iteration 3342/3560 Training loss: 1.3372 1.8713 sec/batch
Epoch 19/20  Iteration 3343/3560 Training loss: 1.3372 1.9816 sec/batch
Epoch 19/20  Iteration 3344/3560 Training loss: 1.3373 1.8980 sec/batch
Epoch 19/20  Iteration 3345/3560 Training loss: 1.3376 1.8955 sec/batch
Epoch 19/20  Iteration 3346/3560 Training loss: 1.3376 1.9076 sec/batch
Epoch 19/20  Iteration 3347/3560 Training loss: 1.3375 1.9673 sec/batch
Epoch 19/20  Iteration 3348/3560 Training loss: 1.3377 1.9261 sec/batch
Epoch 19/20  Iteration 3349/3560 Training loss: 1.3376 1.9092 sec/batch
Epoch 19/20  Iteration 3350/3560 Training loss: 1.3378 1.9088 sec/batch
Epoch 19/20  Iteration 3351/3560 Training loss: 1.3378 1.9226 sec/batch
Epoch 19/20  Iteration 3352/3560 Training loss: 1.3381 1.9318 sec/batch
Epoch 19/20  Iteration 3353/3560 Training loss: 1.3383 1.8876 sec/batch
Epoch 19/20  Iteration 3354/3560 Training loss: 1.3382 1.9252 sec/batch
Epoch 19/20  Iteration 3355/3560 Training loss: 1.3380 1.9173 sec/batch
Epoch 19/20  Iteration 3356/3560 Training loss: 1.3379 1.9540 sec/batch
Epoch 19/20  Iteration 3357/3560 Training loss: 1.3380 1.9159 sec/batch
Epoch 19/20  Iteration 3358/3560 Training loss: 1.3380 1.9044 sec/batch
Epoch 19/20  Iteration 3359/3560 Training loss: 1.3381 1.8967 sec/batch
Epoch 19/20  Iteration 3360/3560 Training loss: 1.3380 2.0604 sec/batch
Epoch 19/20  Iteration 3361/3560 Training loss: 1.3381 2.4045 sec/batch
Epoch 19/20  Iteration 3362/3560 Training loss: 1.3381 2.6136 sec/batch
Epoch 19/20  Iteration 3363/3560 Training loss: 1.3379 2.1294 sec/batch
Epoch 19/20  Iteration 3364/3560 Training loss: 1.3381 2.4012 sec/batch
Epoch 19/20  Iteration 3365/3560 Training loss: 1.3383 2.1852 sec/batch
Epoch 19/20  Iteration 3366/3560 Training loss: 1.3383 1.8984 sec/batch
Epoch 19/20  Iteration 3367/3560 Training loss: 1.3384 1.8972 sec/batch
Epoch 19/20  Iteration 3368/3560 Training loss: 1.3383 1.9820 sec/batch
Epoch 19/20  Iteration 3369/3560 Training loss: 1.3384 1.9871 sec/batch
Epoch 19/20  Iteration 3370/3560 Training loss: 1.3385 1.9148 sec/batch
Epoch 19/20  Iteration 3371/3560 Training loss: 1.3387 1.9049 sec/batch
Epoch 19/20  Iteration 3372/3560 Training loss: 1.3391 1.9264 sec/batch
Epoch 19/20  Iteration 3373/3560 Training loss: 1.3391 2.0043 sec/batch
Epoch 19/20  Iteration 3374/3560 Training loss: 1.3391 1.9048 sec/batch
Epoch 19/20  Iteration 3375/3560 Training loss: 1.3391 1.9192 sec/batch
Epoch 19/20  Iteration 3376/3560 Training loss: 1.3389 1.9241 sec/batch
Epoch 19/20  Iteration 3377/3560 Training loss: 1.3391 1.9438 sec/batch
Epoch 19/20  Iteration 3378/3560 Training loss: 1.3391 1.9136 sec/batch
Epoch 19/20  Iteration 3379/3560 Training loss: 1.3392 1.9153 sec/batch
Epoch 19/20  Iteration 3380/3560 Training loss: 1.3390 1.8801 sec/batch
Epoch 19/20  Iteration 3381/3560 Training loss: 1.3389 2.0396 sec/batch
Epoch 19/20  Iteration 3382/3560 Training loss: 1.3390 1.9240 sec/batch
Epoch 20/20  Iteration 3383/3560 Training loss: 1.4257 1.9817 sec/batch
Epoch 20/20  Iteration 3384/3560 Training loss: 1.3907 1.9924 sec/batch
Epoch 20/20  Iteration 3385/3560 Training loss: 1.3723 1.9528 sec/batch
Epoch 20/20  Iteration 3386/3560 Training loss: 1.3705 1.9109 sec/batch
Epoch 20/20  Iteration 3387/3560 Training loss: 1.3612 1.9162 sec/batch
Epoch 20/20  Iteration 3388/3560 Training loss: 1.3509 1.8573 sec/batch
Epoch 20/20  Iteration 3389/3560 Training loss: 1.3490 1.9225 sec/batch
Epoch 20/20  Iteration 3390/3560 Training loss: 1.3459 1.9093 sec/batch
Epoch 20/20  Iteration 3391/3560 Training loss: 1.3453 1.9163 sec/batch
Epoch 20/20  Iteration 3392/3560 Training loss: 1.3438 1.9260 sec/batch
Epoch 20/20  Iteration 3393/3560 Training loss: 1.3412 1.9012 sec/batch
Epoch 20/20  Iteration 3394/3560 Training loss: 1.3407 1.9056 sec/batch
Epoch 20/20  Iteration 3395/3560 Training loss: 1.3405 1.9270 sec/batch
Epoch 20/20  Iteration 3396/3560 Training loss: 1.3408 1.9296 sec/batch
Epoch 20/20  Iteration 3397/3560 Training loss: 1.3395 1.8480 sec/batch
Epoch 20/20  Iteration 3398/3560 Training loss: 1.3376 2.0159 sec/batch
Epoch 20/20  Iteration 3399/3560 Training loss: 1.3383 1.9721 sec/batch
Epoch 20/20  Iteration 3400/3560 Training loss: 1.3399 1.9615 sec/batch
Epoch 20/20  Iteration 3401/3560 Training loss: 1.3405 1.9629 sec/batch
Epoch 20/20  Iteration 3402/3560 Training loss: 1.3408 2.1534 sec/batch
Epoch 20/20  Iteration 3403/3560 Training loss: 1.3402 2.4548 sec/batch
Epoch 20/20  Iteration 3404/3560 Training loss: 1.3407 2.0002 sec/batch
Epoch 20/20  Iteration 3405/3560 Training loss: 1.3400 1.9362 sec/batch
Epoch 20/20  Iteration 3406/3560 Training loss: 1.3400 1.9367 sec/batch
Epoch 20/20  Iteration 3407/3560 Training loss: 1.3400 1.9283 sec/batch
Epoch 20/20  Iteration 3408/3560 Training loss: 1.3383 1.9040 sec/batch
Epoch 20/20  Iteration 3409/3560 Training loss: 1.3373 1.8826 sec/batch
Epoch 20/20  Iteration 3410/3560 Training loss: 1.3375 2.0043 sec/batch
Epoch 20/20  Iteration 3411/3560 Training loss: 1.3378 2.0064 sec/batch
Epoch 20/20  Iteration 3412/3560 Training loss: 1.3382 1.9077 sec/batch
Epoch 20/20  Iteration 3413/3560 Training loss: 1.3377 1.8944 sec/batch
Epoch 20/20  Iteration 3414/3560 Training loss: 1.3370 1.9500 sec/batch
Epoch 20/20  Iteration 3415/3560 Training loss: 1.3372 1.9613 sec/batch
Epoch 20/20  Iteration 3416/3560 Training loss: 1.3375 1.8743 sec/batch
Epoch 20/20  Iteration 3417/3560 Training loss: 1.3369 1.9139 sec/batch
Epoch 20/20  Iteration 3418/3560 Training loss: 1.3369 1.9199 sec/batch
Epoch 20/20  Iteration 3419/3560 Training loss: 1.3360 1.9953 sec/batch
Epoch 20/20  Iteration 3420/3560 Training loss: 1.3349 1.9193 sec/batch
Epoch 20/20  Iteration 3421/3560 Training loss: 1.3339 1.9160 sec/batch
Epoch 20/20  Iteration 3422/3560 Training loss: 1.3336 1.9191 sec/batch
Epoch 20/20  Iteration 3423/3560 Training loss: 1.3330 1.9358 sec/batch
Epoch 20/20  Iteration 3424/3560 Training loss: 1.3337 1.9246 sec/batch
Epoch 20/20  Iteration 3425/3560 Training loss: 1.3335 1.8992 sec/batch
Epoch 20/20  Iteration 3426/3560 Training loss: 1.3330 1.8752 sec/batch
Epoch 20/20  Iteration 3427/3560 Training loss: 1.3333 1.9328 sec/batch
Epoch 20/20  Iteration 3428/3560 Training loss: 1.3324 1.9412 sec/batch
Epoch 20/20  Iteration 3429/3560 Training loss: 1.3325 1.9183 sec/batch
Epoch 20/20  Iteration 3430/3560 Training loss: 1.3321 1.9173 sec/batch
Epoch 20/20  Iteration 3431/3560 Training loss: 1.3320 1.9741 sec/batch
Epoch 20/20  Iteration 3432/3560 Training loss: 1.3323 1.8914 sec/batch
Epoch 20/20  Iteration 3433/3560 Training loss: 1.3318 1.9142 sec/batch
Epoch 20/20  Iteration 3434/3560 Training loss: 1.3325 2.0499 sec/batch
Epoch 20/20  Iteration 3435/3560 Training loss: 1.3324 2.0096 sec/batch
Epoch 20/20  Iteration 3436/3560 Training loss: 1.3329 1.9872 sec/batch
Epoch 20/20  Iteration 3437/3560 Training loss: 1.3327 1.9765 sec/batch
Epoch 20/20  Iteration 3438/3560 Training loss: 1.3330 2.5528 sec/batch
Epoch 20/20  Iteration 3439/3560 Training loss: 1.3333 2.2404 sec/batch
Epoch 20/20  Iteration 3440/3560 Training loss: 1.3331 1.9213 sec/batch
Epoch 20/20  Iteration 3441/3560 Training loss: 1.3326 1.9087 sec/batch
Epoch 20/20  Iteration 3442/3560 Training loss: 1.3332 2.1758 sec/batch
Epoch 20/20  Iteration 3443/3560 Training loss: 1.3333 1.9047 sec/batch
Epoch 20/20  Iteration 3444/3560 Training loss: 1.3344 1.9009 sec/batch
Epoch 20/20  Iteration 3445/3560 Training loss: 1.3347 2.1305 sec/batch
Epoch 20/20  Iteration 3446/3560 Training loss: 1.3350 2.0161 sec/batch
Epoch 20/20  Iteration 3447/3560 Training loss: 1.3350 1.8851 sec/batch
Epoch 20/20  Iteration 3448/3560 Training loss: 1.3352 1.9246 sec/batch
Epoch 20/20  Iteration 3449/3560 Training loss: 1.3355 1.9103 sec/batch
Epoch 20/20  Iteration 3450/3560 Training loss: 1.3353 1.9696 sec/batch
Epoch 20/20  Iteration 3451/3560 Training loss: 1.3355 1.9444 sec/batch
Epoch 20/20  Iteration 3452/3560 Training loss: 1.3355 1.9200 sec/batch
Epoch 20/20  Iteration 3453/3560 Training loss: 1.3359 1.9252 sec/batch
Epoch 20/20  Iteration 3454/3560 Training loss: 1.3362 1.9875 sec/batch
Epoch 20/20  Iteration 3455/3560 Training loss: 1.3367 1.9091 sec/batch
Epoch 20/20  Iteration 3456/3560 Training loss: 1.3364 1.8577 sec/batch
Epoch 20/20  Iteration 3457/3560 Training loss: 1.3362 1.8954 sec/batch
Epoch 20/20  Iteration 3458/3560 Training loss: 1.3363 1.9967 sec/batch
Epoch 20/20  Iteration 3459/3560 Training loss: 1.3363 1.9321 sec/batch
Epoch 20/20  Iteration 3460/3560 Training loss: 1.3362 1.9028 sec/batch
Epoch 20/20  Iteration 3461/3560 Training loss: 1.3357 1.8783 sec/batch
Epoch 20/20  Iteration 3462/3560 Training loss: 1.3356 1.9150 sec/batch
Epoch 20/20  Iteration 3463/3560 Training loss: 1.3352 1.9098 sec/batch
Epoch 20/20  Iteration 3464/3560 Training loss: 1.3352 1.9168 sec/batch
Epoch 20/20  Iteration 3465/3560 Training loss: 1.3347 1.9066 sec/batch
Epoch 20/20  Iteration 3466/3560 Training loss: 1.3346 1.9374 sec/batch
Epoch 20/20  Iteration 3467/3560 Training loss: 1.3343 1.9427 sec/batch
Epoch 20/20  Iteration 3468/3560 Training loss: 1.3343 1.9303 sec/batch
Epoch 20/20  Iteration 3469/3560 Training loss: 1.3341 1.9088 sec/batch
Epoch 20/20  Iteration 3470/3560 Training loss: 1.3339 1.8942 sec/batch
Epoch 20/20  Iteration 3471/3560 Training loss: 1.3336 1.9585 sec/batch
Epoch 20/20  Iteration 3472/3560 Training loss: 1.3337 1.9298 sec/batch
Epoch 20/20  Iteration 3473/3560 Training loss: 1.3335 1.9349 sec/batch
Epoch 20/20  Iteration 3474/3560 Training loss: 1.3334 1.9373 sec/batch
Epoch 20/20  Iteration 3475/3560 Training loss: 1.3330 2.0507 sec/batch
Epoch 20/20  Iteration 3476/3560 Training loss: 1.3327 2.4845 sec/batch
Epoch 20/20  Iteration 3477/3560 Training loss: 1.3325 2.5398 sec/batch
Epoch 20/20  Iteration 3478/3560 Training loss: 1.3325 1.9923 sec/batch
Epoch 20/20  Iteration 3479/3560 Training loss: 1.3326 2.4188 sec/batch
Epoch 20/20  Iteration 3480/3560 Training loss: 1.3322 2.6403 sec/batch
Epoch 20/20  Iteration 3481/3560 Training loss: 1.3319 2.5392 sec/batch
Epoch 20/20  Iteration 3482/3560 Training loss: 1.3314 2.8198 sec/batch
Epoch 20/20  Iteration 3483/3560 Training loss: 1.3315 2.2864 sec/batch
Epoch 20/20  Iteration 3484/3560 Training loss: 1.3314 2.0363 sec/batch
Epoch 20/20  Iteration 3485/3560 Training loss: 1.3313 1.9492 sec/batch
Epoch 20/20  Iteration 3486/3560 Training loss: 1.3313 2.1306 sec/batch
Epoch 20/20  Iteration 3487/3560 Training loss: 1.3312 1.9724 sec/batch
Epoch 20/20  Iteration 3488/3560 Training loss: 1.3312 1.9496 sec/batch
Epoch 20/20  Iteration 3489/3560 Training loss: 1.3312 1.8903 sec/batch
Epoch 20/20  Iteration 3490/3560 Training loss: 1.3312 1.9488 sec/batch
Epoch 20/20  Iteration 3491/3560 Training loss: 1.3311 1.9420 sec/batch
Epoch 20/20  Iteration 3492/3560 Training loss: 1.3311 1.9128 sec/batch
Epoch 20/20  Iteration 3493/3560 Training loss: 1.3310 2.0403 sec/batch
Epoch 20/20  Iteration 3494/3560 Training loss: 1.3310 2.1441 sec/batch
Epoch 20/20  Iteration 3495/3560 Training loss: 1.3309 1.9976 sec/batch
Epoch 20/20  Iteration 3496/3560 Training loss: 1.3308 2.0053 sec/batch
Epoch 20/20  Iteration 3497/3560 Training loss: 1.3306 2.3496 sec/batch
Epoch 20/20  Iteration 3498/3560 Training loss: 1.3303 2.0728 sec/batch
Epoch 20/20  Iteration 3499/3560 Training loss: 1.3302 1.9891 sec/batch
Epoch 20/20  Iteration 3500/3560 Training loss: 1.3304 1.9398 sec/batch
Epoch 20/20  Iteration 3501/3560 Training loss: 1.3303 1.9292 sec/batch
Epoch 20/20  Iteration 3502/3560 Training loss: 1.3303 2.0416 sec/batch
Epoch 20/20  Iteration 3503/3560 Training loss: 1.3302 1.9904 sec/batch
Epoch 20/20  Iteration 3504/3560 Training loss: 1.3299 2.0333 sec/batch
Epoch 20/20  Iteration 3505/3560 Training loss: 1.3295 2.0489 sec/batch
Epoch 20/20  Iteration 3506/3560 Training loss: 1.3295 1.9363 sec/batch
Epoch 20/20  Iteration 3507/3560 Training loss: 1.3295 1.9436 sec/batch
Epoch 20/20  Iteration 3508/3560 Training loss: 1.3292 1.9440 sec/batch
Epoch 20/20  Iteration 3509/3560 Training loss: 1.3292 1.9082 sec/batch
Epoch 20/20  Iteration 3510/3560 Training loss: 1.3292 1.9893 sec/batch
Epoch 20/20  Iteration 3511/3560 Training loss: 1.3291 1.9698 sec/batch
Epoch 20/20  Iteration 3512/3560 Training loss: 1.3289 2.0169 sec/batch
Epoch 20/20  Iteration 3513/3560 Training loss: 1.3285 2.2390 sec/batch
Epoch 20/20  Iteration 3514/3560 Training loss: 1.3284 2.4981 sec/batch
Epoch 20/20  Iteration 3515/3560 Training loss: 1.3285 2.3686 sec/batch
Epoch 20/20  Iteration 3516/3560 Training loss: 1.3287 2.6732 sec/batch
Epoch 20/20  Iteration 3517/3560 Training loss: 1.3286 2.4817 sec/batch
Epoch 20/20  Iteration 3518/3560 Training loss: 1.3287 2.2095 sec/batch
Epoch 20/20  Iteration 3519/3560 Training loss: 1.3289 2.1597 sec/batch
Epoch 20/20  Iteration 3520/3560 Training loss: 1.3291 1.9702 sec/batch
Epoch 20/20  Iteration 3521/3560 Training loss: 1.3291 1.9979 sec/batch
Epoch 20/20  Iteration 3522/3560 Training loss: 1.3292 1.9581 sec/batch
Epoch 20/20  Iteration 3523/3560 Training loss: 1.3295 2.1321 sec/batch
Epoch 20/20  Iteration 3524/3560 Training loss: 1.3295 1.9565 sec/batch
Epoch 20/20  Iteration 3525/3560 Training loss: 1.3295 1.9694 sec/batch
Epoch 20/20  Iteration 3526/3560 Training loss: 1.3297 1.9994 sec/batch
Epoch 20/20  Iteration 3527/3560 Training loss: 1.3296 1.9695 sec/batch
Epoch 20/20  Iteration 3528/3560 Training loss: 1.3297 1.9826 sec/batch
Epoch 20/20  Iteration 3529/3560 Training loss: 1.3297 1.9432 sec/batch
Epoch 20/20  Iteration 3530/3560 Training loss: 1.3299 1.9726 sec/batch
Epoch 20/20  Iteration 3531/3560 Training loss: 1.3301 2.0707 sec/batch
Epoch 20/20  Iteration 3532/3560 Training loss: 1.3300 2.0003 sec/batch
Epoch 20/20  Iteration 3533/3560 Training loss: 1.3298 1.9560 sec/batch
Epoch 20/20  Iteration 3534/3560 Training loss: 1.3297 1.9969 sec/batch
Epoch 20/20  Iteration 3535/3560 Training loss: 1.3297 2.0231 sec/batch
Epoch 20/20  Iteration 3536/3560 Training loss: 1.3297 2.0025 sec/batch
Epoch 20/20  Iteration 3537/3560 Training loss: 1.3298 1.9307 sec/batch
Epoch 20/20  Iteration 3538/3560 Training loss: 1.3298 2.0254 sec/batch
Epoch 20/20  Iteration 3539/3560 Training loss: 1.3298 2.0382 sec/batch
Epoch 20/20  Iteration 3540/3560 Training loss: 1.3297 2.0193 sec/batch
Epoch 20/20  Iteration 3541/3560 Training loss: 1.3295 1.9971 sec/batch
Epoch 20/20  Iteration 3542/3560 Training loss: 1.3296 2.0150 sec/batch
Epoch 20/20  Iteration 3543/3560 Training loss: 1.3298 2.0124 sec/batch
Epoch 20/20  Iteration 3544/3560 Training loss: 1.3299 1.9449 sec/batch
Epoch 20/20  Iteration 3545/3560 Training loss: 1.3299 1.9766 sec/batch
Epoch 20/20  Iteration 3546/3560 Training loss: 1.3299 1.9730 sec/batch
Epoch 20/20  Iteration 3547/3560 Training loss: 1.3300 2.0913 sec/batch
Epoch 20/20  Iteration 3548/3560 Training loss: 1.3299 2.1117 sec/batch
Epoch 20/20  Iteration 3549/3560 Training loss: 1.3301 2.0724 sec/batch
Epoch 20/20  Iteration 3550/3560 Training loss: 1.3305 1.9695 sec/batch
Epoch 20/20  Iteration 3551/3560 Training loss: 1.3305 2.1211 sec/batch
Epoch 20/20  Iteration 3552/3560 Training loss: 1.3305 2.1517 sec/batch
Epoch 20/20  Iteration 3553/3560 Training loss: 1.3304 2.1978 sec/batch
Epoch 20/20  Iteration 3554/3560 Training loss: 1.3303 2.1375 sec/batch
Epoch 20/20  Iteration 3555/3560 Training loss: 1.3305 2.3222 sec/batch
Epoch 20/20  Iteration 3556/3560 Training loss: 1.3306 2.5997 sec/batch
Epoch 20/20  Iteration 3557/3560 Training loss: 1.3306 2.2015 sec/batch
Epoch 20/20  Iteration 3558/3560 Training loss: 1.3305 2.3433 sec/batch
Epoch 20/20  Iteration 3559/3560 Training loss: 1.3304 2.6284 sec/batch
Epoch 20/20  Iteration 3560/3560 Training loss: 1.3306 2.6614 sec/batch
Epoch 1/20  Iteration 1/3560 Training loss: 4.4208 2.0547 sec/batch
Epoch 1/20  Iteration 2/3560 Training loss: 4.4040 1.9007 sec/batch
Epoch 1/20  Iteration 3/3560 Training loss: 4.3777 1.9315 sec/batch
Epoch 1/20  Iteration 4/3560 Training loss: 4.3169 2.0353 sec/batch
Epoch 1/20  Iteration 5/3560 Training loss: 4.1932 2.1528 sec/batch
Epoch 1/20  Iteration 6/3560 Training loss: 4.1090 2.1234 sec/batch
Epoch 1/20  Iteration 7/3560 Training loss: 4.0322 1.9797 sec/batch
Epoch 1/20  Iteration 8/3560 Training loss: 3.9698 2.0085 sec/batch
Epoch 1/20  Iteration 9/3560 Training loss: 3.9131 2.1186 sec/batch
Epoch 1/20  Iteration 10/3560 Training loss: 3.8642 2.2232 sec/batch
Epoch 1/20  Iteration 11/3560 Training loss: 3.8193 1.9777 sec/batch
Epoch 1/20  Iteration 12/3560 Training loss: 3.7816 1.9582 sec/batch
Epoch 1/20  Iteration 13/3560 Training loss: 3.7481 2.1060 sec/batch
Epoch 1/20  Iteration 14/3560 Training loss: 3.7187 1.8993 sec/batch
Epoch 1/20  Iteration 15/3560 Training loss: 3.6917 1.9476 sec/batch
Epoch 1/20  Iteration 16/3560 Training loss: 3.6678 1.8976 sec/batch
Epoch 1/20  Iteration 17/3560 Training loss: 3.6454 2.1218 sec/batch
Epoch 1/20  Iteration 18/3560 Training loss: 3.6270 1.9766 sec/batch
Epoch 1/20  Iteration 19/3560 Training loss: 3.6096 1.9130 sec/batch
Epoch 1/20  Iteration 20/3560 Training loss: 3.5915 1.9418 sec/batch
Epoch 1/20  Iteration 21/3560 Training loss: 3.5762 2.0578 sec/batch
Epoch 1/20  Iteration 22/3560 Training loss: 3.5619 2.0052 sec/batch
Epoch 1/20  Iteration 23/3560 Training loss: 3.5482 1.9114 sec/batch
Epoch 1/20  Iteration 24/3560 Training loss: 3.5354 1.9379 sec/batch
Epoch 1/20  Iteration 25/3560 Training loss: 3.5232 2.0364 sec/batch
Epoch 1/20  Iteration 26/3560 Training loss: 3.5123 2.0369 sec/batch
Epoch 1/20  Iteration 27/3560 Training loss: 3.5024 2.0610 sec/batch
Epoch 1/20  Iteration 28/3560 Training loss: 3.4926 1.9720 sec/batch
Epoch 1/20  Iteration 29/3560 Training loss: 3.4837 1.9988 sec/batch
Epoch 1/20  Iteration 30/3560 Training loss: 3.4749 2.0350 sec/batch
Epoch 1/20  Iteration 31/3560 Training loss: 3.4677 2.1579 sec/batch
Epoch 1/20  Iteration 32/3560 Training loss: 3.4601 2.0824 sec/batch
Epoch 1/20  Iteration 33/3560 Training loss: 3.4522 1.9581 sec/batch
Epoch 1/20  Iteration 34/3560 Training loss: 3.4454 2.0425 sec/batch
Epoch 1/20  Iteration 35/3560 Training loss: 3.4385 1.9226 sec/batch
Epoch 1/20  Iteration 36/3560 Training loss: 3.4324 1.9966 sec/batch
Epoch 1/20  Iteration 37/3560 Training loss: 3.4257 2.0107 sec/batch
Epoch 1/20  Iteration 38/3560 Training loss: 3.4196 2.0675 sec/batch
Epoch 1/20  Iteration 39/3560 Training loss: 3.4132 2.1322 sec/batch
Epoch 1/20  Iteration 40/3560 Training loss: 3.4075 2.5369 sec/batch
Epoch 1/20  Iteration 41/3560 Training loss: 3.4018 2.6461 sec/batch
Epoch 1/20  Iteration 42/3560 Training loss: 3.3966 2.0595 sec/batch
Epoch 1/20  Iteration 43/3560 Training loss: 3.3916 1.9717 sec/batch
Epoch 1/20  Iteration 44/3560 Training loss: 3.3867 2.2546 sec/batch
Epoch 1/20  Iteration 45/3560 Training loss: 3.3818 1.9692 sec/batch
Epoch 1/20  Iteration 46/3560 Training loss: 3.3775 2.1119 sec/batch
Epoch 1/20  Iteration 47/3560 Training loss: 3.3735 2.0231 sec/batch
Epoch 1/20  Iteration 48/3560 Training loss: 3.3698 2.0537 sec/batch
Epoch 1/20  Iteration 49/3560 Training loss: 3.3660 1.9816 sec/batch
Epoch 1/20  Iteration 50/3560 Training loss: 3.3625 2.1809 sec/batch
Epoch 1/20  Iteration 51/3560 Training loss: 3.3588 2.1725 sec/batch
Epoch 1/20  Iteration 52/3560 Training loss: 3.3552 2.1517 sec/batch
Epoch 1/20  Iteration 53/3560 Training loss: 3.3519 1.9733 sec/batch
Epoch 1/20  Iteration 54/3560 Training loss: 3.3483 2.0433 sec/batch
Epoch 1/20  Iteration 55/3560 Training loss: 3.3451 1.9307 sec/batch
Epoch 1/20  Iteration 56/3560 Training loss: 3.3417 2.0592 sec/batch
Epoch 1/20  Iteration 57/3560 Training loss: 3.3386 2.1389 sec/batch
Epoch 1/20  Iteration 58/3560 Training loss: 3.3356 2.1041 sec/batch
Epoch 1/20  Iteration 59/3560 Training loss: 3.3325 2.0077 sec/batch
Epoch 1/20  Iteration 60/3560 Training loss: 3.3297 2.0336 sec/batch
Epoch 1/20  Iteration 61/3560 Training loss: 3.3270 1.9693 sec/batch
Epoch 1/20  Iteration 62/3560 Training loss: 3.3246 1.9980 sec/batch
Epoch 1/20  Iteration 63/3560 Training loss: 3.3224 2.0091 sec/batch
Epoch 1/20  Iteration 64/3560 Training loss: 3.3196 2.0363 sec/batch
Epoch 1/20  Iteration 65/3560 Training loss: 3.3170 1.9643 sec/batch
Epoch 1/20  Iteration 66/3560 Training loss: 3.3149 2.0739 sec/batch
Epoch 1/20  Iteration 67/3560 Training loss: 3.3127 2.1195 sec/batch
Epoch 1/20  Iteration 68/3560 Training loss: 3.3099 1.9934 sec/batch
Epoch 1/20  Iteration 69/3560 Training loss: 3.3076 2.0027 sec/batch
Epoch 1/20  Iteration 70/3560 Training loss: 3.3055 1.9427 sec/batch
Epoch 1/20  Iteration 71/3560 Training loss: 3.3034 2.2364 sec/batch
Epoch 1/20  Iteration 72/3560 Training loss: 3.3016 2.1550 sec/batch
Epoch 1/20  Iteration 73/3560 Training loss: 3.2995 1.9115 sec/batch
Epoch 1/20  Iteration 74/3560 Training loss: 3.2976 2.0047 sec/batch
Epoch 1/20  Iteration 75/3560 Training loss: 3.2958 1.9813 sec/batch
Epoch 1/20  Iteration 76/3560 Training loss: 3.2941 2.0181 sec/batch
Epoch 1/20  Iteration 77/3560 Training loss: 3.2924 2.0781 sec/batch
Epoch 1/20  Iteration 78/3560 Training loss: 3.2906 2.0446 sec/batch
Epoch 1/20  Iteration 79/3560 Training loss: 3.2889 2.2573 sec/batch
Epoch 1/20  Iteration 80/3560 Training loss: 3.2869 2.1057 sec/batch
Epoch 1/20  Iteration 81/3560 Training loss: 3.2852 2.2322 sec/batch
Epoch 1/20  Iteration 82/3560 Training loss: 3.2836 2.7214 sec/batch
Epoch 1/20  Iteration 83/3560 Training loss: 3.2821 2.2988 sec/batch
Epoch 1/20  Iteration 84/3560 Training loss: 3.2804 2.1824 sec/batch
Epoch 1/20  Iteration 85/3560 Training loss: 3.2787 2.7078 sec/batch
Epoch 1/20  Iteration 86/3560 Training loss: 3.2770 2.7612 sec/batch
Epoch 1/20  Iteration 87/3560 Training loss: 3.2753 2.3634 sec/batch
Epoch 1/20  Iteration 88/3560 Training loss: 3.2737 2.1346 sec/batch
Epoch 1/20  Iteration 89/3560 Training loss: 3.2723 2.0121 sec/batch
Epoch 1/20  Iteration 90/3560 Training loss: 3.2709 2.1950 sec/batch
Epoch 1/20  Iteration 91/3560 Training loss: 3.2696 2.2915 sec/batch
Epoch 1/20  Iteration 92/3560 Training loss: 3.2682 2.0362 sec/batch
Epoch 1/20  Iteration 93/3560 Training loss: 3.2668 1.9884 sec/batch
Epoch 1/20  Iteration 94/3560 Training loss: 3.2655 1.9319 sec/batch
Epoch 1/20  Iteration 95/3560 Training loss: 3.2641 2.1917 sec/batch
Epoch 1/20  Iteration 96/3560 Training loss: 3.2627 2.2784 sec/batch
Epoch 1/20  Iteration 97/3560 Training loss: 3.2614 1.9791 sec/batch
Epoch 1/20  Iteration 98/3560 Training loss: 3.2601 1.9659 sec/batch
Epoch 1/20  Iteration 99/3560 Training loss: 3.2588 2.0394 sec/batch
Epoch 1/20  Iteration 100/3560 Training loss: 3.2576 1.9906 sec/batch
Epoch 1/20  Iteration 101/3560 Training loss: 3.2563 1.9430 sec/batch
Epoch 1/20  Iteration 102/3560 Training loss: 3.2551 1.9437 sec/batch
Epoch 1/20  Iteration 103/3560 Training loss: 3.2539 2.0103 sec/batch
Epoch 1/20  Iteration 104/3560 Training loss: 3.2527 2.0254 sec/batch
Epoch 1/20  Iteration 105/3560 Training loss: 3.2515 1.9522 sec/batch
Epoch 1/20  Iteration 106/3560 Training loss: 3.2503 2.0618 sec/batch
Epoch 1/20  Iteration 107/3560 Training loss: 3.2489 2.0470 sec/batch
Epoch 1/20  Iteration 108/3560 Training loss: 3.2476 1.9807 sec/batch
Epoch 1/20  Iteration 109/3560 Training loss: 3.2464 1.9866 sec/batch
Epoch 1/20  Iteration 110/3560 Training loss: 3.2449 2.0615 sec/batch
Epoch 1/20  Iteration 111/3560 Training loss: 3.2437 2.2468 sec/batch
Epoch 1/20  Iteration 112/3560 Training loss: 3.2426 2.0967 sec/batch
Epoch 1/20  Iteration 113/3560 Training loss: 3.2413 1.9914 sec/batch
Epoch 1/20  Iteration 114/3560 Training loss: 3.2399 2.3107 sec/batch
Epoch 1/20  Iteration 115/3560 Training loss: 3.2386 2.6476 sec/batch
Epoch 1/20  Iteration 116/3560 Training loss: 3.2372 2.6255 sec/batch
Epoch 1/20  Iteration 117/3560 Training loss: 3.2359 2.0509 sec/batch
Epoch 1/20  Iteration 118/3560 Training loss: 3.2347 1.9881 sec/batch
Epoch 1/20  Iteration 119/3560 Training loss: 3.2336 2.1994 sec/batch
Epoch 1/20  Iteration 120/3560 Training loss: 3.2322 2.0360 sec/batch
Epoch 1/20  Iteration 121/3560 Training loss: 3.2312 2.0200 sec/batch
Epoch 1/20  Iteration 122/3560 Training loss: 3.2300 1.9535 sec/batch
Epoch 1/20  Iteration 123/3560 Training loss: 3.2287 2.0387 sec/batch
Epoch 1/20  Iteration 124/3560 Training loss: 3.2275 1.9717 sec/batch
Epoch 1/20  Iteration 125/3560 Training loss: 3.2262 1.9536 sec/batch
Epoch 1/20  Iteration 126/3560 Training loss: 3.2247 2.0975 sec/batch
Epoch 1/20  Iteration 127/3560 Training loss: 3.2235 2.1417 sec/batch
Epoch 1/20  Iteration 128/3560 Training loss: 3.2222 2.0576 sec/batch
Epoch 1/20  Iteration 129/3560 Training loss: 3.2207 1.9611 sec/batch
Epoch 1/20  Iteration 130/3560 Training loss: 3.2194 2.1719 sec/batch
Epoch 1/20  Iteration 131/3560 Training loss: 3.2181 2.3374 sec/batch
Epoch 1/20  Iteration 132/3560 Training loss: 3.2166 1.9701 sec/batch
Epoch 1/20  Iteration 133/3560 Training loss: 3.2152 1.9729 sec/batch
Epoch 1/20  Iteration 134/3560 Training loss: 3.2138 1.9925 sec/batch
Epoch 1/20  Iteration 135/3560 Training loss: 3.2120 1.9817 sec/batch
Epoch 1/20  Iteration 136/3560 Training loss: 3.2103 2.0009 sec/batch
Epoch 1/20  Iteration 137/3560 Training loss: 3.2087 2.0270 sec/batch
Epoch 1/20  Iteration 138/3560 Training loss: 3.2070 2.0220 sec/batch
Epoch 1/20  Iteration 139/3560 Training loss: 3.2055 1.9372 sec/batch
Epoch 1/20  Iteration 140/3560 Training loss: 3.2038 2.0165 sec/batch
Epoch 1/20  Iteration 141/3560 Training loss: 3.2021 2.0207 sec/batch
Epoch 1/20  Iteration 142/3560 Training loss: 3.2003 2.0089 sec/batch
Epoch 1/20  Iteration 143/3560 Training loss: 3.1986 1.9910 sec/batch
Epoch 1/20  Iteration 144/3560 Training loss: 3.1968 1.9895 sec/batch
Epoch 1/20  Iteration 145/3560 Training loss: 3.1951 1.9363 sec/batch
Epoch 1/20  Iteration 146/3560 Training loss: 3.1934 2.0872 sec/batch
Epoch 1/20  Iteration 147/3560 Training loss: 3.1917 2.0904 sec/batch
Epoch 1/20  Iteration 148/3560 Training loss: 3.1901 1.9963 sec/batch
Epoch 1/20  Iteration 149/3560 Training loss: 3.1881 2.0281 sec/batch
Epoch 1/20  Iteration 150/3560 Training loss: 3.1861 2.1103 sec/batch
Epoch 1/20  Iteration 151/3560 Training loss: 3.1844 2.4150 sec/batch
Epoch 1/20  Iteration 152/3560 Training loss: 3.1828 2.1527 sec/batch
Epoch 1/20  Iteration 153/3560 Training loss: 3.1808 1.9942 sec/batch
Epoch 1/20  Iteration 154/3560 Training loss: 3.1788 2.3904 sec/batch
Epoch 1/20  Iteration 155/3560 Training loss: 3.1767 2.5148 sec/batch
Epoch 1/20  Iteration 156/3560 Training loss: 3.1746 2.7767 sec/batch
Epoch 1/20  Iteration 157/3560 Training loss: 3.1724 2.4884 sec/batch
Epoch 1/20  Iteration 158/3560 Training loss: 3.1702 2.1088 sec/batch
Epoch 1/20  Iteration 159/3560 Training loss: 3.1679 1.9296 sec/batch
Epoch 1/20  Iteration 160/3560 Training loss: 3.1657 2.0059 sec/batch
Epoch 1/20  Iteration 161/3560 Training loss: 3.1636 2.0221 sec/batch
Epoch 1/20  Iteration 162/3560 Training loss: 3.1612 1.9557 sec/batch
Epoch 1/20  Iteration 163/3560 Training loss: 3.1588 1.9899 sec/batch
Epoch 1/20  Iteration 164/3560 Training loss: 3.1565 2.0740 sec/batch
Epoch 1/20  Iteration 165/3560 Training loss: 3.1542 1.9903 sec/batch
Epoch 1/20  Iteration 166/3560 Training loss: 3.1518 2.0931 sec/batch
Epoch 1/20  Iteration 167/3560 Training loss: 3.1495 2.1034 sec/batch
Epoch 1/20  Iteration 168/3560 Training loss: 3.1471 2.0851 sec/batch
Epoch 1/20  Iteration 169/3560 Training loss: 3.1448 1.9864 sec/batch
Epoch 1/20  Iteration 170/3560 Training loss: 3.1423 2.1323 sec/batch
Epoch 1/20  Iteration 171/3560 Training loss: 3.1400 2.2822 sec/batch
Epoch 1/20  Iteration 172/3560 Training loss: 3.1377 2.1208 sec/batch
Epoch 1/20  Iteration 173/3560 Training loss: 3.1356 1.9672 sec/batch
Epoch 1/20  Iteration 174/3560 Training loss: 3.1334 1.9660 sec/batch
Epoch 1/20  Iteration 175/3560 Training loss: 3.1312 2.2279 sec/batch
Epoch 1/20  Iteration 176/3560 Training loss: 3.1291 2.5314 sec/batch
Epoch 1/20  Iteration 177/3560 Training loss: 3.1267 2.5539 sec/batch
Epoch 1/20  Iteration 178/3560 Training loss: 3.1242 2.2870 sec/batch
Epoch 2/20  Iteration 179/3560 Training loss: 2.7438 1.9982 sec/batch
Epoch 2/20  Iteration 180/3560 Training loss: 2.7070 2.1893 sec/batch
Epoch 2/20  Iteration 181/3560 Training loss: 2.6911 2.6985 sec/batch
Epoch 2/20  Iteration 182/3560 Training loss: 2.6878 2.4707 sec/batch
Epoch 2/20  Iteration 183/3560 Training loss: 2.6831 2.7035 sec/batch
Epoch 2/20  Iteration 184/3560 Training loss: 2.6800 2.4625 sec/batch
Epoch 2/20  Iteration 185/3560 Training loss: 2.6793 2.9458 sec/batch
Epoch 2/20  Iteration 186/3560 Training loss: 2.6780 2.5839 sec/batch
Epoch 2/20  Iteration 187/3560 Training loss: 2.6766 2.3762 sec/batch
Epoch 2/20  Iteration 188/3560 Training loss: 2.6730 2.6514 sec/batch
Epoch 2/20  Iteration 189/3560 Training loss: 2.6687 2.5741 sec/batch
Epoch 2/20  Iteration 190/3560 Training loss: 2.6674 2.4845 sec/batch
Epoch 2/20  Iteration 191/3560 Training loss: 2.6643 2.3212 sec/batch
Epoch 2/20  Iteration 192/3560 Training loss: 2.6639 2.3903 sec/batch
Epoch 2/20  Iteration 193/3560 Training loss: 2.6617 2.0198 sec/batch
Epoch 2/20  Iteration 194/3560 Training loss: 2.6598 1.9975 sec/batch
Epoch 2/20  Iteration 195/3560 Training loss: 2.6577 2.1922 sec/batch
Epoch 2/20  Iteration 196/3560 Training loss: 2.6575 2.4842 sec/batch
Epoch 2/20  Iteration 197/3560 Training loss: 2.6557 2.6655 sec/batch
Epoch 2/20  Iteration 198/3560 Training loss: 2.6523 2.3052 sec/batch
Epoch 2/20  Iteration 199/3560 Training loss: 2.6500 1.8662 sec/batch
Epoch 2/20  Iteration 200/3560 Training loss: 2.6487 1.8505 sec/batch
Epoch 2/20  Iteration 201/3560 Training loss: 2.6466 1.8132 sec/batch
Epoch 2/20  Iteration 202/3560 Training loss: 2.6444 1.9647 sec/batch
Epoch 2/20  Iteration 203/3560 Training loss: 2.6415 2.1385 sec/batch
Epoch 2/20  Iteration 204/3560 Training loss: 2.6402 1.9981 sec/batch
Epoch 2/20  Iteration 205/3560 Training loss: 2.6380 1.9288 sec/batch
Epoch 2/20  Iteration 206/3560 Training loss: 2.6358 2.1765 sec/batch
Epoch 2/20  Iteration 207/3560 Training loss: 2.6342 2.5760 sec/batch
Epoch 2/20  Iteration 208/3560 Training loss: 2.6325 2.1677 sec/batch
Epoch 2/20  Iteration 209/3560 Training loss: 2.6311 2.0518 sec/batch
Epoch 2/20  Iteration 210/3560 Training loss: 2.6289 2.2394 sec/batch
Epoch 2/20  Iteration 211/3560 Training loss: 2.6262 1.9009 sec/batch
Epoch 2/20  Iteration 212/3560 Training loss: 2.6243 1.9382 sec/batch
Epoch 2/20  Iteration 213/3560 Training loss: 2.6220 1.8558 sec/batch
Epoch 2/20  Iteration 214/3560 Training loss: 2.6204 2.0581 sec/batch
Epoch 2/20  Iteration 215/3560 Training loss: 2.6183 2.0889 sec/batch
Epoch 2/20  Iteration 216/3560 Training loss: 2.6157 1.8790 sec/batch
Epoch 2/20  Iteration 217/3560 Training loss: 2.6136 1.8682 sec/batch
Epoch 2/20  Iteration 218/3560 Training loss: 2.6114 1.8693 sec/batch
Epoch 2/20  Iteration 219/3560 Training loss: 2.6091 2.1732 sec/batch
Epoch 2/20  Iteration 220/3560 Training loss: 2.6070 1.9555 sec/batch
Epoch 2/20  Iteration 221/3560 Training loss: 2.6049 2.1936 sec/batch
Epoch 2/20  Iteration 222/3560 Training loss: 2.6030 2.0148 sec/batch
Epoch 2/20  Iteration 223/3560 Training loss: 2.6010 2.0341 sec/batch
Epoch 2/20  Iteration 224/3560 Training loss: 2.5985 2.0382 sec/batch
Epoch 2/20  Iteration 225/3560 Training loss: 2.5968 2.3684 sec/batch
Epoch 2/20  Iteration 226/3560 Training loss: 2.5951 2.2369 sec/batch
Epoch 2/20  Iteration 227/3560 Training loss: 2.5933 2.5585 sec/batch
Epoch 2/20  Iteration 228/3560 Training loss: 2.5920 2.4142 sec/batch
Epoch 2/20  Iteration 229/3560 Training loss: 2.5903 2.7021 sec/batch
Epoch 2/20  Iteration 230/3560 Training loss: 2.5886 2.5633 sec/batch
Epoch 2/20  Iteration 231/3560 Training loss: 2.5868 1.9538 sec/batch
Epoch 2/20  Iteration 232/3560 Training loss: 2.5851 1.9241 sec/batch
Epoch 2/20  Iteration 233/3560 Training loss: 2.5834 1.9309 sec/batch
Epoch 2/20  Iteration 234/3560 Training loss: 2.5820 1.8689 sec/batch
Epoch 2/20  Iteration 235/3560 Training loss: 2.5804 2.3731 sec/batch
Epoch 2/20  Iteration 236/3560 Training loss: 2.5788 1.9474 sec/batch
Epoch 2/20  Iteration 237/3560 Training loss: 2.5773 1.9788 sec/batch
Epoch 2/20  Iteration 238/3560 Training loss: 2.5759 1.9844 sec/batch
Epoch 2/20  Iteration 239/3560 Training loss: 2.5743 1.8840 sec/batch
Epoch 2/20  Iteration 240/3560 Training loss: 2.5731 1.8781 sec/batch
Epoch 2/20  Iteration 241/3560 Training loss: 2.5721 1.8726 sec/batch
Epoch 2/20  Iteration 242/3560 Training loss: 2.5705 2.0457 sec/batch
Epoch 2/20  Iteration 243/3560 Training loss: 2.5688 2.1155 sec/batch
Epoch 2/20  Iteration 244/3560 Training loss: 2.5677 1.8812 sec/batch
Epoch 2/20  Iteration 245/3560 Training loss: 2.5665 1.8717 sec/batch
Epoch 2/20  Iteration 246/3560 Training loss: 2.5647 1.8976 sec/batch
Epoch 2/20  Iteration 247/3560 Training loss: 2.5630 1.9214 sec/batch
Epoch 2/20  Iteration 248/3560 Training loss: 2.5619 1.8846 sec/batch
Epoch 2/20  Iteration 249/3560 Training loss: 2.5606 1.9338 sec/batch
Epoch 2/20  Iteration 250/3560 Training loss: 2.5594 1.8756 sec/batch
Epoch 2/20  Iteration 251/3560 Training loss: 2.5581 1.9045 sec/batch
Epoch 2/20  Iteration 252/3560 Training loss: 2.5565 1.9705 sec/batch
Epoch 2/20  Iteration 253/3560 Training loss: 2.5553 1.9167 sec/batch
Epoch 2/20  Iteration 254/3560 Training loss: 2.5546 1.9362 sec/batch
Epoch 2/20  Iteration 255/3560 Training loss: 2.5533 1.9056 sec/batch
Epoch 2/20  Iteration 256/3560 Training loss: 2.5522 1.8529 sec/batch
Epoch 2/20  Iteration 257/3560 Training loss: 2.5508 2.0212 sec/batch
Epoch 2/20  Iteration 258/3560 Training loss: 2.5496 2.0088 sec/batch
Epoch 2/20  Iteration 259/3560 Training loss: 2.5483 1.8945 sec/batch
Epoch 2/20  Iteration 260/3560 Training loss: 2.5472 1.8884 sec/batch
Epoch 2/20  Iteration 261/3560 Training loss: 2.5459 1.9544 sec/batch
Epoch 2/20  Iteration 262/3560 Training loss: 2.5445 2.1846 sec/batch
Epoch 2/20  Iteration 263/3560 Training loss: 2.5429 1.9921 sec/batch
Epoch 2/20  Iteration 264/3560 Training loss: 2.5416 1.9270 sec/batch
Epoch 2/20  Iteration 265/3560 Training loss: 2.5404 2.3037 sec/batch
Epoch 2/20  Iteration 266/3560 Training loss: 2.5392 2.1066 sec/batch
Epoch 2/20  Iteration 267/3560 Training loss: 2.5379 1.9637 sec/batch
Epoch 2/20  Iteration 268/3560 Training loss: 2.5368 2.0815 sec/batch
Epoch 2/20  Iteration 269/3560 Training loss: 2.5356 1.9629 sec/batch
Epoch 2/20  Iteration 270/3560 Training loss: 2.5346 1.9254 sec/batch
Epoch 2/20  Iteration 271/3560 Training loss: 2.5334 2.1411 sec/batch
Epoch 2/20  Iteration 272/3560 Training loss: 2.5321 2.0538 sec/batch
Epoch 2/20  Iteration 273/3560 Training loss: 2.5308 1.9697 sec/batch
Epoch 2/20  Iteration 274/3560 Training loss: 2.5296 1.9137 sec/batch
Epoch 2/20  Iteration 275/3560 Training loss: 2.5286 2.1079 sec/batch
Epoch 2/20  Iteration 276/3560 Training loss: 2.5274 2.2788 sec/batch
Epoch 2/20  Iteration 277/3560 Training loss: 2.5263 2.4320 sec/batch
Epoch 2/20  Iteration 278/3560 Training loss: 2.5250 2.1797 sec/batch
Epoch 2/20  Iteration 279/3560 Training loss: 2.5241 2.1023 sec/batch
Epoch 2/20  Iteration 280/3560 Training loss: 2.5230 2.2772 sec/batch
Epoch 2/20  Iteration 281/3560 Training loss: 2.5217 2.0346 sec/batch
Epoch 2/20  Iteration 282/3560 Training loss: 2.5204 1.9384 sec/batch
Epoch 2/20  Iteration 283/3560 Training loss: 2.5193 1.8501 sec/batch
Epoch 2/20  Iteration 284/3560 Training loss: 2.5182 1.8884 sec/batch
Epoch 2/20  Iteration 285/3560 Training loss: 2.5170 2.0589 sec/batch
Epoch 2/20  Iteration 286/3560 Training loss: 2.5161 2.1503 sec/batch
Epoch 2/20  Iteration 287/3560 Training loss: 2.5152 2.1536 sec/batch
Epoch 2/20  Iteration 288/3560 Training loss: 2.5139 1.9176 sec/batch
Epoch 2/20  Iteration 289/3560 Training loss: 2.5130 1.8986 sec/batch
Epoch 2/20  Iteration 290/3560 Training loss: 2.5121 1.9555 sec/batch
Epoch 2/20  Iteration 291/3560 Training loss: 2.5110 1.9418 sec/batch
Epoch 2/20  Iteration 292/3560 Training loss: 2.5099 1.9191 sec/batch
Epoch 2/20  Iteration 293/3560 Training loss: 2.5088 1.9776 sec/batch
Epoch 2/20  Iteration 294/3560 Training loss: 2.5076 2.1982 sec/batch
Epoch 2/20  Iteration 295/3560 Training loss: 2.5066 2.2164 sec/batch
Epoch 2/20  Iteration 296/3560 Training loss: 2.5057 1.9666 sec/batch
Epoch 2/20  Iteration 297/3560 Training loss: 2.5048 1.9125 sec/batch
Epoch 2/20  Iteration 298/3560 Training loss: 2.5038 2.0188 sec/batch
Epoch 2/20  Iteration 299/3560 Training loss: 2.5030 2.1600 sec/batch
Epoch 2/20  Iteration 300/3560 Training loss: 2.5020 2.4876 sec/batch
Epoch 2/20  Iteration 301/3560 Training loss: 2.5009 2.6283 sec/batch
Epoch 2/20  Iteration 302/3560 Training loss: 2.5001 2.0448 sec/batch
Epoch 2/20  Iteration 303/3560 Training loss: 2.4991 2.1428 sec/batch
Epoch 2/20  Iteration 304/3560 Training loss: 2.4979 1.9663 sec/batch
Epoch 2/20  Iteration 305/3560 Training loss: 2.4970 1.9839 sec/batch
Epoch 2/20  Iteration 306/3560 Training loss: 2.4962 2.0421 sec/batch
Epoch 2/20  Iteration 307/3560 Training loss: 2.4953 2.0667 sec/batch
Epoch 2/20  Iteration 308/3560 Training loss: 2.4943 2.0875 sec/batch
Epoch 2/20  Iteration 309/3560 Training loss: 2.4933 1.9135 sec/batch
Epoch 2/20  Iteration 310/3560 Training loss: 2.4921 2.0800 sec/batch
Epoch 2/20  Iteration 311/3560 Training loss: 2.4913 1.9343 sec/batch
Epoch 2/20  Iteration 312/3560 Training loss: 2.4904 2.0274 sec/batch
Epoch 2/20  Iteration 313/3560 Training loss: 2.4894 1.9516 sec/batch
Epoch 2/20  Iteration 314/3560 Training loss: 2.4884 2.0789 sec/batch
Epoch 2/20  Iteration 315/3560 Training loss: 2.4873 2.0865 sec/batch
Epoch 2/20  Iteration 316/3560 Training loss: 2.4864 2.1287 sec/batch
Epoch 2/20  Iteration 317/3560 Training loss: 2.4857 2.3738 sec/batch
Epoch 2/20  Iteration 318/3560 Training loss: 2.4847 2.1390 sec/batch
Epoch 2/20  Iteration 319/3560 Training loss: 2.4839 2.0644 sec/batch
Epoch 2/20  Iteration 320/3560 Training loss: 2.4828 2.0121 sec/batch
Epoch 2/20  Iteration 321/3560 Training loss: 2.4820 2.3107 sec/batch
Epoch 2/20  Iteration 322/3560 Training loss: 2.4810 2.1400 sec/batch
Epoch 2/20  Iteration 323/3560 Training loss: 2.4801 2.0897 sec/batch
Epoch 2/20  Iteration 324/3560 Training loss: 2.4795 1.9020 sec/batch
Epoch 2/20  Iteration 325/3560 Training loss: 2.4785 2.0925 sec/batch
Epoch 2/20  Iteration 326/3560 Training loss: 2.4778 2.1998 sec/batch
Epoch 2/20  Iteration 327/3560 Training loss: 2.4769 2.0775 sec/batch
Epoch 2/20  Iteration 328/3560 Training loss: 2.4759 1.9353 sec/batch
Epoch 2/20  Iteration 329/3560 Training loss: 2.4752 2.0332 sec/batch
Epoch 2/20  Iteration 330/3560 Training loss: 2.4746 2.3532 sec/batch
Epoch 2/20  Iteration 331/3560 Training loss: 2.4738 2.5458 sec/batch
Epoch 2/20  Iteration 332/3560 Training loss: 2.4731 2.1016 sec/batch
Epoch 2/20  Iteration 333/3560 Training loss: 2.4721 1.9916 sec/batch
Epoch 2/20  Iteration 334/3560 Training loss: 2.4713 2.1108 sec/batch
Epoch 2/20  Iteration 335/3560 Training loss: 2.4704 2.3752 sec/batch
Epoch 2/20  Iteration 336/3560 Training loss: 2.4695 2.0572 sec/batch
Epoch 2/20  Iteration 337/3560 Training loss: 2.4685 2.1407 sec/batch
Epoch 2/20  Iteration 338/3560 Training loss: 2.4678 2.0517 sec/batch
Epoch 2/20  Iteration 339/3560 Training loss: 2.4669 2.1984 sec/batch
Epoch 2/20  Iteration 340/3560 Training loss: 2.4660 1.9981 sec/batch
Epoch 2/20  Iteration 341/3560 Training loss: 2.4650 2.2284 sec/batch
Epoch 2/20  Iteration 342/3560 Training loss: 2.4641 2.2538 sec/batch
Epoch 2/20  Iteration 343/3560 Training loss: 2.4633 1.9273 sec/batch
Epoch 2/20  Iteration 344/3560 Training loss: 2.4625 2.0603 sec/batch
Epoch 2/20  Iteration 345/3560 Training loss: 2.4617 2.2171 sec/batch
Epoch 2/20  Iteration 346/3560 Training loss: 2.4609 2.5529 sec/batch
Epoch 2/20  Iteration 347/3560 Training loss: 2.4601 2.1614 sec/batch
Epoch 2/20  Iteration 348/3560 Training loss: 2.4591 2.2442 sec/batch
Epoch 2/20  Iteration 349/3560 Training loss: 2.4582 2.2332 sec/batch
Epoch 2/20  Iteration 350/3560 Training loss: 2.4574 2.3222 sec/batch
Epoch 2/20  Iteration 351/3560 Training loss: 2.4566 2.1394 sec/batch
Epoch 2/20  Iteration 352/3560 Training loss: 2.4560 2.2931 sec/batch
Epoch 2/20  Iteration 353/3560 Training loss: 2.4553 2.2210 sec/batch
Epoch 2/20  Iteration 354/3560 Training loss: 2.4545 2.1111 sec/batch
Epoch 2/20  Iteration 355/3560 Training loss: 2.4537 2.7159 sec/batch
Epoch 2/20  Iteration 356/3560 Training loss: 2.4528 2.6416 sec/batch
Epoch 3/20  Iteration 357/3560 Training loss: 2.3738 2.5033 sec/batch
Epoch 3/20  Iteration 358/3560 Training loss: 2.3195 2.2558 sec/batch
Epoch 3/20  Iteration 359/3560 Training loss: 2.3054 2.0784 sec/batch
Epoch 3/20  Iteration 360/3560 Training loss: 2.3023 1.9426 sec/batch
Epoch 3/20  Iteration 361/3560 Training loss: 2.3002 2.0791 sec/batch
Epoch 3/20  Iteration 362/3560 Training loss: 2.2979 2.1101 sec/batch
Epoch 3/20  Iteration 363/3560 Training loss: 2.2975 2.1331 sec/batch
Epoch 3/20  Iteration 364/3560 Training loss: 2.2995 1.9643 sec/batch
Epoch 3/20  Iteration 365/3560 Training loss: 2.3006 2.2120 sec/batch
Epoch 3/20  Iteration 366/3560 Training loss: 2.2997 2.0090 sec/batch
Epoch 3/20  Iteration 367/3560 Training loss: 2.2962 1.8808 sec/batch
Epoch 3/20  Iteration 368/3560 Training loss: 2.2951 1.8525 sec/batch
Epoch 3/20  Iteration 369/3560 Training loss: 2.2945 2.2060 sec/batch
Epoch 3/20  Iteration 370/3560 Training loss: 2.2964 2.3018 sec/batch
Epoch 3/20  Iteration 371/3560 Training loss: 2.2956 1.9458 sec/batch
Epoch 3/20  Iteration 372/3560 Training loss: 2.2953 1.8450 sec/batch
Epoch 3/20  Iteration 373/3560 Training loss: 2.2945 2.0766 sec/batch
Epoch 3/20  Iteration 374/3560 Training loss: 2.2960 2.1281 sec/batch
Epoch 3/20  Iteration 375/3560 Training loss: 2.2959 2.1885 sec/batch
Epoch 3/20  Iteration 376/3560 Training loss: 2.2940 2.8246 sec/batch
Epoch 3/20  Iteration 377/3560 Training loss: 2.2922 2.3513 sec/batch
Epoch 3/20  Iteration 378/3560 Training loss: 2.2928 2.7590 sec/batch
Epoch 3/20  Iteration 379/3560 Training loss: 2.2920 2.4736 sec/batch
Epoch 3/20  Iteration 380/3560 Training loss: 2.2906 2.0806 sec/batch
Epoch 3/20  Iteration 381/3560 Training loss: 2.2892 2.0115 sec/batch
Epoch 3/20  Iteration 382/3560 Training loss: 2.2882 2.3743 sec/batch
Epoch 3/20  Iteration 383/3560 Training loss: 2.2868 2.0131 sec/batch
Epoch 3/20  Iteration 384/3560 Training loss: 2.2862 2.1221 sec/batch
Epoch 3/20  Iteration 385/3560 Training loss: 2.2863 1.9534 sec/batch
Epoch 3/20  Iteration 386/3560 Training loss: 2.2857 1.9869 sec/batch
Epoch 3/20  Iteration 387/3560 Training loss: 2.2856 2.0390 sec/batch
Epoch 3/20  Iteration 388/3560 Training loss: 2.2843 1.9551 sec/batch
Epoch 3/20  Iteration 389/3560 Training loss: 2.2833 2.1872 sec/batch
Epoch 3/20  Iteration 390/3560 Training loss: 2.2832 1.9960 sec/batch
Epoch 3/20  Iteration 391/3560 Training loss: 2.2823 2.0675 sec/batch
Epoch 3/20  Iteration 392/3560 Training loss: 2.2818 2.1247 sec/batch
Epoch 3/20  Iteration 393/3560 Training loss: 2.2810 2.5614 sec/batch
Epoch 3/20  Iteration 394/3560 Training loss: 2.2796 2.7380 sec/batch
Epoch 3/20  Iteration 395/3560 Training loss: 2.2781 2.5281 sec/batch
Epoch 3/20  Iteration 396/3560 Training loss: 2.2768 2.5180 sec/batch
Epoch 3/20  Iteration 397/3560 Training loss: 2.2756 2.0487 sec/batch
Epoch 3/20  Iteration 398/3560 Training loss: 2.2744 2.4232 sec/batch
Epoch 3/20  Iteration 399/3560 Training loss: 2.2732 2.0287 sec/batch
Epoch 3/20  Iteration 400/3560 Training loss: 2.2719 2.1752 sec/batch
Epoch 3/20  Iteration 401/3560 Training loss: 2.2711 2.0928 sec/batch
Epoch 3/20  Iteration 402/3560 Training loss: 2.2695 2.5739 sec/batch
Epoch 3/20  Iteration 403/3560 Training loss: 2.2691 2.9121 sec/batch
Epoch 3/20  Iteration 404/3560 Training loss: 2.2682 2.6402 sec/batch
Epoch 3/20  Iteration 405/3560 Training loss: 2.2674 2.0998 sec/batch
Epoch 3/20  Iteration 406/3560 Training loss: 2.2671 2.0169 sec/batch
Epoch 3/20  Iteration 407/3560 Training loss: 2.2660 1.9368 sec/batch
Epoch 3/20  Iteration 408/3560 Training loss: 2.2658 2.0244 sec/batch
Epoch 3/20  Iteration 409/3560 Training loss: 2.2649 1.9831 sec/batch
Epoch 3/20  Iteration 410/3560 Training loss: 2.2640 1.8821 sec/batch
Epoch 3/20  Iteration 411/3560 Training loss: 2.2631 2.3171 sec/batch
Epoch 3/20  Iteration 412/3560 Training loss: 2.2626 1.8934 sec/batch
Epoch 3/20  Iteration 413/3560 Training loss: 2.2620 1.8719 sec/batch
Epoch 3/20  Iteration 414/3560 Training loss: 2.2612 2.1945 sec/batch
Epoch 3/20  Iteration 415/3560 Training loss: 2.2603 2.1250 sec/batch
Epoch 3/20  Iteration 416/3560 Training loss: 2.2599 2.1379 sec/batch
Epoch 3/20  Iteration 417/3560 Training loss: 2.2592 2.3440 sec/batch
Epoch 3/20  Iteration 418/3560 Training loss: 2.2589 2.0300 sec/batch
Epoch 3/20  Iteration 419/3560 Training loss: 2.2587 2.2938 sec/batch
Epoch 3/20  Iteration 420/3560 Training loss: 2.2580 2.1966 sec/batch
Epoch 3/20  Iteration 421/3560 Training loss: 2.2570 2.4715 sec/batch
Epoch 3/20  Iteration 422/3560 Training loss: 2.2567 2.3903 sec/batch
Epoch 3/20  Iteration 423/3560 Training loss: 2.2561 2.1405 sec/batch
Epoch 3/20  Iteration 424/3560 Training loss: 2.2550 2.0259 sec/batch
Epoch 3/20  Iteration 425/3560 Training loss: 2.2541 1.8739 sec/batch
Epoch 3/20  Iteration 426/3560 Training loss: 2.2536 2.0243 sec/batch
Epoch 3/20  Iteration 427/3560 Training loss: 2.2534 2.0502 sec/batch
Epoch 3/20  Iteration 428/3560 Training loss: 2.2529 1.9161 sec/batch
Epoch 3/20  Iteration 429/3560 Training loss: 2.2525 2.3140 sec/batch
Epoch 3/20  Iteration 430/3560 Training loss: 2.2518 2.4828 sec/batch
Epoch 3/20  Iteration 431/3560 Training loss: 2.2511 2.0831 sec/batch
Epoch 3/20  Iteration 432/3560 Training loss: 2.2510 2.6098 sec/batch
Epoch 3/20  Iteration 433/3560 Training loss: 2.2503 1.8460 sec/batch
Epoch 3/20  Iteration 434/3560 Training loss: 2.2498 1.9575 sec/batch
Epoch 3/20  Iteration 435/3560 Training loss: 2.2489 2.6590 sec/batch
Epoch 3/20  Iteration 436/3560 Training loss: 2.2482 2.1286 sec/batch
Epoch 3/20  Iteration 437/3560 Training loss: 2.2474 2.7642 sec/batch
Epoch 3/20  Iteration 438/3560 Training loss: 2.2468 4.4553 sec/batch
Epoch 3/20  Iteration 439/3560 Training loss: 2.2460 3.5886 sec/batch
Epoch 3/20  Iteration 440/3560 Training loss: 2.2452 3.0093 sec/batch
Epoch 3/20  Iteration 441/3560 Training loss: 2.2442 3.4988 sec/batch
Epoch 3/20  Iteration 442/3560 Training loss: 2.2433 2.3981 sec/batch
Epoch 3/20  Iteration 443/3560 Training loss: 2.2428 2.5792 sec/batch
Epoch 3/20  Iteration 444/3560 Training loss: 2.2420 2.2650 sec/batch
Epoch 3/20  Iteration 445/3560 Training loss: 2.2411 1.9303 sec/batch
Epoch 3/20  Iteration 446/3560 Training loss: 2.2406 2.1299 sec/batch
Epoch 3/20  Iteration 447/3560 Training loss: 2.2399 2.1144 sec/batch
Epoch 3/20  Iteration 448/3560 Training loss: 2.2394 2.0170 sec/batch
Epoch 3/20  Iteration 449/3560 Training loss: 2.2385 2.0119 sec/batch
Epoch 3/20  Iteration 450/3560 Training loss: 2.2376 1.8850 sec/batch
Epoch 3/20  Iteration 451/3560 Training loss: 2.2367 2.3514 sec/batch
Epoch 3/20  Iteration 452/3560 Training loss: 2.2360 2.0993 sec/batch
Epoch 3/20  Iteration 453/3560 Training loss: 2.2353 1.9498 sec/batch
Epoch 3/20  Iteration 454/3560 Training loss: 2.2345 2.0643 sec/batch
Epoch 3/20  Iteration 455/3560 Training loss: 2.2337 2.0262 sec/batch
Epoch 3/20  Iteration 456/3560 Training loss: 2.2328 2.0664 sec/batch
Epoch 3/20  Iteration 457/3560 Training loss: 2.2324 2.3092 sec/batch
Epoch 3/20  Iteration 458/3560 Training loss: 2.2319 2.2471 sec/batch
Epoch 3/20  Iteration 459/3560 Training loss: 2.2312 2.1074 sec/batch
Epoch 3/20  Iteration 460/3560 Training loss: 2.2304 2.4024 sec/batch
Epoch 3/20  Iteration 461/3560 Training loss: 2.2296 2.0529 sec/batch
Epoch 3/20  Iteration 462/3560 Training loss: 2.2291 2.0752 sec/batch
Epoch 3/20  Iteration 463/3560 Training loss: 2.2284 2.2078 sec/batch
Epoch 3/20  Iteration 464/3560 Training loss: 2.2279 2.1532 sec/batch
Epoch 3/20  Iteration 465/3560 Training loss: 2.2273 2.2192 sec/batch
Epoch 3/20  Iteration 466/3560 Training loss: 2.2265 2.3591 sec/batch
Epoch 3/20  Iteration 467/3560 Training loss: 2.2260 2.0014 sec/batch
Epoch 3/20  Iteration 468/3560 Training loss: 2.2255 2.0112 sec/batch
Epoch 3/20  Iteration 469/3560 Training loss: 2.2250 2.2116 sec/batch
Epoch 3/20  Iteration 470/3560 Training loss: 2.2244 2.0270 sec/batch
Epoch 3/20  Iteration 471/3560 Training loss: 2.2238 2.0247 sec/batch
Epoch 3/20  Iteration 472/3560 Training loss: 2.2229 2.2661 sec/batch
Epoch 3/20  Iteration 473/3560 Training loss: 2.2224 2.5220 sec/batch
Epoch 3/20  Iteration 474/3560 Training loss: 2.2218 2.5862 sec/batch
Epoch 3/20  Iteration 475/3560 Training loss: 2.2214 2.3301 sec/batch
Epoch 3/20  Iteration 476/3560 Training loss: 2.2208 2.1482 sec/batch
Epoch 3/20  Iteration 477/3560 Training loss: 2.2204 2.4968 sec/batch
Epoch 3/20  Iteration 478/3560 Training loss: 2.2197 1.9743 sec/batch
Epoch 3/20  Iteration 479/3560 Training loss: 2.2191 1.8638 sec/batch
Epoch 3/20  Iteration 480/3560 Training loss: 2.2187 2.1563 sec/batch
Epoch 3/20  Iteration 481/3560 Training loss: 2.2181 2.0614 sec/batch
Epoch 3/20  Iteration 482/3560 Training loss: 2.2173 2.1764 sec/batch
Epoch 3/20  Iteration 483/3560 Training loss: 2.2169 2.2032 sec/batch
Epoch 3/20  Iteration 484/3560 Training loss: 2.2165 2.3278 sec/batch
Epoch 3/20  Iteration 485/3560 Training loss: 2.2161 1.9662 sec/batch
Epoch 3/20  Iteration 486/3560 Training loss: 2.2156 2.1519 sec/batch
Epoch 3/20  Iteration 487/3560 Training loss: 2.2150 2.3382 sec/batch
Epoch 3/20  Iteration 488/3560 Training loss: 2.2143 2.3835 sec/batch
Epoch 3/20  Iteration 489/3560 Training loss: 2.2139 2.2682 sec/batch
Epoch 3/20  Iteration 490/3560 Training loss: 2.2135 1.9465 sec/batch
Epoch 3/20  Iteration 491/3560 Training loss: 2.2128 2.4498 sec/batch
Epoch 3/20  Iteration 492/3560 Training loss: 2.2124 2.3588 sec/batch
Epoch 3/20  Iteration 493/3560 Training loss: 2.2119 2.4701 sec/batch
Epoch 3/20  Iteration 494/3560 Training loss: 2.2113 2.1909 sec/batch
Epoch 3/20  Iteration 495/3560 Training loss: 2.2110 2.5817 sec/batch
Epoch 3/20  Iteration 496/3560 Training loss: 2.2105 2.3163 sec/batch
Epoch 3/20  Iteration 497/3560 Training loss: 2.2101 2.5091 sec/batch
Epoch 3/20  Iteration 498/3560 Training loss: 2.2096 2.2481 sec/batch
Epoch 3/20  Iteration 499/3560 Training loss: 2.2090 2.1526 sec/batch
Epoch 3/20  Iteration 500/3560 Training loss: 2.2085 1.8819 sec/batch
Epoch 3/20  Iteration 501/3560 Training loss: 2.2080 2.2673 sec/batch
Epoch 3/20  Iteration 502/3560 Training loss: 2.2076 2.0736 sec/batch
Epoch 3/20  Iteration 503/3560 Training loss: 2.2071 1.9591 sec/batch
Epoch 3/20  Iteration 504/3560 Training loss: 2.2067 1.8577 sec/batch
Epoch 3/20  Iteration 505/3560 Training loss: 2.2063 1.8545 sec/batch
Epoch 3/20  Iteration 506/3560 Training loss: 2.2057 2.3447 sec/batch
Epoch 3/20  Iteration 507/3560 Training loss: 2.2052 2.1858 sec/batch
Epoch 3/20  Iteration 508/3560 Training loss: 2.2050 1.8433 sec/batch
Epoch 3/20  Iteration 509/3560 Training loss: 2.2045 1.8346 sec/batch
Epoch 3/20  Iteration 510/3560 Training loss: 2.2040 1.9242 sec/batch
Epoch 3/20  Iteration 511/3560 Training loss: 2.2035 1.9227 sec/batch
Epoch 3/20  Iteration 512/3560 Training loss: 2.2029 1.9559 sec/batch
Epoch 3/20  Iteration 513/3560 Training loss: 2.2024 2.2574 sec/batch
Epoch 3/20  Iteration 514/3560 Training loss: 2.2018 1.9258 sec/batch
Epoch 3/20  Iteration 515/3560 Training loss: 2.2012 1.9738 sec/batch
Epoch 3/20  Iteration 516/3560 Training loss: 2.2009 1.8964 sec/batch
Epoch 3/20  Iteration 517/3560 Training loss: 2.2005 2.1563 sec/batch
Epoch 3/20  Iteration 518/3560 Training loss: 2.2000 1.8487 sec/batch
Epoch 3/20  Iteration 519/3560 Training loss: 2.1995 1.8467 sec/batch
Epoch 3/20  Iteration 520/3560 Training loss: 2.1989 1.8460 sec/batch
Epoch 3/20  Iteration 521/3560 Training loss: 2.1985 1.8593 sec/batch
Epoch 3/20  Iteration 522/3560 Training loss: 2.1980 1.8269 sec/batch
Epoch 3/20  Iteration 523/3560 Training loss: 2.1976 1.8201 sec/batch
Epoch 3/20  Iteration 524/3560 Training loss: 2.1972 1.8687 sec/batch
Epoch 3/20  Iteration 525/3560 Training loss: 2.1966 1.8062 sec/batch
Epoch 3/20  Iteration 526/3560 Training loss: 2.1962 1.8443 sec/batch
Epoch 3/20  Iteration 527/3560 Training loss: 2.1956 2.1398 sec/batch
Epoch 3/20  Iteration 528/3560 Training loss: 2.1951 1.8990 sec/batch
Epoch 3/20  Iteration 529/3560 Training loss: 2.1947 1.8481 sec/batch
Epoch 3/20  Iteration 530/3560 Training loss: 2.1943 2.0967 sec/batch
Epoch 3/20  Iteration 531/3560 Training loss: 2.1939 1.8379 sec/batch
Epoch 3/20  Iteration 532/3560 Training loss: 2.1934 1.9620 sec/batch
Epoch 3/20  Iteration 533/3560 Training loss: 2.1928 1.9083 sec/batch
Epoch 3/20  Iteration 534/3560 Training loss: 2.1923 1.8493 sec/batch
Epoch 4/20  Iteration 535/3560 Training loss: 2.1863 1.8370 sec/batch
Epoch 4/20  Iteration 536/3560 Training loss: 2.1296 2.0809 sec/batch
Epoch 4/20  Iteration 537/3560 Training loss: 2.1135 2.2369 sec/batch
Epoch 4/20  Iteration 538/3560 Training loss: 2.1057 1.9592 sec/batch
Epoch 4/20  Iteration 539/3560 Training loss: 2.1043 1.8673 sec/batch
Epoch 4/20  Iteration 540/3560 Training loss: 2.0977 1.8269 sec/batch
Epoch 4/20  Iteration 541/3560 Training loss: 2.0983 1.8374 sec/batch
Epoch 4/20  Iteration 542/3560 Training loss: 2.0977 1.8246 sec/batch
Epoch 4/20  Iteration 543/3560 Training loss: 2.0996 2.0301 sec/batch
Epoch 4/20  Iteration 544/3560 Training loss: 2.0987 1.8801 sec/batch
Epoch 4/20  Iteration 545/3560 Training loss: 2.0957 1.8450 sec/batch
Epoch 4/20  Iteration 546/3560 Training loss: 2.0940 1.8494 sec/batch
Epoch 4/20  Iteration 547/3560 Training loss: 2.0936 1.8299 sec/batch
Epoch 4/20  Iteration 548/3560 Training loss: 2.0954 1.8524 sec/batch
Epoch 4/20  Iteration 549/3560 Training loss: 2.0944 1.7848 sec/batch
Epoch 4/20  Iteration 550/3560 Training loss: 2.0935 1.8217 sec/batch
Epoch 4/20  Iteration 551/3560 Training loss: 2.0925 1.9853 sec/batch
Epoch 4/20  Iteration 552/3560 Training loss: 2.0942 1.8827 sec/batch
Epoch 4/20  Iteration 553/3560 Training loss: 2.0935 1.8562 sec/batch
Epoch 4/20  Iteration 554/3560 Training loss: 2.0934 1.9579 sec/batch
Epoch 4/20  Iteration 555/3560 Training loss: 2.0920 1.9126 sec/batch
Epoch 4/20  Iteration 556/3560 Training loss: 2.0931 1.8767 sec/batch
Epoch 4/20  Iteration 557/3560 Training loss: 2.0922 1.9038 sec/batch
Epoch 4/20  Iteration 558/3560 Training loss: 2.0918 1.9368 sec/batch
Epoch 4/20  Iteration 559/3560 Training loss: 2.0915 2.1738 sec/batch
Epoch 4/20  Iteration 560/3560 Training loss: 2.0902 2.1511 sec/batch
Epoch 4/20  Iteration 561/3560 Training loss: 2.0893 2.1852 sec/batch
Epoch 4/20  Iteration 562/3560 Training loss: 2.0892 2.0975 sec/batch
Epoch 4/20  Iteration 563/3560 Training loss: 2.0902 1.8707 sec/batch
Epoch 4/20  Iteration 564/3560 Training loss: 2.0902 2.0116 sec/batch
Epoch 4/20  Iteration 565/3560 Training loss: 2.0899 1.9486 sec/batch
Epoch 4/20  Iteration 566/3560 Training loss: 2.0888 1.8425 sec/batch
Epoch 4/20  Iteration 567/3560 Training loss: 2.0882 1.8044 sec/batch
Epoch 4/20  Iteration 568/3560 Training loss: 2.0885 2.3458 sec/batch
Epoch 4/20  Iteration 569/3560 Training loss: 2.0877 2.2098 sec/batch
Epoch 4/20  Iteration 570/3560 Training loss: 2.0874 2.0150 sec/batch
Epoch 4/20  Iteration 571/3560 Training loss: 2.0867 2.0194 sec/batch
Epoch 4/20  Iteration 572/3560 Training loss: 2.0853 1.8800 sec/batch
Epoch 4/20  Iteration 573/3560 Training loss: 2.0840 2.1645 sec/batch
Epoch 4/20  Iteration 574/3560 Training loss: 2.0831 2.4344 sec/batch
Epoch 4/20  Iteration 575/3560 Training loss: 2.0823 2.2939 sec/batch
Epoch 4/20  Iteration 576/3560 Training loss: 2.0819 2.2354 sec/batch
Epoch 4/20  Iteration 577/3560 Training loss: 2.0810 2.1895 sec/batch
Epoch 4/20  Iteration 578/3560 Training loss: 2.0800 2.2868 sec/batch
Epoch 4/20  Iteration 579/3560 Training loss: 2.0795 2.1164 sec/batch
Epoch 4/20  Iteration 580/3560 Training loss: 2.0780 2.3106 sec/batch
Epoch 4/20  Iteration 581/3560 Training loss: 2.0780 1.9170 sec/batch
Epoch 4/20  Iteration 582/3560 Training loss: 2.0772 2.0496 sec/batch
Epoch 4/20  Iteration 583/3560 Training loss: 2.0768 2.2324 sec/batch
Epoch 4/20  Iteration 584/3560 Training loss: 2.0771 2.0088 sec/batch
Epoch 4/20  Iteration 585/3560 Training loss: 2.0761 2.2142 sec/batch
Epoch 4/20  Iteration 586/3560 Training loss: 2.0765 1.9320 sec/batch
Epoch 4/20  Iteration 587/3560 Training loss: 2.0759 2.1569 sec/batch
Epoch 4/20  Iteration 588/3560 Training loss: 2.0753 2.2236 sec/batch
Epoch 4/20  Iteration 589/3560 Training loss: 2.0747 2.1806 sec/batch
Epoch 4/20  Iteration 590/3560 Training loss: 2.0745 2.0498 sec/batch
Epoch 4/20  Iteration 591/3560 Training loss: 2.0744 1.9797 sec/batch
Epoch 4/20  Iteration 592/3560 Training loss: 2.0738 2.1376 sec/batch
Epoch 4/20  Iteration 593/3560 Training loss: 2.0732 2.7547 sec/batch
Epoch 4/20  Iteration 594/3560 Training loss: 2.0732 2.3336 sec/batch
Epoch 4/20  Iteration 595/3560 Training loss: 2.0726 2.0776 sec/batch
Epoch 4/20  Iteration 596/3560 Training loss: 2.0729 2.3047 sec/batch
Epoch 4/20  Iteration 597/3560 Training loss: 2.0729 2.7876 sec/batch
Epoch 4/20  Iteration 598/3560 Training loss: 2.0727 2.4029 sec/batch
Epoch 4/20  Iteration 599/3560 Training loss: 2.0722 2.1969 sec/batch
Epoch 4/20  Iteration 600/3560 Training loss: 2.0721 1.9633 sec/batch
Epoch 4/20  Iteration 601/3560 Training loss: 2.0719 2.2528 sec/batch
Epoch 4/20  Iteration 602/3560 Training loss: 2.0709 1.9550 sec/batch
Epoch 4/20  Iteration 603/3560 Training loss: 2.0704 2.1164 sec/batch
Epoch 4/20  Iteration 604/3560 Training loss: 2.0700 2.0778 sec/batch
Epoch 4/20  Iteration 605/3560 Training loss: 2.0700 2.0696 sec/batch
Epoch 4/20  Iteration 606/3560 Training loss: 2.0699 1.9393 sec/batch
Epoch 4/20  Iteration 607/3560 Training loss: 2.0698 2.0017 sec/batch
Epoch 4/20  Iteration 608/3560 Training loss: 2.0693 2.1681 sec/batch
Epoch 4/20  Iteration 609/3560 Training loss: 2.0691 2.4741 sec/batch
Epoch 4/20  Iteration 610/3560 Training loss: 2.0692 2.5892 sec/batch
Epoch 4/20  Iteration 611/3560 Training loss: 2.0688 2.1625 sec/batch
Epoch 4/20  Iteration 612/3560 Training loss: 2.0685 2.3138 sec/batch
Epoch 4/20  Iteration 613/3560 Training loss: 2.0677 2.2134 sec/batch
Epoch 4/20  Iteration 614/3560 Training loss: 2.0673 2.5948 sec/batch
Epoch 4/20  Iteration 615/3560 Training loss: 2.0666 2.3459 sec/batch
Epoch 4/20  Iteration 616/3560 Training loss: 2.0663 2.1196 sec/batch
Epoch 4/20  Iteration 617/3560 Training loss: 2.0655 1.9006 sec/batch
Epoch 4/20  Iteration 618/3560 Training loss: 2.0650 1.9483 sec/batch
Epoch 4/20  Iteration 619/3560 Training loss: 2.0640 2.3518 sec/batch
Epoch 4/20  Iteration 620/3560 Training loss: 2.0634 2.2424 sec/batch
Epoch 4/20  Iteration 621/3560 Training loss: 2.0629 2.2117 sec/batch
Epoch 4/20  Iteration 622/3560 Training loss: 2.0624 2.0830 sec/batch
Epoch 4/20  Iteration 623/3560 Training loss: 2.0615 1.8983 sec/batch
Epoch 4/20  Iteration 624/3560 Training loss: 2.0613 1.8760 sec/batch
Epoch 4/20  Iteration 625/3560 Training loss: 2.0606 1.9145 sec/batch
Epoch 4/20  Iteration 626/3560 Training loss: 2.0603 1.8784 sec/batch
Epoch 4/20  Iteration 627/3560 Training loss: 2.0595 2.2665 sec/batch
Epoch 4/20  Iteration 628/3560 Training loss: 2.0589 2.3469 sec/batch
Epoch 4/20  Iteration 629/3560 Training loss: 2.0582 1.9973 sec/batch
Epoch 4/20  Iteration 630/3560 Training loss: 2.0576 1.8990 sec/batch
Epoch 4/20  Iteration 631/3560 Training loss: 2.0572 2.1429 sec/batch
Epoch 4/20  Iteration 632/3560 Training loss: 2.0565 2.3582 sec/batch
Epoch 4/20  Iteration 633/3560 Training loss: 2.0558 2.0164 sec/batch
Epoch 4/20  Iteration 634/3560 Training loss: 2.0551 2.1350 sec/batch
Epoch 4/20  Iteration 635/3560 Training loss: 2.0548 2.0624 sec/batch
Epoch 4/20  Iteration 636/3560 Training loss: 2.0545 1.8638 sec/batch
Epoch 4/20  Iteration 637/3560 Training loss: 2.0539 2.1222 sec/batch
Epoch 4/20  Iteration 638/3560 Training loss: 2.0533 2.0264 sec/batch
Epoch 4/20  Iteration 639/3560 Training loss: 2.0527 1.9447 sec/batch
Epoch 4/20  Iteration 640/3560 Training loss: 2.0523 1.9893 sec/batch
Epoch 4/20  Iteration 641/3560 Training loss: 2.0519 2.1720 sec/batch
Epoch 4/20  Iteration 642/3560 Training loss: 2.0516 1.9896 sec/batch
Epoch 4/20  Iteration 643/3560 Training loss: 2.0513 2.1362 sec/batch
Epoch 4/20  Iteration 644/3560 Training loss: 2.0509 2.1613 sec/batch
Epoch 4/20  Iteration 645/3560 Training loss: 2.0506 2.3617 sec/batch
Epoch 4/20  Iteration 646/3560 Training loss: 2.0502 2.1425 sec/batch
Epoch 4/20  Iteration 647/3560 Training loss: 2.0498 2.0933 sec/batch
Epoch 4/20  Iteration 648/3560 Training loss: 2.0494 2.0616 sec/batch
Epoch 4/20  Iteration 649/3560 Training loss: 2.0489 2.1379 sec/batch
Epoch 4/20  Iteration 650/3560 Training loss: 2.0483 2.4773 sec/batch
Epoch 4/20  Iteration 651/3560 Training loss: 2.0480 2.3151 sec/batch
Epoch 4/20  Iteration 652/3560 Training loss: 2.0475 2.1449 sec/batch
Epoch 4/20  Iteration 653/3560 Training loss: 2.0473 2.3759 sec/batch
Epoch 4/20  Iteration 654/3560 Training loss: 2.0468 2.3825 sec/batch
Epoch 4/20  Iteration 655/3560 Training loss: 2.0465 2.0901 sec/batch
Epoch 4/20  Iteration 656/3560 Training loss: 2.0460 2.2861 sec/batch
Epoch 4/20  Iteration 657/3560 Training loss: 2.0456 2.3974 sec/batch
Epoch 4/20  Iteration 658/3560 Training loss: 2.0454 1.9738 sec/batch
Epoch 4/20  Iteration 659/3560 Training loss: 2.0451 2.2795 sec/batch
Epoch 4/20  Iteration 660/3560 Training loss: 2.0444 2.0759 sec/batch
Epoch 4/20  Iteration 661/3560 Training loss: 2.0442 2.1983 sec/batch
Epoch 4/20  Iteration 662/3560 Training loss: 2.0439 2.1883 sec/batch
Epoch 4/20  Iteration 663/3560 Training loss: 2.0436 2.0849 sec/batch
Epoch 4/20  Iteration 664/3560 Training loss: 2.0432 2.1943 sec/batch
Epoch 4/20  Iteration 665/3560 Training loss: 2.0427 2.0909 sec/batch
Epoch 4/20  Iteration 666/3560 Training loss: 2.0421 2.2893 sec/batch
Epoch 4/20  Iteration 667/3560 Training loss: 2.0418 2.3409 sec/batch
Epoch 4/20  Iteration 668/3560 Training loss: 2.0415 2.1634 sec/batch
Epoch 4/20  Iteration 669/3560 Training loss: 2.0411 1.9517 sec/batch
Epoch 4/20  Iteration 670/3560 Training loss: 2.0409 1.9400 sec/batch
Epoch 4/20  Iteration 671/3560 Training loss: 2.0405 1.8998 sec/batch
Epoch 4/20  Iteration 672/3560 Training loss: 2.0402 1.9863 sec/batch
Epoch 4/20  Iteration 673/3560 Training loss: 2.0400 1.8207 sec/batch
Epoch 4/20  Iteration 674/3560 Training loss: 2.0396 2.7653 sec/batch
Epoch 4/20  Iteration 675/3560 Training loss: 2.0394 2.4558 sec/batch
Epoch 4/20  Iteration 676/3560 Training loss: 2.0390 2.1428 sec/batch
Epoch 4/20  Iteration 677/3560 Training loss: 2.0387 2.0608 sec/batch
Epoch 4/20  Iteration 678/3560 Training loss: 2.0384 2.1149 sec/batch
Epoch 4/20  Iteration 679/3560 Training loss: 2.0379 2.0222 sec/batch
Epoch 4/20  Iteration 680/3560 Training loss: 2.0377 2.1472 sec/batch
Epoch 4/20  Iteration 681/3560 Training loss: 2.0374 2.2062 sec/batch
Epoch 4/20  Iteration 682/3560 Training loss: 2.0372 1.9905 sec/batch
Epoch 4/20  Iteration 683/3560 Training loss: 2.0369 2.1554 sec/batch
Epoch 4/20  Iteration 684/3560 Training loss: 2.0365 2.4491 sec/batch
Epoch 4/20  Iteration 685/3560 Training loss: 2.0361 2.2204 sec/batch
Epoch 4/20  Iteration 686/3560 Training loss: 2.0360 2.0211 sec/batch
Epoch 4/20  Iteration 687/3560 Training loss: 2.0358 1.8551 sec/batch
Epoch 4/20  Iteration 688/3560 Training loss: 2.0355 1.9369 sec/batch
Epoch 4/20  Iteration 689/3560 Training loss: 2.0351 1.9570 sec/batch
Epoch 4/20  Iteration 690/3560 Training loss: 2.0347 1.9999 sec/batch
Epoch 4/20  Iteration 691/3560 Training loss: 2.0344 2.0343 sec/batch
Epoch 4/20  Iteration 692/3560 Training loss: 2.0341 1.9759 sec/batch
Epoch 4/20  Iteration 693/3560 Training loss: 2.0336 2.0976 sec/batch
Epoch 4/20  Iteration 694/3560 Training loss: 2.0335 2.0795 sec/batch
Epoch 4/20  Iteration 695/3560 Training loss: 2.0333 2.0407 sec/batch
Epoch 4/20  Iteration 696/3560 Training loss: 2.0329 1.9213 sec/batch
Epoch 4/20  Iteration 697/3560 Training loss: 2.0327 2.0202 sec/batch
Epoch 4/20  Iteration 698/3560 Training loss: 2.0324 2.3358 sec/batch
Epoch 4/20  Iteration 699/3560 Training loss: 2.0321 2.2943 sec/batch
Epoch 4/20  Iteration 700/3560 Training loss: 2.0318 2.0637 sec/batch
Epoch 4/20  Iteration 701/3560 Training loss: 2.0315 2.1917 sec/batch
Epoch 4/20  Iteration 702/3560 Training loss: 2.0314 2.3278 sec/batch
Epoch 4/20  Iteration 703/3560 Training loss: 2.0310 2.4039 sec/batch
Epoch 4/20  Iteration 704/3560 Training loss: 2.0307 2.6825 sec/batch
Epoch 4/20  Iteration 705/3560 Training loss: 2.0303 2.3224 sec/batch
Epoch 4/20  Iteration 706/3560 Training loss: 2.0299 2.1024 sec/batch
Epoch 4/20  Iteration 707/3560 Training loss: 2.0297 2.0750 sec/batch
Epoch 4/20  Iteration 708/3560 Training loss: 2.0294 1.8751 sec/batch
Epoch 4/20  Iteration 709/3560 Training loss: 2.0292 2.0616 sec/batch
Epoch 4/20  Iteration 710/3560 Training loss: 2.0288 2.0297 sec/batch
Epoch 4/20  Iteration 711/3560 Training loss: 2.0285 2.0566 sec/batch
Epoch 4/20  Iteration 712/3560 Training loss: 2.0282 2.0343 sec/batch
Epoch 5/20  Iteration 713/3560 Training loss: 2.0462 1.9328 sec/batch
Epoch 5/20  Iteration 714/3560 Training loss: 1.9992 2.0310 sec/batch
Epoch 5/20  Iteration 715/3560 Training loss: 1.9857 2.0078 sec/batch
Epoch 5/20  Iteration 716/3560 Training loss: 1.9767 2.5239 sec/batch
Epoch 5/20  Iteration 717/3560 Training loss: 1.9731 2.0767 sec/batch
Epoch 5/20  Iteration 718/3560 Training loss: 1.9650 2.2003 sec/batch
Epoch 5/20  Iteration 719/3560 Training loss: 1.9660 1.9253 sec/batch
Epoch 5/20  Iteration 720/3560 Training loss: 1.9661 2.2678 sec/batch
Epoch 5/20  Iteration 721/3560 Training loss: 1.9693 2.1621 sec/batch
Epoch 5/20  Iteration 722/3560 Training loss: 1.9687 1.9013 sec/batch
Epoch 5/20  Iteration 723/3560 Training loss: 1.9655 2.0235 sec/batch
Epoch 5/20  Iteration 724/3560 Training loss: 1.9636 2.0321 sec/batch
Epoch 5/20  Iteration 725/3560 Training loss: 1.9633 2.2480 sec/batch
Epoch 5/20  Iteration 726/3560 Training loss: 1.9662 2.0734 sec/batch
Epoch 5/20  Iteration 727/3560 Training loss: 1.9655 2.2360 sec/batch
Epoch 5/20  Iteration 728/3560 Training loss: 1.9643 2.4118 sec/batch
Epoch 5/20  Iteration 729/3560 Training loss: 1.9639 2.4460 sec/batch
Epoch 5/20  Iteration 730/3560 Training loss: 1.9660 2.2532 sec/batch
Epoch 5/20  Iteration 731/3560 Training loss: 1.9660 2.2708 sec/batch
Epoch 5/20  Iteration 732/3560 Training loss: 1.9660 2.5937 sec/batch
Epoch 5/20  Iteration 733/3560 Training loss: 1.9650 2.4745 sec/batch
Epoch 5/20  Iteration 734/3560 Training loss: 1.9659 1.8926 sec/batch
Epoch 5/20  Iteration 735/3560 Training loss: 1.9649 1.9347 sec/batch
Epoch 5/20  Iteration 736/3560 Training loss: 1.9642 1.9002 sec/batch
Epoch 5/20  Iteration 737/3560 Training loss: 1.9639 2.0497 sec/batch
Epoch 5/20  Iteration 738/3560 Training loss: 1.9629 2.4274 sec/batch
Epoch 5/20  Iteration 739/3560 Training loss: 1.9622 2.0098 sec/batch
Epoch 5/20  Iteration 740/3560 Training loss: 1.9624 1.9324 sec/batch
Epoch 5/20  Iteration 741/3560 Training loss: 1.9634 1.9052 sec/batch
Epoch 5/20  Iteration 742/3560 Training loss: 1.9633 2.0790 sec/batch
Epoch 5/20  Iteration 743/3560 Training loss: 1.9632 2.0325 sec/batch
Epoch 5/20  Iteration 744/3560 Training loss: 1.9624 1.9890 sec/batch
Epoch 5/20  Iteration 745/3560 Training loss: 1.9622 1.9028 sec/batch
Epoch 5/20  Iteration 746/3560 Training loss: 1.9629 1.9047 sec/batch
Epoch 5/20  Iteration 747/3560 Training loss: 1.9625 1.9256 sec/batch
Epoch 5/20  Iteration 748/3560 Training loss: 1.9620 1.9336 sec/batch
Epoch 5/20  Iteration 749/3560 Training loss: 1.9614 1.9413 sec/batch
Epoch 5/20  Iteration 750/3560 Training loss: 1.9604 2.0393 sec/batch
Epoch 5/20  Iteration 751/3560 Training loss: 1.9591 2.0211 sec/batch
Epoch 5/20  Iteration 752/3560 Training loss: 1.9580 2.0735 sec/batch
Epoch 5/20  Iteration 753/3560 Training loss: 1.9570 1.8899 sec/batch
Epoch 5/20  Iteration 754/3560 Training loss: 1.9569 1.8846 sec/batch
Epoch 5/20  Iteration 755/3560 Training loss: 1.9559 1.9093 sec/batch
Epoch 5/20  Iteration 756/3560 Training loss: 1.9547 2.0559 sec/batch
Epoch 5/20  Iteration 757/3560 Training loss: 1.9543 2.1111 sec/batch
Epoch 5/20  Iteration 758/3560 Training loss: 1.9528 1.8496 sec/batch
Epoch 5/20  Iteration 759/3560 Training loss: 1.9527 2.2962 sec/batch
Epoch 5/20  Iteration 760/3560 Training loss: 1.9519 2.1380 sec/batch
Epoch 5/20  Iteration 761/3560 Training loss: 1.9516 2.2423 sec/batch
Epoch 5/20  Iteration 762/3560 Training loss: 1.9521 2.1509 sec/batch
Epoch 5/20  Iteration 763/3560 Training loss: 1.9512 2.2652 sec/batch
Epoch 5/20  Iteration 764/3560 Training loss: 1.9518 2.6516 sec/batch
Epoch 5/20  Iteration 765/3560 Training loss: 1.9514 2.1923 sec/batch
Epoch 5/20  Iteration 766/3560 Training loss: 1.9509 2.1513 sec/batch
Epoch 5/20  Iteration 767/3560 Training loss: 1.9502 2.2733 sec/batch
Epoch 5/20  Iteration 768/3560 Training loss: 1.9499 2.5490 sec/batch
Epoch 5/20  Iteration 769/3560 Training loss: 1.9498 2.1569 sec/batch
Epoch 5/20  Iteration 770/3560 Training loss: 1.9494 2.1919 sec/batch
Epoch 5/20  Iteration 771/3560 Training loss: 1.9487 2.0366 sec/batch
Epoch 5/20  Iteration 772/3560 Training loss: 1.9489 2.2829 sec/batch
Epoch 5/20  Iteration 773/3560 Training loss: 1.9485 2.7535 sec/batch
Epoch 5/20  Iteration 774/3560 Training loss: 1.9489 2.8784 sec/batch
Epoch 5/20  Iteration 775/3560 Training loss: 1.9490 2.2241 sec/batch
Epoch 5/20  Iteration 776/3560 Training loss: 1.9487 2.3184 sec/batch
Epoch 5/20  Iteration 777/3560 Training loss: 1.9483 2.2455 sec/batch
Epoch 5/20  Iteration 778/3560 Training loss: 1.9483 2.4694 sec/batch
Epoch 5/20  Iteration 779/3560 Training loss: 1.9481 1.9623 sec/batch
Epoch 5/20  Iteration 780/3560 Training loss: 1.9474 2.1813 sec/batch
Epoch 5/20  Iteration 781/3560 Training loss: 1.9469 2.4708 sec/batch
Epoch 5/20  Iteration 782/3560 Training loss: 1.9465 2.3821 sec/batch
Epoch 5/20  Iteration 783/3560 Training loss: 1.9467 2.0347 sec/batch
Epoch 5/20  Iteration 784/3560 Training loss: 1.9466 2.0903 sec/batch
Epoch 5/20  Iteration 785/3560 Training loss: 1.9466 1.9794 sec/batch
Epoch 5/20  Iteration 786/3560 Training loss: 1.9461 2.2617 sec/batch
Epoch 5/20  Iteration 787/3560 Training loss: 1.9459 2.3185 sec/batch
Epoch 5/20  Iteration 788/3560 Training loss: 1.9460 2.1215 sec/batch
Epoch 5/20  Iteration 789/3560 Training loss: 1.9456 2.0303 sec/batch
Epoch 5/20  Iteration 790/3560 Training loss: 1.9455 2.4386 sec/batch
Epoch 5/20  Iteration 791/3560 Training loss: 1.9449 2.2241 sec/batch
Epoch 5/20  Iteration 792/3560 Training loss: 1.9444 2.0794 sec/batch
Epoch 5/20  Iteration 793/3560 Training loss: 1.9437 2.1904 sec/batch
Epoch 5/20  Iteration 794/3560 Training loss: 1.9435 2.1700 sec/batch
Epoch 5/20  Iteration 795/3560 Training loss: 1.9428 2.5529 sec/batch
Epoch 5/20  Iteration 796/3560 Training loss: 1.9424 2.4878 sec/batch
Epoch 5/20  Iteration 797/3560 Training loss: 1.9416 2.0120 sec/batch
Epoch 5/20  Iteration 798/3560 Training loss: 1.9411 1.9249 sec/batch
Epoch 5/20  Iteration 799/3560 Training loss: 1.9407 1.9573 sec/batch
Epoch 5/20  Iteration 800/3560 Training loss: 1.9402 2.0705 sec/batch
Epoch 5/20  Iteration 801/3560 Training loss: 1.9395 2.0688 sec/batch
Epoch 5/20  Iteration 802/3560 Training loss: 1.9394 1.9284 sec/batch
Epoch 5/20  Iteration 803/3560 Training loss: 1.9388 2.1813 sec/batch
Epoch 5/20  Iteration 804/3560 Training loss: 1.9385 2.2808 sec/batch
Epoch 5/20  Iteration 805/3560 Training loss: 1.9379 2.1461 sec/batch
Epoch 5/20  Iteration 806/3560 Training loss: 1.9373 2.0436 sec/batch
Epoch 5/20  Iteration 807/3560 Training loss: 1.9369 2.1115 sec/batch
Epoch 5/20  Iteration 808/3560 Training loss: 1.9366 2.1731 sec/batch
Epoch 5/20  Iteration 809/3560 Training loss: 1.9363 2.0951 sec/batch
Epoch 5/20  Iteration 810/3560 Training loss: 1.9356 2.2815 sec/batch
Epoch 5/20  Iteration 811/3560 Training loss: 1.9350 2.3378 sec/batch
Epoch 5/20  Iteration 812/3560 Training loss: 1.9343 2.1962 sec/batch
Epoch 5/20  Iteration 813/3560 Training loss: 1.9341 2.2269 sec/batch
Epoch 5/20  Iteration 814/3560 Training loss: 1.9338 1.9333 sec/batch
Epoch 5/20  Iteration 815/3560 Training loss: 1.9334 1.9173 sec/batch
Epoch 5/20  Iteration 816/3560 Training loss: 1.9329 2.1020 sec/batch
Epoch 5/20  Iteration 817/3560 Training loss: 1.9325 2.1324 sec/batch
Epoch 5/20  Iteration 818/3560 Training loss: 1.9322 2.2378 sec/batch
Epoch 5/20  Iteration 819/3560 Training loss: 1.9319 2.1547 sec/batch
Epoch 5/20  Iteration 820/3560 Training loss: 1.9316 2.3346 sec/batch
Epoch 5/20  Iteration 821/3560 Training loss: 1.9315 2.7844 sec/batch
Epoch 5/20  Iteration 822/3560 Training loss: 1.9311 2.4889 sec/batch
Epoch 5/20  Iteration 823/3560 Training loss: 1.9309 2.0322 sec/batch
Epoch 5/20  Iteration 824/3560 Training loss: 1.9306 2.1335 sec/batch
Epoch 5/20  Iteration 825/3560 Training loss: 1.9302 2.2319 sec/batch
Epoch 5/20  Iteration 826/3560 Training loss: 1.9299 2.0347 sec/batch
Epoch 5/20  Iteration 827/3560 Training loss: 1.9294 1.8919 sec/batch
Epoch 5/20  Iteration 828/3560 Training loss: 1.9288 1.9902 sec/batch
Epoch 5/20  Iteration 829/3560 Training loss: 1.9286 2.0836 sec/batch
Epoch 5/20  Iteration 830/3560 Training loss: 1.9283 2.1302 sec/batch
Epoch 5/20  Iteration 831/3560 Training loss: 1.9281 2.0795 sec/batch
Epoch 5/20  Iteration 832/3560 Training loss: 1.9278 1.9988 sec/batch
Epoch 5/20  Iteration 833/3560 Training loss: 1.9276 1.8974 sec/batch
Epoch 5/20  Iteration 834/3560 Training loss: 1.9270 1.9360 sec/batch
Epoch 5/20  Iteration 835/3560 Training loss: 1.9267 1.9615 sec/batch
Epoch 5/20  Iteration 836/3560 Training loss: 1.9266 1.9130 sec/batch
Epoch 5/20  Iteration 837/3560 Training loss: 1.9263 1.8940 sec/batch
Epoch 5/20  Iteration 838/3560 Training loss: 1.9257 1.8902 sec/batch
Epoch 5/20  Iteration 839/3560 Training loss: 1.9255 1.9911 sec/batch
Epoch 5/20  Iteration 840/3560 Training loss: 1.9253 1.9703 sec/batch
Epoch 5/20  Iteration 841/3560 Training loss: 1.9250 1.8819 sec/batch
Epoch 5/20  Iteration 842/3560 Training loss: 1.9248 1.8401 sec/batch
Epoch 5/20  Iteration 843/3560 Training loss: 1.9244 2.0528 sec/batch
Epoch 5/20  Iteration 844/3560 Training loss: 1.9239 2.2121 sec/batch
Epoch 5/20  Iteration 845/3560 Training loss: 1.9238 1.9411 sec/batch
Epoch 5/20  Iteration 846/3560 Training loss: 1.9236 1.8756 sec/batch
Epoch 5/20  Iteration 847/3560 Training loss: 1.9234 1.8480 sec/batch
Epoch 5/20  Iteration 848/3560 Training loss: 1.9232 1.8971 sec/batch
Epoch 5/20  Iteration 849/3560 Training loss: 1.9231 2.0182 sec/batch
Epoch 5/20  Iteration 850/3560 Training loss: 1.9229 2.3755 sec/batch
Epoch 5/20  Iteration 851/3560 Training loss: 1.9229 2.5544 sec/batch
Epoch 5/20  Iteration 852/3560 Training loss: 1.9226 2.4949 sec/batch
Epoch 5/20  Iteration 853/3560 Training loss: 1.9225 2.4094 sec/batch
Epoch 5/20  Iteration 854/3560 Training loss: 1.9222 1.9588 sec/batch
Epoch 5/20  Iteration 855/3560 Training loss: 1.9219 2.0948 sec/batch
Epoch 5/20  Iteration 856/3560 Training loss: 1.9216 1.9180 sec/batch
Epoch 5/20  Iteration 857/3560 Training loss: 1.9213 1.9769 sec/batch
Epoch 5/20  Iteration 858/3560 Training loss: 1.9212 1.9612 sec/batch
Epoch 5/20  Iteration 859/3560 Training loss: 1.9210 2.1380 sec/batch
Epoch 5/20  Iteration 860/3560 Training loss: 1.9209 2.0421 sec/batch
Epoch 5/20  Iteration 861/3560 Training loss: 1.9207 2.2252 sec/batch
Epoch 5/20  Iteration 862/3560 Training loss: 1.9204 2.5323 sec/batch
Epoch 5/20  Iteration 863/3560 Training loss: 1.9200 2.0683 sec/batch
Epoch 5/20  Iteration 864/3560 Training loss: 1.9200 2.2681 sec/batch
Epoch 5/20  Iteration 865/3560 Training loss: 1.9198 2.3152 sec/batch
Epoch 5/20  Iteration 866/3560 Training loss: 1.9195 2.2022 sec/batch
Epoch 5/20  Iteration 867/3560 Training loss: 1.9193 1.9600 sec/batch
Epoch 5/20  Iteration 868/3560 Training loss: 1.9191 2.2565 sec/batch
Epoch 5/20  Iteration 869/3560 Training loss: 1.9189 2.0724 sec/batch
Epoch 5/20  Iteration 870/3560 Training loss: 1.9187 1.9326 sec/batch
Epoch 5/20  Iteration 871/3560 Training loss: 1.9183 1.9103 sec/batch
Epoch 5/20  Iteration 872/3560 Training loss: 1.9182 1.8444 sec/batch
Epoch 5/20  Iteration 873/3560 Training loss: 1.9182 1.8646 sec/batch
Epoch 5/20  Iteration 874/3560 Training loss: 1.9180 1.9094 sec/batch
Epoch 5/20  Iteration 875/3560 Training loss: 1.9179 2.2012 sec/batch
Epoch 5/20  Iteration 876/3560 Training loss: 1.9177 2.6937 sec/batch
Epoch 5/20  Iteration 877/3560 Training loss: 1.9176 2.2891 sec/batch
Epoch 5/20  Iteration 878/3560 Training loss: 1.9173 2.0210 sec/batch
Epoch 5/20  Iteration 879/3560 Training loss: 1.9172 2.0023 sec/batch
Epoch 5/20  Iteration 880/3560 Training loss: 1.9173 2.1005 sec/batch
Epoch 5/20  Iteration 881/3560 Training loss: 1.9171 2.3005 sec/batch
Epoch 5/20  Iteration 882/3560 Training loss: 1.9169 2.2093 sec/batch
Epoch 5/20  Iteration 883/3560 Training loss: 1.9166 2.2006 sec/batch
Epoch 5/20  Iteration 884/3560 Training loss: 1.9163 2.1059 sec/batch
Epoch 5/20  Iteration 885/3560 Training loss: 1.9162 2.0914 sec/batch
Epoch 5/20  Iteration 886/3560 Training loss: 1.9160 1.9214 sec/batch
Epoch 5/20  Iteration 887/3560 Training loss: 1.9159 2.2208 sec/batch
Epoch 5/20  Iteration 888/3560 Training loss: 1.9156 2.1861 sec/batch
Epoch 5/20  Iteration 889/3560 Training loss: 1.9153 2.5472 sec/batch
Epoch 5/20  Iteration 890/3560 Training loss: 1.9151 2.3502 sec/batch
Epoch 6/20  Iteration 891/3560 Training loss: 1.9761 1.9664 sec/batch
Epoch 6/20  Iteration 892/3560 Training loss: 1.9181 1.9457 sec/batch
Epoch 6/20  Iteration 893/3560 Training loss: 1.8988 2.1196 sec/batch
Epoch 6/20  Iteration 894/3560 Training loss: 1.8914 2.2050 sec/batch
Epoch 6/20  Iteration 895/3560 Training loss: 1.8870 1.8795 sec/batch
Epoch 6/20  Iteration 896/3560 Training loss: 1.8775 2.4028 sec/batch
Epoch 6/20  Iteration 897/3560 Training loss: 1.8762 2.1851 sec/batch
Epoch 6/20  Iteration 898/3560 Training loss: 1.8748 2.2058 sec/batch
Epoch 6/20  Iteration 899/3560 Training loss: 1.8770 1.9266 sec/batch
Epoch 6/20  Iteration 900/3560 Training loss: 1.8759 2.0986 sec/batch
Epoch 6/20  Iteration 901/3560 Training loss: 1.8728 2.0023 sec/batch
Epoch 6/20  Iteration 902/3560 Training loss: 1.8708 2.1743 sec/batch
Epoch 6/20  Iteration 903/3560 Training loss: 1.8698 2.0000 sec/batch
Epoch 6/20  Iteration 904/3560 Training loss: 1.8720 1.9109 sec/batch
Epoch 6/20  Iteration 905/3560 Training loss: 1.8707 1.8553 sec/batch
Epoch 6/20  Iteration 906/3560 Training loss: 1.8689 1.9208 sec/batch
Epoch 6/20  Iteration 907/3560 Training loss: 1.8684 2.0856 sec/batch
Epoch 6/20  Iteration 908/3560 Training loss: 1.8704 1.9139 sec/batch
Epoch 6/20  Iteration 909/3560 Training loss: 1.8701 1.8893 sec/batch
Epoch 6/20  Iteration 910/3560 Training loss: 1.8706 1.8457 sec/batch
Epoch 6/20  Iteration 911/3560 Training loss: 1.8698 1.9506 sec/batch
Epoch 6/20  Iteration 912/3560 Training loss: 1.8705 1.9542 sec/batch
Epoch 6/20  Iteration 913/3560 Training loss: 1.8698 1.8506 sec/batch
Epoch 6/20  Iteration 914/3560 Training loss: 1.8694 1.8642 sec/batch
Epoch 6/20  Iteration 915/3560 Training loss: 1.8692 1.8530 sec/batch
Epoch 6/20  Iteration 916/3560 Training loss: 1.8680 1.9670 sec/batch
Epoch 6/20  Iteration 917/3560 Training loss: 1.8669 2.2734 sec/batch
Epoch 6/20  Iteration 918/3560 Training loss: 1.8673 1.9069 sec/batch
Epoch 6/20  Iteration 919/3560 Training loss: 1.8681 1.8137 sec/batch
Epoch 6/20  Iteration 920/3560 Training loss: 1.8681 1.8346 sec/batch
Epoch 6/20  Iteration 921/3560 Training loss: 1.8682 1.8324 sec/batch
Epoch 6/20  Iteration 922/3560 Training loss: 1.8673 1.8457 sec/batch
Epoch 6/20  Iteration 923/3560 Training loss: 1.8675 1.8845 sec/batch
Epoch 6/20  Iteration 924/3560 Training loss: 1.8681 2.0207 sec/batch
Epoch 6/20  Iteration 925/3560 Training loss: 1.8673 1.8399 sec/batch
Epoch 6/20  Iteration 926/3560 Training loss: 1.8668 1.8441 sec/batch
Epoch 6/20  Iteration 927/3560 Training loss: 1.8663 1.8608 sec/batch
Epoch 6/20  Iteration 928/3560 Training loss: 1.8652 1.9847 sec/batch
Epoch 6/20  Iteration 929/3560 Training loss: 1.8639 1.8450 sec/batch
Epoch 6/20  Iteration 930/3560 Training loss: 1.8628 1.8751 sec/batch
Epoch 6/20  Iteration 931/3560 Training loss: 1.8620 1.9240 sec/batch
Epoch 6/20  Iteration 932/3560 Training loss: 1.8621 1.9370 sec/batch
Epoch 6/20  Iteration 933/3560 Training loss: 1.8614 1.8091 sec/batch
Epoch 6/20  Iteration 934/3560 Training loss: 1.8605 1.8457 sec/batch
Epoch 6/20  Iteration 935/3560 Training loss: 1.8604 2.0811 sec/batch
Epoch 6/20  Iteration 936/3560 Training loss: 1.8591 2.3449 sec/batch
Epoch 6/20  Iteration 937/3560 Training loss: 1.8586 2.4044 sec/batch
Epoch 6/20  Iteration 938/3560 Training loss: 1.8579 2.4650 sec/batch
Epoch 6/20  Iteration 939/3560 Training loss: 1.8574 2.0715 sec/batch
Epoch 6/20  Iteration 940/3560 Training loss: 1.8581 2.3898 sec/batch
Epoch 6/20  Iteration 941/3560 Training loss: 1.8574 1.9317 sec/batch
Epoch 6/20  Iteration 942/3560 Training loss: 1.8582 1.8798 sec/batch
Epoch 6/20  Iteration 943/3560 Training loss: 1.8579 1.8726 sec/batch
Epoch 6/20  Iteration 944/3560 Training loss: 1.8576 1.8225 sec/batch
Epoch 6/20  Iteration 945/3560 Training loss: 1.8570 1.8598 sec/batch
Epoch 6/20  Iteration 946/3560 Training loss: 1.8569 1.9764 sec/batch
Epoch 6/20  Iteration 947/3560 Training loss: 1.8570 1.9638 sec/batch
Epoch 6/20  Iteration 948/3560 Training loss: 1.8566 2.0932 sec/batch
Epoch 6/20  Iteration 949/3560 Training loss: 1.8561 2.4716 sec/batch
Epoch 6/20  Iteration 950/3560 Training loss: 1.8563 2.7549 sec/batch
Epoch 6/20  Iteration 951/3560 Training loss: 1.8560 2.6475 sec/batch
Epoch 6/20  Iteration 952/3560 Training loss: 1.8565 2.3175 sec/batch
Epoch 6/20  Iteration 953/3560 Training loss: 1.8568 2.1164 sec/batch
Epoch 6/20  Iteration 954/3560 Training loss: 1.8568 1.9955 sec/batch
Epoch 6/20  Iteration 955/3560 Training loss: 1.8565 2.1862 sec/batch
Epoch 6/20  Iteration 956/3560 Training loss: 1.8567 2.2067 sec/batch
Epoch 6/20  Iteration 957/3560 Training loss: 1.8568 1.9549 sec/batch
Epoch 6/20  Iteration 958/3560 Training loss: 1.8562 2.5056 sec/batch
Epoch 6/20  Iteration 959/3560 Training loss: 1.8559 2.2425 sec/batch
Epoch 6/20  Iteration 960/3560 Training loss: 1.8557 1.9442 sec/batch
Epoch 6/20  Iteration 961/3560 Training loss: 1.8559 2.0499 sec/batch
Epoch 6/20  Iteration 962/3560 Training loss: 1.8559 1.9959 sec/batch
Epoch 6/20  Iteration 963/3560 Training loss: 1.8561 1.8565 sec/batch
Epoch 6/20  Iteration 964/3560 Training loss: 1.8556 1.9161 sec/batch
Epoch 6/20  Iteration 965/3560 Training loss: 1.8555 2.1477 sec/batch
Epoch 6/20  Iteration 966/3560 Training loss: 1.8555 2.4231 sec/batch
Epoch 6/20  Iteration 967/3560 Training loss: 1.8553 2.3860 sec/batch
Epoch 6/20  Iteration 968/3560 Training loss: 1.8552 2.0077 sec/batch
Epoch 6/20  Iteration 969/3560 Training loss: 1.8545 2.0338 sec/batch
Epoch 6/20  Iteration 970/3560 Training loss: 1.8541 2.1701 sec/batch
Epoch 6/20  Iteration 971/3560 Training loss: 1.8535 2.0975 sec/batch
Epoch 6/20  Iteration 972/3560 Training loss: 1.8535 1.9633 sec/batch
Epoch 6/20  Iteration 973/3560 Training loss: 1.8529 1.8982 sec/batch
Epoch 6/20  Iteration 974/3560 Training loss: 1.8526 1.9764 sec/batch
Epoch 6/20  Iteration 975/3560 Training loss: 1.8520 2.1478 sec/batch
Epoch 6/20  Iteration 976/3560 Training loss: 1.8515 1.8919 sec/batch
Epoch 6/20  Iteration 977/3560 Training loss: 1.8512 1.8635 sec/batch
Epoch 6/20  Iteration 978/3560 Training loss: 1.8507 1.8997 sec/batch
Epoch 6/20  Iteration 979/3560 Training loss: 1.8502 1.8958 sec/batch
Epoch 6/20  Iteration 980/3560 Training loss: 1.8501 1.8446 sec/batch
Epoch 6/20  Iteration 981/3560 Training loss: 1.8495 1.8948 sec/batch
Epoch 6/20  Iteration 982/3560 Training loss: 1.8494 1.9019 sec/batch
Epoch 6/20  Iteration 983/3560 Training loss: 1.8488 1.8315 sec/batch
Epoch 6/20  Iteration 984/3560 Training loss: 1.8483 1.8686 sec/batch
Epoch 6/20  Iteration 985/3560 Training loss: 1.8480 1.8563 sec/batch
Epoch 6/20  Iteration 986/3560 Training loss: 1.8478 1.8332 sec/batch
Epoch 6/20  Iteration 987/3560 Training loss: 1.8476 1.8100 sec/batch
Epoch 6/20  Iteration 988/3560 Training loss: 1.8471 1.8597 sec/batch
Epoch 6/20  Iteration 989/3560 Training loss: 1.8466 1.8617 sec/batch
Epoch 6/20  Iteration 990/3560 Training loss: 1.8460 1.7944 sec/batch
Epoch 6/20  Iteration 991/3560 Training loss: 1.8458 1.9154 sec/batch
Epoch 6/20  Iteration 992/3560 Training loss: 1.8455 1.8516 sec/batch
Epoch 6/20  Iteration 993/3560 Training loss: 1.8452 1.7984 sec/batch
Epoch 6/20  Iteration 994/3560 Training loss: 1.8448 1.8081 sec/batch
Epoch 6/20  Iteration 995/3560 Training loss: 1.8444 1.8160 sec/batch
Epoch 6/20  Iteration 996/3560 Training loss: 1.8442 2.0147 sec/batch
Epoch 6/20  Iteration 997/3560 Training loss: 1.8439 1.8548 sec/batch
Epoch 6/20  Iteration 998/3560 Training loss: 1.8437 1.9268 sec/batch
Epoch 6/20  Iteration 999/3560 Training loss: 1.8437 1.8387 sec/batch
Epoch 6/20  Iteration 1000/3560 Training loss: 1.8434 1.7470 sec/batch
Epoch 6/20  Iteration 1001/3560 Training loss: 1.8432 1.9359 sec/batch
Epoch 6/20  Iteration 1002/3560 Training loss: 1.8429 1.9253 sec/batch
Epoch 6/20  Iteration 1003/3560 Training loss: 1.8427 1.7708 sec/batch
Epoch 6/20  Iteration 1004/3560 Training loss: 1.8424 1.7431 sec/batch
Epoch 6/20  Iteration 1005/3560 Training loss: 1.8419 1.7246 sec/batch
Epoch 6/20  Iteration 1006/3560 Training loss: 1.8414 1.7603 sec/batch
Epoch 6/20  Iteration 1007/3560 Training loss: 1.8413 1.7332 sec/batch
Epoch 6/20  Iteration 1008/3560 Training loss: 1.8411 1.7750 sec/batch
Epoch 6/20  Iteration 1009/3560 Training loss: 1.8410 1.7144 sec/batch
Epoch 6/20  Iteration 1010/3560 Training loss: 1.8408 1.7549 sec/batch
Epoch 6/20  Iteration 1011/3560 Training loss: 1.8406 1.7328 sec/batch
Epoch 6/20  Iteration 1012/3560 Training loss: 1.8402 1.7032 sec/batch
Epoch 6/20  Iteration 1013/3560 Training loss: 1.8400 1.8720 sec/batch
Epoch 6/20  Iteration 1014/3560 Training loss: 1.8400 1.7986 sec/batch
Epoch 6/20  Iteration 1015/3560 Training loss: 1.8397 1.7339 sec/batch
Epoch 6/20  Iteration 1016/3560 Training loss: 1.8392 1.7450 sec/batch
Epoch 6/20  Iteration 1017/3560 Training loss: 1.8392 1.7743 sec/batch
Epoch 6/20  Iteration 1018/3560 Training loss: 1.8391 1.9491 sec/batch
Epoch 6/20  Iteration 1019/3560 Training loss: 1.8389 1.8869 sec/batch
Epoch 6/20  Iteration 1020/3560 Training loss: 1.8386 1.7384 sec/batch
Epoch 6/20  Iteration 1021/3560 Training loss: 1.8381 1.9219 sec/batch
Epoch 6/20  Iteration 1022/3560 Training loss: 1.8378 1.8036 sec/batch
Epoch 6/20  Iteration 1023/3560 Training loss: 1.8377 1.7953 sec/batch
Epoch 6/20  Iteration 1024/3560 Training loss: 1.8375 1.7684 sec/batch
Epoch 6/20  Iteration 1025/3560 Training loss: 1.8373 1.7589 sec/batch
Epoch 6/20  Iteration 1026/3560 Training loss: 1.8373 1.7558 sec/batch
Epoch 6/20  Iteration 1027/3560 Training loss: 1.8372 1.7219 sec/batch
Epoch 6/20  Iteration 1028/3560 Training loss: 1.8372 1.7821 sec/batch
Epoch 6/20  Iteration 1029/3560 Training loss: 1.8372 1.8532 sec/batch
Epoch 6/20  Iteration 1030/3560 Training loss: 1.8369 1.8269 sec/batch
Epoch 6/20  Iteration 1031/3560 Training loss: 1.8370 1.7398 sec/batch
Epoch 6/20  Iteration 1032/3560 Training loss: 1.8367 1.7171 sec/batch
Epoch 6/20  Iteration 1033/3560 Training loss: 1.8365 1.7466 sec/batch
Epoch 6/20  Iteration 1034/3560 Training loss: 1.8364 1.7586 sec/batch
Epoch 6/20  Iteration 1035/3560 Training loss: 1.8361 1.8588 sec/batch
Epoch 6/20  Iteration 1036/3560 Training loss: 1.8361 1.8184 sec/batch
Epoch 6/20  Iteration 1037/3560 Training loss: 1.8360 1.7538 sec/batch
Epoch 6/20  Iteration 1038/3560 Training loss: 1.8360 1.7776 sec/batch
Epoch 6/20  Iteration 1039/3560 Training loss: 1.8359 1.7549 sec/batch
Epoch 6/20  Iteration 1040/3560 Training loss: 1.8356 1.9636 sec/batch
Epoch 6/20  Iteration 1041/3560 Training loss: 1.8353 1.9113 sec/batch
Epoch 6/20  Iteration 1042/3560 Training loss: 1.8353 1.7497 sec/batch
Epoch 6/20  Iteration 1043/3560 Training loss: 1.8352 1.7218 sec/batch
Epoch 6/20  Iteration 1044/3560 Training loss: 1.8350 1.7361 sec/batch
Epoch 6/20  Iteration 1045/3560 Training loss: 1.8348 1.7451 sec/batch
Epoch 6/20  Iteration 1046/3560 Training loss: 1.8346 1.7117 sec/batch
Epoch 6/20  Iteration 1047/3560 Training loss: 1.8345 1.7962 sec/batch
Epoch 6/20  Iteration 1048/3560 Training loss: 1.8343 1.7470 sec/batch
Epoch 6/20  Iteration 1049/3560 Training loss: 1.8339 1.7712 sec/batch
Epoch 6/20  Iteration 1050/3560 Training loss: 1.8339 1.7430 sec/batch
Epoch 6/20  Iteration 1051/3560 Training loss: 1.8340 1.7428 sec/batch
Epoch 6/20  Iteration 1052/3560 Training loss: 1.8338 1.7628 sec/batch
Epoch 6/20  Iteration 1053/3560 Training loss: 1.8337 1.7729 sec/batch
Epoch 6/20  Iteration 1054/3560 Training loss: 1.8335 1.7223 sec/batch
Epoch 6/20  Iteration 1055/3560 Training loss: 1.8334 1.7454 sec/batch
Epoch 6/20  Iteration 1056/3560 Training loss: 1.8332 1.7504 sec/batch
Epoch 6/20  Iteration 1057/3560 Training loss: 1.8330 1.7782 sec/batch
Epoch 6/20  Iteration 1058/3560 Training loss: 1.8332 1.8216 sec/batch
Epoch 6/20  Iteration 1059/3560 Training loss: 1.8330 1.7595 sec/batch
Epoch 6/20  Iteration 1060/3560 Training loss: 1.8328 1.9320 sec/batch
Epoch 6/20  Iteration 1061/3560 Training loss: 1.8325 1.9329 sec/batch
Epoch 6/20  Iteration 1062/3560 Training loss: 1.8323 1.8793 sec/batch
Epoch 6/20  Iteration 1063/3560 Training loss: 1.8322 2.0155 sec/batch
Epoch 6/20  Iteration 1064/3560 Training loss: 1.8321 1.8717 sec/batch
Epoch 6/20  Iteration 1065/3560 Training loss: 1.8320 1.8216 sec/batch
Epoch 6/20  Iteration 1066/3560 Training loss: 1.8319 1.8921 sec/batch
Epoch 6/20  Iteration 1067/3560 Training loss: 1.8316 1.8333 sec/batch
Epoch 6/20  Iteration 1068/3560 Training loss: 1.8315 1.9140 sec/batch
Epoch 7/20  Iteration 1069/3560 Training loss: 1.8871 1.8073 sec/batch
Epoch 7/20  Iteration 1070/3560 Training loss: 1.8403 1.9898 sec/batch
Epoch 7/20  Iteration 1071/3560 Training loss: 1.8225 1.7383 sec/batch
Epoch 7/20  Iteration 1072/3560 Training loss: 1.8148 1.7260 sec/batch
Epoch 7/20  Iteration 1073/3560 Training loss: 1.8106 2.0806 sec/batch
Epoch 7/20  Iteration 1074/3560 Training loss: 1.8009 1.8632 sec/batch
Epoch 7/20  Iteration 1075/3560 Training loss: 1.8015 1.8448 sec/batch
Epoch 7/20  Iteration 1076/3560 Training loss: 1.8001 1.8378 sec/batch
Epoch 7/20  Iteration 1077/3560 Training loss: 1.8034 1.8364 sec/batch
Epoch 7/20  Iteration 1078/3560 Training loss: 1.8027 1.8050 sec/batch
Epoch 7/20  Iteration 1079/3560 Training loss: 1.7992 2.0929 sec/batch
Epoch 7/20  Iteration 1080/3560 Training loss: 1.7966 2.1304 sec/batch
Epoch 7/20  Iteration 1081/3560 Training loss: 1.7959 1.9064 sec/batch
Epoch 7/20  Iteration 1082/3560 Training loss: 1.7984 1.9938 sec/batch
Epoch 7/20  Iteration 1083/3560 Training loss: 1.7974 1.8555 sec/batch
Epoch 7/20  Iteration 1084/3560 Training loss: 1.7959 2.0433 sec/batch
Epoch 7/20  Iteration 1085/3560 Training loss: 1.7962 1.9896 sec/batch
Epoch 7/20  Iteration 1086/3560 Training loss: 1.7980 2.0824 sec/batch
Epoch 7/20  Iteration 1087/3560 Training loss: 1.7976 1.8272 sec/batch
Epoch 7/20  Iteration 1088/3560 Training loss: 1.7980 2.0757 sec/batch
Epoch 7/20  Iteration 1089/3560 Training loss: 1.7964 2.0414 sec/batch
Epoch 7/20  Iteration 1090/3560 Training loss: 1.7971 2.0511 sec/batch
Epoch 7/20  Iteration 1091/3560 Training loss: 1.7961 1.8823 sec/batch
Epoch 7/20  Iteration 1092/3560 Training loss: 1.7956 1.9892 sec/batch
Epoch 7/20  Iteration 1093/3560 Training loss: 1.7956 1.8863 sec/batch
Epoch 7/20  Iteration 1094/3560 Training loss: 1.7944 1.7946 sec/batch
Epoch 7/20  Iteration 1095/3560 Training loss: 1.7931 1.9604 sec/batch
Epoch 7/20  Iteration 1096/3560 Training loss: 1.7934 1.9887 sec/batch
Epoch 7/20  Iteration 1097/3560 Training loss: 1.7940 2.0876 sec/batch
Epoch 7/20  Iteration 1098/3560 Training loss: 1.7944 1.9730 sec/batch
Epoch 7/20  Iteration 1099/3560 Training loss: 1.7941 1.9756 sec/batch
Epoch 7/20  Iteration 1100/3560 Training loss: 1.7931 2.3645 sec/batch
Epoch 7/20  Iteration 1101/3560 Training loss: 1.7931 1.9634 sec/batch
Epoch 7/20  Iteration 1102/3560 Training loss: 1.7938 1.8213 sec/batch
Epoch 7/20  Iteration 1103/3560 Training loss: 1.7933 1.8679 sec/batch
Epoch 7/20  Iteration 1104/3560 Training loss: 1.7929 2.0744 sec/batch
Epoch 7/20  Iteration 1105/3560 Training loss: 1.7923 2.3047 sec/batch
Epoch 7/20  Iteration 1106/3560 Training loss: 1.7910 1.9237 sec/batch
Epoch 7/20  Iteration 1107/3560 Training loss: 1.7896 1.9079 sec/batch
Epoch 7/20  Iteration 1108/3560 Training loss: 1.7886 1.7531 sec/batch
Epoch 7/20  Iteration 1109/3560 Training loss: 1.7880 1.8216 sec/batch
Epoch 7/20  Iteration 1110/3560 Training loss: 1.7880 1.7734 sec/batch
Epoch 7/20  Iteration 1111/3560 Training loss: 1.7873 1.7843 sec/batch
Epoch 7/20  Iteration 1112/3560 Training loss: 1.7865 1.7437 sec/batch
Epoch 7/20  Iteration 1113/3560 Training loss: 1.7864 1.7205 sec/batch
Epoch 7/20  Iteration 1114/3560 Training loss: 1.7853 1.7633 sec/batch
Epoch 7/20  Iteration 1115/3560 Training loss: 1.7851 1.7641 sec/batch
Epoch 7/20  Iteration 1116/3560 Training loss: 1.7844 1.7388 sec/batch
Epoch 7/20  Iteration 1117/3560 Training loss: 1.7841 1.7274 sec/batch
Epoch 7/20  Iteration 1118/3560 Training loss: 1.7847 1.7214 sec/batch
Epoch 7/20  Iteration 1119/3560 Training loss: 1.7840 1.7356 sec/batch
Epoch 7/20  Iteration 1120/3560 Training loss: 1.7848 1.8131 sec/batch
Epoch 7/20  Iteration 1121/3560 Training loss: 1.7847 1.7947 sec/batch
Epoch 7/20  Iteration 1122/3560 Training loss: 1.7843 1.7167 sec/batch
Epoch 7/20  Iteration 1123/3560 Training loss: 1.7839 1.7568 sec/batch
Epoch 7/20  Iteration 1124/3560 Training loss: 1.7838 1.7632 sec/batch
Epoch 7/20  Iteration 1125/3560 Training loss: 1.7840 1.8508 sec/batch
Epoch 7/20  Iteration 1126/3560 Training loss: 1.7838 1.9478 sec/batch
Epoch 7/20  Iteration 1127/3560 Training loss: 1.7833 1.8366 sec/batch
Epoch 7/20  Iteration 1128/3560 Training loss: 1.7838 1.7383 sec/batch
Epoch 7/20  Iteration 1129/3560 Training loss: 1.7835 1.7473 sec/batch
Epoch 7/20  Iteration 1130/3560 Training loss: 1.7842 1.7605 sec/batch
Epoch 7/20  Iteration 1131/3560 Training loss: 1.7845 1.7588 sec/batch
Epoch 7/20  Iteration 1132/3560 Training loss: 1.7846 1.7200 sec/batch
Epoch 7/20  Iteration 1133/3560 Training loss: 1.7844 1.7586 sec/batch
Epoch 7/20  Iteration 1134/3560 Training loss: 1.7845 1.7445 sec/batch
Epoch 7/20  Iteration 1135/3560 Training loss: 1.7846 1.7123 sec/batch
Epoch 7/20  Iteration 1136/3560 Training loss: 1.7842 1.7391 sec/batch
Epoch 7/20  Iteration 1137/3560 Training loss: 1.7839 1.7252 sec/batch
Epoch 7/20  Iteration 1138/3560 Training loss: 1.7838 1.7252 sec/batch
Epoch 7/20  Iteration 1139/3560 Training loss: 1.7841 1.7561 sec/batch
Epoch 7/20  Iteration 1140/3560 Training loss: 1.7841 1.7387 sec/batch
Epoch 7/20  Iteration 1141/3560 Training loss: 1.7843 1.7161 sec/batch
Epoch 7/20  Iteration 1142/3560 Training loss: 1.7838 1.7643 sec/batch
Epoch 7/20  Iteration 1143/3560 Training loss: 1.7837 1.8624 sec/batch
Epoch 7/20  Iteration 1144/3560 Training loss: 1.7839 1.8216 sec/batch
Epoch 7/20  Iteration 1145/3560 Training loss: 1.7836 1.7511 sec/batch
Epoch 7/20  Iteration 1146/3560 Training loss: 1.7834 1.7618 sec/batch
Epoch 7/20  Iteration 1147/3560 Training loss: 1.7828 1.7448 sec/batch
Epoch 7/20  Iteration 1148/3560 Training loss: 1.7824 1.9571 sec/batch
Epoch 7/20  Iteration 1149/3560 Training loss: 1.7819 1.8993 sec/batch
Epoch 7/20  Iteration 1150/3560 Training loss: 1.7821 1.7348 sec/batch
Epoch 7/20  Iteration 1151/3560 Training loss: 1.7814 1.7657 sec/batch
Epoch 7/20  Iteration 1152/3560 Training loss: 1.7812 1.9966 sec/batch
Epoch 7/20  Iteration 1153/3560 Training loss: 1.7806 1.8371 sec/batch
Epoch 7/20  Iteration 1154/3560 Training loss: 1.7802 1.8067 sec/batch
Epoch 7/20  Iteration 1155/3560 Training loss: 1.7801 1.8931 sec/batch
Epoch 7/20  Iteration 1156/3560 Training loss: 1.7797 1.8471 sec/batch
Epoch 7/20  Iteration 1157/3560 Training loss: 1.7791 2.0313 sec/batch
Epoch 7/20  Iteration 1158/3560 Training loss: 1.7791 1.9940 sec/batch
Epoch 7/20  Iteration 1159/3560 Training loss: 1.7786 1.9911 sec/batch
Epoch 7/20  Iteration 1160/3560 Training loss: 1.7783 2.0036 sec/batch
Epoch 7/20  Iteration 1161/3560 Training loss: 1.7778 2.0029 sec/batch
Epoch 7/20  Iteration 1162/3560 Training loss: 1.7774 2.0949 sec/batch
Epoch 7/20  Iteration 1163/3560 Training loss: 1.7770 1.9314 sec/batch
Epoch 7/20  Iteration 1164/3560 Training loss: 1.7768 1.9936 sec/batch
Epoch 7/20  Iteration 1165/3560 Training loss: 1.7765 1.8249 sec/batch
Epoch 7/20  Iteration 1166/3560 Training loss: 1.7760 1.7592 sec/batch
Epoch 7/20  Iteration 1167/3560 Training loss: 1.7757 1.7770 sec/batch
Epoch 7/20  Iteration 1168/3560 Training loss: 1.7751 1.7550 sec/batch
Epoch 7/20  Iteration 1169/3560 Training loss: 1.7749 1.9479 sec/batch
Epoch 7/20  Iteration 1170/3560 Training loss: 1.7747 1.9245 sec/batch
Epoch 7/20  Iteration 1171/3560 Training loss: 1.7743 1.7517 sec/batch
Epoch 7/20  Iteration 1172/3560 Training loss: 1.7740 1.7155 sec/batch
Epoch 7/20  Iteration 1173/3560 Training loss: 1.7737 1.7528 sec/batch
Epoch 7/20  Iteration 1174/3560 Training loss: 1.7736 1.7191 sec/batch
Epoch 7/20  Iteration 1175/3560 Training loss: 1.7734 1.7345 sec/batch
Epoch 7/20  Iteration 1176/3560 Training loss: 1.7733 1.9841 sec/batch
Epoch 7/20  Iteration 1177/3560 Training loss: 1.7732 1.9214 sec/batch
Epoch 7/20  Iteration 1178/3560 Training loss: 1.7731 1.7407 sec/batch
Epoch 7/20  Iteration 1179/3560 Training loss: 1.7729 1.7459 sec/batch
Epoch 7/20  Iteration 1180/3560 Training loss: 1.7726 1.8961 sec/batch
Epoch 7/20  Iteration 1181/3560 Training loss: 1.7724 1.9126 sec/batch
Epoch 7/20  Iteration 1182/3560 Training loss: 1.7722 1.7437 sec/batch
Epoch 7/20  Iteration 1183/3560 Training loss: 1.7719 1.7504 sec/batch
Epoch 7/20  Iteration 1184/3560 Training loss: 1.7715 1.7504 sec/batch
Epoch 7/20  Iteration 1185/3560 Training loss: 1.7714 1.7326 sec/batch
Epoch 7/20  Iteration 1186/3560 Training loss: 1.7712 1.8517 sec/batch
Epoch 7/20  Iteration 1187/3560 Training loss: 1.7711 1.8136 sec/batch
Epoch 7/20  Iteration 1188/3560 Training loss: 1.7710 1.7277 sec/batch
Epoch 7/20  Iteration 1189/3560 Training loss: 1.7709 1.7670 sec/batch
Epoch 7/20  Iteration 1190/3560 Training loss: 1.7705 1.7593 sec/batch
Epoch 7/20  Iteration 1191/3560 Training loss: 1.7700 1.8850 sec/batch
Epoch 7/20  Iteration 1192/3560 Training loss: 1.7701 1.9464 sec/batch
Epoch 7/20  Iteration 1193/3560 Training loss: 1.7700 1.7206 sec/batch
Epoch 7/20  Iteration 1194/3560 Training loss: 1.7696 1.7694 sec/batch
Epoch 7/20  Iteration 1195/3560 Training loss: 1.7696 1.7665 sec/batch
Epoch 7/20  Iteration 1196/3560 Training loss: 1.7696 1.7557 sec/batch
Epoch 7/20  Iteration 1197/3560 Training loss: 1.7694 1.7624 sec/batch
Epoch 7/20  Iteration 1198/3560 Training loss: 1.7692 1.7274 sec/batch
Epoch 7/20  Iteration 1199/3560 Training loss: 1.7688 1.7723 sec/batch
Epoch 7/20  Iteration 1200/3560 Training loss: 1.7685 1.7583 sec/batch
Epoch 7/20  Iteration 1201/3560 Training loss: 1.7684 1.7086 sec/batch
Epoch 7/20  Iteration 1202/3560 Training loss: 1.7684 1.7588 sec/batch
Epoch 7/20  Iteration 1203/3560 Training loss: 1.7683 1.7610 sec/batch
Epoch 7/20  Iteration 1204/3560 Training loss: 1.7682 1.8067 sec/batch
Epoch 7/20  Iteration 1205/3560 Training loss: 1.7683 1.9682 sec/batch
Epoch 7/20  Iteration 1206/3560 Training loss: 1.7682 2.0100 sec/batch
Epoch 7/20  Iteration 1207/3560 Training loss: 1.7682 1.9362 sec/batch
Epoch 7/20  Iteration 1208/3560 Training loss: 1.7681 1.9232 sec/batch
Epoch 7/20  Iteration 1209/3560 Training loss: 1.7682 2.0145 sec/batch
Epoch 7/20  Iteration 1210/3560 Training loss: 1.7679 1.9229 sec/batch
Epoch 7/20  Iteration 1211/3560 Training loss: 1.7678 1.8634 sec/batch
Epoch 7/20  Iteration 1212/3560 Training loss: 1.7677 1.8766 sec/batch
Epoch 7/20  Iteration 1213/3560 Training loss: 1.7675 1.9964 sec/batch
Epoch 7/20  Iteration 1214/3560 Training loss: 1.7675 2.5376 sec/batch
Epoch 7/20  Iteration 1215/3560 Training loss: 1.7674 2.0138 sec/batch
Epoch 7/20  Iteration 1216/3560 Training loss: 1.7675 1.7898 sec/batch
Epoch 7/20  Iteration 1217/3560 Training loss: 1.7674 1.9228 sec/batch
Epoch 7/20  Iteration 1218/3560 Training loss: 1.7671 1.7961 sec/batch
Epoch 7/20  Iteration 1219/3560 Training loss: 1.7668 2.1404 sec/batch
Epoch 7/20  Iteration 1220/3560 Training loss: 1.7668 2.3059 sec/batch
Epoch 7/20  Iteration 1221/3560 Training loss: 1.7667 1.9829 sec/batch
Epoch 7/20  Iteration 1222/3560 Training loss: 1.7666 2.0469 sec/batch
Epoch 7/20  Iteration 1223/3560 Training loss: 1.7665 1.9629 sec/batch
Epoch 7/20  Iteration 1224/3560 Training loss: 1.7663 1.9166 sec/batch
Epoch 7/20  Iteration 1225/3560 Training loss: 1.7663 1.7829 sec/batch
Epoch 7/20  Iteration 1226/3560 Training loss: 1.7663 2.0809 sec/batch
Epoch 7/20  Iteration 1227/3560 Training loss: 1.7659 1.8571 sec/batch
Epoch 7/20  Iteration 1228/3560 Training loss: 1.7659 1.7790 sec/batch
Epoch 7/20  Iteration 1229/3560 Training loss: 1.7660 1.9304 sec/batch
Epoch 7/20  Iteration 1230/3560 Training loss: 1.7659 1.9237 sec/batch
Epoch 7/20  Iteration 1231/3560 Training loss: 1.7658 2.0220 sec/batch
Epoch 7/20  Iteration 1232/3560 Training loss: 1.7657 1.9073 sec/batch
Epoch 7/20  Iteration 1233/3560 Training loss: 1.7656 2.1592 sec/batch
Epoch 7/20  Iteration 1234/3560 Training loss: 1.7655 2.2751 sec/batch
Epoch 7/20  Iteration 1235/3560 Training loss: 1.7655 1.8881 sec/batch
Epoch 7/20  Iteration 1236/3560 Training loss: 1.7657 1.7392 sec/batch
Epoch 7/20  Iteration 1237/3560 Training loss: 1.7655 1.8664 sec/batch
Epoch 7/20  Iteration 1238/3560 Training loss: 1.7654 3.0749 sec/batch
Epoch 7/20  Iteration 1239/3560 Training loss: 1.7652 2.5360 sec/batch
Epoch 7/20  Iteration 1240/3560 Training loss: 1.7649 2.1281 sec/batch
Epoch 7/20  Iteration 1241/3560 Training loss: 1.7649 1.8712 sec/batch
Epoch 7/20  Iteration 1242/3560 Training loss: 1.7649 1.9082 sec/batch
Epoch 7/20  Iteration 1243/3560 Training loss: 1.7649 1.9864 sec/batch
Epoch 7/20  Iteration 1244/3560 Training loss: 1.7647 2.3108 sec/batch
Epoch 7/20  Iteration 1245/3560 Training loss: 1.7644 2.3795 sec/batch
Epoch 7/20  Iteration 1246/3560 Training loss: 1.7644 2.6662 sec/batch
Epoch 8/20  Iteration 1247/3560 Training loss: 1.8373 2.8743 sec/batch
Epoch 8/20  Iteration 1248/3560 Training loss: 1.7865 2.9320 sec/batch
Epoch 8/20  Iteration 1249/3560 Training loss: 1.7695 2.9278 sec/batch
Epoch 8/20  Iteration 1250/3560 Training loss: 1.7606 2.6834 sec/batch
Epoch 8/20  Iteration 1251/3560 Training loss: 1.7565 2.9266 sec/batch
Epoch 8/20  Iteration 1252/3560 Training loss: 1.7453 3.3921 sec/batch
Epoch 8/20  Iteration 1253/3560 Training loss: 1.7441 3.0579 sec/batch
Epoch 8/20  Iteration 1254/3560 Training loss: 1.7432 2.5925 sec/batch
Epoch 8/20  Iteration 1255/3560 Training loss: 1.7450 2.5805 sec/batch
Epoch 8/20  Iteration 1256/3560 Training loss: 1.7437 2.5567 sec/batch
Epoch 8/20  Iteration 1257/3560 Training loss: 1.7401 2.3245 sec/batch
Epoch 8/20  Iteration 1258/3560 Training loss: 1.7393 1.9191 sec/batch
Epoch 8/20  Iteration 1259/3560 Training loss: 1.7386 1.9124 sec/batch
Epoch 8/20  Iteration 1260/3560 Training loss: 1.7412 1.8995 sec/batch
Epoch 8/20  Iteration 1261/3560 Training loss: 1.7410 1.8932 sec/batch
Epoch 8/20  Iteration 1262/3560 Training loss: 1.7391 1.8690 sec/batch
Epoch 8/20  Iteration 1263/3560 Training loss: 1.7391 1.8598 sec/batch
Epoch 8/20  Iteration 1264/3560 Training loss: 1.7405 1.8627 sec/batch
Epoch 8/20  Iteration 1265/3560 Training loss: 1.7404 1.8280 sec/batch
Epoch 8/20  Iteration 1266/3560 Training loss: 1.7405 1.9832 sec/batch
Epoch 8/20  Iteration 1267/3560 Training loss: 1.7395 1.9448 sec/batch
Epoch 8/20  Iteration 1268/3560 Training loss: 1.7404 1.9017 sec/batch
Epoch 8/20  Iteration 1269/3560 Training loss: 1.7393 1.8640 sec/batch
Epoch 8/20  Iteration 1270/3560 Training loss: 1.7388 1.9948 sec/batch
Epoch 8/20  Iteration 1271/3560 Training loss: 1.7385 2.2218 sec/batch
Epoch 8/20  Iteration 1272/3560 Training loss: 1.7370 2.0723 sec/batch
Epoch 8/20  Iteration 1273/3560 Training loss: 1.7362 2.1974 sec/batch
Epoch 8/20  Iteration 1274/3560 Training loss: 1.7365 2.2689 sec/batch
Epoch 8/20  Iteration 1275/3560 Training loss: 1.7371 2.1293 sec/batch
Epoch 8/20  Iteration 1276/3560 Training loss: 1.7376 1.9026 sec/batch
Epoch 8/20  Iteration 1277/3560 Training loss: 1.7374 1.8368 sec/batch
Epoch 8/20  Iteration 1278/3560 Training loss: 1.7364 1.8501 sec/batch
Epoch 8/20  Iteration 1279/3560 Training loss: 1.7367 1.8471 sec/batch
Epoch 8/20  Iteration 1280/3560 Training loss: 1.7372 1.8402 sec/batch
Epoch 8/20  Iteration 1281/3560 Training loss: 1.7367 1.8988 sec/batch
Epoch 8/20  Iteration 1282/3560 Training loss: 1.7364 1.8404 sec/batch
Epoch 8/20  Iteration 1283/3560 Training loss: 1.7355 1.8551 sec/batch
Epoch 8/20  Iteration 1284/3560 Training loss: 1.7345 1.9813 sec/batch
Epoch 8/20  Iteration 1285/3560 Training loss: 1.7329 1.8708 sec/batch
Epoch 8/20  Iteration 1286/3560 Training loss: 1.7322 1.8482 sec/batch
Epoch 8/20  Iteration 1287/3560 Training loss: 1.7316 1.8133 sec/batch
Epoch 8/20  Iteration 1288/3560 Training loss: 1.7319 1.9744 sec/batch
Epoch 8/20  Iteration 1289/3560 Training loss: 1.7314 1.9398 sec/batch
Epoch 8/20  Iteration 1290/3560 Training loss: 1.7304 1.8754 sec/batch
Epoch 8/20  Iteration 1291/3560 Training loss: 1.7305 1.8609 sec/batch
Epoch 8/20  Iteration 1292/3560 Training loss: 1.7293 1.9285 sec/batch
Epoch 8/20  Iteration 1293/3560 Training loss: 1.7289 2.0368 sec/batch
Epoch 8/20  Iteration 1294/3560 Training loss: 1.7285 1.9352 sec/batch
Epoch 8/20  Iteration 1295/3560 Training loss: 1.7283 1.8465 sec/batch
Epoch 8/20  Iteration 1296/3560 Training loss: 1.7290 1.8242 sec/batch
Epoch 8/20  Iteration 1297/3560 Training loss: 1.7284 1.8580 sec/batch
Epoch 8/20  Iteration 1298/3560 Training loss: 1.7294 1.8105 sec/batch
Epoch 8/20  Iteration 1299/3560 Training loss: 1.7291 1.9695 sec/batch
Epoch 8/20  Iteration 1300/3560 Training loss: 1.7291 1.8725 sec/batch
Epoch 8/20  Iteration 1301/3560 Training loss: 1.7288 1.8626 sec/batch
Epoch 8/20  Iteration 1302/3560 Training loss: 1.7287 1.8305 sec/batch
Epoch 8/20  Iteration 1303/3560 Training loss: 1.7289 1.8367 sec/batch
Epoch 8/20  Iteration 1304/3560 Training loss: 1.7286 2.1977 sec/batch
Epoch 8/20  Iteration 1305/3560 Training loss: 1.7281 1.8865 sec/batch
Epoch 8/20  Iteration 1306/3560 Training loss: 1.7285 1.8548 sec/batch
Epoch 8/20  Iteration 1307/3560 Training loss: 1.7282 2.0218 sec/batch
Epoch 8/20  Iteration 1308/3560 Training loss: 1.7291 1.8083 sec/batch
Epoch 8/20  Iteration 1309/3560 Training loss: 1.7296 1.8257 sec/batch
Epoch 8/20  Iteration 1310/3560 Training loss: 1.7297 2.0040 sec/batch
Epoch 8/20  Iteration 1311/3560 Training loss: 1.7295 1.9339 sec/batch
Epoch 8/20  Iteration 1312/3560 Training loss: 1.7296 1.8604 sec/batch
Epoch 8/20  Iteration 1313/3560 Training loss: 1.7297 1.8686 sec/batch
Epoch 8/20  Iteration 1314/3560 Training loss: 1.7292 1.9306 sec/batch
Epoch 8/20  Iteration 1315/3560 Training loss: 1.7290 2.0084 sec/batch
Epoch 8/20  Iteration 1316/3560 Training loss: 1.7289 1.9047 sec/batch
Epoch 8/20  Iteration 1317/3560 Training loss: 1.7293 1.8503 sec/batch
Epoch 8/20  Iteration 1318/3560 Training loss: 1.7295 1.8375 sec/batch
Epoch 8/20  Iteration 1319/3560 Training loss: 1.7298 1.8339 sec/batch
Epoch 8/20  Iteration 1320/3560 Training loss: 1.7295 1.8491 sec/batch
Epoch 8/20  Iteration 1321/3560 Training loss: 1.7294 1.8695 sec/batch
Epoch 8/20  Iteration 1322/3560 Training loss: 1.7296 1.8932 sec/batch
Epoch 8/20  Iteration 1323/3560 Training loss: 1.7294 1.8888 sec/batch
Epoch 8/20  Iteration 1324/3560 Training loss: 1.7293 1.9013 sec/batch
Epoch 8/20  Iteration 1325/3560 Training loss: 1.7288 1.8471 sec/batch
Epoch 8/20  Iteration 1326/3560 Training loss: 1.7286 1.8708 sec/batch
Epoch 8/20  Iteration 1327/3560 Training loss: 1.7280 1.8325 sec/batch
Epoch 8/20  Iteration 1328/3560 Training loss: 1.7280 1.8170 sec/batch
Epoch 8/20  Iteration 1329/3560 Training loss: 1.7274 1.8113 sec/batch
Epoch 8/20  Iteration 1330/3560 Training loss: 1.7271 1.8533 sec/batch
Epoch 8/20  Iteration 1331/3560 Training loss: 1.7266 1.8082 sec/batch
Epoch 8/20  Iteration 1332/3560 Training loss: 1.7262 1.9486 sec/batch
Epoch 8/20  Iteration 1333/3560 Training loss: 1.7259 2.4099 sec/batch
Epoch 8/20  Iteration 1334/3560 Training loss: 1.7255 2.0684 sec/batch
Epoch 8/20  Iteration 1335/3560 Training loss: 1.7248 2.1431 sec/batch
Epoch 8/20  Iteration 1336/3560 Training loss: 1.7249 2.1086 sec/batch
Epoch 8/20  Iteration 1337/3560 Training loss: 1.7245 2.3913 sec/batch
Epoch 8/20  Iteration 1338/3560 Training loss: 1.7242 2.1085 sec/batch
Epoch 8/20  Iteration 1339/3560 Training loss: 1.7237 1.8904 sec/batch
Epoch 8/20  Iteration 1340/3560 Training loss: 1.7233 1.8311 sec/batch
Epoch 8/20  Iteration 1341/3560 Training loss: 1.7230 1.8195 sec/batch
Epoch 8/20  Iteration 1342/3560 Training loss: 1.7228 1.8007 sec/batch
Epoch 8/20  Iteration 1343/3560 Training loss: 1.7225 2.2048 sec/batch
Epoch 8/20  Iteration 1344/3560 Training loss: 1.7221 2.5076 sec/batch
Epoch 8/20  Iteration 1345/3560 Training loss: 1.7216 1.8341 sec/batch
Epoch 8/20  Iteration 1346/3560 Training loss: 1.7211 2.1723 sec/batch
Epoch 8/20  Iteration 1347/3560 Training loss: 1.7209 2.1434 sec/batch
Epoch 8/20  Iteration 1348/3560 Training loss: 1.7207 1.8654 sec/batch
Epoch 8/20  Iteration 1349/3560 Training loss: 1.7205 1.8614 sec/batch
Epoch 8/20  Iteration 1350/3560 Training loss: 1.7202 1.8155 sec/batch
Epoch 8/20  Iteration 1351/3560 Training loss: 1.7200 1.8435 sec/batch
Epoch 8/20  Iteration 1352/3560 Training loss: 1.7198 1.7941 sec/batch
Epoch 8/20  Iteration 1353/3560 Training loss: 1.7196 1.9309 sec/batch
Epoch 8/20  Iteration 1354/3560 Training loss: 1.7194 1.9332 sec/batch
Epoch 8/20  Iteration 1355/3560 Training loss: 1.7193 1.9806 sec/batch
Epoch 8/20  Iteration 1356/3560 Training loss: 1.7192 2.4217 sec/batch
Epoch 8/20  Iteration 1357/3560 Training loss: 1.7191 2.8487 sec/batch
Epoch 8/20  Iteration 1358/3560 Training loss: 1.7189 2.1785 sec/batch
Epoch 8/20  Iteration 1359/3560 Training loss: 1.7186 1.9487 sec/batch
Epoch 8/20  Iteration 1360/3560 Training loss: 1.7184 1.9236 sec/batch
Epoch 8/20  Iteration 1361/3560 Training loss: 1.7181 1.9157 sec/batch
Epoch 8/20  Iteration 1362/3560 Training loss: 1.7177 1.8706 sec/batch
Epoch 8/20  Iteration 1363/3560 Training loss: 1.7176 1.8903 sec/batch
Epoch 8/20  Iteration 1364/3560 Training loss: 1.7174 1.9013 sec/batch
Epoch 8/20  Iteration 1365/3560 Training loss: 1.7172 1.8805 sec/batch
Epoch 8/20  Iteration 1366/3560 Training loss: 1.7170 1.8890 sec/batch
Epoch 8/20  Iteration 1367/3560 Training loss: 1.7169 1.9026 sec/batch
Epoch 8/20  Iteration 1368/3560 Training loss: 1.7165 1.9108 sec/batch
Epoch 8/20  Iteration 1369/3560 Training loss: 1.7161 1.8868 sec/batch
Epoch 8/20  Iteration 1370/3560 Training loss: 1.7161 1.8586 sec/batch
Epoch 8/20  Iteration 1371/3560 Training loss: 1.7160 1.8773 sec/batch
Epoch 8/20  Iteration 1372/3560 Training loss: 1.7156 1.8638 sec/batch
Epoch 8/20  Iteration 1373/3560 Training loss: 1.7156 1.8404 sec/batch
Epoch 8/20  Iteration 1374/3560 Training loss: 1.7156 1.9885 sec/batch
Epoch 8/20  Iteration 1375/3560 Training loss: 1.7154 1.9151 sec/batch
Epoch 8/20  Iteration 1376/3560 Training loss: 1.7151 1.8822 sec/batch
Epoch 8/20  Iteration 1377/3560 Training loss: 1.7147 1.8816 sec/batch
Epoch 8/20  Iteration 1378/3560 Training loss: 1.7143 1.9652 sec/batch
Epoch 8/20  Iteration 1379/3560 Training loss: 1.7144 2.0776 sec/batch
Epoch 8/20  Iteration 1380/3560 Training loss: 1.7143 1.9524 sec/batch
Epoch 8/20  Iteration 1381/3560 Training loss: 1.7143 1.8424 sec/batch
Epoch 8/20  Iteration 1382/3560 Training loss: 1.7143 1.8632 sec/batch
Epoch 8/20  Iteration 1383/3560 Training loss: 1.7144 1.8515 sec/batch
Epoch 8/20  Iteration 1384/3560 Training loss: 1.7143 1.8764 sec/batch
Epoch 8/20  Iteration 1385/3560 Training loss: 1.7143 2.1480 sec/batch
Epoch 8/20  Iteration 1386/3560 Training loss: 1.7142 2.1905 sec/batch
Epoch 8/20  Iteration 1387/3560 Training loss: 1.7144 1.8870 sec/batch
Epoch 8/20  Iteration 1388/3560 Training loss: 1.7142 1.8807 sec/batch
Epoch 8/20  Iteration 1389/3560 Training loss: 1.7141 2.4645 sec/batch
Epoch 8/20  Iteration 1390/3560 Training loss: 1.7141 2.1727 sec/batch
Epoch 8/20  Iteration 1391/3560 Training loss: 1.7139 1.8492 sec/batch
Epoch 8/20  Iteration 1392/3560 Training loss: 1.7140 1.8472 sec/batch
Epoch 8/20  Iteration 1393/3560 Training loss: 1.7139 2.1380 sec/batch
Epoch 8/20  Iteration 1394/3560 Training loss: 1.7139 1.9797 sec/batch
Epoch 8/20  Iteration 1395/3560 Training loss: 1.7138 1.9712 sec/batch
Epoch 8/20  Iteration 1396/3560 Training loss: 1.7136 1.9418 sec/batch
Epoch 8/20  Iteration 1397/3560 Training loss: 1.7133 1.8738 sec/batch
Epoch 8/20  Iteration 1398/3560 Training loss: 1.7133 1.8320 sec/batch
Epoch 8/20  Iteration 1399/3560 Training loss: 1.7132 1.8499 sec/batch
Epoch 8/20  Iteration 1400/3560 Training loss: 1.7132 2.5981 sec/batch
Epoch 8/20  Iteration 1401/3560 Training loss: 1.7131 2.7435 sec/batch
Epoch 8/20  Iteration 1402/3560 Training loss: 1.7130 2.0496 sec/batch
Epoch 8/20  Iteration 1403/3560 Training loss: 1.7130 2.3416 sec/batch
Epoch 8/20  Iteration 1404/3560 Training loss: 1.7129 2.0745 sec/batch
Epoch 8/20  Iteration 1405/3560 Training loss: 1.7126 2.1343 sec/batch
Epoch 8/20  Iteration 1406/3560 Training loss: 1.7126 2.0096 sec/batch
Epoch 8/20  Iteration 1407/3560 Training loss: 1.7128 2.0652 sec/batch
Epoch 8/20  Iteration 1408/3560 Training loss: 1.7127 2.0752 sec/batch
Epoch 8/20  Iteration 1409/3560 Training loss: 1.7126 2.0571 sec/batch
Epoch 8/20  Iteration 1410/3560 Training loss: 1.7125 2.0015 sec/batch
Epoch 8/20  Iteration 1411/3560 Training loss: 1.7124 1.8741 sec/batch
Epoch 8/20  Iteration 1412/3560 Training loss: 1.7123 1.8220 sec/batch
Epoch 8/20  Iteration 1413/3560 Training loss: 1.7123 1.8136 sec/batch
Epoch 8/20  Iteration 1414/3560 Training loss: 1.7126 1.8489 sec/batch
Epoch 8/20  Iteration 1415/3560 Training loss: 1.7125 2.0489 sec/batch
Epoch 8/20  Iteration 1416/3560 Training loss: 1.7124 2.0433 sec/batch
Epoch 8/20  Iteration 1417/3560 Training loss: 1.7122 2.0542 sec/batch
Epoch 8/20  Iteration 1418/3560 Training loss: 1.7119 2.1142 sec/batch
Epoch 8/20  Iteration 1419/3560 Training loss: 1.7120 1.8893 sec/batch
Epoch 8/20  Iteration 1420/3560 Training loss: 1.7119 2.0203 sec/batch
Epoch 8/20  Iteration 1421/3560 Training loss: 1.7119 2.0430 sec/batch
Epoch 8/20  Iteration 1422/3560 Training loss: 1.7118 1.8448 sec/batch
Epoch 8/20  Iteration 1423/3560 Training loss: 1.7116 1.8535 sec/batch
Epoch 8/20  Iteration 1424/3560 Training loss: 1.7115 1.8230 sec/batch
Epoch 9/20  Iteration 1425/3560 Training loss: 1.7948 1.8523 sec/batch
Epoch 9/20  Iteration 1426/3560 Training loss: 1.7400 2.2032 sec/batch
Epoch 9/20  Iteration 1427/3560 Training loss: 1.7207 2.2276 sec/batch
Epoch 9/20  Iteration 1428/3560 Training loss: 1.7135 1.9123 sec/batch
Epoch 9/20  Iteration 1429/3560 Training loss: 1.7088 1.8507 sec/batch
Epoch 9/20  Iteration 1430/3560 Training loss: 1.6988 1.8382 sec/batch
Epoch 9/20  Iteration 1431/3560 Training loss: 1.6999 1.8670 sec/batch
Epoch 9/20  Iteration 1432/3560 Training loss: 1.6969 1.8425 sec/batch
Epoch 9/20  Iteration 1433/3560 Training loss: 1.6984 1.7979 sec/batch
Epoch 9/20  Iteration 1434/3560 Training loss: 1.6983 1.7953 sec/batch
Epoch 9/20  Iteration 1435/3560 Training loss: 1.6952 1.8438 sec/batch
Epoch 9/20  Iteration 1436/3560 Training loss: 1.6940 1.8525 sec/batch
Epoch 9/20  Iteration 1437/3560 Training loss: 1.6936 2.6160 sec/batch
Epoch 9/20  Iteration 1438/3560 Training loss: 1.6957 2.5526 sec/batch
Epoch 9/20  Iteration 1439/3560 Training loss: 1.6943 2.2138 sec/batch
Epoch 9/20  Iteration 1440/3560 Training loss: 1.6920 2.1740 sec/batch
Epoch 9/20  Iteration 1441/3560 Training loss: 1.6918 2.3644 sec/batch
Epoch 9/20  Iteration 1442/3560 Training loss: 1.6929 1.9705 sec/batch
Epoch 9/20  Iteration 1443/3560 Training loss: 1.6929 1.8927 sec/batch
Epoch 9/20  Iteration 1444/3560 Training loss: 1.6934 1.8664 sec/batch
Epoch 9/20  Iteration 1445/3560 Training loss: 1.6923 2.1274 sec/batch
Epoch 9/20  Iteration 1446/3560 Training loss: 1.6933 1.9413 sec/batch
Epoch 9/20  Iteration 1447/3560 Training loss: 1.6927 2.1662 sec/batch
Epoch 9/20  Iteration 1448/3560 Training loss: 1.6924 2.1805 sec/batch
Epoch 9/20  Iteration 1449/3560 Training loss: 1.6925 1.8417 sec/batch
Epoch 9/20  Iteration 1450/3560 Training loss: 1.6910 2.1304 sec/batch
Epoch 9/20  Iteration 1451/3560 Training loss: 1.6899 2.0239 sec/batch
Epoch 9/20  Iteration 1452/3560 Training loss: 1.6903 2.1518 sec/batch
Epoch 9/20  Iteration 1453/3560 Training loss: 1.6909 1.9659 sec/batch
Epoch 9/20  Iteration 1454/3560 Training loss: 1.6908 1.8541 sec/batch
Epoch 9/20  Iteration 1455/3560 Training loss: 1.6906 1.8474 sec/batch
Epoch 9/20  Iteration 1456/3560 Training loss: 1.6901 2.1846 sec/batch
Epoch 9/20  Iteration 1457/3560 Training loss: 1.6903 2.3262 sec/batch
Epoch 9/20  Iteration 1458/3560 Training loss: 1.6908 2.6029 sec/batch
Epoch 9/20  Iteration 1459/3560 Training loss: 1.6901 1.9122 sec/batch
Epoch 9/20  Iteration 1460/3560 Training loss: 1.6899 1.9342 sec/batch
Epoch 9/20  Iteration 1461/3560 Training loss: 1.6891 1.9631 sec/batch
Epoch 9/20  Iteration 1462/3560 Training loss: 1.6880 2.0595 sec/batch
Epoch 9/20  Iteration 1463/3560 Training loss: 1.6868 1.9583 sec/batch
Epoch 9/20  Iteration 1464/3560 Training loss: 1.6860 1.8641 sec/batch
Epoch 9/20  Iteration 1465/3560 Training loss: 1.6853 1.8631 sec/batch
Epoch 9/20  Iteration 1466/3560 Training loss: 1.6858 1.8509 sec/batch
Epoch 9/20  Iteration 1467/3560 Training loss: 1.6854 1.8649 sec/batch
Epoch 9/20  Iteration 1468/3560 Training loss: 1.6844 1.8818 sec/batch
Epoch 9/20  Iteration 1469/3560 Training loss: 1.6845 2.0706 sec/batch
Epoch 9/20  Iteration 1470/3560 Training loss: 1.6835 1.9029 sec/batch
Epoch 9/20  Iteration 1471/3560 Training loss: 1.6832 1.8801 sec/batch
Epoch 9/20  Iteration 1472/3560 Training loss: 1.6827 1.8104 sec/batch
Epoch 9/20  Iteration 1473/3560 Training loss: 1.6825 1.8798 sec/batch
Epoch 9/20  Iteration 1474/3560 Training loss: 1.6831 2.0881 sec/batch
Epoch 9/20  Iteration 1475/3560 Training loss: 1.6825 1.9640 sec/batch
Epoch 9/20  Iteration 1476/3560 Training loss: 1.6834 1.8716 sec/batch
Epoch 9/20  Iteration 1477/3560 Training loss: 1.6833 2.0299 sec/batch
Epoch 9/20  Iteration 1478/3560 Training loss: 1.6831 1.9511 sec/batch
Epoch 9/20  Iteration 1479/3560 Training loss: 1.6828 2.0989 sec/batch
Epoch 9/20  Iteration 1480/3560 Training loss: 1.6825 2.2203 sec/batch
Epoch 9/20  Iteration 1481/3560 Training loss: 1.6829 2.0786 sec/batch
Epoch 9/20  Iteration 1482/3560 Training loss: 1.6826 3.0723 sec/batch
Epoch 9/20  Iteration 1483/3560 Training loss: 1.6820 2.7284 sec/batch
Epoch 9/20  Iteration 1484/3560 Training loss: 1.6825 2.8976 sec/batch
Epoch 9/20  Iteration 1485/3560 Training loss: 1.6821 3.0947 sec/batch
Epoch 9/20  Iteration 1486/3560 Training loss: 1.6830 3.3202 sec/batch
Epoch 9/20  Iteration 1487/3560 Training loss: 1.6833 2.5139 sec/batch
Epoch 9/20  Iteration 1488/3560 Training loss: 1.6834 2.5878 sec/batch
Epoch 9/20  Iteration 1489/3560 Training loss: 1.6832 2.5404 sec/batch
Epoch 9/20  Iteration 1490/3560 Training loss: 1.6833 2.6455 sec/batch
Epoch 9/20  Iteration 1491/3560 Training loss: 1.6835 2.4822 sec/batch
Epoch 9/20  Iteration 1492/3560 Training loss: 1.6830 2.2018 sec/batch
Epoch 9/20  Iteration 1493/3560 Training loss: 1.6829 2.2778 sec/batch
Epoch 9/20  Iteration 1494/3560 Training loss: 1.6827 2.4520 sec/batch
Epoch 9/20  Iteration 1495/3560 Training loss: 1.6831 2.9725 sec/batch
Epoch 9/20  Iteration 1496/3560 Training loss: 1.6833 2.5596 sec/batch
Epoch 9/20  Iteration 1497/3560 Training loss: 1.6836 2.4540 sec/batch
Epoch 9/20  Iteration 1498/3560 Training loss: 1.6833 2.4576 sec/batch
Epoch 9/20  Iteration 1499/3560 Training loss: 1.6833 2.8818 sec/batch
Epoch 9/20  Iteration 1500/3560 Training loss: 1.6834 2.4858 sec/batch
Epoch 9/20  Iteration 1501/3560 Training loss: 1.6834 2.5208 sec/batch
Epoch 9/20  Iteration 1502/3560 Training loss: 1.6834 2.6606 sec/batch
Epoch 9/20  Iteration 1503/3560 Training loss: 1.6827 2.2730 sec/batch
Epoch 9/20  Iteration 1504/3560 Training loss: 1.6824 2.1564 sec/batch
Epoch 9/20  Iteration 1505/3560 Training loss: 1.6820 2.1867 sec/batch
Epoch 9/20  Iteration 1506/3560 Training loss: 1.6820 1.8801 sec/batch
Epoch 9/20  Iteration 1507/3560 Training loss: 1.6815 1.8467 sec/batch
Epoch 9/20  Iteration 1508/3560 Training loss: 1.6812 1.8167 sec/batch
Epoch 9/20  Iteration 1509/3560 Training loss: 1.6806 1.7673 sec/batch
Epoch 9/20  Iteration 1510/3560 Training loss: 1.6803 1.7961 sec/batch
Epoch 9/20  Iteration 1511/3560 Training loss: 1.6802 1.8017 sec/batch
Epoch 9/20  Iteration 1512/3560 Training loss: 1.6798 1.7709 sec/batch
Epoch 9/20  Iteration 1513/3560 Training loss: 1.6794 1.9153 sec/batch
Epoch 9/20  Iteration 1514/3560 Training loss: 1.6795 1.8630 sec/batch
Epoch 9/20  Iteration 1515/3560 Training loss: 1.6792 1.8058 sec/batch
Epoch 9/20  Iteration 1516/3560 Training loss: 1.6790 1.8085 sec/batch
Epoch 9/20  Iteration 1517/3560 Training loss: 1.6784 1.8077 sec/batch
Epoch 9/20  Iteration 1518/3560 Training loss: 1.6781 1.8854 sec/batch
Epoch 9/20  Iteration 1519/3560 Training loss: 1.6776 1.9891 sec/batch
Epoch 9/20  Iteration 1520/3560 Training loss: 1.6776 1.8834 sec/batch
Epoch 9/20  Iteration 1521/3560 Training loss: 1.6773 1.8828 sec/batch
Epoch 9/20  Iteration 1522/3560 Training loss: 1.6768 2.1039 sec/batch
Epoch 9/20  Iteration 1523/3560 Training loss: 1.6763 1.8875 sec/batch
Epoch 9/20  Iteration 1524/3560 Training loss: 1.6758 2.3898 sec/batch
Epoch 9/20  Iteration 1525/3560 Training loss: 1.6756 2.1841 sec/batch
Epoch 9/20  Iteration 1526/3560 Training loss: 1.6755 1.9158 sec/batch
Epoch 9/20  Iteration 1527/3560 Training loss: 1.6752 2.1993 sec/batch
Epoch 9/20  Iteration 1528/3560 Training loss: 1.6750 2.4551 sec/batch
Epoch 9/20  Iteration 1529/3560 Training loss: 1.6747 2.0903 sec/batch
Epoch 9/20  Iteration 1530/3560 Training loss: 1.6744 1.8949 sec/batch
Epoch 9/20  Iteration 1531/3560 Training loss: 1.6742 1.9646 sec/batch
Epoch 9/20  Iteration 1532/3560 Training loss: 1.6742 1.8454 sec/batch
Epoch 9/20  Iteration 1533/3560 Training loss: 1.6741 1.7776 sec/batch
Epoch 9/20  Iteration 1534/3560 Training loss: 1.6741 1.7795 sec/batch
Epoch 9/20  Iteration 1535/3560 Training loss: 1.6740 2.5963 sec/batch
Epoch 9/20  Iteration 1536/3560 Training loss: 1.6739 2.9087 sec/batch
Epoch 9/20  Iteration 1537/3560 Training loss: 1.6737 2.3519 sec/batch
Epoch 9/20  Iteration 1538/3560 Training loss: 1.6734 2.9609 sec/batch
Epoch 9/20  Iteration 1539/3560 Training loss: 1.6732 2.3631 sec/batch
Epoch 9/20  Iteration 1540/3560 Training loss: 1.6728 1.9409 sec/batch
Epoch 9/20  Iteration 1541/3560 Training loss: 1.6728 2.0570 sec/batch
Epoch 9/20  Iteration 1542/3560 Training loss: 1.6727 1.9786 sec/batch
Epoch 9/20  Iteration 1543/3560 Training loss: 1.6726 1.9933 sec/batch
Epoch 9/20  Iteration 1544/3560 Training loss: 1.6725 2.1789 sec/batch
Epoch 9/20  Iteration 1545/3560 Training loss: 1.6724 2.0594 sec/batch
Epoch 9/20  Iteration 1546/3560 Training loss: 1.6720 2.0183 sec/batch
Epoch 9/20  Iteration 1547/3560 Training loss: 1.6716 1.9324 sec/batch
Epoch 9/20  Iteration 1548/3560 Training loss: 1.6716 1.9274 sec/batch
Epoch 9/20  Iteration 1549/3560 Training loss: 1.6715 1.9275 sec/batch
Epoch 9/20  Iteration 1550/3560 Training loss: 1.6710 1.8881 sec/batch
Epoch 9/20  Iteration 1551/3560 Training loss: 1.6711 1.9447 sec/batch
Epoch 9/20  Iteration 1552/3560 Training loss: 1.6710 1.9872 sec/batch
Epoch 9/20  Iteration 1553/3560 Training loss: 1.6709 1.9577 sec/batch
Epoch 9/20  Iteration 1554/3560 Training loss: 1.6706 2.1166 sec/batch
Epoch 9/20  Iteration 1555/3560 Training loss: 1.6703 2.0310 sec/batch
Epoch 9/20  Iteration 1556/3560 Training loss: 1.6699 2.0421 sec/batch
Epoch 9/20  Iteration 1557/3560 Training loss: 1.6699 1.9379 sec/batch
Epoch 9/20  Iteration 1558/3560 Training loss: 1.6699 2.0320 sec/batch
Epoch 9/20  Iteration 1559/3560 Training loss: 1.6698 2.1261 sec/batch
Epoch 9/20  Iteration 1560/3560 Training loss: 1.6697 2.0267 sec/batch
Epoch 9/20  Iteration 1561/3560 Training loss: 1.6698 1.9432 sec/batch
Epoch 9/20  Iteration 1562/3560 Training loss: 1.6698 1.9307 sec/batch
Epoch 9/20  Iteration 1563/3560 Training loss: 1.6699 2.0103 sec/batch
Epoch 9/20  Iteration 1564/3560 Training loss: 1.6698 2.0746 sec/batch
Epoch 9/20  Iteration 1565/3560 Training loss: 1.6700 2.0394 sec/batch
Epoch 9/20  Iteration 1566/3560 Training loss: 1.6699 2.0674 sec/batch
Epoch 9/20  Iteration 1567/3560 Training loss: 1.6697 1.9351 sec/batch
Epoch 9/20  Iteration 1568/3560 Training loss: 1.6697 1.9389 sec/batch
Epoch 9/20  Iteration 1569/3560 Training loss: 1.6695 1.9403 sec/batch
Epoch 9/20  Iteration 1570/3560 Training loss: 1.6696 2.0670 sec/batch
Epoch 9/20  Iteration 1571/3560 Training loss: 1.6696 1.9164 sec/batch
Epoch 9/20  Iteration 1572/3560 Training loss: 1.6698 1.9240 sec/batch
Epoch 9/20  Iteration 1573/3560 Training loss: 1.6698 1.9154 sec/batch
Epoch 9/20  Iteration 1574/3560 Training loss: 1.6696 1.8899 sec/batch
Epoch 9/20  Iteration 1575/3560 Training loss: 1.6693 2.0692 sec/batch
Epoch 9/20  Iteration 1576/3560 Training loss: 1.6693 2.0163 sec/batch
Epoch 9/20  Iteration 1577/3560 Training loss: 1.6692 1.9049 sec/batch
Epoch 9/20  Iteration 1578/3560 Training loss: 1.6691 1.9943 sec/batch
Epoch 9/20  Iteration 1579/3560 Training loss: 1.6690 2.0649 sec/batch
Epoch 9/20  Iteration 1580/3560 Training loss: 1.6690 2.1307 sec/batch
Epoch 9/20  Iteration 1581/3560 Training loss: 1.6690 2.0201 sec/batch
Epoch 9/20  Iteration 1582/3560 Training loss: 1.6690 1.9482 sec/batch
Epoch 9/20  Iteration 1583/3560 Training loss: 1.6687 1.9072 sec/batch
Epoch 9/20  Iteration 1584/3560 Training loss: 1.6688 1.9027 sec/batch
Epoch 9/20  Iteration 1585/3560 Training loss: 1.6689 1.9122 sec/batch
Epoch 9/20  Iteration 1586/3560 Training loss: 1.6688 1.9797 sec/batch
Epoch 9/20  Iteration 1587/3560 Training loss: 1.6688 1.9609 sec/batch
Epoch 9/20  Iteration 1588/3560 Training loss: 1.6687 1.8910 sec/batch
Epoch 9/20  Iteration 1589/3560 Training loss: 1.6686 1.9029 sec/batch
Epoch 9/20  Iteration 1590/3560 Training loss: 1.6685 1.9288 sec/batch
Epoch 9/20  Iteration 1591/3560 Training loss: 1.6686 1.9746 sec/batch
Epoch 9/20  Iteration 1592/3560 Training loss: 1.6690 1.9024 sec/batch
Epoch 9/20  Iteration 1593/3560 Training loss: 1.6689 1.9188 sec/batch
Epoch 9/20  Iteration 1594/3560 Training loss: 1.6689 1.9249 sec/batch
Epoch 9/20  Iteration 1595/3560 Training loss: 1.6687 2.0207 sec/batch
Epoch 9/20  Iteration 1596/3560 Training loss: 1.6685 2.0473 sec/batch
Epoch 9/20  Iteration 1597/3560 Training loss: 1.6685 1.9978 sec/batch
Epoch 9/20  Iteration 1598/3560 Training loss: 1.6686 1.9130 sec/batch
Epoch 9/20  Iteration 1599/3560 Training loss: 1.6686 1.9262 sec/batch
Epoch 9/20  Iteration 1600/3560 Training loss: 1.6684 1.9251 sec/batch
Epoch 9/20  Iteration 1601/3560 Training loss: 1.6683 2.1719 sec/batch
Epoch 9/20  Iteration 1602/3560 Training loss: 1.6682 2.1210 sec/batch
Epoch 10/20  Iteration 1603/3560 Training loss: 1.7466 2.0121 sec/batch
Epoch 10/20  Iteration 1604/3560 Training loss: 1.6977 1.9202 sec/batch
Epoch 10/20  Iteration 1605/3560 Training loss: 1.6796 1.9605 sec/batch
Epoch 10/20  Iteration 1606/3560 Training loss: 1.6733 1.9332 sec/batch
Epoch 10/20  Iteration 1607/3560 Training loss: 1.6665 1.9515 sec/batch
Epoch 10/20  Iteration 1608/3560 Training loss: 1.6550 1.9441 sec/batch
Epoch 10/20  Iteration 1609/3560 Training loss: 1.6554 2.0197 sec/batch
Epoch 10/20  Iteration 1610/3560 Training loss: 1.6528 1.9552 sec/batch
Epoch 10/20  Iteration 1611/3560 Training loss: 1.6550 1.9031 sec/batch
Epoch 10/20  Iteration 1612/3560 Training loss: 1.6549 1.9177 sec/batch
Epoch 10/20  Iteration 1613/3560 Training loss: 1.6514 1.8733 sec/batch
Epoch 10/20  Iteration 1614/3560 Training loss: 1.6499 1.9162 sec/batch
Epoch 10/20  Iteration 1615/3560 Training loss: 1.6492 1.8770 sec/batch
Epoch 10/20  Iteration 1616/3560 Training loss: 1.6518 2.0590 sec/batch
Epoch 10/20  Iteration 1617/3560 Training loss: 1.6512 1.9484 sec/batch
Epoch 10/20  Iteration 1618/3560 Training loss: 1.6494 2.0870 sec/batch
Epoch 10/20  Iteration 1619/3560 Training loss: 1.6497 1.9709 sec/batch
Epoch 10/20  Iteration 1620/3560 Training loss: 1.6516 1.9426 sec/batch
Epoch 10/20  Iteration 1621/3560 Training loss: 1.6522 1.9622 sec/batch
Epoch 10/20  Iteration 1622/3560 Training loss: 1.6527 2.0264 sec/batch
Epoch 10/20  Iteration 1623/3560 Training loss: 1.6520 2.2914 sec/batch
Epoch 10/20  Iteration 1624/3560 Training loss: 1.6524 1.9607 sec/batch
Epoch 10/20  Iteration 1625/3560 Training loss: 1.6519 2.0913 sec/batch
Epoch 10/20  Iteration 1626/3560 Training loss: 1.6514 1.9711 sec/batch
Epoch 10/20  Iteration 1627/3560 Training loss: 1.6512 1.9577 sec/batch
Epoch 10/20  Iteration 1628/3560 Training loss: 1.6498 1.9251 sec/batch
Epoch 10/20  Iteration 1629/3560 Training loss: 1.6488 2.0128 sec/batch
Epoch 10/20  Iteration 1630/3560 Training loss: 1.6492 1.9259 sec/batch
Epoch 10/20  Iteration 1631/3560 Training loss: 1.6498 1.8982 sec/batch
Epoch 10/20  Iteration 1632/3560 Training loss: 1.6503 1.9790 sec/batch
Epoch 10/20  Iteration 1633/3560 Training loss: 1.6502 1.9844 sec/batch
Epoch 10/20  Iteration 1634/3560 Training loss: 1.6492 1.9124 sec/batch
Epoch 10/20  Iteration 1635/3560 Training loss: 1.6495 1.9257 sec/batch
Epoch 10/20  Iteration 1636/3560 Training loss: 1.6500 1.9167 sec/batch
Epoch 10/20  Iteration 1637/3560 Training loss: 1.6496 1.9046 sec/batch
Epoch 10/20  Iteration 1638/3560 Training loss: 1.6494 1.8909 sec/batch
Epoch 10/20  Iteration 1639/3560 Training loss: 1.6486 2.0072 sec/batch
Epoch 10/20  Iteration 1640/3560 Training loss: 1.6475 1.9725 sec/batch
Epoch 10/20  Iteration 1641/3560 Training loss: 1.6460 1.9386 sec/batch
Epoch 10/20  Iteration 1642/3560 Training loss: 1.6453 1.9316 sec/batch
Epoch 10/20  Iteration 1643/3560 Training loss: 1.6449 2.0285 sec/batch
Epoch 10/20  Iteration 1644/3560 Training loss: 1.6454 2.1192 sec/batch
Epoch 10/20  Iteration 1645/3560 Training loss: 1.6450 2.0246 sec/batch
Epoch 10/20  Iteration 1646/3560 Training loss: 1.6444 1.9030 sec/batch
Epoch 10/20  Iteration 1647/3560 Training loss: 1.6444 1.9257 sec/batch
Epoch 10/20  Iteration 1648/3560 Training loss: 1.6435 1.9472 sec/batch
Epoch 10/20  Iteration 1649/3560 Training loss: 1.6433 1.9142 sec/batch
Epoch 10/20  Iteration 1650/3560 Training loss: 1.6428 1.8929 sec/batch
Epoch 10/20  Iteration 1651/3560 Training loss: 1.6425 1.9818 sec/batch
Epoch 10/20  Iteration 1652/3560 Training loss: 1.6432 1.9042 sec/batch
Epoch 10/20  Iteration 1653/3560 Training loss: 1.6428 1.9694 sec/batch
Epoch 10/20  Iteration 1654/3560 Training loss: 1.6435 1.9088 sec/batch
Epoch 10/20  Iteration 1655/3560 Training loss: 1.6433 1.9510 sec/batch
Epoch 10/20  Iteration 1656/3560 Training loss: 1.6433 1.9015 sec/batch
Epoch 10/20  Iteration 1657/3560 Training loss: 1.6428 1.9283 sec/batch
Epoch 10/20  Iteration 1658/3560 Training loss: 1.6428 1.9263 sec/batch
Epoch 10/20  Iteration 1659/3560 Training loss: 1.6431 1.8944 sec/batch
Epoch 10/20  Iteration 1660/3560 Training loss: 1.6427 2.0391 sec/batch
Epoch 10/20  Iteration 1661/3560 Training loss: 1.6422 2.0403 sec/batch
Epoch 10/20  Iteration 1662/3560 Training loss: 1.6425 1.9551 sec/batch
Epoch 10/20  Iteration 1663/3560 Training loss: 1.6424 1.9438 sec/batch
Epoch 10/20  Iteration 1664/3560 Training loss: 1.6433 1.9358 sec/batch
Epoch 10/20  Iteration 1665/3560 Training loss: 1.6438 2.1744 sec/batch
Epoch 10/20  Iteration 1666/3560 Training loss: 1.6440 2.0999 sec/batch
Epoch 10/20  Iteration 1667/3560 Training loss: 1.6438 1.9069 sec/batch
Epoch 10/20  Iteration 1668/3560 Training loss: 1.6439 1.8774 sec/batch
Epoch 10/20  Iteration 1669/3560 Training loss: 1.6440 1.9148 sec/batch
Epoch 10/20  Iteration 1670/3560 Training loss: 1.6436 1.9047 sec/batch
Epoch 10/20  Iteration 1671/3560 Training loss: 1.6435 1.9313 sec/batch
Epoch 10/20  Iteration 1672/3560 Training loss: 1.6434 1.9070 sec/batch
Epoch 10/20  Iteration 1673/3560 Training loss: 1.6438 1.9149 sec/batch
Epoch 10/20  Iteration 1674/3560 Training loss: 1.6441 1.9108 sec/batch
Epoch 10/20  Iteration 1675/3560 Training loss: 1.6446 1.9187 sec/batch
Epoch 10/20  Iteration 1676/3560 Training loss: 1.6442 1.9313 sec/batch
Epoch 10/20  Iteration 1677/3560 Training loss: 1.6440 1.8872 sec/batch
Epoch 10/20  Iteration 1678/3560 Training loss: 1.6444 1.9248 sec/batch
Epoch 10/20  Iteration 1679/3560 Training loss: 1.6442 1.8947 sec/batch
Epoch 10/20  Iteration 1680/3560 Training loss: 1.6444 1.9056 sec/batch
Epoch 10/20  Iteration 1681/3560 Training loss: 1.6437 1.9397 sec/batch
Epoch 10/20  Iteration 1682/3560 Training loss: 1.6435 2.0421 sec/batch
Epoch 10/20  Iteration 1683/3560 Training loss: 1.6430 2.0167 sec/batch
Epoch 10/20  Iteration 1684/3560 Training loss: 1.6430 1.9210 sec/batch
Epoch 10/20  Iteration 1685/3560 Training loss: 1.6425 1.8938 sec/batch
Epoch 10/20  Iteration 1686/3560 Training loss: 1.6424 2.2056 sec/batch
Epoch 10/20  Iteration 1687/3560 Training loss: 1.6421 2.1965 sec/batch
Epoch 10/20  Iteration 1688/3560 Training loss: 1.6418 2.0277 sec/batch
Epoch 10/20  Iteration 1689/3560 Training loss: 1.6416 1.8676 sec/batch
Epoch 10/20  Iteration 1690/3560 Training loss: 1.6414 2.0822 sec/batch
Epoch 10/20  Iteration 1691/3560 Training loss: 1.6409 1.9483 sec/batch
Epoch 10/20  Iteration 1692/3560 Training loss: 1.6411 1.9721 sec/batch
Epoch 10/20  Iteration 1693/3560 Training loss: 1.6406 1.9765 sec/batch
Epoch 10/20  Iteration 1694/3560 Training loss: 1.6403 1.9488 sec/batch
Epoch 10/20  Iteration 1695/3560 Training loss: 1.6399 1.9089 sec/batch
Epoch 10/20  Iteration 1696/3560 Training loss: 1.6395 2.3739 sec/batch
Epoch 10/20  Iteration 1697/3560 Training loss: 1.6392 2.0381 sec/batch
Epoch 10/20  Iteration 1698/3560 Training loss: 1.6392 1.9032 sec/batch
Epoch 10/20  Iteration 1699/3560 Training loss: 1.6391 1.9373 sec/batch
Epoch 10/20  Iteration 1700/3560 Training loss: 1.6387 1.8690 sec/batch
Epoch 10/20  Iteration 1701/3560 Training loss: 1.6383 1.9318 sec/batch
Epoch 10/20  Iteration 1702/3560 Training loss: 1.6378 1.9032 sec/batch
Epoch 10/20  Iteration 1703/3560 Training loss: 1.6378 2.0244 sec/batch
Epoch 10/20  Iteration 1704/3560 Training loss: 1.6377 2.0109 sec/batch
Epoch 10/20  Iteration 1705/3560 Training loss: 1.6375 1.9590 sec/batch
Epoch 10/20  Iteration 1706/3560 Training loss: 1.6373 1.9854 sec/batch
Epoch 10/20  Iteration 1707/3560 Training loss: 1.6370 2.0239 sec/batch
Epoch 10/20  Iteration 1708/3560 Training loss: 1.6369 2.1384 sec/batch
Epoch 10/20  Iteration 1709/3560 Training loss: 1.6369 1.9707 sec/batch
Epoch 10/20  Iteration 1710/3560 Training loss: 1.6369 1.9652 sec/batch
Epoch 10/20  Iteration 1711/3560 Training loss: 1.6368 1.9115 sec/batch
Epoch 10/20  Iteration 1712/3560 Training loss: 1.6369 1.9318 sec/batch
Epoch 10/20  Iteration 1713/3560 Training loss: 1.6367 1.9238 sec/batch
Epoch 10/20  Iteration 1714/3560 Training loss: 1.6365 1.9593 sec/batch
Epoch 10/20  Iteration 1715/3560 Training loss: 1.6364 1.8774 sec/batch
Epoch 10/20  Iteration 1716/3560 Training loss: 1.6361 1.9063 sec/batch
Epoch 10/20  Iteration 1717/3560 Training loss: 1.6358 1.9443 sec/batch
Epoch 10/20  Iteration 1718/3560 Training loss: 1.6353 1.8900 sec/batch
Epoch 10/20  Iteration 1719/3560 Training loss: 1.6352 1.9332 sec/batch
Epoch 10/20  Iteration 1720/3560 Training loss: 1.6351 1.9145 sec/batch
Epoch 10/20  Iteration 1721/3560 Training loss: 1.6350 1.8852 sec/batch
Epoch 10/20  Iteration 1722/3560 Training loss: 1.6349 1.8944 sec/batch
Epoch 10/20  Iteration 1723/3560 Training loss: 1.6347 1.9175 sec/batch
Epoch 10/20  Iteration 1724/3560 Training loss: 1.6343 2.0626 sec/batch
Epoch 10/20  Iteration 1725/3560 Training loss: 1.6339 2.0279 sec/batch
Epoch 10/20  Iteration 1726/3560 Training loss: 1.6339 1.9215 sec/batch
Epoch 10/20  Iteration 1727/3560 Training loss: 1.6339 1.9449 sec/batch
Epoch 10/20  Iteration 1728/3560 Training loss: 1.6334 1.9329 sec/batch
Epoch 10/20  Iteration 1729/3560 Training loss: 1.6335 2.2257 sec/batch
Epoch 10/20  Iteration 1730/3560 Training loss: 1.6334 2.0525 sec/batch
Epoch 10/20  Iteration 1731/3560 Training loss: 1.6333 2.1077 sec/batch
Epoch 10/20  Iteration 1732/3560 Training loss: 1.6330 2.3700 sec/batch
Epoch 10/20  Iteration 1733/3560 Training loss: 1.6326 2.5408 sec/batch
Epoch 10/20  Iteration 1734/3560 Training loss: 1.6324 2.4640 sec/batch
Epoch 10/20  Iteration 1735/3560 Training loss: 1.6324 2.6017 sec/batch
Epoch 10/20  Iteration 1736/3560 Training loss: 1.6324 2.3863 sec/batch
Epoch 10/20  Iteration 1737/3560 Training loss: 1.6324 2.2806 sec/batch
Epoch 10/20  Iteration 1738/3560 Training loss: 1.6324 2.2329 sec/batch
Epoch 10/20  Iteration 1739/3560 Training loss: 1.6326 1.8620 sec/batch
Epoch 10/20  Iteration 1740/3560 Training loss: 1.6326 2.1105 sec/batch
Epoch 10/20  Iteration 1741/3560 Training loss: 1.6326 2.2080 sec/batch
Epoch 10/20  Iteration 1742/3560 Training loss: 1.6325 2.1277 sec/batch
Epoch 10/20  Iteration 1743/3560 Training loss: 1.6328 2.3982 sec/batch
Epoch 10/20  Iteration 1744/3560 Training loss: 1.6327 2.2658 sec/batch
Epoch 10/20  Iteration 1745/3560 Training loss: 1.6326 2.0594 sec/batch
Epoch 10/20  Iteration 1746/3560 Training loss: 1.6326 1.9665 sec/batch
Epoch 10/20  Iteration 1747/3560 Training loss: 1.6324 2.0234 sec/batch
Epoch 10/20  Iteration 1748/3560 Training loss: 1.6326 2.2563 sec/batch
Epoch 10/20  Iteration 1749/3560 Training loss: 1.6325 1.9259 sec/batch
Epoch 10/20  Iteration 1750/3560 Training loss: 1.6327 1.7564 sec/batch
Epoch 10/20  Iteration 1751/3560 Training loss: 1.6327 1.7526 sec/batch
Epoch 10/20  Iteration 1752/3560 Training loss: 1.6325 2.0093 sec/batch
Epoch 10/20  Iteration 1753/3560 Training loss: 1.6322 1.8168 sec/batch
Epoch 10/20  Iteration 1754/3560 Training loss: 1.6321 2.2264 sec/batch
Epoch 10/20  Iteration 1755/3560 Training loss: 1.6321 2.0790 sec/batch
Epoch 10/20  Iteration 1756/3560 Training loss: 1.6321 1.8976 sec/batch
Epoch 10/20  Iteration 1757/3560 Training loss: 1.6320 1.8758 sec/batch
Epoch 10/20  Iteration 1758/3560 Training loss: 1.6320 1.7577 sec/batch
Epoch 10/20  Iteration 1759/3560 Training loss: 1.6320 1.9160 sec/batch
Epoch 10/20  Iteration 1760/3560 Training loss: 1.6320 1.9039 sec/batch
Epoch 10/20  Iteration 1761/3560 Training loss: 1.6317 1.7457 sec/batch
Epoch 10/20  Iteration 1762/3560 Training loss: 1.6319 2.6947 sec/batch
Epoch 10/20  Iteration 1763/3560 Training loss: 1.6321 2.4338 sec/batch
Epoch 10/20  Iteration 1764/3560 Training loss: 1.6320 2.1089 sec/batch
Epoch 10/20  Iteration 1765/3560 Training loss: 1.6320 2.0333 sec/batch
Epoch 10/20  Iteration 1766/3560 Training loss: 1.6319 1.9556 sec/batch
Epoch 10/20  Iteration 1767/3560 Training loss: 1.6319 1.9679 sec/batch
Epoch 10/20  Iteration 1768/3560 Training loss: 1.6319 2.0456 sec/batch
Epoch 10/20  Iteration 1769/3560 Training loss: 1.6320 2.1913 sec/batch
Epoch 10/20  Iteration 1770/3560 Training loss: 1.6324 2.0500 sec/batch
Epoch 10/20  Iteration 1771/3560 Training loss: 1.6323 1.9064 sec/batch
Epoch 10/20  Iteration 1772/3560 Training loss: 1.6322 1.9247 sec/batch
Epoch 10/20  Iteration 1773/3560 Training loss: 1.6321 1.9327 sec/batch
Epoch 10/20  Iteration 1774/3560 Training loss: 1.6319 1.9410 sec/batch
Epoch 10/20  Iteration 1775/3560 Training loss: 1.6320 1.9528 sec/batch
Epoch 10/20  Iteration 1776/3560 Training loss: 1.6319 1.9001 sec/batch
Epoch 10/20  Iteration 1777/3560 Training loss: 1.6319 2.3584 sec/batch
Epoch 10/20  Iteration 1778/3560 Training loss: 1.6318 2.2881 sec/batch
Epoch 10/20  Iteration 1779/3560 Training loss: 1.6316 1.9637 sec/batch
Epoch 10/20  Iteration 1780/3560 Training loss: 1.6316 1.9587 sec/batch
Epoch 11/20  Iteration 1781/3560 Training loss: 1.7065 1.9282 sec/batch
Epoch 11/20  Iteration 1782/3560 Training loss: 1.6595 1.9162 sec/batch
Epoch 11/20  Iteration 1783/3560 Training loss: 1.6415 1.8890 sec/batch
Epoch 11/20  Iteration 1784/3560 Training loss: 1.6414 1.9206 sec/batch
Epoch 11/20  Iteration 1785/3560 Training loss: 1.6365 2.0264 sec/batch
Epoch 11/20  Iteration 1786/3560 Training loss: 1.6262 1.9953 sec/batch
Epoch 11/20  Iteration 1787/3560 Training loss: 1.6264 1.9425 sec/batch
Epoch 11/20  Iteration 1788/3560 Training loss: 1.6235 2.0259 sec/batch
Epoch 11/20  Iteration 1789/3560 Training loss: 1.6247 2.0371 sec/batch
Epoch 11/20  Iteration 1790/3560 Training loss: 1.6237 2.0960 sec/batch
Epoch 11/20  Iteration 1791/3560 Training loss: 1.6201 2.0259 sec/batch
Epoch 11/20  Iteration 1792/3560 Training loss: 1.6184 1.9807 sec/batch
Epoch 11/20  Iteration 1793/3560 Training loss: 1.6176 1.9461 sec/batch
Epoch 11/20  Iteration 1794/3560 Training loss: 1.6198 1.9326 sec/batch
Epoch 11/20  Iteration 1795/3560 Training loss: 1.6187 1.9288 sec/batch
Epoch 11/20  Iteration 1796/3560 Training loss: 1.6170 1.9894 sec/batch
Epoch 11/20  Iteration 1797/3560 Training loss: 1.6168 1.9793 sec/batch
Epoch 11/20  Iteration 1798/3560 Training loss: 1.6187 1.8486 sec/batch
Epoch 11/20  Iteration 1799/3560 Training loss: 1.6191 2.0021 sec/batch
Epoch 11/20  Iteration 1800/3560 Training loss: 1.6199 1.9402 sec/batch
Epoch 11/20  Iteration 1801/3560 Training loss: 1.6189 1.9034 sec/batch
Epoch 11/20  Iteration 1802/3560 Training loss: 1.6199 1.9059 sec/batch
Epoch 11/20  Iteration 1803/3560 Training loss: 1.6192 1.9305 sec/batch
Epoch 11/20  Iteration 1804/3560 Training loss: 1.6187 2.0890 sec/batch
Epoch 11/20  Iteration 1805/3560 Training loss: 1.6186 2.0029 sec/batch
Epoch 11/20  Iteration 1806/3560 Training loss: 1.6172 2.1117 sec/batch
Epoch 11/20  Iteration 1807/3560 Training loss: 1.6159 2.0215 sec/batch
Epoch 11/20  Iteration 1808/3560 Training loss: 1.6162 1.9420 sec/batch
Epoch 11/20  Iteration 1809/3560 Training loss: 1.6167 1.9364 sec/batch
Epoch 11/20  Iteration 1810/3560 Training loss: 1.6170 1.9567 sec/batch
Epoch 11/20  Iteration 1811/3560 Training loss: 1.6165 2.1733 sec/batch
Epoch 11/20  Iteration 1812/3560 Training loss: 1.6158 2.0435 sec/batch
Epoch 11/20  Iteration 1813/3560 Training loss: 1.6161 1.9473 sec/batch
Epoch 11/20  Iteration 1814/3560 Training loss: 1.6163 1.9006 sec/batch
Epoch 11/20  Iteration 1815/3560 Training loss: 1.6160 2.0003 sec/batch
Epoch 11/20  Iteration 1816/3560 Training loss: 1.6158 1.9439 sec/batch
Epoch 11/20  Iteration 1817/3560 Training loss: 1.6151 1.9193 sec/batch
Epoch 11/20  Iteration 1818/3560 Training loss: 1.6137 1.9656 sec/batch
Epoch 11/20  Iteration 1819/3560 Training loss: 1.6124 1.9117 sec/batch
Epoch 11/20  Iteration 1820/3560 Training loss: 1.6118 1.9340 sec/batch
Epoch 11/20  Iteration 1821/3560 Training loss: 1.6112 1.8969 sec/batch
Epoch 11/20  Iteration 1822/3560 Training loss: 1.6117 1.9498 sec/batch
Epoch 11/20  Iteration 1823/3560 Training loss: 1.6112 1.9192 sec/batch
Epoch 11/20  Iteration 1824/3560 Training loss: 1.6104 1.9126 sec/batch
Epoch 11/20  Iteration 1825/3560 Training loss: 1.6107 1.9078 sec/batch
Epoch 11/20  Iteration 1826/3560 Training loss: 1.6097 1.9275 sec/batch
Epoch 11/20  Iteration 1827/3560 Training loss: 1.6096 1.9579 sec/batch
Epoch 11/20  Iteration 1828/3560 Training loss: 1.6090 2.1335 sec/batch
Epoch 11/20  Iteration 1829/3560 Training loss: 1.6087 1.8960 sec/batch
Epoch 11/20  Iteration 1830/3560 Training loss: 1.6094 1.9135 sec/batch
Epoch 11/20  Iteration 1831/3560 Training loss: 1.6089 1.8653 sec/batch
Epoch 11/20  Iteration 1832/3560 Training loss: 1.6098 2.1210 sec/batch
Epoch 11/20  Iteration 1833/3560 Training loss: 1.6096 2.1575 sec/batch
Epoch 11/20  Iteration 1834/3560 Training loss: 1.6097 1.8815 sec/batch
Epoch 11/20  Iteration 1835/3560 Training loss: 1.6093 1.9067 sec/batch
Epoch 11/20  Iteration 1836/3560 Training loss: 1.6094 1.8789 sec/batch
Epoch 11/20  Iteration 1837/3560 Training loss: 1.6097 1.9813 sec/batch
Epoch 11/20  Iteration 1838/3560 Training loss: 1.6094 1.9544 sec/batch
Epoch 11/20  Iteration 1839/3560 Training loss: 1.6090 1.9572 sec/batch
Epoch 11/20  Iteration 1840/3560 Training loss: 1.6094 1.9180 sec/batch
Epoch 11/20  Iteration 1841/3560 Training loss: 1.6094 1.9109 sec/batch
Epoch 11/20  Iteration 1842/3560 Training loss: 1.6102 1.9810 sec/batch
Epoch 11/20  Iteration 1843/3560 Training loss: 1.6104 1.9460 sec/batch
Epoch 11/20  Iteration 1844/3560 Training loss: 1.6105 2.0528 sec/batch
Epoch 11/20  Iteration 1845/3560 Training loss: 1.6105 2.4181 sec/batch
Epoch 11/20  Iteration 1846/3560 Training loss: 1.6106 2.2669 sec/batch
Epoch 11/20  Iteration 1847/3560 Training loss: 1.6107 1.9118 sec/batch
Epoch 11/20  Iteration 1848/3560 Training loss: 1.6104 2.0472 sec/batch
Epoch 11/20  Iteration 1849/3560 Training loss: 1.6105 2.0180 sec/batch
Epoch 11/20  Iteration 1850/3560 Training loss: 1.6104 1.9624 sec/batch
Epoch 11/20  Iteration 1851/3560 Training loss: 1.6108 1.9188 sec/batch
Epoch 11/20  Iteration 1852/3560 Training loss: 1.6109 1.9477 sec/batch
Epoch 11/20  Iteration 1853/3560 Training loss: 1.6114 2.2219 sec/batch
Epoch 11/20  Iteration 1854/3560 Training loss: 1.6111 2.0324 sec/batch
Epoch 11/20  Iteration 1855/3560 Training loss: 1.6110 1.9396 sec/batch
Epoch 11/20  Iteration 1856/3560 Training loss: 1.6112 1.9302 sec/batch
Epoch 11/20  Iteration 1857/3560 Training loss: 1.6111 1.9497 sec/batch
Epoch 11/20  Iteration 1858/3560 Training loss: 1.6111 1.9296 sec/batch
Epoch 11/20  Iteration 1859/3560 Training loss: 1.6106 1.9184 sec/batch
Epoch 11/20  Iteration 1860/3560 Training loss: 1.6105 1.9352 sec/batch
Epoch 11/20  Iteration 1861/3560 Training loss: 1.6098 1.9143 sec/batch
Epoch 11/20  Iteration 1862/3560 Training loss: 1.6099 1.9443 sec/batch
Epoch 11/20  Iteration 1863/3560 Training loss: 1.6092 1.9383 sec/batch
Epoch 11/20  Iteration 1864/3560 Training loss: 1.6092 1.9407 sec/batch
Epoch 11/20  Iteration 1865/3560 Training loss: 1.6089 2.0987 sec/batch
Epoch 11/20  Iteration 1866/3560 Training loss: 1.6086 1.9387 sec/batch
Epoch 11/20  Iteration 1867/3560 Training loss: 1.6084 1.8797 sec/batch
Epoch 11/20  Iteration 1868/3560 Training loss: 1.6081 1.9079 sec/batch
Epoch 11/20  Iteration 1869/3560 Training loss: 1.6076 1.8739 sec/batch
Epoch 11/20  Iteration 1870/3560 Training loss: 1.6078 2.0302 sec/batch
Epoch 11/20  Iteration 1871/3560 Training loss: 1.6073 2.0225 sec/batch
Epoch 11/20  Iteration 1872/3560 Training loss: 1.6071 1.9498 sec/batch
Epoch 11/20  Iteration 1873/3560 Training loss: 1.6067 1.9554 sec/batch
Epoch 11/20  Iteration 1874/3560 Training loss: 1.6063 2.0089 sec/batch
Epoch 11/20  Iteration 1875/3560 Training loss: 1.6060 2.1542 sec/batch
Epoch 11/20  Iteration 1876/3560 Training loss: 1.6060 2.0204 sec/batch
Epoch 11/20  Iteration 1877/3560 Training loss: 1.6060 1.9376 sec/batch
Epoch 11/20  Iteration 1878/3560 Training loss: 1.6056 1.8431 sec/batch
Epoch 11/20  Iteration 1879/3560 Training loss: 1.6052 1.8928 sec/batch
Epoch 11/20  Iteration 1880/3560 Training loss: 1.6047 1.9224 sec/batch
Epoch 11/20  Iteration 1881/3560 Training loss: 1.6047 1.9530 sec/batch
Epoch 11/20  Iteration 1882/3560 Training loss: 1.6044 1.9677 sec/batch
Epoch 11/20  Iteration 1883/3560 Training loss: 1.6042 1.9109 sec/batch
Epoch 11/20  Iteration 1884/3560 Training loss: 1.6041 1.9368 sec/batch
Epoch 11/20  Iteration 1885/3560 Training loss: 1.6038 1.9550 sec/batch
Epoch 11/20  Iteration 1886/3560 Training loss: 1.6037 1.9101 sec/batch
Epoch 11/20  Iteration 1887/3560 Training loss: 1.6036 1.8773 sec/batch
Epoch 11/20  Iteration 1888/3560 Training loss: 1.6035 1.9049 sec/batch
Epoch 11/20  Iteration 1889/3560 Training loss: 1.6035 1.8820 sec/batch
Epoch 11/20  Iteration 1890/3560 Training loss: 1.6035 1.8954 sec/batch
Epoch 11/20  Iteration 1891/3560 Training loss: 1.6033 2.0474 sec/batch
Epoch 11/20  Iteration 1892/3560 Training loss: 1.6032 2.0106 sec/batch
Epoch 11/20  Iteration 1893/3560 Training loss: 1.6030 1.9408 sec/batch
Epoch 11/20  Iteration 1894/3560 Training loss: 1.6028 1.9404 sec/batch
Epoch 11/20  Iteration 1895/3560 Training loss: 1.6024 1.9300 sec/batch
Epoch 11/20  Iteration 1896/3560 Training loss: 1.6020 2.1927 sec/batch
Epoch 11/20  Iteration 1897/3560 Training loss: 1.6020 2.0103 sec/batch
Epoch 11/20  Iteration 1898/3560 Training loss: 1.6019 1.9303 sec/batch
Epoch 11/20  Iteration 1899/3560 Training loss: 1.6018 1.9168 sec/batch
Epoch 11/20  Iteration 1900/3560 Training loss: 1.6017 1.9180 sec/batch
Epoch 11/20  Iteration 1901/3560 Training loss: 1.6015 1.8985 sec/batch
Epoch 11/20  Iteration 1902/3560 Training loss: 1.6011 1.9252 sec/batch
Epoch 11/20  Iteration 1903/3560 Training loss: 1.6006 1.9847 sec/batch
Epoch 11/20  Iteration 1904/3560 Training loss: 1.6007 1.9110 sec/batch
Epoch 11/20  Iteration 1905/3560 Training loss: 1.6006 1.9215 sec/batch
Epoch 11/20  Iteration 1906/3560 Training loss: 1.6002 1.9262 sec/batch
Epoch 11/20  Iteration 1907/3560 Training loss: 1.6003 1.9323 sec/batch
Epoch 11/20  Iteration 1908/3560 Training loss: 1.6002 1.9378 sec/batch
Epoch 11/20  Iteration 1909/3560 Training loss: 1.6001 1.8960 sec/batch
Epoch 11/20  Iteration 1910/3560 Training loss: 1.5999 1.9523 sec/batch
Epoch 11/20  Iteration 1911/3560 Training loss: 1.5996 1.9224 sec/batch
Epoch 11/20  Iteration 1912/3560 Training loss: 1.5995 1.9033 sec/batch
Epoch 11/20  Iteration 1913/3560 Training loss: 1.5996 2.0134 sec/batch
Epoch 11/20  Iteration 1914/3560 Training loss: 1.5996 2.0083 sec/batch
Epoch 11/20  Iteration 1915/3560 Training loss: 1.5996 1.9371 sec/batch
Epoch 11/20  Iteration 1916/3560 Training loss: 1.5996 1.9120 sec/batch
Epoch 11/20  Iteration 1917/3560 Training loss: 1.5997 2.0077 sec/batch
Epoch 11/20  Iteration 1918/3560 Training loss: 1.5997 2.2089 sec/batch
Epoch 11/20  Iteration 1919/3560 Training loss: 1.5997 1.9736 sec/batch
Epoch 11/20  Iteration 1920/3560 Training loss: 1.5997 1.9207 sec/batch
Epoch 11/20  Iteration 1921/3560 Training loss: 1.5999 1.9051 sec/batch
Epoch 11/20  Iteration 1922/3560 Training loss: 1.5999 1.9379 sec/batch
Epoch 11/20  Iteration 1923/3560 Training loss: 1.5998 1.9187 sec/batch
Epoch 11/20  Iteration 1924/3560 Training loss: 1.5999 1.9554 sec/batch
Epoch 11/20  Iteration 1925/3560 Training loss: 1.5997 2.0360 sec/batch
Epoch 11/20  Iteration 1926/3560 Training loss: 1.5998 2.0078 sec/batch
Epoch 11/20  Iteration 1927/3560 Training loss: 1.5998 1.9788 sec/batch
Epoch 11/20  Iteration 1928/3560 Training loss: 1.6000 1.9436 sec/batch
Epoch 11/20  Iteration 1929/3560 Training loss: 1.6001 1.9677 sec/batch
Epoch 11/20  Iteration 1930/3560 Training loss: 1.5999 1.8839 sec/batch
Epoch 11/20  Iteration 1931/3560 Training loss: 1.5996 1.8917 sec/batch
Epoch 11/20  Iteration 1932/3560 Training loss: 1.5996 1.9089 sec/batch
Epoch 11/20  Iteration 1933/3560 Training loss: 1.5996 1.9169 sec/batch
Epoch 11/20  Iteration 1934/3560 Training loss: 1.5996 2.0373 sec/batch
Epoch 11/20  Iteration 1935/3560 Training loss: 1.5996 2.0685 sec/batch
Epoch 11/20  Iteration 1936/3560 Training loss: 1.5996 1.9604 sec/batch
Epoch 11/20  Iteration 1937/3560 Training loss: 1.5997 1.8968 sec/batch
Epoch 11/20  Iteration 1938/3560 Training loss: 1.5996 2.0503 sec/batch
Epoch 11/20  Iteration 1939/3560 Training loss: 1.5994 2.1113 sec/batch
Epoch 11/20  Iteration 1940/3560 Training loss: 1.5995 2.1713 sec/batch
Epoch 11/20  Iteration 1941/3560 Training loss: 1.5997 2.4552 sec/batch
Epoch 11/20  Iteration 1942/3560 Training loss: 1.5997 1.9699 sec/batch
Epoch 11/20  Iteration 1943/3560 Training loss: 1.5998 1.7992 sec/batch
Epoch 11/20  Iteration 1944/3560 Training loss: 1.5997 2.0875 sec/batch
Epoch 11/20  Iteration 1945/3560 Training loss: 1.5997 2.6310 sec/batch
Epoch 11/20  Iteration 1946/3560 Training loss: 1.5997 1.8970 sec/batch
Epoch 11/20  Iteration 1947/3560 Training loss: 1.5997 1.9856 sec/batch
Epoch 11/20  Iteration 1948/3560 Training loss: 1.6002 1.9812 sec/batch
Epoch 11/20  Iteration 1949/3560 Training loss: 1.6001 2.1317 sec/batch
Epoch 11/20  Iteration 1950/3560 Training loss: 1.6001 1.9887 sec/batch
Epoch 11/20  Iteration 1951/3560 Training loss: 1.5999 1.9485 sec/batch
Epoch 11/20  Iteration 1952/3560 Training loss: 1.5996 1.9027 sec/batch
Epoch 11/20  Iteration 1953/3560 Training loss: 1.5997 2.0145 sec/batch
Epoch 11/20  Iteration 1954/3560 Training loss: 1.5997 2.0399 sec/batch
Epoch 11/20  Iteration 1955/3560 Training loss: 1.5998 2.0453 sec/batch
Epoch 11/20  Iteration 1956/3560 Training loss: 1.5997 1.9208 sec/batch
Epoch 11/20  Iteration 1957/3560 Training loss: 1.5995 1.8978 sec/batch
Epoch 11/20  Iteration 1958/3560 Training loss: 1.5996 1.8933 sec/batch
Epoch 12/20  Iteration 1959/3560 Training loss: 1.6858 2.0780 sec/batch
Epoch 12/20  Iteration 1960/3560 Training loss: 1.6389 2.2166 sec/batch
Epoch 12/20  Iteration 1961/3560 Training loss: 1.6230 1.9152 sec/batch
Epoch 12/20  Iteration 1962/3560 Training loss: 1.6177 1.8699 sec/batch
Epoch 12/20  Iteration 1963/3560 Training loss: 1.6107 1.9310 sec/batch
Epoch 12/20  Iteration 1964/3560 Training loss: 1.5993 1.9257 sec/batch
Epoch 12/20  Iteration 1965/3560 Training loss: 1.6007 1.9446 sec/batch
Epoch 12/20  Iteration 1966/3560 Training loss: 1.5985 1.9210 sec/batch
Epoch 12/20  Iteration 1967/3560 Training loss: 1.6001 1.9144 sec/batch
Epoch 12/20  Iteration 1968/3560 Training loss: 1.5989 1.8519 sec/batch
Epoch 12/20  Iteration 1969/3560 Training loss: 1.5954 1.9017 sec/batch
Epoch 12/20  Iteration 1970/3560 Training loss: 1.5942 1.9422 sec/batch
Epoch 12/20  Iteration 1971/3560 Training loss: 1.5936 1.9169 sec/batch
Epoch 12/20  Iteration 1972/3560 Training loss: 1.5952 1.9761 sec/batch
Epoch 12/20  Iteration 1973/3560 Training loss: 1.5940 1.9090 sec/batch
Epoch 12/20  Iteration 1974/3560 Training loss: 1.5922 1.8798 sec/batch
Epoch 12/20  Iteration 1975/3560 Training loss: 1.5922 1.9274 sec/batch
Epoch 12/20  Iteration 1976/3560 Training loss: 1.5936 1.9756 sec/batch
Epoch 12/20  Iteration 1977/3560 Training loss: 1.5939 1.9940 sec/batch
Epoch 12/20  Iteration 1978/3560 Training loss: 1.5944 1.8824 sec/batch
Epoch 12/20  Iteration 1979/3560 Training loss: 1.5935 1.9051 sec/batch
Epoch 12/20  Iteration 1980/3560 Training loss: 1.5943 2.0114 sec/batch
Epoch 12/20  Iteration 1981/3560 Training loss: 1.5935 2.1197 sec/batch
Epoch 12/20  Iteration 1982/3560 Training loss: 1.5931 2.0094 sec/batch
Epoch 12/20  Iteration 1983/3560 Training loss: 1.5931 1.9884 sec/batch
Epoch 12/20  Iteration 1984/3560 Training loss: 1.5913 1.8835 sec/batch
Epoch 12/20  Iteration 1985/3560 Training loss: 1.5900 1.8723 sec/batch
Epoch 12/20  Iteration 1986/3560 Training loss: 1.5905 2.0476 sec/batch
Epoch 12/20  Iteration 1987/3560 Training loss: 1.5909 2.0186 sec/batch
Epoch 12/20  Iteration 1988/3560 Training loss: 1.5911 1.9100 sec/batch
Epoch 12/20  Iteration 1989/3560 Training loss: 1.5907 1.8766 sec/batch
Epoch 12/20  Iteration 1990/3560 Training loss: 1.5898 1.8890 sec/batch
Epoch 12/20  Iteration 1991/3560 Training loss: 1.5901 1.9328 sec/batch
Epoch 12/20  Iteration 1992/3560 Training loss: 1.5900 2.0148 sec/batch
Epoch 12/20  Iteration 1993/3560 Training loss: 1.5897 1.8866 sec/batch
Epoch 12/20  Iteration 1994/3560 Training loss: 1.5894 1.9244 sec/batch
Epoch 12/20  Iteration 1995/3560 Training loss: 1.5886 1.8978 sec/batch
Epoch 12/20  Iteration 1996/3560 Training loss: 1.5874 2.0701 sec/batch
Epoch 12/20  Iteration 1997/3560 Training loss: 1.5860 2.1638 sec/batch
Epoch 12/20  Iteration 1998/3560 Training loss: 1.5854 1.9766 sec/batch
Epoch 12/20  Iteration 1999/3560 Training loss: 1.5848 1.9224 sec/batch
Epoch 12/20  Iteration 2000/3560 Training loss: 1.5853 2.0084 sec/batch
Epoch 12/20  Iteration 2001/3560 Training loss: 1.5848 1.9197 sec/batch
Epoch 12/20  Iteration 2002/3560 Training loss: 1.5839 2.2174 sec/batch
Epoch 12/20  Iteration 2003/3560 Training loss: 1.5840 2.0159 sec/batch
Epoch 12/20  Iteration 2004/3560 Training loss: 1.5830 1.9111 sec/batch
Epoch 12/20  Iteration 2005/3560 Training loss: 1.5827 1.8678 sec/batch
Epoch 12/20  Iteration 2006/3560 Training loss: 1.5823 1.9275 sec/batch
Epoch 12/20  Iteration 2007/3560 Training loss: 1.5821 1.9042 sec/batch
Epoch 12/20  Iteration 2008/3560 Training loss: 1.5827 2.0284 sec/batch
Epoch 12/20  Iteration 2009/3560 Training loss: 1.5824 1.9328 sec/batch
Epoch 12/20  Iteration 2010/3560 Training loss: 1.5831 1.8296 sec/batch
Epoch 12/20  Iteration 2011/3560 Training loss: 1.5830 1.8965 sec/batch
Epoch 12/20  Iteration 2012/3560 Training loss: 1.5832 1.9028 sec/batch
Epoch 12/20  Iteration 2013/3560 Training loss: 1.5829 1.8696 sec/batch
Epoch 12/20  Iteration 2014/3560 Training loss: 1.5830 1.8380 sec/batch
Epoch 12/20  Iteration 2015/3560 Training loss: 1.5834 1.8650 sec/batch
Epoch 12/20  Iteration 2016/3560 Training loss: 1.5830 1.8850 sec/batch
Epoch 12/20  Iteration 2017/3560 Training loss: 1.5824 1.9325 sec/batch
Epoch 12/20  Iteration 2018/3560 Training loss: 1.5830 1.9551 sec/batch
Epoch 12/20  Iteration 2019/3560 Training loss: 1.5829 1.9904 sec/batch
Epoch 12/20  Iteration 2020/3560 Training loss: 1.5838 1.9912 sec/batch
Epoch 12/20  Iteration 2021/3560 Training loss: 1.5842 1.9342 sec/batch
Epoch 12/20  Iteration 2022/3560 Training loss: 1.5842 1.8893 sec/batch
Epoch 12/20  Iteration 2023/3560 Training loss: 1.5840 2.0064 sec/batch
Epoch 12/20  Iteration 2024/3560 Training loss: 1.5842 2.1253 sec/batch
Epoch 12/20  Iteration 2025/3560 Training loss: 1.5843 1.9731 sec/batch
Epoch 12/20  Iteration 2026/3560 Training loss: 1.5839 1.9047 sec/batch
Epoch 12/20  Iteration 2027/3560 Training loss: 1.5838 1.8947 sec/batch
Epoch 12/20  Iteration 2028/3560 Training loss: 1.5838 1.8887 sec/batch
Epoch 12/20  Iteration 2029/3560 Training loss: 1.5844 1.9344 sec/batch
Epoch 12/20  Iteration 2030/3560 Training loss: 1.5844 1.9189 sec/batch
Epoch 12/20  Iteration 2031/3560 Training loss: 1.5849 1.8754 sec/batch
Epoch 12/20  Iteration 2032/3560 Training loss: 1.5845 1.9008 sec/batch
Epoch 12/20  Iteration 2033/3560 Training loss: 1.5845 1.8713 sec/batch
Epoch 12/20  Iteration 2034/3560 Training loss: 1.5847 1.9074 sec/batch
Epoch 12/20  Iteration 2035/3560 Training loss: 1.5845 1.8801 sec/batch
Epoch 12/20  Iteration 2036/3560 Training loss: 1.5844 1.9115 sec/batch
Epoch 12/20  Iteration 2037/3560 Training loss: 1.5837 1.9307 sec/batch
Epoch 12/20  Iteration 2038/3560 Training loss: 1.5835 1.8636 sec/batch
Epoch 12/20  Iteration 2039/3560 Training loss: 1.5830 1.9114 sec/batch
Epoch 12/20  Iteration 2040/3560 Training loss: 1.5830 2.0747 sec/batch
Epoch 12/20  Iteration 2041/3560 Training loss: 1.5824 2.0044 sec/batch
Epoch 12/20  Iteration 2042/3560 Training loss: 1.5823 1.9304 sec/batch
Epoch 12/20  Iteration 2043/3560 Training loss: 1.5819 1.9268 sec/batch
Epoch 12/20  Iteration 2044/3560 Training loss: 1.5816 1.8913 sec/batch
Epoch 12/20  Iteration 2045/3560 Training loss: 1.5814 2.0428 sec/batch
Epoch 12/20  Iteration 2046/3560 Training loss: 1.5811 2.1600 sec/batch
Epoch 12/20  Iteration 2047/3560 Training loss: 1.5807 1.9162 sec/batch
Epoch 12/20  Iteration 2048/3560 Training loss: 1.5807 2.0884 sec/batch
Epoch 12/20  Iteration 2049/3560 Training loss: 1.5804 1.9150 sec/batch
Epoch 12/20  Iteration 2050/3560 Training loss: 1.5803 1.8999 sec/batch
Epoch 12/20  Iteration 2051/3560 Training loss: 1.5799 1.9727 sec/batch
Epoch 12/20  Iteration 2052/3560 Training loss: 1.5796 1.8938 sec/batch
Epoch 12/20  Iteration 2053/3560 Training loss: 1.5793 1.9437 sec/batch
Epoch 12/20  Iteration 2054/3560 Training loss: 1.5792 1.8558 sec/batch
Epoch 12/20  Iteration 2055/3560 Training loss: 1.5791 1.9504 sec/batch
Epoch 12/20  Iteration 2056/3560 Training loss: 1.5788 1.9027 sec/batch
Epoch 12/20  Iteration 2057/3560 Training loss: 1.5786 1.9513 sec/batch
Epoch 12/20  Iteration 2058/3560 Training loss: 1.5782 1.8907 sec/batch
Epoch 12/20  Iteration 2059/3560 Training loss: 1.5781 1.8982 sec/batch
Epoch 12/20  Iteration 2060/3560 Training loss: 1.5779 1.9210 sec/batch
Epoch 12/20  Iteration 2061/3560 Training loss: 1.5777 1.8949 sec/batch
Epoch 12/20  Iteration 2062/3560 Training loss: 1.5775 2.0686 sec/batch
Epoch 12/20  Iteration 2063/3560 Training loss: 1.5772 2.0276 sec/batch
Epoch 12/20  Iteration 2064/3560 Training loss: 1.5771 1.9021 sec/batch
Epoch 12/20  Iteration 2065/3560 Training loss: 1.5770 1.8972 sec/batch
Epoch 12/20  Iteration 2066/3560 Training loss: 1.5769 1.9601 sec/batch
Epoch 12/20  Iteration 2067/3560 Training loss: 1.5768 2.1929 sec/batch
Epoch 12/20  Iteration 2068/3560 Training loss: 1.5768 1.9488 sec/batch
Epoch 12/20  Iteration 2069/3560 Training loss: 1.5766 1.9099 sec/batch
Epoch 12/20  Iteration 2070/3560 Training loss: 1.5765 1.9034 sec/batch
Epoch 12/20  Iteration 2071/3560 Training loss: 1.5764 1.8835 sec/batch
Epoch 12/20  Iteration 2072/3560 Training loss: 1.5762 1.8655 sec/batch
Epoch 12/20  Iteration 2073/3560 Training loss: 1.5758 1.9156 sec/batch
Epoch 12/20  Iteration 2074/3560 Training loss: 1.5754 1.9344 sec/batch
Epoch 12/20  Iteration 2075/3560 Training loss: 1.5753 1.9298 sec/batch
Epoch 12/20  Iteration 2076/3560 Training loss: 1.5752 1.9819 sec/batch
Epoch 12/20  Iteration 2077/3560 Training loss: 1.5752 1.9617 sec/batch
Epoch 12/20  Iteration 2078/3560 Training loss: 1.5751 1.9024 sec/batch
Epoch 12/20  Iteration 2079/3560 Training loss: 1.5750 1.8687 sec/batch
Epoch 12/20  Iteration 2080/3560 Training loss: 1.5746 1.8983 sec/batch
Epoch 12/20  Iteration 2081/3560 Training loss: 1.5743 1.8813 sec/batch
Epoch 12/20  Iteration 2082/3560 Training loss: 1.5744 1.8960 sec/batch
Epoch 12/20  Iteration 2083/3560 Training loss: 1.5743 1.8774 sec/batch
Epoch 12/20  Iteration 2084/3560 Training loss: 1.5739 2.0074 sec/batch
Epoch 12/20  Iteration 2085/3560 Training loss: 1.5740 1.9810 sec/batch
Epoch 12/20  Iteration 2086/3560 Training loss: 1.5740 1.9124 sec/batch
Epoch 12/20  Iteration 2087/3560 Training loss: 1.5739 1.9263 sec/batch
Epoch 12/20  Iteration 2088/3560 Training loss: 1.5737 1.9757 sec/batch
Epoch 12/20  Iteration 2089/3560 Training loss: 1.5734 2.0883 sec/batch
Epoch 12/20  Iteration 2090/3560 Training loss: 1.5732 1.9713 sec/batch
Epoch 12/20  Iteration 2091/3560 Training loss: 1.5733 1.9318 sec/batch
Epoch 12/20  Iteration 2092/3560 Training loss: 1.5734 1.9614 sec/batch
Epoch 12/20  Iteration 2093/3560 Training loss: 1.5734 1.8688 sec/batch
Epoch 12/20  Iteration 2094/3560 Training loss: 1.5734 1.9004 sec/batch
Epoch 12/20  Iteration 2095/3560 Training loss: 1.5735 1.9334 sec/batch
Epoch 12/20  Iteration 2096/3560 Training loss: 1.5736 2.0260 sec/batch
Epoch 12/20  Iteration 2097/3560 Training loss: 1.5736 1.9014 sec/batch
Epoch 12/20  Iteration 2098/3560 Training loss: 1.5735 1.8958 sec/batch
Epoch 12/20  Iteration 2099/3560 Training loss: 1.5738 1.9086 sec/batch
Epoch 12/20  Iteration 2100/3560 Training loss: 1.5737 1.8710 sec/batch
Epoch 12/20  Iteration 2101/3560 Training loss: 1.5738 1.8762 sec/batch
Epoch 12/20  Iteration 2102/3560 Training loss: 1.5738 1.9239 sec/batch
Epoch 12/20  Iteration 2103/3560 Training loss: 1.5738 1.9123 sec/batch
Epoch 12/20  Iteration 2104/3560 Training loss: 1.5739 1.9032 sec/batch
Epoch 12/20  Iteration 2105/3560 Training loss: 1.5739 2.0134 sec/batch
Epoch 12/20  Iteration 2106/3560 Training loss: 1.5741 1.9831 sec/batch
Epoch 12/20  Iteration 2107/3560 Training loss: 1.5741 1.9289 sec/batch
Epoch 12/20  Iteration 2108/3560 Training loss: 1.5740 1.9140 sec/batch
Epoch 12/20  Iteration 2109/3560 Training loss: 1.5736 1.9015 sec/batch
Epoch 12/20  Iteration 2110/3560 Training loss: 1.5736 2.3579 sec/batch
Epoch 12/20  Iteration 2111/3560 Training loss: 1.5735 2.1290 sec/batch
Epoch 12/20  Iteration 2112/3560 Training loss: 1.5735 1.9224 sec/batch
Epoch 12/20  Iteration 2113/3560 Training loss: 1.5736 2.0103 sec/batch
Epoch 12/20  Iteration 2114/3560 Training loss: 1.5736 1.9808 sec/batch
Epoch 12/20  Iteration 2115/3560 Training loss: 1.5736 2.0154 sec/batch
Epoch 12/20  Iteration 2116/3560 Training loss: 1.5736 1.9471 sec/batch
Epoch 12/20  Iteration 2117/3560 Training loss: 1.5733 1.9468 sec/batch
Epoch 12/20  Iteration 2118/3560 Training loss: 1.5734 1.9019 sec/batch
Epoch 12/20  Iteration 2119/3560 Training loss: 1.5736 1.8881 sec/batch
Epoch 12/20  Iteration 2120/3560 Training loss: 1.5736 1.8812 sec/batch
Epoch 12/20  Iteration 2121/3560 Training loss: 1.5736 1.9138 sec/batch
Epoch 12/20  Iteration 2122/3560 Training loss: 1.5736 1.9204 sec/batch
Epoch 12/20  Iteration 2123/3560 Training loss: 1.5735 1.8922 sec/batch
Epoch 12/20  Iteration 2124/3560 Training loss: 1.5734 1.8771 sec/batch
Epoch 12/20  Iteration 2125/3560 Training loss: 1.5735 1.9126 sec/batch
Epoch 12/20  Iteration 2126/3560 Training loss: 1.5739 1.8654 sec/batch
Epoch 12/20  Iteration 2127/3560 Training loss: 1.5738 2.0071 sec/batch
Epoch 12/20  Iteration 2128/3560 Training loss: 1.5738 1.9774 sec/batch
Epoch 12/20  Iteration 2129/3560 Training loss: 1.5736 1.9091 sec/batch
Epoch 12/20  Iteration 2130/3560 Training loss: 1.5734 1.9334 sec/batch
Epoch 12/20  Iteration 2131/3560 Training loss: 1.5735 2.0529 sec/batch
Epoch 12/20  Iteration 2132/3560 Training loss: 1.5735 2.1484 sec/batch
Epoch 12/20  Iteration 2133/3560 Training loss: 1.5736 1.9863 sec/batch
Epoch 12/20  Iteration 2134/3560 Training loss: 1.5735 1.8664 sec/batch
Epoch 12/20  Iteration 2135/3560 Training loss: 1.5733 1.8959 sec/batch
Epoch 12/20  Iteration 2136/3560 Training loss: 1.5734 1.8992 sec/batch
Epoch 13/20  Iteration 2137/3560 Training loss: 1.6672 1.8728 sec/batch
Epoch 13/20  Iteration 2138/3560 Training loss: 1.6198 2.0309 sec/batch
Epoch 13/20  Iteration 2139/3560 Training loss: 1.5988 1.8897 sec/batch
Epoch 13/20  Iteration 2140/3560 Training loss: 1.5937 1.9145 sec/batch
Epoch 13/20  Iteration 2141/3560 Training loss: 1.5865 1.9415 sec/batch
Epoch 13/20  Iteration 2142/3560 Training loss: 1.5763 1.8867 sec/batch
Epoch 13/20  Iteration 2143/3560 Training loss: 1.5764 1.8846 sec/batch
Epoch 13/20  Iteration 2144/3560 Training loss: 1.5738 1.8867 sec/batch
Epoch 13/20  Iteration 2145/3560 Training loss: 1.5743 1.8680 sec/batch
Epoch 13/20  Iteration 2146/3560 Training loss: 1.5732 1.8892 sec/batch
Epoch 13/20  Iteration 2147/3560 Training loss: 1.5698 1.8927 sec/batch
Epoch 13/20  Iteration 2148/3560 Training loss: 1.5689 2.0087 sec/batch
Epoch 13/20  Iteration 2149/3560 Training loss: 1.5684 1.9796 sec/batch
Epoch 13/20  Iteration 2150/3560 Training loss: 1.5703 2.1369 sec/batch
Epoch 13/20  Iteration 2151/3560 Training loss: 1.5689 2.2404 sec/batch
Epoch 13/20  Iteration 2152/3560 Training loss: 1.5663 2.0772 sec/batch
Epoch 13/20  Iteration 2153/3560 Training loss: 1.5664 2.1279 sec/batch
Epoch 13/20  Iteration 2154/3560 Training loss: 1.5679 2.0818 sec/batch
Epoch 13/20  Iteration 2155/3560 Training loss: 1.5681 1.8866 sec/batch
Epoch 13/20  Iteration 2156/3560 Training loss: 1.5688 1.8836 sec/batch
Epoch 13/20  Iteration 2157/3560 Training loss: 1.5682 1.9000 sec/batch
Epoch 13/20  Iteration 2158/3560 Training loss: 1.5688 1.9249 sec/batch
Epoch 13/20  Iteration 2159/3560 Training loss: 1.5682 1.9591 sec/batch
Epoch 13/20  Iteration 2160/3560 Training loss: 1.5676 1.9336 sec/batch
Epoch 13/20  Iteration 2161/3560 Training loss: 1.5677 1.8896 sec/batch
Epoch 13/20  Iteration 2162/3560 Training loss: 1.5664 1.9540 sec/batch
Epoch 13/20  Iteration 2163/3560 Training loss: 1.5649 1.9937 sec/batch
Epoch 13/20  Iteration 2164/3560 Training loss: 1.5653 1.8979 sec/batch
Epoch 13/20  Iteration 2165/3560 Training loss: 1.5657 1.8684 sec/batch
Epoch 13/20  Iteration 2166/3560 Training loss: 1.5661 1.9329 sec/batch
Epoch 13/20  Iteration 2167/3560 Training loss: 1.5660 1.9284 sec/batch
Epoch 13/20  Iteration 2168/3560 Training loss: 1.5648 1.8622 sec/batch
Epoch 13/20  Iteration 2169/3560 Training loss: 1.5651 1.9530 sec/batch
Epoch 13/20  Iteration 2170/3560 Training loss: 1.5654 2.0552 sec/batch
Epoch 13/20  Iteration 2171/3560 Training loss: 1.5651 2.0542 sec/batch
Epoch 13/20  Iteration 2172/3560 Training loss: 1.5647 1.9954 sec/batch
Epoch 13/20  Iteration 2173/3560 Training loss: 1.5638 1.8858 sec/batch
Epoch 13/20  Iteration 2174/3560 Training loss: 1.5626 2.0666 sec/batch
Epoch 13/20  Iteration 2175/3560 Training loss: 1.5611 2.1099 sec/batch
Epoch 13/20  Iteration 2176/3560 Training loss: 1.5606 1.8945 sec/batch
Epoch 13/20  Iteration 2177/3560 Training loss: 1.5600 1.9047 sec/batch
Epoch 13/20  Iteration 2178/3560 Training loss: 1.5606 1.9486 sec/batch
Epoch 13/20  Iteration 2179/3560 Training loss: 1.5601 1.9196 sec/batch
Epoch 13/20  Iteration 2180/3560 Training loss: 1.5595 1.8798 sec/batch
Epoch 13/20  Iteration 2181/3560 Training loss: 1.5598 1.9217 sec/batch
Epoch 13/20  Iteration 2182/3560 Training loss: 1.5589 1.9357 sec/batch
Epoch 13/20  Iteration 2183/3560 Training loss: 1.5586 1.8923 sec/batch
Epoch 13/20  Iteration 2184/3560 Training loss: 1.5582 1.8956 sec/batch
Epoch 13/20  Iteration 2185/3560 Training loss: 1.5579 1.9238 sec/batch
Epoch 13/20  Iteration 2186/3560 Training loss: 1.5587 2.0364 sec/batch
Epoch 13/20  Iteration 2187/3560 Training loss: 1.5581 1.8939 sec/batch
Epoch 13/20  Iteration 2188/3560 Training loss: 1.5589 1.8439 sec/batch
Epoch 13/20  Iteration 2189/3560 Training loss: 1.5588 1.9031 sec/batch
Epoch 13/20  Iteration 2190/3560 Training loss: 1.5589 1.8955 sec/batch
Epoch 13/20  Iteration 2191/3560 Training loss: 1.5586 1.9912 sec/batch
Epoch 13/20  Iteration 2192/3560 Training loss: 1.5587 1.9941 sec/batch
Epoch 13/20  Iteration 2193/3560 Training loss: 1.5590 1.8982 sec/batch
Epoch 13/20  Iteration 2194/3560 Training loss: 1.5587 1.9394 sec/batch
Epoch 13/20  Iteration 2195/3560 Training loss: 1.5581 1.9436 sec/batch
Epoch 13/20  Iteration 2196/3560 Training loss: 1.5587 2.1446 sec/batch
Epoch 13/20  Iteration 2197/3560 Training loss: 1.5585 1.9864 sec/batch
Epoch 13/20  Iteration 2198/3560 Training loss: 1.5595 1.9069 sec/batch
Epoch 13/20  Iteration 2199/3560 Training loss: 1.5598 1.9095 sec/batch
Epoch 13/20  Iteration 2200/3560 Training loss: 1.5600 1.8517 sec/batch
Epoch 13/20  Iteration 2201/3560 Training loss: 1.5599 1.8919 sec/batch
Epoch 13/20  Iteration 2202/3560 Training loss: 1.5601 2.0517 sec/batch
Epoch 13/20  Iteration 2203/3560 Training loss: 1.5601 1.8709 sec/batch
Epoch 13/20  Iteration 2204/3560 Training loss: 1.5598 1.9121 sec/batch
Epoch 13/20  Iteration 2205/3560 Training loss: 1.5598 1.8817 sec/batch
Epoch 13/20  Iteration 2206/3560 Training loss: 1.5597 1.9234 sec/batch
Epoch 13/20  Iteration 2207/3560 Training loss: 1.5603 1.8728 sec/batch
Epoch 13/20  Iteration 2208/3560 Training loss: 1.5604 1.8867 sec/batch
Epoch 13/20  Iteration 2209/3560 Training loss: 1.5608 1.8988 sec/batch
Epoch 13/20  Iteration 2210/3560 Training loss: 1.5605 1.8947 sec/batch
Epoch 13/20  Iteration 2211/3560 Training loss: 1.5604 1.8391 sec/batch
Epoch 13/20  Iteration 2212/3560 Training loss: 1.5606 1.9139 sec/batch
Epoch 13/20  Iteration 2213/3560 Training loss: 1.5605 1.9948 sec/batch
Epoch 13/20  Iteration 2214/3560 Training loss: 1.5603 1.9617 sec/batch
Epoch 13/20  Iteration 2215/3560 Training loss: 1.5597 1.9248 sec/batch
Epoch 13/20  Iteration 2216/3560 Training loss: 1.5594 1.8877 sec/batch
Epoch 13/20  Iteration 2217/3560 Training loss: 1.5589 2.0106 sec/batch
Epoch 13/20  Iteration 2218/3560 Training loss: 1.5589 2.1163 sec/batch
Epoch 13/20  Iteration 2219/3560 Training loss: 1.5582 2.0351 sec/batch
Epoch 13/20  Iteration 2220/3560 Training loss: 1.5581 1.9713 sec/batch
Epoch 13/20  Iteration 2221/3560 Training loss: 1.5578 1.9042 sec/batch
Epoch 13/20  Iteration 2222/3560 Training loss: 1.5575 1.8684 sec/batch
Epoch 13/20  Iteration 2223/3560 Training loss: 1.5572 1.8721 sec/batch
Epoch 13/20  Iteration 2224/3560 Training loss: 1.5569 1.9338 sec/batch
Epoch 13/20  Iteration 2225/3560 Training loss: 1.5564 1.9141 sec/batch
Epoch 13/20  Iteration 2226/3560 Training loss: 1.5566 1.8620 sec/batch
Epoch 13/20  Iteration 2227/3560 Training loss: 1.5562 1.9314 sec/batch
Epoch 13/20  Iteration 2228/3560 Training loss: 1.5560 1.9181 sec/batch
Epoch 13/20  Iteration 2229/3560 Training loss: 1.5557 1.8462 sec/batch
Epoch 13/20  Iteration 2230/3560 Training loss: 1.5553 1.8654 sec/batch
Epoch 13/20  Iteration 2231/3560 Training loss: 1.5550 1.8698 sec/batch
Epoch 13/20  Iteration 2232/3560 Training loss: 1.5550 1.9202 sec/batch
Epoch 13/20  Iteration 2233/3560 Training loss: 1.5548 2.0861 sec/batch
Epoch 13/20  Iteration 2234/3560 Training loss: 1.5545 2.0636 sec/batch
Epoch 13/20  Iteration 2235/3560 Training loss: 1.5541 2.0273 sec/batch
Epoch 13/20  Iteration 2236/3560 Training loss: 1.5537 1.9284 sec/batch
Epoch 13/20  Iteration 2237/3560 Training loss: 1.5537 1.9363 sec/batch
Epoch 13/20  Iteration 2238/3560 Training loss: 1.5536 1.9206 sec/batch
Epoch 13/20  Iteration 2239/3560 Training loss: 1.5534 2.1749 sec/batch
Epoch 13/20  Iteration 2240/3560 Training loss: 1.5532 2.0981 sec/batch
Epoch 13/20  Iteration 2241/3560 Training loss: 1.5530 1.9001 sec/batch
Epoch 13/20  Iteration 2242/3560 Training loss: 1.5529 1.9050 sec/batch
Epoch 13/20  Iteration 2243/3560 Training loss: 1.5527 1.8708 sec/batch
Epoch 13/20  Iteration 2244/3560 Training loss: 1.5527 1.8817 sec/batch
Epoch 13/20  Iteration 2245/3560 Training loss: 1.5526 1.9210 sec/batch
Epoch 13/20  Iteration 2246/3560 Training loss: 1.5526 1.9058 sec/batch
Epoch 13/20  Iteration 2247/3560 Training loss: 1.5525 1.8960 sec/batch
Epoch 13/20  Iteration 2248/3560 Training loss: 1.5523 1.8475 sec/batch
Epoch 13/20  Iteration 2249/3560 Training loss: 1.5521 1.9107 sec/batch
Epoch 13/20  Iteration 2250/3560 Training loss: 1.5519 1.9268 sec/batch
Epoch 13/20  Iteration 2251/3560 Training loss: 1.5516 1.9696 sec/batch
Epoch 13/20  Iteration 2252/3560 Training loss: 1.5512 1.8781 sec/batch
Epoch 13/20  Iteration 2253/3560 Training loss: 1.5511 1.8613 sec/batch
Epoch 13/20  Iteration 2254/3560 Training loss: 1.5510 1.9096 sec/batch
Epoch 13/20  Iteration 2255/3560 Training loss: 1.5510 1.8969 sec/batch
Epoch 13/20  Iteration 2256/3560 Training loss: 1.5508 2.0148 sec/batch
Epoch 13/20  Iteration 2257/3560 Training loss: 1.5508 1.9959 sec/batch
Epoch 13/20  Iteration 2258/3560 Training loss: 1.5503 1.9214 sec/batch
Epoch 13/20  Iteration 2259/3560 Training loss: 1.5499 1.8941 sec/batch
Epoch 13/20  Iteration 2260/3560 Training loss: 1.5500 1.9447 sec/batch
Epoch 13/20  Iteration 2261/3560 Training loss: 1.5500 2.1209 sec/batch
Epoch 13/20  Iteration 2262/3560 Training loss: 1.5496 1.9961 sec/batch
Epoch 13/20  Iteration 2263/3560 Training loss: 1.5496 1.9328 sec/batch
Epoch 13/20  Iteration 2264/3560 Training loss: 1.5496 1.9210 sec/batch
Epoch 13/20  Iteration 2265/3560 Training loss: 1.5495 1.9060 sec/batch
Epoch 13/20  Iteration 2266/3560 Training loss: 1.5492 1.9068 sec/batch
Epoch 13/20  Iteration 2267/3560 Training loss: 1.5489 1.9289 sec/batch
Epoch 13/20  Iteration 2268/3560 Training loss: 1.5488 1.9042 sec/batch
Epoch 13/20  Iteration 2269/3560 Training loss: 1.5489 1.9197 sec/batch
Epoch 13/20  Iteration 2270/3560 Training loss: 1.5489 1.8612 sec/batch
Epoch 13/20  Iteration 2271/3560 Training loss: 1.5488 1.9262 sec/batch
Epoch 13/20  Iteration 2272/3560 Training loss: 1.5489 1.9050 sec/batch
Epoch 13/20  Iteration 2273/3560 Training loss: 1.5490 1.8879 sec/batch
Epoch 13/20  Iteration 2274/3560 Training loss: 1.5490 1.8889 sec/batch
Epoch 13/20  Iteration 2275/3560 Training loss: 1.5491 1.9161 sec/batch
Epoch 13/20  Iteration 2276/3560 Training loss: 1.5490 1.9030 sec/batch
Epoch 13/20  Iteration 2277/3560 Training loss: 1.5492 1.8962 sec/batch
Epoch 13/20  Iteration 2278/3560 Training loss: 1.5491 1.9872 sec/batch
Epoch 13/20  Iteration 2279/3560 Training loss: 1.5489 1.9837 sec/batch
Epoch 13/20  Iteration 2280/3560 Training loss: 1.5490 1.9120 sec/batch
Epoch 13/20  Iteration 2281/3560 Training loss: 1.5489 1.9296 sec/batch
Epoch 13/20  Iteration 2282/3560 Training loss: 1.5490 2.0107 sec/batch
Epoch 13/20  Iteration 2283/3560 Training loss: 1.5490 2.1425 sec/batch
Epoch 13/20  Iteration 2284/3560 Training loss: 1.5492 1.9532 sec/batch
Epoch 13/20  Iteration 2285/3560 Training loss: 1.5492 1.8821 sec/batch
Epoch 13/20  Iteration 2286/3560 Training loss: 1.5491 1.9156 sec/batch
Epoch 13/20  Iteration 2287/3560 Training loss: 1.5488 1.9171 sec/batch
Epoch 13/20  Iteration 2288/3560 Training loss: 1.5487 1.9299 sec/batch
Epoch 13/20  Iteration 2289/3560 Training loss: 1.5488 1.9100 sec/batch
Epoch 13/20  Iteration 2290/3560 Training loss: 1.5487 1.9259 sec/batch
Epoch 13/20  Iteration 2291/3560 Training loss: 1.5487 1.8973 sec/batch
Epoch 13/20  Iteration 2292/3560 Training loss: 1.5487 1.8742 sec/batch
Epoch 13/20  Iteration 2293/3560 Training loss: 1.5488 1.9269 sec/batch
Epoch 13/20  Iteration 2294/3560 Training loss: 1.5488 2.0644 sec/batch
Epoch 13/20  Iteration 2295/3560 Training loss: 1.5485 1.9639 sec/batch
Epoch 13/20  Iteration 2296/3560 Training loss: 1.5486 1.8987 sec/batch
Epoch 13/20  Iteration 2297/3560 Training loss: 1.5488 1.8591 sec/batch
Epoch 13/20  Iteration 2298/3560 Training loss: 1.5487 1.8792 sec/batch
Epoch 13/20  Iteration 2299/3560 Training loss: 1.5488 2.0087 sec/batch
Epoch 13/20  Iteration 2300/3560 Training loss: 1.5488 1.9872 sec/batch
Epoch 13/20  Iteration 2301/3560 Training loss: 1.5488 1.8673 sec/batch
Epoch 13/20  Iteration 2302/3560 Training loss: 1.5487 1.9445 sec/batch
Epoch 13/20  Iteration 2303/3560 Training loss: 1.5489 1.9014 sec/batch
Epoch 13/20  Iteration 2304/3560 Training loss: 1.5493 2.4261 sec/batch
Epoch 13/20  Iteration 2305/3560 Training loss: 1.5493 2.1447 sec/batch
Epoch 13/20  Iteration 2306/3560 Training loss: 1.5493 1.9733 sec/batch
Epoch 13/20  Iteration 2307/3560 Training loss: 1.5492 1.9300 sec/batch
Epoch 13/20  Iteration 2308/3560 Training loss: 1.5490 1.9170 sec/batch
Epoch 13/20  Iteration 2309/3560 Training loss: 1.5491 1.8879 sec/batch
Epoch 13/20  Iteration 2310/3560 Training loss: 1.5491 1.9888 sec/batch
Epoch 13/20  Iteration 2311/3560 Training loss: 1.5492 1.8890 sec/batch
Epoch 13/20  Iteration 2312/3560 Training loss: 1.5490 1.9262 sec/batch
Epoch 13/20  Iteration 2313/3560 Training loss: 1.5489 1.8827 sec/batch
Epoch 13/20  Iteration 2314/3560 Training loss: 1.5491 1.8907 sec/batch
Epoch 14/20  Iteration 2315/3560 Training loss: 1.6394 1.8913 sec/batch
Epoch 14/20  Iteration 2316/3560 Training loss: 1.5967 1.8918 sec/batch
Epoch 14/20  Iteration 2317/3560 Training loss: 1.5795 1.9072 sec/batch
Epoch 14/20  Iteration 2318/3560 Training loss: 1.5731 1.8923 sec/batch
Epoch 14/20  Iteration 2319/3560 Training loss: 1.5652 1.8837 sec/batch
Epoch 14/20  Iteration 2320/3560 Training loss: 1.5557 1.8978 sec/batch
Epoch 14/20  Iteration 2321/3560 Training loss: 1.5559 2.0171 sec/batch
Epoch 14/20  Iteration 2322/3560 Training loss: 1.5522 2.0046 sec/batch
Epoch 14/20  Iteration 2323/3560 Training loss: 1.5523 1.8761 sec/batch
Epoch 14/20  Iteration 2324/3560 Training loss: 1.5507 1.9318 sec/batch
Epoch 14/20  Iteration 2325/3560 Training loss: 1.5468 1.9838 sec/batch
Epoch 14/20  Iteration 2326/3560 Training loss: 1.5453 2.1202 sec/batch
Epoch 14/20  Iteration 2327/3560 Training loss: 1.5449 1.9779 sec/batch
Epoch 14/20  Iteration 2328/3560 Training loss: 1.5468 1.9235 sec/batch
Epoch 14/20  Iteration 2329/3560 Training loss: 1.5454 1.8775 sec/batch
Epoch 14/20  Iteration 2330/3560 Training loss: 1.5435 1.8659 sec/batch
Epoch 14/20  Iteration 2331/3560 Training loss: 1.5441 1.9343 sec/batch
Epoch 14/20  Iteration 2332/3560 Training loss: 1.5457 1.9397 sec/batch
Epoch 14/20  Iteration 2333/3560 Training loss: 1.5455 1.9117 sec/batch
Epoch 14/20  Iteration 2334/3560 Training loss: 1.5462 1.8956 sec/batch
Epoch 14/20  Iteration 2335/3560 Training loss: 1.5454 1.9435 sec/batch
Epoch 14/20  Iteration 2336/3560 Training loss: 1.5460 1.9173 sec/batch
Epoch 14/20  Iteration 2337/3560 Training loss: 1.5452 1.8652 sec/batch
Epoch 14/20  Iteration 2338/3560 Training loss: 1.5449 1.8922 sec/batch
Epoch 14/20  Iteration 2339/3560 Training loss: 1.5450 1.8965 sec/batch
Epoch 14/20  Iteration 2340/3560 Training loss: 1.5435 1.8626 sec/batch
Epoch 14/20  Iteration 2341/3560 Training loss: 1.5424 1.8771 sec/batch
Epoch 14/20  Iteration 2342/3560 Training loss: 1.5425 1.9579 sec/batch
Epoch 14/20  Iteration 2343/3560 Training loss: 1.5432 2.0033 sec/batch
Epoch 14/20  Iteration 2344/3560 Training loss: 1.5437 1.9368 sec/batch
Epoch 14/20  Iteration 2345/3560 Training loss: 1.5434 1.9252 sec/batch
Epoch 14/20  Iteration 2346/3560 Training loss: 1.5423 1.8790 sec/batch
Epoch 14/20  Iteration 2347/3560 Training loss: 1.5424 2.1136 sec/batch
Epoch 14/20  Iteration 2348/3560 Training loss: 1.5423 2.1406 sec/batch
Epoch 14/20  Iteration 2349/3560 Training loss: 1.5420 1.9132 sec/batch
Epoch 14/20  Iteration 2350/3560 Training loss: 1.5418 1.8957 sec/batch
Epoch 14/20  Iteration 2351/3560 Training loss: 1.5408 1.8783 sec/batch
Epoch 14/20  Iteration 2352/3560 Training loss: 1.5398 1.8993 sec/batch
Epoch 14/20  Iteration 2353/3560 Training loss: 1.5385 1.8889 sec/batch
Epoch 14/20  Iteration 2354/3560 Training loss: 1.5378 1.9290 sec/batch
Epoch 14/20  Iteration 2355/3560 Training loss: 1.5372 1.9312 sec/batch
Epoch 14/20  Iteration 2356/3560 Training loss: 1.5377 2.0639 sec/batch
Epoch 14/20  Iteration 2357/3560 Training loss: 1.5372 1.9685 sec/batch
Epoch 14/20  Iteration 2358/3560 Training loss: 1.5366 1.9362 sec/batch
Epoch 14/20  Iteration 2359/3560 Training loss: 1.5367 1.8985 sec/batch
Epoch 14/20  Iteration 2360/3560 Training loss: 1.5356 1.9015 sec/batch
Epoch 14/20  Iteration 2361/3560 Training loss: 1.5354 1.8555 sec/batch
Epoch 14/20  Iteration 2362/3560 Training loss: 1.5350 1.8729 sec/batch
Epoch 14/20  Iteration 2363/3560 Training loss: 1.5349 1.8754 sec/batch
Epoch 14/20  Iteration 2364/3560 Training loss: 1.5354 2.2575 sec/batch
Epoch 14/20  Iteration 2365/3560 Training loss: 1.5350 1.9970 sec/batch
Epoch 14/20  Iteration 2366/3560 Training loss: 1.5360 1.9143 sec/batch
Epoch 14/20  Iteration 2367/3560 Training loss: 1.5358 1.8913 sec/batch
Epoch 14/20  Iteration 2368/3560 Training loss: 1.5359 1.9603 sec/batch
Epoch 14/20  Iteration 2369/3560 Training loss: 1.5356 2.1216 sec/batch
Epoch 14/20  Iteration 2370/3560 Training loss: 1.5358 1.9878 sec/batch
Epoch 14/20  Iteration 2371/3560 Training loss: 1.5361 1.9051 sec/batch
Epoch 14/20  Iteration 2372/3560 Training loss: 1.5357 1.8885 sec/batch
Epoch 14/20  Iteration 2373/3560 Training loss: 1.5352 1.9107 sec/batch
Epoch 14/20  Iteration 2374/3560 Training loss: 1.5358 2.1155 sec/batch
Epoch 14/20  Iteration 2375/3560 Training loss: 1.5358 1.9320 sec/batch
Epoch 14/20  Iteration 2376/3560 Training loss: 1.5368 1.8870 sec/batch
Epoch 14/20  Iteration 2377/3560 Training loss: 1.5370 1.8487 sec/batch
Epoch 14/20  Iteration 2378/3560 Training loss: 1.5371 1.9375 sec/batch
Epoch 14/20  Iteration 2379/3560 Training loss: 1.5369 1.8946 sec/batch
Epoch 14/20  Iteration 2380/3560 Training loss: 1.5371 1.9214 sec/batch
Epoch 14/20  Iteration 2381/3560 Training loss: 1.5372 1.8778 sec/batch
Epoch 14/20  Iteration 2382/3560 Training loss: 1.5369 1.9058 sec/batch
Epoch 14/20  Iteration 2383/3560 Training loss: 1.5369 1.9181 sec/batch
Epoch 14/20  Iteration 2384/3560 Training loss: 1.5370 1.9106 sec/batch
Epoch 14/20  Iteration 2385/3560 Training loss: 1.5374 1.9167 sec/batch
Epoch 14/20  Iteration 2386/3560 Training loss: 1.5375 2.0055 sec/batch
Epoch 14/20  Iteration 2387/3560 Training loss: 1.5380 1.9564 sec/batch
Epoch 14/20  Iteration 2388/3560 Training loss: 1.5377 1.8714 sec/batch
Epoch 14/20  Iteration 2389/3560 Training loss: 1.5375 2.0566 sec/batch
Epoch 14/20  Iteration 2390/3560 Training loss: 1.5377 2.3291 sec/batch
Epoch 14/20  Iteration 2391/3560 Training loss: 1.5376 2.8589 sec/batch
Epoch 14/20  Iteration 2392/3560 Training loss: 1.5375 2.6908 sec/batch
Epoch 14/20  Iteration 2393/3560 Training loss: 1.5369 2.4785 sec/batch
Epoch 14/20  Iteration 2394/3560 Training loss: 1.5367 2.7489 sec/batch
Epoch 14/20  Iteration 2395/3560 Training loss: 1.5361 2.4485 sec/batch
Epoch 14/20  Iteration 2396/3560 Training loss: 1.5361 1.9148 sec/batch
Epoch 14/20  Iteration 2397/3560 Training loss: 1.5355 1.9019 sec/batch
Epoch 14/20  Iteration 2398/3560 Training loss: 1.5354 1.9912 sec/batch
Epoch 14/20  Iteration 2399/3560 Training loss: 1.5351 1.9737 sec/batch
Epoch 14/20  Iteration 2400/3560 Training loss: 1.5350 1.9051 sec/batch
Epoch 14/20  Iteration 2401/3560 Training loss: 1.5348 1.9122 sec/batch
Epoch 14/20  Iteration 2402/3560 Training loss: 1.5346 1.9316 sec/batch
Epoch 14/20  Iteration 2403/3560 Training loss: 1.5342 1.9304 sec/batch
Epoch 14/20  Iteration 2404/3560 Training loss: 1.5343 1.9479 sec/batch
Epoch 14/20  Iteration 2405/3560 Training loss: 1.5340 2.1364 sec/batch
Epoch 14/20  Iteration 2406/3560 Training loss: 1.5338 2.0725 sec/batch
Epoch 14/20  Iteration 2407/3560 Training loss: 1.5334 1.9229 sec/batch
Epoch 14/20  Iteration 2408/3560 Training loss: 1.5331 2.0608 sec/batch
Epoch 14/20  Iteration 2409/3560 Training loss: 1.5327 2.0780 sec/batch
Epoch 14/20  Iteration 2410/3560 Training loss: 1.5327 2.1387 sec/batch
Epoch 14/20  Iteration 2411/3560 Training loss: 1.5326 2.0113 sec/batch
Epoch 14/20  Iteration 2412/3560 Training loss: 1.5324 1.8969 sec/batch
Epoch 14/20  Iteration 2413/3560 Training loss: 1.5320 1.9527 sec/batch
Epoch 14/20  Iteration 2414/3560 Training loss: 1.5315 2.0719 sec/batch
Epoch 14/20  Iteration 2415/3560 Training loss: 1.5315 2.1046 sec/batch
Epoch 14/20  Iteration 2416/3560 Training loss: 1.5313 1.9245 sec/batch
Epoch 14/20  Iteration 2417/3560 Training loss: 1.5312 1.9461 sec/batch
Epoch 14/20  Iteration 2418/3560 Training loss: 1.5311 1.9023 sec/batch
Epoch 14/20  Iteration 2419/3560 Training loss: 1.5308 2.0624 sec/batch
Epoch 14/20  Iteration 2420/3560 Training loss: 1.5307 1.9303 sec/batch
Epoch 14/20  Iteration 2421/3560 Training loss: 1.5306 1.9240 sec/batch
Epoch 14/20  Iteration 2422/3560 Training loss: 1.5305 1.9464 sec/batch
Epoch 14/20  Iteration 2423/3560 Training loss: 1.5305 1.9615 sec/batch
Epoch 14/20  Iteration 2424/3560 Training loss: 1.5304 1.9250 sec/batch
Epoch 14/20  Iteration 2425/3560 Training loss: 1.5302 1.9560 sec/batch
Epoch 14/20  Iteration 2426/3560 Training loss: 1.5302 2.0186 sec/batch
Epoch 14/20  Iteration 2427/3560 Training loss: 1.5301 1.9926 sec/batch
Epoch 14/20  Iteration 2428/3560 Training loss: 1.5300 2.0257 sec/batch
Epoch 14/20  Iteration 2429/3560 Training loss: 1.5297 2.0049 sec/batch
Epoch 14/20  Iteration 2430/3560 Training loss: 1.5293 2.0449 sec/batch
Epoch 14/20  Iteration 2431/3560 Training loss: 1.5293 2.1481 sec/batch
Epoch 14/20  Iteration 2432/3560 Training loss: 1.5293 2.0103 sec/batch
Epoch 14/20  Iteration 2433/3560 Training loss: 1.5292 2.0189 sec/batch
Epoch 14/20  Iteration 2434/3560 Training loss: 1.5291 2.0435 sec/batch
Epoch 14/20  Iteration 2435/3560 Training loss: 1.5290 2.0381 sec/batch
Epoch 14/20  Iteration 2436/3560 Training loss: 1.5286 2.0145 sec/batch
Epoch 14/20  Iteration 2437/3560 Training loss: 1.5283 1.9732 sec/batch
Epoch 14/20  Iteration 2438/3560 Training loss: 1.5283 1.9330 sec/batch
Epoch 14/20  Iteration 2439/3560 Training loss: 1.5283 1.9609 sec/batch
Epoch 14/20  Iteration 2440/3560 Training loss: 1.5280 1.9651 sec/batch
Epoch 14/20  Iteration 2441/3560 Training loss: 1.5280 2.0325 sec/batch
Epoch 14/20  Iteration 2442/3560 Training loss: 1.5280 2.0696 sec/batch
Epoch 14/20  Iteration 2443/3560 Training loss: 1.5279 1.9191 sec/batch
Epoch 14/20  Iteration 2444/3560 Training loss: 1.5277 1.9569 sec/batch
Epoch 14/20  Iteration 2445/3560 Training loss: 1.5274 1.9786 sec/batch
Epoch 14/20  Iteration 2446/3560 Training loss: 1.5272 2.0992 sec/batch
Epoch 14/20  Iteration 2447/3560 Training loss: 1.5274 2.0301 sec/batch
Epoch 14/20  Iteration 2448/3560 Training loss: 1.5273 2.0135 sec/batch
Epoch 14/20  Iteration 2449/3560 Training loss: 1.5273 2.0307 sec/batch
Epoch 14/20  Iteration 2450/3560 Training loss: 1.5273 1.9373 sec/batch
Epoch 14/20  Iteration 2451/3560 Training loss: 1.5275 2.0053 sec/batch
Epoch 14/20  Iteration 2452/3560 Training loss: 1.5275 2.1878 sec/batch
Epoch 14/20  Iteration 2453/3560 Training loss: 1.5275 1.9435 sec/batch
Epoch 14/20  Iteration 2454/3560 Training loss: 1.5275 2.0348 sec/batch
Epoch 14/20  Iteration 2455/3560 Training loss: 1.5278 2.3394 sec/batch
Epoch 14/20  Iteration 2456/3560 Training loss: 1.5278 1.9336 sec/batch
Epoch 14/20  Iteration 2457/3560 Training loss: 1.5277 1.9525 sec/batch
Epoch 14/20  Iteration 2458/3560 Training loss: 1.5279 2.0037 sec/batch
Epoch 14/20  Iteration 2459/3560 Training loss: 1.5279 1.9298 sec/batch
Epoch 14/20  Iteration 2460/3560 Training loss: 1.5281 1.9342 sec/batch
Epoch 14/20  Iteration 2461/3560 Training loss: 1.5280 1.9029 sec/batch
Epoch 14/20  Iteration 2462/3560 Training loss: 1.5283 2.0284 sec/batch
Epoch 14/20  Iteration 2463/3560 Training loss: 1.5284 1.9532 sec/batch
Epoch 14/20  Iteration 2464/3560 Training loss: 1.5283 1.9735 sec/batch
Epoch 14/20  Iteration 2465/3560 Training loss: 1.5281 1.9206 sec/batch
Epoch 14/20  Iteration 2466/3560 Training loss: 1.5281 1.9333 sec/batch
Epoch 14/20  Iteration 2467/3560 Training loss: 1.5281 1.9060 sec/batch
Epoch 14/20  Iteration 2468/3560 Training loss: 1.5281 2.1241 sec/batch
Epoch 14/20  Iteration 2469/3560 Training loss: 1.5281 2.0289 sec/batch
Epoch 14/20  Iteration 2470/3560 Training loss: 1.5281 1.9660 sec/batch
Epoch 14/20  Iteration 2471/3560 Training loss: 1.5281 2.0229 sec/batch
Epoch 14/20  Iteration 2472/3560 Training loss: 1.5281 2.0438 sec/batch
Epoch 14/20  Iteration 2473/3560 Training loss: 1.5279 2.2422 sec/batch
Epoch 14/20  Iteration 2474/3560 Training loss: 1.5280 1.9697 sec/batch
Epoch 14/20  Iteration 2475/3560 Training loss: 1.5282 1.9826 sec/batch
Epoch 14/20  Iteration 2476/3560 Training loss: 1.5282 1.9607 sec/batch
Epoch 14/20  Iteration 2477/3560 Training loss: 1.5283 1.9301 sec/batch
Epoch 14/20  Iteration 2478/3560 Training loss: 1.5282 1.9505 sec/batch
Epoch 14/20  Iteration 2479/3560 Training loss: 1.5281 2.0043 sec/batch
Epoch 14/20  Iteration 2480/3560 Training loss: 1.5280 1.9608 sec/batch
Epoch 14/20  Iteration 2481/3560 Training loss: 1.5282 1.9057 sec/batch
Epoch 14/20  Iteration 2482/3560 Training loss: 1.5286 1.9713 sec/batch
Epoch 14/20  Iteration 2483/3560 Training loss: 1.5285 1.9427 sec/batch
Epoch 14/20  Iteration 2484/3560 Training loss: 1.5285 1.8902 sec/batch
Epoch 14/20  Iteration 2485/3560 Training loss: 1.5284 1.8990 sec/batch
Epoch 14/20  Iteration 2486/3560 Training loss: 1.5282 1.9314 sec/batch
Epoch 14/20  Iteration 2487/3560 Training loss: 1.5283 1.9290 sec/batch
Epoch 14/20  Iteration 2488/3560 Training loss: 1.5282 1.9621 sec/batch
Epoch 14/20  Iteration 2489/3560 Training loss: 1.5283 2.0237 sec/batch
Epoch 14/20  Iteration 2490/3560 Training loss: 1.5282 2.0114 sec/batch
Epoch 14/20  Iteration 2491/3560 Training loss: 1.5282 1.9038 sec/batch
Epoch 14/20  Iteration 2492/3560 Training loss: 1.5282 1.8964 sec/batch
Epoch 15/20  Iteration 2493/3560 Training loss: 1.6238 1.9555 sec/batch
Epoch 15/20  Iteration 2494/3560 Training loss: 1.5787 2.1804 sec/batch
Epoch 15/20  Iteration 2495/3560 Training loss: 1.5585 2.0132 sec/batch
Epoch 15/20  Iteration 2496/3560 Training loss: 1.5476 1.9133 sec/batch
Epoch 15/20  Iteration 2497/3560 Training loss: 1.5389 1.9636 sec/batch
Epoch 15/20  Iteration 2498/3560 Training loss: 1.5295 1.9480 sec/batch
Epoch 15/20  Iteration 2499/3560 Training loss: 1.5316 1.8890 sec/batch
Epoch 15/20  Iteration 2500/3560 Training loss: 1.5285 1.9561 sec/batch
Epoch 15/20  Iteration 2501/3560 Training loss: 1.5298 1.9184 sec/batch
Epoch 15/20  Iteration 2502/3560 Training loss: 1.5288 1.9091 sec/batch
Epoch 15/20  Iteration 2503/3560 Training loss: 1.5256 1.8829 sec/batch
Epoch 15/20  Iteration 2504/3560 Training loss: 1.5248 1.9556 sec/batch
Epoch 15/20  Iteration 2505/3560 Training loss: 1.5242 1.9161 sec/batch
Epoch 15/20  Iteration 2506/3560 Training loss: 1.5264 1.9442 sec/batch
Epoch 15/20  Iteration 2507/3560 Training loss: 1.5257 1.9341 sec/batch
Epoch 15/20  Iteration 2508/3560 Training loss: 1.5242 1.9448 sec/batch
Epoch 15/20  Iteration 2509/3560 Training loss: 1.5245 1.9320 sec/batch
Epoch 15/20  Iteration 2510/3560 Training loss: 1.5264 1.9500 sec/batch
Epoch 15/20  Iteration 2511/3560 Training loss: 1.5263 2.0638 sec/batch
Epoch 15/20  Iteration 2512/3560 Training loss: 1.5272 1.9074 sec/batch
Epoch 15/20  Iteration 2513/3560 Training loss: 1.5267 1.9544 sec/batch
Epoch 15/20  Iteration 2514/3560 Training loss: 1.5277 1.9263 sec/batch
Epoch 15/20  Iteration 2515/3560 Training loss: 1.5271 2.0883 sec/batch
Epoch 15/20  Iteration 2516/3560 Training loss: 1.5272 2.1450 sec/batch
Epoch 15/20  Iteration 2517/3560 Training loss: 1.5271 1.8759 sec/batch
Epoch 15/20  Iteration 2518/3560 Training loss: 1.5255 1.9113 sec/batch
Epoch 15/20  Iteration 2519/3560 Training loss: 1.5242 1.8833 sec/batch
Epoch 15/20  Iteration 2520/3560 Training loss: 1.5242 1.9018 sec/batch
Epoch 15/20  Iteration 2521/3560 Training loss: 1.5245 1.8955 sec/batch
Epoch 15/20  Iteration 2522/3560 Training loss: 1.5246 1.9275 sec/batch
Epoch 15/20  Iteration 2523/3560 Training loss: 1.5240 1.9020 sec/batch
Epoch 15/20  Iteration 2524/3560 Training loss: 1.5232 1.8883 sec/batch
Epoch 15/20  Iteration 2525/3560 Training loss: 1.5232 1.9549 sec/batch
Epoch 15/20  Iteration 2526/3560 Training loss: 1.5231 1.9184 sec/batch
Epoch 15/20  Iteration 2527/3560 Training loss: 1.5229 1.9043 sec/batch
Epoch 15/20  Iteration 2528/3560 Training loss: 1.5225 1.8867 sec/batch
Epoch 15/20  Iteration 2529/3560 Training loss: 1.5217 1.9134 sec/batch
Epoch 15/20  Iteration 2530/3560 Training loss: 1.5207 1.9004 sec/batch
Epoch 15/20  Iteration 2531/3560 Training loss: 1.5192 1.8716 sec/batch
Epoch 15/20  Iteration 2532/3560 Training loss: 1.5184 2.0678 sec/batch
Epoch 15/20  Iteration 2533/3560 Training loss: 1.5181 2.0504 sec/batch
Epoch 15/20  Iteration 2534/3560 Training loss: 1.5185 1.9498 sec/batch
Epoch 15/20  Iteration 2535/3560 Training loss: 1.5181 2.0749 sec/batch
Epoch 15/20  Iteration 2536/3560 Training loss: 1.5173 2.2045 sec/batch
Epoch 15/20  Iteration 2537/3560 Training loss: 1.5173 2.1244 sec/batch
Epoch 15/20  Iteration 2538/3560 Training loss: 1.5164 2.0319 sec/batch
Epoch 15/20  Iteration 2539/3560 Training loss: 1.5161 1.9313 sec/batch
Epoch 15/20  Iteration 2540/3560 Training loss: 1.5155 1.9295 sec/batch
Epoch 15/20  Iteration 2541/3560 Training loss: 1.5154 1.8961 sec/batch
Epoch 15/20  Iteration 2542/3560 Training loss: 1.5161 1.8942 sec/batch
Epoch 15/20  Iteration 2543/3560 Training loss: 1.5158 1.9022 sec/batch
Epoch 15/20  Iteration 2544/3560 Training loss: 1.5166 2.0029 sec/batch
Epoch 15/20  Iteration 2545/3560 Training loss: 1.5166 1.8903 sec/batch
Epoch 15/20  Iteration 2546/3560 Training loss: 1.5166 1.9422 sec/batch
Epoch 15/20  Iteration 2547/3560 Training loss: 1.5164 1.8986 sec/batch
Epoch 15/20  Iteration 2548/3560 Training loss: 1.5164 1.9313 sec/batch
Epoch 15/20  Iteration 2549/3560 Training loss: 1.5167 1.8808 sec/batch
Epoch 15/20  Iteration 2550/3560 Training loss: 1.5164 1.9156 sec/batch
Epoch 15/20  Iteration 2551/3560 Training loss: 1.5158 1.9291 sec/batch
Epoch 15/20  Iteration 2552/3560 Training loss: 1.5162 1.9105 sec/batch
Epoch 15/20  Iteration 2553/3560 Training loss: 1.5162 2.0315 sec/batch
Epoch 15/20  Iteration 2554/3560 Training loss: 1.5173 2.0087 sec/batch
Epoch 15/20  Iteration 2555/3560 Training loss: 1.5178 1.9490 sec/batch
Epoch 15/20  Iteration 2556/3560 Training loss: 1.5178 1.9168 sec/batch
Epoch 15/20  Iteration 2557/3560 Training loss: 1.5178 1.9280 sec/batch
Epoch 15/20  Iteration 2558/3560 Training loss: 1.5180 2.1154 sec/batch
Epoch 15/20  Iteration 2559/3560 Training loss: 1.5181 2.0820 sec/batch
Epoch 15/20  Iteration 2560/3560 Training loss: 1.5179 1.8967 sec/batch
Epoch 15/20  Iteration 2561/3560 Training loss: 1.5179 1.8635 sec/batch
Epoch 15/20  Iteration 2562/3560 Training loss: 1.5179 1.8949 sec/batch
Epoch 15/20  Iteration 2563/3560 Training loss: 1.5185 1.9206 sec/batch
Epoch 15/20  Iteration 2564/3560 Training loss: 1.5187 1.8665 sec/batch
Epoch 15/20  Iteration 2565/3560 Training loss: 1.5192 2.0604 sec/batch
Epoch 15/20  Iteration 2566/3560 Training loss: 1.5189 2.0106 sec/batch
Epoch 15/20  Iteration 2567/3560 Training loss: 1.5188 2.0204 sec/batch
Epoch 15/20  Iteration 2568/3560 Training loss: 1.5191 1.9269 sec/batch
Epoch 15/20  Iteration 2569/3560 Training loss: 1.5189 1.9414 sec/batch
Epoch 15/20  Iteration 2570/3560 Training loss: 1.5189 1.8883 sec/batch
Epoch 15/20  Iteration 2571/3560 Training loss: 1.5183 1.9369 sec/batch
Epoch 15/20  Iteration 2572/3560 Training loss: 1.5181 1.8748 sec/batch
Epoch 15/20  Iteration 2573/3560 Training loss: 1.5176 1.9944 sec/batch
Epoch 15/20  Iteration 2574/3560 Training loss: 1.5176 1.9656 sec/batch
Epoch 15/20  Iteration 2575/3560 Training loss: 1.5171 2.6161 sec/batch
Epoch 15/20  Iteration 2576/3560 Training loss: 1.5169 2.5066 sec/batch
Epoch 15/20  Iteration 2577/3560 Training loss: 1.5167 2.2155 sec/batch
Epoch 15/20  Iteration 2578/3560 Training loss: 1.5164 2.2405 sec/batch
Epoch 15/20  Iteration 2579/3560 Training loss: 1.5161 2.4978 sec/batch
Epoch 15/20  Iteration 2580/3560 Training loss: 1.5159 3.0470 sec/batch
Epoch 15/20  Iteration 2581/3560 Training loss: 1.5154 2.4648 sec/batch
Epoch 15/20  Iteration 2582/3560 Training loss: 1.5155 2.6534 sec/batch
Epoch 15/20  Iteration 2583/3560 Training loss: 1.5151 2.4071 sec/batch
Epoch 15/20  Iteration 2584/3560 Training loss: 1.5150 2.0119 sec/batch
Epoch 15/20  Iteration 2585/3560 Training loss: 1.5146 2.0431 sec/batch
Epoch 15/20  Iteration 2586/3560 Training loss: 1.5143 1.9784 sec/batch
Epoch 15/20  Iteration 2587/3560 Training loss: 1.5139 2.0803 sec/batch
Epoch 15/20  Iteration 2588/3560 Training loss: 1.5140 1.9513 sec/batch
Epoch 15/20  Iteration 2589/3560 Training loss: 1.5139 1.9056 sec/batch
Epoch 15/20  Iteration 2590/3560 Training loss: 1.5135 1.9707 sec/batch
Epoch 15/20  Iteration 2591/3560 Training loss: 1.5132 1.9294 sec/batch
Epoch 15/20  Iteration 2592/3560 Training loss: 1.5128 1.9846 sec/batch
Epoch 15/20  Iteration 2593/3560 Training loss: 1.5128 2.0341 sec/batch
Epoch 15/20  Iteration 2594/3560 Training loss: 1.5128 2.1787 sec/batch
Epoch 15/20  Iteration 2595/3560 Training loss: 1.5127 2.0615 sec/batch
Epoch 15/20  Iteration 2596/3560 Training loss: 1.5126 2.0614 sec/batch
Epoch 15/20  Iteration 2597/3560 Training loss: 1.5123 1.9336 sec/batch
Epoch 15/20  Iteration 2598/3560 Training loss: 1.5122 2.2128 sec/batch
Epoch 15/20  Iteration 2599/3560 Training loss: 1.5121 2.1431 sec/batch
Epoch 15/20  Iteration 2600/3560 Training loss: 1.5121 1.9155 sec/batch
Epoch 15/20  Iteration 2601/3560 Training loss: 1.5119 1.9650 sec/batch
Epoch 15/20  Iteration 2602/3560 Training loss: 1.5119 1.9121 sec/batch
Epoch 15/20  Iteration 2603/3560 Training loss: 1.5118 1.9040 sec/batch
Epoch 15/20  Iteration 2604/3560 Training loss: 1.5117 2.2323 sec/batch
Epoch 15/20  Iteration 2605/3560 Training loss: 1.5115 2.1062 sec/batch
Epoch 15/20  Iteration 2606/3560 Training loss: 1.5113 2.0220 sec/batch
Epoch 15/20  Iteration 2607/3560 Training loss: 1.5111 1.9625 sec/batch
Epoch 15/20  Iteration 2608/3560 Training loss: 1.5107 1.9178 sec/batch
Epoch 15/20  Iteration 2609/3560 Training loss: 1.5107 1.9725 sec/batch
Epoch 15/20  Iteration 2610/3560 Training loss: 1.5107 1.9566 sec/batch
Epoch 15/20  Iteration 2611/3560 Training loss: 1.5106 1.9185 sec/batch
Epoch 15/20  Iteration 2612/3560 Training loss: 1.5105 1.9719 sec/batch
Epoch 15/20  Iteration 2613/3560 Training loss: 1.5104 2.0376 sec/batch
Epoch 15/20  Iteration 2614/3560 Training loss: 1.5100 2.0557 sec/batch
Epoch 15/20  Iteration 2615/3560 Training loss: 1.5096 2.0345 sec/batch
Epoch 15/20  Iteration 2616/3560 Training loss: 1.5097 1.9623 sec/batch
Epoch 15/20  Iteration 2617/3560 Training loss: 1.5096 1.9642 sec/batch
Epoch 15/20  Iteration 2618/3560 Training loss: 1.5092 1.9728 sec/batch
Epoch 15/20  Iteration 2619/3560 Training loss: 1.5092 2.2122 sec/batch
Epoch 15/20  Iteration 2620/3560 Training loss: 1.5092 2.1123 sec/batch
Epoch 15/20  Iteration 2621/3560 Training loss: 1.5090 1.9151 sec/batch
Epoch 15/20  Iteration 2622/3560 Training loss: 1.5087 1.9267 sec/batch
Epoch 15/20  Iteration 2623/3560 Training loss: 1.5083 1.9178 sec/batch
Epoch 15/20  Iteration 2624/3560 Training loss: 1.5081 1.9623 sec/batch
Epoch 15/20  Iteration 2625/3560 Training loss: 1.5082 1.9596 sec/batch
Epoch 15/20  Iteration 2626/3560 Training loss: 1.5082 1.9359 sec/batch
Epoch 15/20  Iteration 2627/3560 Training loss: 1.5082 2.0039 sec/batch
Epoch 15/20  Iteration 2628/3560 Training loss: 1.5082 1.9666 sec/batch
Epoch 15/20  Iteration 2629/3560 Training loss: 1.5084 1.8881 sec/batch
Epoch 15/20  Iteration 2630/3560 Training loss: 1.5085 1.9757 sec/batch
Epoch 15/20  Iteration 2631/3560 Training loss: 1.5085 1.9360 sec/batch
Epoch 15/20  Iteration 2632/3560 Training loss: 1.5083 1.9562 sec/batch
Epoch 15/20  Iteration 2633/3560 Training loss: 1.5086 1.9411 sec/batch
Epoch 15/20  Iteration 2634/3560 Training loss: 1.5086 1.9100 sec/batch
Epoch 15/20  Iteration 2635/3560 Training loss: 1.5085 2.0458 sec/batch
Epoch 15/20  Iteration 2636/3560 Training loss: 1.5087 2.1152 sec/batch
Epoch 15/20  Iteration 2637/3560 Training loss: 1.5086 1.9705 sec/batch
Epoch 15/20  Iteration 2638/3560 Training loss: 1.5087 1.9216 sec/batch
Epoch 15/20  Iteration 2639/3560 Training loss: 1.5087 1.8986 sec/batch
Epoch 15/20  Iteration 2640/3560 Training loss: 1.5089 2.2475 sec/batch
Epoch 15/20  Iteration 2641/3560 Training loss: 1.5090 2.0585 sec/batch
Epoch 15/20  Iteration 2642/3560 Training loss: 1.5088 1.8900 sec/batch
Epoch 15/20  Iteration 2643/3560 Training loss: 1.5085 1.9002 sec/batch
Epoch 15/20  Iteration 2644/3560 Training loss: 1.5085 1.9163 sec/batch
Epoch 15/20  Iteration 2645/3560 Training loss: 1.5086 1.9039 sec/batch
Epoch 15/20  Iteration 2646/3560 Training loss: 1.5085 2.0216 sec/batch
Epoch 15/20  Iteration 2647/3560 Training loss: 1.5085 1.9685 sec/batch
Epoch 15/20  Iteration 2648/3560 Training loss: 1.5085 1.8937 sec/batch
Epoch 15/20  Iteration 2649/3560 Training loss: 1.5085 1.9606 sec/batch
Epoch 15/20  Iteration 2650/3560 Training loss: 1.5085 1.9688 sec/batch
Epoch 15/20  Iteration 2651/3560 Training loss: 1.5082 1.9427 sec/batch
Epoch 15/20  Iteration 2652/3560 Training loss: 1.5084 2.0040 sec/batch
Epoch 15/20  Iteration 2653/3560 Training loss: 1.5086 1.9290 sec/batch
Epoch 15/20  Iteration 2654/3560 Training loss: 1.5086 1.9890 sec/batch
Epoch 15/20  Iteration 2655/3560 Training loss: 1.5087 2.1067 sec/batch
Epoch 15/20  Iteration 2656/3560 Training loss: 1.5087 2.0725 sec/batch
Epoch 15/20  Iteration 2657/3560 Training loss: 1.5086 2.0349 sec/batch
Epoch 15/20  Iteration 2658/3560 Training loss: 1.5086 1.9434 sec/batch
Epoch 15/20  Iteration 2659/3560 Training loss: 1.5087 1.9425 sec/batch
Epoch 15/20  Iteration 2660/3560 Training loss: 1.5091 1.9306 sec/batch
Epoch 15/20  Iteration 2661/3560 Training loss: 1.5091 2.2224 sec/batch
Epoch 15/20  Iteration 2662/3560 Training loss: 1.5091 2.1763 sec/batch
Epoch 15/20  Iteration 2663/3560 Training loss: 1.5090 1.9406 sec/batch
Epoch 15/20  Iteration 2664/3560 Training loss: 1.5089 1.9468 sec/batch
Epoch 15/20  Iteration 2665/3560 Training loss: 1.5090 1.9721 sec/batch
Epoch 15/20  Iteration 2666/3560 Training loss: 1.5090 1.9229 sec/batch
Epoch 15/20  Iteration 2667/3560 Training loss: 1.5090 2.0661 sec/batch
Epoch 15/20  Iteration 2668/3560 Training loss: 1.5090 1.9751 sec/batch
Epoch 15/20  Iteration 2669/3560 Training loss: 1.5088 1.9081 sec/batch
Epoch 15/20  Iteration 2670/3560 Training loss: 1.5089 1.9958 sec/batch
Epoch 16/20  Iteration 2671/3560 Training loss: 1.5963 1.9646 sec/batch
Epoch 16/20  Iteration 2672/3560 Training loss: 1.5519 1.9464 sec/batch
Epoch 16/20  Iteration 2673/3560 Training loss: 1.5336 1.9174 sec/batch
Epoch 16/20  Iteration 2674/3560 Training loss: 1.5274 1.8922 sec/batch
Epoch 16/20  Iteration 2675/3560 Training loss: 1.5208 2.0344 sec/batch
Epoch 16/20  Iteration 2676/3560 Training loss: 1.5125 1.9328 sec/batch
Epoch 16/20  Iteration 2677/3560 Training loss: 1.5127 1.9554 sec/batch
Epoch 16/20  Iteration 2678/3560 Training loss: 1.5103 2.0185 sec/batch
Epoch 16/20  Iteration 2679/3560 Training loss: 1.5110 1.9863 sec/batch
Epoch 16/20  Iteration 2680/3560 Training loss: 1.5093 1.9308 sec/batch
Epoch 16/20  Iteration 2681/3560 Training loss: 1.5060 1.9050 sec/batch
Epoch 16/20  Iteration 2682/3560 Training loss: 1.5057 2.0984 sec/batch
Epoch 16/20  Iteration 2683/3560 Training loss: 1.5051 2.2842 sec/batch
Epoch 16/20  Iteration 2684/3560 Training loss: 1.5069 1.9340 sec/batch
Epoch 16/20  Iteration 2685/3560 Training loss: 1.5064 1.9510 sec/batch
Epoch 16/20  Iteration 2686/3560 Training loss: 1.5041 1.8718 sec/batch
Epoch 16/20  Iteration 2687/3560 Training loss: 1.5042 1.9970 sec/batch
Epoch 16/20  Iteration 2688/3560 Training loss: 1.5055 1.9362 sec/batch
Epoch 16/20  Iteration 2689/3560 Training loss: 1.5056 1.9487 sec/batch
Epoch 16/20  Iteration 2690/3560 Training loss: 1.5063 1.9191 sec/batch
Epoch 16/20  Iteration 2691/3560 Training loss: 1.5054 1.9367 sec/batch
Epoch 16/20  Iteration 2692/3560 Training loss: 1.5063 1.9766 sec/batch
Epoch 16/20  Iteration 2693/3560 Training loss: 1.5057 1.9022 sec/batch
Epoch 16/20  Iteration 2694/3560 Training loss: 1.5052 1.9217 sec/batch
Epoch 16/20  Iteration 2695/3560 Training loss: 1.5051 2.0211 sec/batch
Epoch 16/20  Iteration 2696/3560 Training loss: 1.5038 1.8767 sec/batch
Epoch 16/20  Iteration 2697/3560 Training loss: 1.5028 1.8944 sec/batch
Epoch 16/20  Iteration 2698/3560 Training loss: 1.5033 1.8761 sec/batch
Epoch 16/20  Iteration 2699/3560 Training loss: 1.5035 2.0289 sec/batch
Epoch 16/20  Iteration 2700/3560 Training loss: 1.5037 2.0339 sec/batch
Epoch 16/20  Iteration 2701/3560 Training loss: 1.5035 1.9361 sec/batch
Epoch 16/20  Iteration 2702/3560 Training loss: 1.5025 1.9107 sec/batch
Epoch 16/20  Iteration 2703/3560 Training loss: 1.5029 2.0324 sec/batch
Epoch 16/20  Iteration 2704/3560 Training loss: 1.5028 2.1498 sec/batch
Epoch 16/20  Iteration 2705/3560 Training loss: 1.5024 2.0024 sec/batch
Epoch 16/20  Iteration 2706/3560 Training loss: 1.5022 1.9350 sec/batch
Epoch 16/20  Iteration 2707/3560 Training loss: 1.5017 1.9164 sec/batch
Epoch 16/20  Iteration 2708/3560 Training loss: 1.5006 2.0312 sec/batch
Epoch 16/20  Iteration 2709/3560 Training loss: 1.4990 1.9166 sec/batch
Epoch 16/20  Iteration 2710/3560 Training loss: 1.4985 1.9511 sec/batch
Epoch 16/20  Iteration 2711/3560 Training loss: 1.4981 1.9034 sec/batch
Epoch 16/20  Iteration 2712/3560 Training loss: 1.4988 1.9191 sec/batch
Epoch 16/20  Iteration 2713/3560 Training loss: 1.4986 1.9696 sec/batch
Epoch 16/20  Iteration 2714/3560 Training loss: 1.4979 2.0529 sec/batch
Epoch 16/20  Iteration 2715/3560 Training loss: 1.4982 2.0717 sec/batch
Epoch 16/20  Iteration 2716/3560 Training loss: 1.4972 1.9248 sec/batch
Epoch 16/20  Iteration 2717/3560 Training loss: 1.4970 1.9011 sec/batch
Epoch 16/20  Iteration 2718/3560 Training loss: 1.4966 1.9183 sec/batch
Epoch 16/20  Iteration 2719/3560 Training loss: 1.4966 1.9880 sec/batch
Epoch 16/20  Iteration 2720/3560 Training loss: 1.4971 2.0480 sec/batch
Epoch 16/20  Iteration 2721/3560 Training loss: 1.4968 1.9911 sec/batch
Epoch 16/20  Iteration 2722/3560 Training loss: 1.4974 1.9166 sec/batch
Epoch 16/20  Iteration 2723/3560 Training loss: 1.4971 1.9675 sec/batch
Epoch 16/20  Iteration 2724/3560 Training loss: 1.4973 1.9808 sec/batch
Epoch 16/20  Iteration 2725/3560 Training loss: 1.4970 2.2150 sec/batch
Epoch 16/20  Iteration 2726/3560 Training loss: 1.4970 2.0238 sec/batch
Epoch 16/20  Iteration 2727/3560 Training loss: 1.4975 1.8983 sec/batch
Epoch 16/20  Iteration 2728/3560 Training loss: 1.4971 1.9211 sec/batch
Epoch 16/20  Iteration 2729/3560 Training loss: 1.4965 1.9261 sec/batch
Epoch 16/20  Iteration 2730/3560 Training loss: 1.4970 2.0128 sec/batch
Epoch 16/20  Iteration 2731/3560 Training loss: 1.4969 1.9362 sec/batch
Epoch 16/20  Iteration 2732/3560 Training loss: 1.4979 1.9529 sec/batch
Epoch 16/20  Iteration 2733/3560 Training loss: 1.4982 1.9072 sec/batch
Epoch 16/20  Iteration 2734/3560 Training loss: 1.4983 1.9454 sec/batch
Epoch 16/20  Iteration 2735/3560 Training loss: 1.4982 1.9470 sec/batch
Epoch 16/20  Iteration 2736/3560 Training loss: 1.4983 1.9123 sec/batch
Epoch 16/20  Iteration 2737/3560 Training loss: 1.4985 1.9166 sec/batch
Epoch 16/20  Iteration 2738/3560 Training loss: 1.4981 1.9270 sec/batch
Epoch 16/20  Iteration 2739/3560 Training loss: 1.4982 1.9349 sec/batch
Epoch 16/20  Iteration 2740/3560 Training loss: 1.4981 1.9093 sec/batch
Epoch 16/20  Iteration 2741/3560 Training loss: 1.4986 2.1239 sec/batch
Epoch 16/20  Iteration 2742/3560 Training loss: 1.4988 2.0393 sec/batch
Epoch 16/20  Iteration 2743/3560 Training loss: 1.4994 1.9443 sec/batch
Epoch 16/20  Iteration 2744/3560 Training loss: 1.4991 1.9729 sec/batch
Epoch 16/20  Iteration 2745/3560 Training loss: 1.4991 1.9249 sec/batch
Epoch 16/20  Iteration 2746/3560 Training loss: 1.4993 2.1783 sec/batch
Epoch 16/20  Iteration 2747/3560 Training loss: 1.4991 2.1014 sec/batch
Epoch 16/20  Iteration 2748/3560 Training loss: 1.4990 1.9460 sec/batch
Epoch 16/20  Iteration 2749/3560 Training loss: 1.4984 1.9304 sec/batch
Epoch 16/20  Iteration 2750/3560 Training loss: 1.4982 1.9557 sec/batch
Epoch 16/20  Iteration 2751/3560 Training loss: 1.4977 2.0429 sec/batch
Epoch 16/20  Iteration 2752/3560 Training loss: 1.4977 1.9322 sec/batch
Epoch 16/20  Iteration 2753/3560 Training loss: 1.4971 1.9902 sec/batch
Epoch 16/20  Iteration 2754/3560 Training loss: 1.4970 1.9083 sec/batch
Epoch 16/20  Iteration 2755/3560 Training loss: 1.4968 2.1487 sec/batch
Epoch 16/20  Iteration 2756/3560 Training loss: 1.4966 2.2723 sec/batch
Epoch 16/20  Iteration 2757/3560 Training loss: 1.4963 1.9819 sec/batch
Epoch 16/20  Iteration 2758/3560 Training loss: 1.4960 1.9138 sec/batch
Epoch 16/20  Iteration 2759/3560 Training loss: 1.4956 1.9395 sec/batch
Epoch 16/20  Iteration 2760/3560 Training loss: 1.4958 1.9268 sec/batch
Epoch 16/20  Iteration 2761/3560 Training loss: 1.4955 1.9409 sec/batch
Epoch 16/20  Iteration 2762/3560 Training loss: 1.4954 2.1644 sec/batch
Epoch 16/20  Iteration 2763/3560 Training loss: 1.4950 2.0283 sec/batch
Epoch 16/20  Iteration 2764/3560 Training loss: 1.4947 1.9606 sec/batch
Epoch 16/20  Iteration 2765/3560 Training loss: 1.4944 1.9938 sec/batch
Epoch 16/20  Iteration 2766/3560 Training loss: 1.4945 1.9371 sec/batch
Epoch 16/20  Iteration 2767/3560 Training loss: 1.4944 2.2775 sec/batch
Epoch 16/20  Iteration 2768/3560 Training loss: 1.4939 2.0600 sec/batch
Epoch 16/20  Iteration 2769/3560 Training loss: 1.4935 1.9509 sec/batch
Epoch 16/20  Iteration 2770/3560 Training loss: 1.4931 1.9504 sec/batch
Epoch 16/20  Iteration 2771/3560 Training loss: 1.4931 1.9013 sec/batch
Epoch 16/20  Iteration 2772/3560 Training loss: 1.4931 1.9836 sec/batch
Epoch 16/20  Iteration 2773/3560 Training loss: 1.4930 1.9300 sec/batch
Epoch 16/20  Iteration 2774/3560 Training loss: 1.4928 1.9189 sec/batch
Epoch 16/20  Iteration 2775/3560 Training loss: 1.4926 2.1939 sec/batch
Epoch 16/20  Iteration 2776/3560 Training loss: 1.4924 2.1017 sec/batch
Epoch 16/20  Iteration 2777/3560 Training loss: 1.4923 1.9402 sec/batch
Epoch 16/20  Iteration 2778/3560 Training loss: 1.4923 1.9461 sec/batch
Epoch 16/20  Iteration 2779/3560 Training loss: 1.4923 1.9271 sec/batch
Epoch 16/20  Iteration 2780/3560 Training loss: 1.4923 1.8688 sec/batch
Epoch 16/20  Iteration 2781/3560 Training loss: 1.4922 1.8900 sec/batch
Epoch 16/20  Iteration 2782/3560 Training loss: 1.4920 1.9358 sec/batch
Epoch 16/20  Iteration 2783/3560 Training loss: 1.4919 2.0804 sec/batch
Epoch 16/20  Iteration 2784/3560 Training loss: 1.4916 2.0170 sec/batch
Epoch 16/20  Iteration 2785/3560 Training loss: 1.4913 1.9099 sec/batch
Epoch 16/20  Iteration 2786/3560 Training loss: 1.4910 2.0572 sec/batch
Epoch 16/20  Iteration 2787/3560 Training loss: 1.4911 1.9118 sec/batch
Epoch 16/20  Iteration 2788/3560 Training loss: 1.4910 2.2324 sec/batch
Epoch 16/20  Iteration 2789/3560 Training loss: 1.4909 2.0688 sec/batch
Epoch 16/20  Iteration 2790/3560 Training loss: 1.4908 2.0896 sec/batch
Epoch 16/20  Iteration 2791/3560 Training loss: 1.4908 2.3436 sec/batch
Epoch 16/20  Iteration 2792/3560 Training loss: 1.4904 2.2649 sec/batch
Epoch 16/20  Iteration 2793/3560 Training loss: 1.4901 2.0702 sec/batch
Epoch 16/20  Iteration 2794/3560 Training loss: 1.4902 2.1811 sec/batch
Epoch 16/20  Iteration 2795/3560 Training loss: 1.4901 2.2912 sec/batch
Epoch 16/20  Iteration 2796/3560 Training loss: 1.4897 2.2723 sec/batch
Epoch 16/20  Iteration 2797/3560 Training loss: 1.4899 2.2976 sec/batch
Epoch 16/20  Iteration 2798/3560 Training loss: 1.4898 2.2442 sec/batch
Epoch 16/20  Iteration 2799/3560 Training loss: 1.4897 2.1305 sec/batch
Epoch 16/20  Iteration 2800/3560 Training loss: 1.4895 2.2822 sec/batch
Epoch 16/20  Iteration 2801/3560 Training loss: 1.4892 2.1621 sec/batch
Epoch 16/20  Iteration 2802/3560 Training loss: 1.4890 1.9747 sec/batch
Epoch 16/20  Iteration 2803/3560 Training loss: 1.4891 2.0664 sec/batch
Epoch 16/20  Iteration 2804/3560 Training loss: 1.4892 2.0179 sec/batch
Epoch 16/20  Iteration 2805/3560 Training loss: 1.4892 1.9573 sec/batch
Epoch 16/20  Iteration 2806/3560 Training loss: 1.4892 2.0182 sec/batch
Epoch 16/20  Iteration 2807/3560 Training loss: 1.4893 2.0462 sec/batch
Epoch 16/20  Iteration 2808/3560 Training loss: 1.4894 2.1421 sec/batch
Epoch 16/20  Iteration 2809/3560 Training loss: 1.4895 2.0486 sec/batch
Epoch 16/20  Iteration 2810/3560 Training loss: 1.4895 1.9274 sec/batch
Epoch 16/20  Iteration 2811/3560 Training loss: 1.4898 1.9590 sec/batch
Epoch 16/20  Iteration 2812/3560 Training loss: 1.4898 1.9289 sec/batch
Epoch 16/20  Iteration 2813/3560 Training loss: 1.4898 1.9543 sec/batch
Epoch 16/20  Iteration 2814/3560 Training loss: 1.4901 1.9864 sec/batch
Epoch 16/20  Iteration 2815/3560 Training loss: 1.4899 1.8969 sec/batch
Epoch 16/20  Iteration 2816/3560 Training loss: 1.4901 1.9358 sec/batch
Epoch 16/20  Iteration 2817/3560 Training loss: 1.4901 1.9726 sec/batch
Epoch 16/20  Iteration 2818/3560 Training loss: 1.4904 1.9025 sec/batch
Epoch 16/20  Iteration 2819/3560 Training loss: 1.4905 1.9024 sec/batch
Epoch 16/20  Iteration 2820/3560 Training loss: 1.4903 1.9236 sec/batch
Epoch 16/20  Iteration 2821/3560 Training loss: 1.4900 1.9551 sec/batch
Epoch 16/20  Iteration 2822/3560 Training loss: 1.4900 1.9395 sec/batch
Epoch 16/20  Iteration 2823/3560 Training loss: 1.4901 1.9377 sec/batch
Epoch 16/20  Iteration 2824/3560 Training loss: 1.4900 2.0635 sec/batch
Epoch 16/20  Iteration 2825/3560 Training loss: 1.4901 2.0368 sec/batch
Epoch 16/20  Iteration 2826/3560 Training loss: 1.4900 1.9206 sec/batch
Epoch 16/20  Iteration 2827/3560 Training loss: 1.4901 1.9771 sec/batch
Epoch 16/20  Iteration 2828/3560 Training loss: 1.4900 2.0275 sec/batch
Epoch 16/20  Iteration 2829/3560 Training loss: 1.4898 2.2006 sec/batch
Epoch 16/20  Iteration 2830/3560 Training loss: 1.4899 2.0198 sec/batch
Epoch 16/20  Iteration 2831/3560 Training loss: 1.4901 1.9412 sec/batch
Epoch 16/20  Iteration 2832/3560 Training loss: 1.4901 1.9724 sec/batch
Epoch 16/20  Iteration 2833/3560 Training loss: 1.4902 1.8854 sec/batch
Epoch 16/20  Iteration 2834/3560 Training loss: 1.4901 2.1383 sec/batch
Epoch 16/20  Iteration 2835/3560 Training loss: 1.4901 2.0373 sec/batch
Epoch 16/20  Iteration 2836/3560 Training loss: 1.4901 1.9903 sec/batch
Epoch 16/20  Iteration 2837/3560 Training loss: 1.4902 1.8758 sec/batch
Epoch 16/20  Iteration 2838/3560 Training loss: 1.4907 1.9802 sec/batch
Epoch 16/20  Iteration 2839/3560 Training loss: 1.4907 1.9312 sec/batch
Epoch 16/20  Iteration 2840/3560 Training loss: 1.4907 1.9630 sec/batch
Epoch 16/20  Iteration 2841/3560 Training loss: 1.4906 1.9857 sec/batch
Epoch 16/20  Iteration 2842/3560 Training loss: 1.4905 1.9267 sec/batch
Epoch 16/20  Iteration 2843/3560 Training loss: 1.4906 1.9537 sec/batch
Epoch 16/20  Iteration 2844/3560 Training loss: 1.4906 1.9273 sec/batch
Epoch 16/20  Iteration 2845/3560 Training loss: 1.4907 2.0642 sec/batch
Epoch 16/20  Iteration 2846/3560 Training loss: 1.4907 2.0270 sec/batch
Epoch 16/20  Iteration 2847/3560 Training loss: 1.4905 1.9690 sec/batch
Epoch 16/20  Iteration 2848/3560 Training loss: 1.4906 1.9602 sec/batch
Epoch 17/20  Iteration 2849/3560 Training loss: 1.5787 2.0188 sec/batch
Epoch 17/20  Iteration 2850/3560 Training loss: 1.5390 2.1770 sec/batch
Epoch 17/20  Iteration 2851/3560 Training loss: 1.5198 2.0087 sec/batch
Epoch 17/20  Iteration 2852/3560 Training loss: 1.5165 1.9418 sec/batch
Epoch 17/20  Iteration 2853/3560 Training loss: 1.5070 1.9569 sec/batch
Epoch 17/20  Iteration 2854/3560 Training loss: 1.4959 1.9026 sec/batch
Epoch 17/20  Iteration 2855/3560 Training loss: 1.4954 1.9041 sec/batch
Epoch 17/20  Iteration 2856/3560 Training loss: 1.4928 1.9858 sec/batch
Epoch 17/20  Iteration 2857/3560 Training loss: 1.4936 1.9752 sec/batch
Epoch 17/20  Iteration 2858/3560 Training loss: 1.4920 1.9260 sec/batch
Epoch 17/20  Iteration 2859/3560 Training loss: 1.4890 1.9619 sec/batch
Epoch 17/20  Iteration 2860/3560 Training loss: 1.4891 1.9185 sec/batch
Epoch 17/20  Iteration 2861/3560 Training loss: 1.4886 1.9477 sec/batch
Epoch 17/20  Iteration 2862/3560 Training loss: 1.4915 1.9351 sec/batch
Epoch 17/20  Iteration 2863/3560 Training loss: 1.4898 1.9060 sec/batch
Epoch 17/20  Iteration 2864/3560 Training loss: 1.4882 1.9593 sec/batch
Epoch 17/20  Iteration 2865/3560 Training loss: 1.4884 1.9176 sec/batch
Epoch 17/20  Iteration 2866/3560 Training loss: 1.4901 2.0977 sec/batch
Epoch 17/20  Iteration 2867/3560 Training loss: 1.4909 2.0347 sec/batch
Epoch 17/20  Iteration 2868/3560 Training loss: 1.4922 1.9660 sec/batch
Epoch 17/20  Iteration 2869/3560 Training loss: 1.4912 1.9600 sec/batch
Epoch 17/20  Iteration 2870/3560 Training loss: 1.4919 1.9469 sec/batch
Epoch 17/20  Iteration 2871/3560 Training loss: 1.4910 2.2939 sec/batch
Epoch 17/20  Iteration 2872/3560 Training loss: 1.4909 2.0285 sec/batch
Epoch 17/20  Iteration 2873/3560 Training loss: 1.4912 1.9385 sec/batch
Epoch 17/20  Iteration 2874/3560 Training loss: 1.4897 1.9565 sec/batch
Epoch 17/20  Iteration 2875/3560 Training loss: 1.4882 1.9414 sec/batch
Epoch 17/20  Iteration 2876/3560 Training loss: 1.4889 1.9166 sec/batch
Epoch 17/20  Iteration 2877/3560 Training loss: 1.4891 1.9540 sec/batch
Epoch 17/20  Iteration 2878/3560 Training loss: 1.4893 1.9764 sec/batch
Epoch 17/20  Iteration 2879/3560 Training loss: 1.4892 1.9271 sec/batch
Epoch 17/20  Iteration 2880/3560 Training loss: 1.4883 1.9425 sec/batch
Epoch 17/20  Iteration 2881/3560 Training loss: 1.4885 1.9583 sec/batch
Epoch 17/20  Iteration 2882/3560 Training loss: 1.4890 1.9330 sec/batch
Epoch 17/20  Iteration 2883/3560 Training loss: 1.4883 2.0010 sec/batch
Epoch 17/20  Iteration 2884/3560 Training loss: 1.4879 1.9098 sec/batch
Epoch 17/20  Iteration 2885/3560 Training loss: 1.4869 1.9582 sec/batch
Epoch 17/20  Iteration 2886/3560 Training loss: 1.4858 1.9585 sec/batch
Epoch 17/20  Iteration 2887/3560 Training loss: 1.4846 2.0576 sec/batch
Epoch 17/20  Iteration 2888/3560 Training loss: 1.4841 2.0772 sec/batch
Epoch 17/20  Iteration 2889/3560 Training loss: 1.4838 2.0558 sec/batch
Epoch 17/20  Iteration 2890/3560 Training loss: 1.4842 2.1070 sec/batch
Epoch 17/20  Iteration 2891/3560 Training loss: 1.4838 1.9771 sec/batch
Epoch 17/20  Iteration 2892/3560 Training loss: 1.4831 2.1894 sec/batch
Epoch 17/20  Iteration 2893/3560 Training loss: 1.4834 2.1404 sec/batch
Epoch 17/20  Iteration 2894/3560 Training loss: 1.4826 2.1117 sec/batch
Epoch 17/20  Iteration 2895/3560 Training loss: 1.4823 1.9729 sec/batch
Epoch 17/20  Iteration 2896/3560 Training loss: 1.4819 2.0071 sec/batch
Epoch 17/20  Iteration 2897/3560 Training loss: 1.4818 1.9317 sec/batch
Epoch 17/20  Iteration 2898/3560 Training loss: 1.4822 1.9853 sec/batch
Epoch 17/20  Iteration 2899/3560 Training loss: 1.4818 1.9383 sec/batch
Epoch 17/20  Iteration 2900/3560 Training loss: 1.4826 1.8977 sec/batch
Epoch 17/20  Iteration 2901/3560 Training loss: 1.4824 1.9195 sec/batch
Epoch 17/20  Iteration 2902/3560 Training loss: 1.4824 1.9886 sec/batch
Epoch 17/20  Iteration 2903/3560 Training loss: 1.4822 1.9435 sec/batch
Epoch 17/20  Iteration 2904/3560 Training loss: 1.4823 2.2176 sec/batch
Epoch 17/20  Iteration 2905/3560 Training loss: 1.4829 2.1061 sec/batch
Epoch 17/20  Iteration 2906/3560 Training loss: 1.4825 1.9936 sec/batch
Epoch 17/20  Iteration 2907/3560 Training loss: 1.4820 1.9081 sec/batch
Epoch 17/20  Iteration 2908/3560 Training loss: 1.4825 2.0264 sec/batch
Epoch 17/20  Iteration 2909/3560 Training loss: 1.4823 2.0369 sec/batch
Epoch 17/20  Iteration 2910/3560 Training loss: 1.4834 1.9438 sec/batch
Epoch 17/20  Iteration 2911/3560 Training loss: 1.4837 1.9509 sec/batch
Epoch 17/20  Iteration 2912/3560 Training loss: 1.4839 1.9431 sec/batch
Epoch 17/20  Iteration 2913/3560 Training loss: 1.4841 2.2633 sec/batch
Epoch 17/20  Iteration 2914/3560 Training loss: 1.4842 2.0313 sec/batch
Epoch 17/20  Iteration 2915/3560 Training loss: 1.4842 1.9166 sec/batch
Epoch 17/20  Iteration 2916/3560 Training loss: 1.4838 1.9664 sec/batch
Epoch 17/20  Iteration 2917/3560 Training loss: 1.4839 1.9338 sec/batch
Epoch 17/20  Iteration 2918/3560 Training loss: 1.4837 1.9683 sec/batch
Epoch 17/20  Iteration 2919/3560 Training loss: 1.4841 2.0360 sec/batch
Epoch 17/20  Iteration 2920/3560 Training loss: 1.4843 2.0090 sec/batch
Epoch 17/20  Iteration 2921/3560 Training loss: 1.4849 2.1450 sec/batch
Epoch 17/20  Iteration 2922/3560 Training loss: 1.4846 2.4157 sec/batch
Epoch 17/20  Iteration 2923/3560 Training loss: 1.4845 2.3357 sec/batch
Epoch 17/20  Iteration 2924/3560 Training loss: 1.4848 1.9960 sec/batch
Epoch 17/20  Iteration 2925/3560 Training loss: 1.4848 2.1163 sec/batch
Epoch 17/20  Iteration 2926/3560 Training loss: 1.4845 2.8128 sec/batch
Epoch 17/20  Iteration 2927/3560 Training loss: 1.4839 2.4304 sec/batch
Epoch 17/20  Iteration 2928/3560 Training loss: 1.4838 2.4542 sec/batch
Epoch 17/20  Iteration 2929/3560 Training loss: 1.4834 2.6098 sec/batch
Epoch 17/20  Iteration 2930/3560 Training loss: 1.4834 2.0787 sec/batch
Epoch 17/20  Iteration 2931/3560 Training loss: 1.4829 1.9826 sec/batch
Epoch 17/20  Iteration 2932/3560 Training loss: 1.4828 2.0892 sec/batch
Epoch 17/20  Iteration 2933/3560 Training loss: 1.4826 2.1566 sec/batch
Epoch 17/20  Iteration 2934/3560 Training loss: 1.4824 1.9547 sec/batch
Epoch 17/20  Iteration 2935/3560 Training loss: 1.4821 1.9538 sec/batch
Epoch 17/20  Iteration 2936/3560 Training loss: 1.4820 2.0475 sec/batch
Epoch 17/20  Iteration 2937/3560 Training loss: 1.4814 1.9351 sec/batch
Epoch 17/20  Iteration 2938/3560 Training loss: 1.4815 1.9823 sec/batch
Epoch 17/20  Iteration 2939/3560 Training loss: 1.4812 1.9638 sec/batch
Epoch 17/20  Iteration 2940/3560 Training loss: 1.4810 1.9301 sec/batch
Epoch 17/20  Iteration 2941/3560 Training loss: 1.4807 1.9319 sec/batch
Epoch 17/20  Iteration 2942/3560 Training loss: 1.4805 1.9396 sec/batch
Epoch 17/20  Iteration 2943/3560 Training loss: 1.4802 1.9371 sec/batch
Epoch 17/20  Iteration 2944/3560 Training loss: 1.4803 1.9198 sec/batch
Epoch 17/20  Iteration 2945/3560 Training loss: 1.4802 2.0061 sec/batch
Epoch 17/20  Iteration 2946/3560 Training loss: 1.4799 1.9523 sec/batch
Epoch 17/20  Iteration 2947/3560 Training loss: 1.4796 1.9414 sec/batch
Epoch 17/20  Iteration 2948/3560 Training loss: 1.4791 1.9550 sec/batch
Epoch 17/20  Iteration 2949/3560 Training loss: 1.4791 2.0469 sec/batch
Epoch 17/20  Iteration 2950/3560 Training loss: 1.4790 1.9566 sec/batch
Epoch 17/20  Iteration 2951/3560 Training loss: 1.4788 1.9708 sec/batch
Epoch 17/20  Iteration 2952/3560 Training loss: 1.4787 2.0508 sec/batch
Epoch 17/20  Iteration 2953/3560 Training loss: 1.4785 2.4143 sec/batch
Epoch 17/20  Iteration 2954/3560 Training loss: 1.4784 2.2044 sec/batch
Epoch 17/20  Iteration 2955/3560 Training loss: 1.4783 1.8987 sec/batch
Epoch 17/20  Iteration 2956/3560 Training loss: 1.4783 1.9457 sec/batch
Epoch 17/20  Iteration 2957/3560 Training loss: 1.4782 1.9231 sec/batch
Epoch 17/20  Iteration 2958/3560 Training loss: 1.4783 1.9330 sec/batch
Epoch 17/20  Iteration 2959/3560 Training loss: 1.4781 1.9018 sec/batch
Epoch 17/20  Iteration 2960/3560 Training loss: 1.4780 1.9907 sec/batch
Epoch 17/20  Iteration 2961/3560 Training loss: 1.4780 1.9871 sec/batch
Epoch 17/20  Iteration 2962/3560 Training loss: 1.4777 1.9334 sec/batch
Epoch 17/20  Iteration 2963/3560 Training loss: 1.4775 1.9124 sec/batch
Epoch 17/20  Iteration 2964/3560 Training loss: 1.4772 1.9555 sec/batch
Epoch 17/20  Iteration 2965/3560 Training loss: 1.4772 1.9108 sec/batch
Epoch 17/20  Iteration 2966/3560 Training loss: 1.4772 1.9382 sec/batch
Epoch 17/20  Iteration 2967/3560 Training loss: 1.4771 1.9229 sec/batch
Epoch 17/20  Iteration 2968/3560 Training loss: 1.4770 1.8696 sec/batch
Epoch 17/20  Iteration 2969/3560 Training loss: 1.4769 1.9287 sec/batch
Epoch 17/20  Iteration 2970/3560 Training loss: 1.4766 2.0320 sec/batch
Epoch 17/20  Iteration 2971/3560 Training loss: 1.4762 2.1227 sec/batch
Epoch 17/20  Iteration 2972/3560 Training loss: 1.4762 1.9809 sec/batch
Epoch 17/20  Iteration 2973/3560 Training loss: 1.4761 1.9369 sec/batch
Epoch 17/20  Iteration 2974/3560 Training loss: 1.4758 2.0709 sec/batch
Epoch 17/20  Iteration 2975/3560 Training loss: 1.4758 2.1103 sec/batch
Epoch 17/20  Iteration 2976/3560 Training loss: 1.4759 2.0373 sec/batch
Epoch 17/20  Iteration 2977/3560 Training loss: 1.4757 1.9515 sec/batch
Epoch 17/20  Iteration 2978/3560 Training loss: 1.4754 2.0113 sec/batch
Epoch 17/20  Iteration 2979/3560 Training loss: 1.4751 1.9073 sec/batch
Epoch 17/20  Iteration 2980/3560 Training loss: 1.4750 1.8765 sec/batch
Epoch 17/20  Iteration 2981/3560 Training loss: 1.4750 1.9816 sec/batch
Epoch 17/20  Iteration 2982/3560 Training loss: 1.4751 1.9021 sec/batch
Epoch 17/20  Iteration 2983/3560 Training loss: 1.4751 1.9341 sec/batch
Epoch 17/20  Iteration 2984/3560 Training loss: 1.4751 1.9666 sec/batch
Epoch 17/20  Iteration 2985/3560 Training loss: 1.4753 2.0836 sec/batch
Epoch 17/20  Iteration 2986/3560 Training loss: 1.4754 1.9241 sec/batch
Epoch 17/20  Iteration 2987/3560 Training loss: 1.4755 1.9533 sec/batch
Epoch 17/20  Iteration 2988/3560 Training loss: 1.4754 1.9107 sec/batch
Epoch 17/20  Iteration 2989/3560 Training loss: 1.4758 1.9235 sec/batch
Epoch 17/20  Iteration 2990/3560 Training loss: 1.4757 1.9715 sec/batch
Epoch 17/20  Iteration 2991/3560 Training loss: 1.4756 2.0661 sec/batch
Epoch 17/20  Iteration 2992/3560 Training loss: 1.4758 2.0884 sec/batch
Epoch 17/20  Iteration 2993/3560 Training loss: 1.4757 1.9705 sec/batch
Epoch 17/20  Iteration 2994/3560 Training loss: 1.4758 1.9299 sec/batch
Epoch 17/20  Iteration 2995/3560 Training loss: 1.4759 2.0241 sec/batch
Epoch 17/20  Iteration 2996/3560 Training loss: 1.4760 2.1420 sec/batch
Epoch 17/20  Iteration 2997/3560 Training loss: 1.4762 2.0819 sec/batch
Epoch 17/20  Iteration 2998/3560 Training loss: 1.4760 1.9329 sec/batch
Epoch 17/20  Iteration 2999/3560 Training loss: 1.4757 1.8975 sec/batch
Epoch 17/20  Iteration 3000/3560 Training loss: 1.4757 1.8993 sec/batch
Epoch 17/20  Iteration 3001/3560 Training loss: 1.4757 1.9322 sec/batch
Epoch 17/20  Iteration 3002/3560 Training loss: 1.4757 2.0820 sec/batch
Epoch 17/20  Iteration 3003/3560 Training loss: 1.4757 1.9492 sec/batch
Epoch 17/20  Iteration 3004/3560 Training loss: 1.4757 1.9209 sec/batch
Epoch 17/20  Iteration 3005/3560 Training loss: 1.4758 1.9406 sec/batch
Epoch 17/20  Iteration 3006/3560 Training loss: 1.4757 1.9505 sec/batch
Epoch 17/20  Iteration 3007/3560 Training loss: 1.4755 1.9472 sec/batch
Epoch 17/20  Iteration 3008/3560 Training loss: 1.4756 1.8882 sec/batch
Epoch 17/20  Iteration 3009/3560 Training loss: 1.4758 1.9059 sec/batch
Epoch 17/20  Iteration 3010/3560 Training loss: 1.4758 1.9120 sec/batch
Epoch 17/20  Iteration 3011/3560 Training loss: 1.4759 1.8924 sec/batch
Epoch 17/20  Iteration 3012/3560 Training loss: 1.4758 2.2453 sec/batch
Epoch 17/20  Iteration 3013/3560 Training loss: 1.4758 2.1210 sec/batch
Epoch 17/20  Iteration 3014/3560 Training loss: 1.4758 2.0020 sec/batch
Epoch 17/20  Iteration 3015/3560 Training loss: 1.4759 1.9624 sec/batch
Epoch 17/20  Iteration 3016/3560 Training loss: 1.4764 2.0110 sec/batch
Epoch 17/20  Iteration 3017/3560 Training loss: 1.4763 2.1335 sec/batch
Epoch 17/20  Iteration 3018/3560 Training loss: 1.4764 1.9856 sec/batch
Epoch 17/20  Iteration 3019/3560 Training loss: 1.4762 1.9457 sec/batch
Epoch 17/20  Iteration 3020/3560 Training loss: 1.4760 1.9581 sec/batch
Epoch 17/20  Iteration 3021/3560 Training loss: 1.4762 1.9474 sec/batch
Epoch 17/20  Iteration 3022/3560 Training loss: 1.4762 1.9455 sec/batch
Epoch 17/20  Iteration 3023/3560 Training loss: 1.4762 2.0794 sec/batch
Epoch 17/20  Iteration 3024/3560 Training loss: 1.4761 1.9541 sec/batch
Epoch 17/20  Iteration 3025/3560 Training loss: 1.4759 1.9437 sec/batch
Epoch 17/20  Iteration 3026/3560 Training loss: 1.4760 1.9411 sec/batch
Epoch 18/20  Iteration 3027/3560 Training loss: 1.5597 1.9595 sec/batch
Epoch 18/20  Iteration 3028/3560 Training loss: 1.5212 1.9637 sec/batch
Epoch 18/20  Iteration 3029/3560 Training loss: 1.5058 1.9249 sec/batch
Epoch 18/20  Iteration 3030/3560 Training loss: 1.4984 1.9399 sec/batch
Epoch 18/20  Iteration 3031/3560 Training loss: 1.4916 1.9206 sec/batch
Epoch 18/20  Iteration 3032/3560 Training loss: 1.4828 1.9334 sec/batch
Epoch 18/20  Iteration 3033/3560 Training loss: 1.4836 2.0421 sec/batch
Epoch 18/20  Iteration 3034/3560 Training loss: 1.4807 2.0020 sec/batch
Epoch 18/20  Iteration 3035/3560 Training loss: 1.4818 2.0822 sec/batch
Epoch 18/20  Iteration 3036/3560 Training loss: 1.4801 1.9467 sec/batch
Epoch 18/20  Iteration 3037/3560 Training loss: 1.4770 1.9438 sec/batch
Epoch 18/20  Iteration 3038/3560 Training loss: 1.4760 2.1999 sec/batch
Epoch 18/20  Iteration 3039/3560 Training loss: 1.4761 2.0430 sec/batch
Epoch 18/20  Iteration 3040/3560 Training loss: 1.4776 1.9048 sec/batch
Epoch 18/20  Iteration 3041/3560 Training loss: 1.4762 2.0083 sec/batch
Epoch 18/20  Iteration 3042/3560 Training loss: 1.4746 1.9068 sec/batch
Epoch 18/20  Iteration 3043/3560 Training loss: 1.4750 1.9619 sec/batch
Epoch 18/20  Iteration 3044/3560 Training loss: 1.4767 1.9645 sec/batch
Epoch 18/20  Iteration 3045/3560 Training loss: 1.4768 1.9678 sec/batch
Epoch 18/20  Iteration 3046/3560 Training loss: 1.4777 1.9764 sec/batch
Epoch 18/20  Iteration 3047/3560 Training loss: 1.4768 1.9199 sec/batch
Epoch 18/20  Iteration 3048/3560 Training loss: 1.4771 1.9457 sec/batch
Epoch 18/20  Iteration 3049/3560 Training loss: 1.4764 2.0595 sec/batch
Epoch 18/20  Iteration 3050/3560 Training loss: 1.4758 1.8936 sec/batch
Epoch 18/20  Iteration 3051/3560 Training loss: 1.4755 1.9287 sec/batch
Epoch 18/20  Iteration 3052/3560 Training loss: 1.4736 1.9077 sec/batch
Epoch 18/20  Iteration 3053/3560 Training loss: 1.4723 2.1756 sec/batch
Epoch 18/20  Iteration 3054/3560 Training loss: 1.4730 2.0445 sec/batch
Epoch 18/20  Iteration 3055/3560 Training loss: 1.4734 2.0127 sec/batch
Epoch 18/20  Iteration 3056/3560 Training loss: 1.4734 1.9994 sec/batch
Epoch 18/20  Iteration 3057/3560 Training loss: 1.4734 1.9696 sec/batch
Epoch 18/20  Iteration 3058/3560 Training loss: 1.4726 1.9741 sec/batch
Epoch 18/20  Iteration 3059/3560 Training loss: 1.4727 2.1937 sec/batch
Epoch 18/20  Iteration 3060/3560 Training loss: 1.4728 2.0335 sec/batch
Epoch 18/20  Iteration 3061/3560 Training loss: 1.4724 1.9760 sec/batch
Epoch 18/20  Iteration 3062/3560 Training loss: 1.4720 1.9341 sec/batch
Epoch 18/20  Iteration 3063/3560 Training loss: 1.4713 1.9483 sec/batch
Epoch 18/20  Iteration 3064/3560 Training loss: 1.4701 1.9401 sec/batch
Epoch 18/20  Iteration 3065/3560 Training loss: 1.4688 2.0094 sec/batch
Epoch 18/20  Iteration 3066/3560 Training loss: 1.4683 1.9327 sec/batch
Epoch 18/20  Iteration 3067/3560 Training loss: 1.4678 1.9074 sec/batch
Epoch 18/20  Iteration 3068/3560 Training loss: 1.4684 2.0034 sec/batch
Epoch 18/20  Iteration 3069/3560 Training loss: 1.4678 1.9612 sec/batch
Epoch 18/20  Iteration 3070/3560 Training loss: 1.4671 1.9597 sec/batch
Epoch 18/20  Iteration 3071/3560 Training loss: 1.4675 1.8939 sec/batch
Epoch 18/20  Iteration 3072/3560 Training loss: 1.4666 1.9186 sec/batch
Epoch 18/20  Iteration 3073/3560 Training loss: 1.4664 2.0579 sec/batch
Epoch 18/20  Iteration 3074/3560 Training loss: 1.4661 2.0664 sec/batch
Epoch 18/20  Iteration 3075/3560 Training loss: 1.4659 2.0772 sec/batch
Epoch 18/20  Iteration 3076/3560 Training loss: 1.4665 2.0025 sec/batch
Epoch 18/20  Iteration 3077/3560 Training loss: 1.4659 1.9577 sec/batch
Epoch 18/20  Iteration 3078/3560 Training loss: 1.4665 1.9520 sec/batch
Epoch 18/20  Iteration 3079/3560 Training loss: 1.4666 1.8860 sec/batch
Epoch 18/20  Iteration 3080/3560 Training loss: 1.4669 2.2459 sec/batch
Epoch 18/20  Iteration 3081/3560 Training loss: 1.4666 2.1125 sec/batch
Epoch 18/20  Iteration 3082/3560 Training loss: 1.4668 1.9290 sec/batch
Epoch 18/20  Iteration 3083/3560 Training loss: 1.4672 1.9179 sec/batch
Epoch 18/20  Iteration 3084/3560 Training loss: 1.4669 2.0137 sec/batch
Epoch 18/20  Iteration 3085/3560 Training loss: 1.4664 1.9349 sec/batch
Epoch 18/20  Iteration 3086/3560 Training loss: 1.4669 1.9474 sec/batch
Epoch 18/20  Iteration 3087/3560 Training loss: 1.4668 1.9676 sec/batch
Epoch 18/20  Iteration 3088/3560 Training loss: 1.4678 1.8893 sec/batch
Epoch 18/20  Iteration 3089/3560 Training loss: 1.4684 1.9100 sec/batch
Epoch 18/20  Iteration 3090/3560 Training loss: 1.4686 1.9087 sec/batch
Epoch 18/20  Iteration 3091/3560 Training loss: 1.4685 1.9167 sec/batch
Epoch 18/20  Iteration 3092/3560 Training loss: 1.4689 1.9342 sec/batch
Epoch 18/20  Iteration 3093/3560 Training loss: 1.4691 1.8845 sec/batch
Epoch 18/20  Iteration 3094/3560 Training loss: 1.4688 1.9267 sec/batch
Epoch 18/20  Iteration 3095/3560 Training loss: 1.4690 2.0285 sec/batch
Epoch 18/20  Iteration 3096/3560 Training loss: 1.4690 1.9723 sec/batch
Epoch 18/20  Iteration 3097/3560 Training loss: 1.4696 2.0501 sec/batch
Epoch 18/20  Iteration 3098/3560 Training loss: 1.4699 1.9435 sec/batch
Epoch 18/20  Iteration 3099/3560 Training loss: 1.4703 1.9795 sec/batch
Epoch 18/20  Iteration 3100/3560 Training loss: 1.4702 1.9595 sec/batch
Epoch 18/20  Iteration 3101/3560 Training loss: 1.4702 2.0210 sec/batch
Epoch 18/20  Iteration 3102/3560 Training loss: 1.4704 2.2055 sec/batch
Epoch 18/20  Iteration 3103/3560 Training loss: 1.4702 1.9509 sec/batch
Epoch 18/20  Iteration 3104/3560 Training loss: 1.4702 1.9153 sec/batch
Epoch 18/20  Iteration 3105/3560 Training loss: 1.4696 1.9917 sec/batch
Epoch 18/20  Iteration 3106/3560 Training loss: 1.4694 1.9427 sec/batch
Epoch 18/20  Iteration 3107/3560 Training loss: 1.4690 1.9170 sec/batch
Epoch 18/20  Iteration 3108/3560 Training loss: 1.4689 2.1694 sec/batch
Epoch 18/20  Iteration 3109/3560 Training loss: 1.4683 1.9323 sec/batch
Epoch 18/20  Iteration 3110/3560 Training loss: 1.4682 1.8872 sec/batch
Epoch 18/20  Iteration 3111/3560 Training loss: 1.4680 1.9676 sec/batch
Epoch 18/20  Iteration 3112/3560 Training loss: 1.4678 1.9511 sec/batch
Epoch 18/20  Iteration 3113/3560 Training loss: 1.4675 1.9443 sec/batch
Epoch 18/20  Iteration 3114/3560 Training loss: 1.4673 1.9210 sec/batch
Epoch 18/20  Iteration 3115/3560 Training loss: 1.4669 1.9029 sec/batch
Epoch 18/20  Iteration 3116/3560 Training loss: 1.4670 1.9428 sec/batch
Epoch 18/20  Iteration 3117/3560 Training loss: 1.4667 1.9191 sec/batch
Epoch 18/20  Iteration 3118/3560 Training loss: 1.4665 2.0475 sec/batch
Epoch 18/20  Iteration 3119/3560 Training loss: 1.4662 2.1398 sec/batch
Epoch 18/20  Iteration 3120/3560 Training loss: 1.4659 1.9423 sec/batch
Epoch 18/20  Iteration 3121/3560 Training loss: 1.4657 1.9175 sec/batch
Epoch 18/20  Iteration 3122/3560 Training loss: 1.4657 2.0347 sec/batch
Epoch 18/20  Iteration 3123/3560 Training loss: 1.4657 2.1369 sec/batch
Epoch 18/20  Iteration 3124/3560 Training loss: 1.4653 1.9925 sec/batch
Epoch 18/20  Iteration 3125/3560 Training loss: 1.4650 1.9449 sec/batch
Epoch 18/20  Iteration 3126/3560 Training loss: 1.4645 1.9203 sec/batch
Epoch 18/20  Iteration 3127/3560 Training loss: 1.4646 1.9228 sec/batch
Epoch 18/20  Iteration 3128/3560 Training loss: 1.4644 1.9362 sec/batch
Epoch 18/20  Iteration 3129/3560 Training loss: 1.4643 1.9565 sec/batch
Epoch 18/20  Iteration 3130/3560 Training loss: 1.4643 1.9491 sec/batch
Epoch 18/20  Iteration 3131/3560 Training loss: 1.4641 1.9145 sec/batch
Epoch 18/20  Iteration 3132/3560 Training loss: 1.4639 1.9528 sec/batch
Epoch 18/20  Iteration 3133/3560 Training loss: 1.4638 1.9783 sec/batch
Epoch 18/20  Iteration 3134/3560 Training loss: 1.4638 2.1206 sec/batch
Epoch 18/20  Iteration 3135/3560 Training loss: 1.4637 1.9657 sec/batch
Epoch 18/20  Iteration 3136/3560 Training loss: 1.4638 1.8981 sec/batch
Epoch 18/20  Iteration 3137/3560 Training loss: 1.4637 1.9413 sec/batch
Epoch 18/20  Iteration 3138/3560 Training loss: 1.4636 1.9085 sec/batch
Epoch 18/20  Iteration 3139/3560 Training loss: 1.4635 2.0348 sec/batch
Epoch 18/20  Iteration 3140/3560 Training loss: 1.4633 1.9922 sec/batch
Epoch 18/20  Iteration 3141/3560 Training loss: 1.4630 1.9417 sec/batch
Epoch 18/20  Iteration 3142/3560 Training loss: 1.4628 1.8881 sec/batch
Epoch 18/20  Iteration 3143/3560 Training loss: 1.4628 1.9947 sec/batch
Epoch 18/20  Iteration 3144/3560 Training loss: 1.4628 2.1759 sec/batch
Epoch 18/20  Iteration 3145/3560 Training loss: 1.4627 2.0035 sec/batch
Epoch 18/20  Iteration 3146/3560 Training loss: 1.4626 1.9435 sec/batch
Epoch 18/20  Iteration 3147/3560 Training loss: 1.4625 1.9207 sec/batch
Epoch 18/20  Iteration 3148/3560 Training loss: 1.4621 1.9099 sec/batch
Epoch 18/20  Iteration 3149/3560 Training loss: 1.4617 1.9258 sec/batch
Epoch 18/20  Iteration 3150/3560 Training loss: 1.4617 1.9359 sec/batch
Epoch 18/20  Iteration 3151/3560 Training loss: 1.4617 1.9297 sec/batch
Epoch 18/20  Iteration 3152/3560 Training loss: 1.4614 1.9884 sec/batch
Epoch 18/20  Iteration 3153/3560 Training loss: 1.4614 1.9405 sec/batch
Epoch 18/20  Iteration 3154/3560 Training loss: 1.4615 1.8973 sec/batch
Epoch 18/20  Iteration 3155/3560 Training loss: 1.4613 2.0059 sec/batch
Epoch 18/20  Iteration 3156/3560 Training loss: 1.4611 1.9264 sec/batch
Epoch 18/20  Iteration 3157/3560 Training loss: 1.4607 1.9490 sec/batch
Epoch 18/20  Iteration 3158/3560 Training loss: 1.4604 1.8937 sec/batch
Epoch 18/20  Iteration 3159/3560 Training loss: 1.4605 1.8903 sec/batch
Epoch 18/20  Iteration 3160/3560 Training loss: 1.4605 2.0195 sec/batch
Epoch 18/20  Iteration 3161/3560 Training loss: 1.4606 2.0784 sec/batch
Epoch 18/20  Iteration 3162/3560 Training loss: 1.4606 1.9666 sec/batch
Epoch 18/20  Iteration 3163/3560 Training loss: 1.4608 1.9305 sec/batch
Epoch 18/20  Iteration 3164/3560 Training loss: 1.4609 1.9592 sec/batch
Epoch 18/20  Iteration 3165/3560 Training loss: 1.4609 2.1740 sec/batch
Epoch 18/20  Iteration 3166/3560 Training loss: 1.4609 2.1754 sec/batch
Epoch 18/20  Iteration 3167/3560 Training loss: 1.4612 1.9407 sec/batch
Epoch 18/20  Iteration 3168/3560 Training loss: 1.4612 1.9352 sec/batch
Epoch 18/20  Iteration 3169/3560 Training loss: 1.4611 1.9336 sec/batch
Epoch 18/20  Iteration 3170/3560 Training loss: 1.4613 1.9528 sec/batch
Epoch 18/20  Iteration 3171/3560 Training loss: 1.4612 1.9510 sec/batch
Epoch 18/20  Iteration 3172/3560 Training loss: 1.4614 1.9696 sec/batch
Epoch 18/20  Iteration 3173/3560 Training loss: 1.4614 1.9057 sec/batch
Epoch 18/20  Iteration 3174/3560 Training loss: 1.4617 1.9135 sec/batch
Epoch 18/20  Iteration 3175/3560 Training loss: 1.4618 1.9567 sec/batch
Epoch 18/20  Iteration 3176/3560 Training loss: 1.4617 1.8929 sec/batch
Epoch 18/20  Iteration 3177/3560 Training loss: 1.4614 1.9152 sec/batch
Epoch 18/20  Iteration 3178/3560 Training loss: 1.4614 1.9309 sec/batch
Epoch 18/20  Iteration 3179/3560 Training loss: 1.4614 1.9148 sec/batch
Epoch 18/20  Iteration 3180/3560 Training loss: 1.4614 1.8903 sec/batch
Epoch 18/20  Iteration 3181/3560 Training loss: 1.4614 1.9131 sec/batch
Epoch 18/20  Iteration 3182/3560 Training loss: 1.4614 2.0250 sec/batch
Epoch 18/20  Iteration 3183/3560 Training loss: 1.4615 2.0281 sec/batch
Epoch 18/20  Iteration 3184/3560 Training loss: 1.4614 1.9591 sec/batch
Epoch 18/20  Iteration 3185/3560 Training loss: 1.4611 1.9435 sec/batch
Epoch 18/20  Iteration 3186/3560 Training loss: 1.4613 2.0291 sec/batch
Epoch 18/20  Iteration 3187/3560 Training loss: 1.4615 2.1661 sec/batch
Epoch 18/20  Iteration 3188/3560 Training loss: 1.4615 1.9947 sec/batch
Epoch 18/20  Iteration 3189/3560 Training loss: 1.4616 1.9533 sec/batch
Epoch 18/20  Iteration 3190/3560 Training loss: 1.4616 1.9243 sec/batch
Epoch 18/20  Iteration 3191/3560 Training loss: 1.4616 1.8995 sec/batch
Epoch 18/20  Iteration 3192/3560 Training loss: 1.4616 1.9431 sec/batch
Epoch 18/20  Iteration 3193/3560 Training loss: 1.4618 1.9743 sec/batch
Epoch 18/20  Iteration 3194/3560 Training loss: 1.4622 2.0191 sec/batch
Epoch 18/20  Iteration 3195/3560 Training loss: 1.4622 2.0605 sec/batch
Epoch 18/20  Iteration 3196/3560 Training loss: 1.4622 2.0427 sec/batch
Epoch 18/20  Iteration 3197/3560 Training loss: 1.4621 1.9355 sec/batch
Epoch 18/20  Iteration 3198/3560 Training loss: 1.4619 1.9347 sec/batch
Epoch 18/20  Iteration 3199/3560 Training loss: 1.4621 1.9340 sec/batch
Epoch 18/20  Iteration 3200/3560 Training loss: 1.4621 1.9164 sec/batch
Epoch 18/20  Iteration 3201/3560 Training loss: 1.4622 1.9118 sec/batch
Epoch 18/20  Iteration 3202/3560 Training loss: 1.4621 1.9210 sec/batch
Epoch 18/20  Iteration 3203/3560 Training loss: 1.4620 2.0113 sec/batch
Epoch 18/20  Iteration 3204/3560 Training loss: 1.4621 2.1252 sec/batch
Epoch 19/20  Iteration 3205/3560 Training loss: 1.5404 2.3527 sec/batch
Epoch 19/20  Iteration 3206/3560 Training loss: 1.5067 1.9246 sec/batch
Epoch 19/20  Iteration 3207/3560 Training loss: 1.4894 2.0521 sec/batch
Epoch 19/20  Iteration 3208/3560 Training loss: 1.4823 2.2737 sec/batch
Epoch 19/20  Iteration 3209/3560 Training loss: 1.4769 1.9949 sec/batch
Epoch 19/20  Iteration 3210/3560 Training loss: 1.4676 1.9378 sec/batch
Epoch 19/20  Iteration 3211/3560 Training loss: 1.4681 1.8683 sec/batch
Epoch 19/20  Iteration 3212/3560 Training loss: 1.4653 1.9897 sec/batch
Epoch 19/20  Iteration 3213/3560 Training loss: 1.4653 1.9905 sec/batch
Epoch 19/20  Iteration 3214/3560 Training loss: 1.4635 1.9081 sec/batch
Epoch 19/20  Iteration 3215/3560 Training loss: 1.4596 1.9181 sec/batch
Epoch 19/20  Iteration 3216/3560 Training loss: 1.4600 1.9190 sec/batch
Epoch 19/20  Iteration 3217/3560 Training loss: 1.4603 1.9704 sec/batch
Epoch 19/20  Iteration 3218/3560 Training loss: 1.4619 1.9820 sec/batch
Epoch 19/20  Iteration 3219/3560 Training loss: 1.4607 1.9343 sec/batch
Epoch 19/20  Iteration 3220/3560 Training loss: 1.4588 1.9125 sec/batch
Epoch 19/20  Iteration 3221/3560 Training loss: 1.4598 1.9337 sec/batch
Epoch 19/20  Iteration 3222/3560 Training loss: 1.4610 1.9253 sec/batch
Epoch 19/20  Iteration 3223/3560 Training loss: 1.4615 1.9244 sec/batch
Epoch 19/20  Iteration 3224/3560 Training loss: 1.4624 2.0383 sec/batch
Epoch 19/20  Iteration 3225/3560 Training loss: 1.4618 2.0087 sec/batch
Epoch 19/20  Iteration 3226/3560 Training loss: 1.4627 1.9241 sec/batch
Epoch 19/20  Iteration 3227/3560 Training loss: 1.4618 1.9589 sec/batch
Epoch 19/20  Iteration 3228/3560 Training loss: 1.4618 2.0262 sec/batch
Epoch 19/20  Iteration 3229/3560 Training loss: 1.4619 2.1144 sec/batch
Epoch 19/20  Iteration 3230/3560 Training loss: 1.4608 2.4109 sec/batch
Epoch 19/20  Iteration 3231/3560 Training loss: 1.4596 2.3439 sec/batch
Epoch 19/20  Iteration 3232/3560 Training loss: 1.4600 1.9145 sec/batch
Epoch 19/20  Iteration 3233/3560 Training loss: 1.4604 1.8089 sec/batch
Epoch 19/20  Iteration 3234/3560 Training loss: 1.4608 1.8345 sec/batch
Epoch 19/20  Iteration 3235/3560 Training loss: 1.4607 1.9538 sec/batch
Epoch 19/20  Iteration 3236/3560 Training loss: 1.4599 1.8652 sec/batch
Epoch 19/20  Iteration 3237/3560 Training loss: 1.4598 1.8936 sec/batch
Epoch 19/20  Iteration 3238/3560 Training loss: 1.4598 1.8226 sec/batch
Epoch 19/20  Iteration 3239/3560 Training loss: 1.4595 1.8656 sec/batch
Epoch 19/20  Iteration 3240/3560 Training loss: 1.4594 2.4765 sec/batch
Epoch 19/20  Iteration 3241/3560 Training loss: 1.4585 2.7840 sec/batch
Epoch 19/20  Iteration 3242/3560 Training loss: 1.4572 2.0827 sec/batch
Epoch 19/20  Iteration 3243/3560 Training loss: 1.4558 2.1442 sec/batch
Epoch 19/20  Iteration 3244/3560 Training loss: 1.4552 2.3175 sec/batch
Epoch 19/20  Iteration 3245/3560 Training loss: 1.4547 2.4499 sec/batch
Epoch 19/20  Iteration 3246/3560 Training loss: 1.4554 2.7372 sec/batch
Epoch 19/20  Iteration 3247/3560 Training loss: 1.4550 2.3449 sec/batch
Epoch 19/20  Iteration 3248/3560 Training loss: 1.4544 2.3397 sec/batch
Epoch 19/20  Iteration 3249/3560 Training loss: 1.4546 2.0728 sec/batch
Epoch 19/20  Iteration 3250/3560 Training loss: 1.4538 1.9049 sec/batch
Epoch 19/20  Iteration 3251/3560 Training loss: 1.4537 1.9166 sec/batch
Epoch 19/20  Iteration 3252/3560 Training loss: 1.4534 2.1479 sec/batch
Epoch 19/20  Iteration 3253/3560 Training loss: 1.4533 2.0293 sec/batch
Epoch 19/20  Iteration 3254/3560 Training loss: 1.4539 2.0128 sec/batch
Epoch 19/20  Iteration 3255/3560 Training loss: 1.4533 1.9519 sec/batch
Epoch 19/20  Iteration 3256/3560 Training loss: 1.4540 1.8980 sec/batch
Epoch 19/20  Iteration 3257/3560 Training loss: 1.4539 1.9500 sec/batch
Epoch 19/20  Iteration 3258/3560 Training loss: 1.4541 2.0628 sec/batch
Epoch 19/20  Iteration 3259/3560 Training loss: 1.4538 1.9171 sec/batch
Epoch 19/20  Iteration 3260/3560 Training loss: 1.4539 1.9400 sec/batch
Epoch 19/20  Iteration 3261/3560 Training loss: 1.4543 1.9295 sec/batch
Epoch 19/20  Iteration 3262/3560 Training loss: 1.4540 1.9344 sec/batch
Epoch 19/20  Iteration 3263/3560 Training loss: 1.4534 2.0788 sec/batch
Epoch 19/20  Iteration 3264/3560 Training loss: 1.4540 2.6086 sec/batch
Epoch 19/20  Iteration 3265/3560 Training loss: 1.4539 2.4526 sec/batch
Epoch 19/20  Iteration 3266/3560 Training loss: 1.4549 1.9667 sec/batch
Epoch 19/20  Iteration 3267/3560 Training loss: 1.4552 1.9346 sec/batch
Epoch 19/20  Iteration 3268/3560 Training loss: 1.4554 2.0456 sec/batch
Epoch 19/20  Iteration 3269/3560 Training loss: 1.4552 2.1553 sec/batch
Epoch 19/20  Iteration 3270/3560 Training loss: 1.4552 2.1233 sec/batch
Epoch 19/20  Iteration 3271/3560 Training loss: 1.4555 1.9449 sec/batch
Epoch 19/20  Iteration 3272/3560 Training loss: 1.4553 1.8953 sec/batch
Epoch 19/20  Iteration 3273/3560 Training loss: 1.4552 1.9500 sec/batch
Epoch 19/20  Iteration 3274/3560 Training loss: 1.4551 1.8981 sec/batch
Epoch 19/20  Iteration 3275/3560 Training loss: 1.4556 1.9550 sec/batch
Epoch 19/20  Iteration 3276/3560 Training loss: 1.4558 1.9499 sec/batch
Epoch 19/20  Iteration 3277/3560 Training loss: 1.4563 1.9012 sec/batch
Epoch 19/20  Iteration 3278/3560 Training loss: 1.4561 1.9931 sec/batch
Epoch 19/20  Iteration 3279/3560 Training loss: 1.4560 1.9315 sec/batch
Epoch 19/20  Iteration 3280/3560 Training loss: 1.4561 1.9315 sec/batch
Epoch 19/20  Iteration 3281/3560 Training loss: 1.4560 1.8846 sec/batch
Epoch 19/20  Iteration 3282/3560 Training loss: 1.4558 1.8796 sec/batch
Epoch 19/20  Iteration 3283/3560 Training loss: 1.4552 1.9168 sec/batch
Epoch 19/20  Iteration 3284/3560 Training loss: 1.4551 1.8870 sec/batch
Epoch 19/20  Iteration 3285/3560 Training loss: 1.4547 2.1159 sec/batch
Epoch 19/20  Iteration 3286/3560 Training loss: 1.4547 2.0418 sec/batch
Epoch 19/20  Iteration 3287/3560 Training loss: 1.4542 1.9467 sec/batch
Epoch 19/20  Iteration 3288/3560 Training loss: 1.4542 1.9599 sec/batch
Epoch 19/20  Iteration 3289/3560 Training loss: 1.4540 2.0292 sec/batch
Epoch 19/20  Iteration 3290/3560 Training loss: 1.4537 2.1659 sec/batch
Epoch 19/20  Iteration 3291/3560 Training loss: 1.4535 2.0210 sec/batch
Epoch 19/20  Iteration 3292/3560 Training loss: 1.4533 1.9321 sec/batch
Epoch 19/20  Iteration 3293/3560 Training loss: 1.4529 1.9641 sec/batch
Epoch 19/20  Iteration 3294/3560 Training loss: 1.4531 1.9877 sec/batch
Epoch 19/20  Iteration 3295/3560 Training loss: 1.4527 1.9325 sec/batch
Epoch 19/20  Iteration 3296/3560 Training loss: 1.4526 1.9289 sec/batch
Epoch 19/20  Iteration 3297/3560 Training loss: 1.4522 1.9205 sec/batch
Epoch 19/20  Iteration 3298/3560 Training loss: 1.4520 1.9147 sec/batch
Epoch 19/20  Iteration 3299/3560 Training loss: 1.4518 1.9145 sec/batch
Epoch 19/20  Iteration 3300/3560 Training loss: 1.4518 1.8967 sec/batch
Epoch 19/20  Iteration 3301/3560 Training loss: 1.4517 1.9454 sec/batch
Epoch 19/20  Iteration 3302/3560 Training loss: 1.4514 1.8828 sec/batch
Epoch 19/20  Iteration 3303/3560 Training loss: 1.4510 1.9225 sec/batch
Epoch 19/20  Iteration 3304/3560 Training loss: 1.4507 2.0241 sec/batch
Epoch 19/20  Iteration 3305/3560 Training loss: 1.4508 1.9463 sec/batch
Epoch 19/20  Iteration 3306/3560 Training loss: 1.4507 2.0546 sec/batch
Epoch 19/20  Iteration 3307/3560 Training loss: 1.4506 2.0282 sec/batch
Epoch 19/20  Iteration 3308/3560 Training loss: 1.4505 1.9975 sec/batch
Epoch 19/20  Iteration 3309/3560 Training loss: 1.4503 1.9451 sec/batch
Epoch 19/20  Iteration 3310/3560 Training loss: 1.4502 1.9638 sec/batch
Epoch 19/20  Iteration 3311/3560 Training loss: 1.4502 2.2495 sec/batch
Epoch 19/20  Iteration 3312/3560 Training loss: 1.4502 2.0704 sec/batch
Epoch 19/20  Iteration 3313/3560 Training loss: 1.4500 2.1500 sec/batch
Epoch 19/20  Iteration 3314/3560 Training loss: 1.4502 1.9801 sec/batch
Epoch 19/20  Iteration 3315/3560 Training loss: 1.4500 2.0319 sec/batch
Epoch 19/20  Iteration 3316/3560 Training loss: 1.4499 1.9113 sec/batch
Epoch 19/20  Iteration 3317/3560 Training loss: 1.4500 2.0015 sec/batch
Epoch 19/20  Iteration 3318/3560 Training loss: 1.4498 1.9128 sec/batch
Epoch 19/20  Iteration 3319/3560 Training loss: 1.4496 1.9192 sec/batch
Epoch 19/20  Iteration 3320/3560 Training loss: 1.4493 1.9274 sec/batch
Epoch 19/20  Iteration 3321/3560 Training loss: 1.4492 1.9249 sec/batch
Epoch 19/20  Iteration 3322/3560 Training loss: 1.4493 1.9627 sec/batch
Epoch 19/20  Iteration 3323/3560 Training loss: 1.4492 1.9350 sec/batch
Epoch 19/20  Iteration 3324/3560 Training loss: 1.4492 1.8983 sec/batch
Epoch 19/20  Iteration 3325/3560 Training loss: 1.4491 1.9730 sec/batch
Epoch 19/20  Iteration 3326/3560 Training loss: 1.4486 1.9051 sec/batch
Epoch 19/20  Iteration 3327/3560 Training loss: 1.4483 2.0342 sec/batch
Epoch 19/20  Iteration 3328/3560 Training loss: 1.4484 2.0411 sec/batch
Epoch 19/20  Iteration 3329/3560 Training loss: 1.4484 1.9202 sec/batch
Epoch 19/20  Iteration 3330/3560 Training loss: 1.4481 1.9341 sec/batch
Epoch 19/20  Iteration 3331/3560 Training loss: 1.4481 1.8866 sec/batch
Epoch 19/20  Iteration 3332/3560 Training loss: 1.4481 2.1195 sec/batch
Epoch 19/20  Iteration 3333/3560 Training loss: 1.4479 2.1284 sec/batch
Epoch 19/20  Iteration 3334/3560 Training loss: 1.4476 1.9099 sec/batch
Epoch 19/20  Iteration 3335/3560 Training loss: 1.4472 1.9061 sec/batch
Epoch 19/20  Iteration 3336/3560 Training loss: 1.4470 1.9442 sec/batch
Epoch 19/20  Iteration 3337/3560 Training loss: 1.4472 1.9166 sec/batch
Epoch 19/20  Iteration 3338/3560 Training loss: 1.4473 1.9062 sec/batch
Epoch 19/20  Iteration 3339/3560 Training loss: 1.4474 1.8944 sec/batch
Epoch 19/20  Iteration 3340/3560 Training loss: 1.4475 1.9172 sec/batch
Epoch 19/20  Iteration 3341/3560 Training loss: 1.4476 1.9102 sec/batch
Epoch 19/20  Iteration 3342/3560 Training loss: 1.4478 1.8929 sec/batch
Epoch 19/20  Iteration 3343/3560 Training loss: 1.4478 1.9181 sec/batch
Epoch 19/20  Iteration 3344/3560 Training loss: 1.4477 1.9309 sec/batch
Epoch 19/20  Iteration 3345/3560 Training loss: 1.4481 1.8941 sec/batch
Epoch 19/20  Iteration 3346/3560 Training loss: 1.4481 1.8850 sec/batch
Epoch 19/20  Iteration 3347/3560 Training loss: 1.4481 1.9374 sec/batch
Epoch 19/20  Iteration 3348/3560 Training loss: 1.4482 1.9314 sec/batch
Epoch 19/20  Iteration 3349/3560 Training loss: 1.4482 2.0518 sec/batch
Epoch 19/20  Iteration 3350/3560 Training loss: 1.4483 2.0174 sec/batch
Epoch 19/20  Iteration 3351/3560 Training loss: 1.4483 1.9235 sec/batch
Epoch 19/20  Iteration 3352/3560 Training loss: 1.4486 1.9533 sec/batch
Epoch 19/20  Iteration 3353/3560 Training loss: 1.4487 2.1074 sec/batch
Epoch 19/20  Iteration 3354/3560 Training loss: 1.4486 2.5352 sec/batch
Epoch 19/20  Iteration 3355/3560 Training loss: 1.4482 2.0319 sec/batch
Epoch 19/20  Iteration 3356/3560 Training loss: 1.4482 1.9586 sec/batch
Epoch 19/20  Iteration 3357/3560 Training loss: 1.4483 1.9309 sec/batch
Epoch 19/20  Iteration 3358/3560 Training loss: 1.4483 1.9236 sec/batch
Epoch 19/20  Iteration 3359/3560 Training loss: 1.4483 1.9255 sec/batch
Epoch 19/20  Iteration 3360/3560 Training loss: 1.4482 1.9430 sec/batch
Epoch 19/20  Iteration 3361/3560 Training loss: 1.4483 1.9835 sec/batch
Epoch 19/20  Iteration 3362/3560 Training loss: 1.4483 1.9403 sec/batch
Epoch 19/20  Iteration 3363/3560 Training loss: 1.4481 1.9451 sec/batch
Epoch 19/20  Iteration 3364/3560 Training loss: 1.4481 1.9585 sec/batch
Epoch 19/20  Iteration 3365/3560 Training loss: 1.4483 1.9319 sec/batch
Epoch 19/20  Iteration 3366/3560 Training loss: 1.4484 1.9259 sec/batch
Epoch 19/20  Iteration 3367/3560 Training loss: 1.4483 1.9482 sec/batch
Epoch 19/20  Iteration 3368/3560 Training loss: 1.4483 1.9268 sec/batch
Epoch 19/20  Iteration 3369/3560 Training loss: 1.4483 1.9384 sec/batch
Epoch 19/20  Iteration 3370/3560 Training loss: 1.4483 2.0683 sec/batch
Epoch 19/20  Iteration 3371/3560 Training loss: 1.4485 1.9844 sec/batch
Epoch 19/20  Iteration 3372/3560 Training loss: 1.4489 1.9463 sec/batch
Epoch 19/20  Iteration 3373/3560 Training loss: 1.4489 2.0032 sec/batch
Epoch 19/20  Iteration 3374/3560 Training loss: 1.4490 2.1781 sec/batch
Epoch 19/20  Iteration 3375/3560 Training loss: 1.4489 2.1932 sec/batch
Epoch 19/20  Iteration 3376/3560 Training loss: 1.4488 2.0222 sec/batch
Epoch 19/20  Iteration 3377/3560 Training loss: 1.4489 1.9330 sec/batch
Epoch 19/20  Iteration 3378/3560 Training loss: 1.4489 1.9715 sec/batch
Epoch 19/20  Iteration 3379/3560 Training loss: 1.4490 1.9281 sec/batch
Epoch 19/20  Iteration 3380/3560 Training loss: 1.4489 1.8995 sec/batch
Epoch 19/20  Iteration 3381/3560 Training loss: 1.4488 1.9773 sec/batch
Epoch 19/20  Iteration 3382/3560 Training loss: 1.4490 1.9000 sec/batch
Epoch 20/20  Iteration 3383/3560 Training loss: 1.5229 1.9142 sec/batch
Epoch 20/20  Iteration 3384/3560 Training loss: 1.4944 1.9581 sec/batch
Epoch 20/20  Iteration 3385/3560 Training loss: 1.4781 1.9366 sec/batch
Epoch 20/20  Iteration 3386/3560 Training loss: 1.4734 1.9267 sec/batch
Epoch 20/20  Iteration 3387/3560 Training loss: 1.4665 1.8954 sec/batch
Epoch 20/20  Iteration 3388/3560 Training loss: 1.4561 1.9138 sec/batch
Epoch 20/20  Iteration 3389/3560 Training loss: 1.4578 1.9049 sec/batch
Epoch 20/20  Iteration 3390/3560 Training loss: 1.4543 1.9179 sec/batch
Epoch 20/20  Iteration 3391/3560 Training loss: 1.4539 2.0429 sec/batch
Epoch 20/20  Iteration 3392/3560 Training loss: 1.4521 2.0348 sec/batch
Epoch 20/20  Iteration 3393/3560 Training loss: 1.4484 1.9297 sec/batch
Epoch 20/20  Iteration 3394/3560 Training loss: 1.4483 1.9158 sec/batch
Epoch 20/20  Iteration 3395/3560 Training loss: 1.4478 1.9115 sec/batch
Epoch 20/20  Iteration 3396/3560 Training loss: 1.4491 2.2484 sec/batch
Epoch 20/20  Iteration 3397/3560 Training loss: 1.4477 2.0360 sec/batch
Epoch 20/20  Iteration 3398/3560 Training loss: 1.4455 1.9214 sec/batch
Epoch 20/20  Iteration 3399/3560 Training loss: 1.4463 1.9371 sec/batch
Epoch 20/20  Iteration 3400/3560 Training loss: 1.4474 1.9002 sec/batch
Epoch 20/20  Iteration 3401/3560 Training loss: 1.4479 1.9360 sec/batch
Epoch 20/20  Iteration 3402/3560 Training loss: 1.4489 1.9477 sec/batch
Epoch 20/20  Iteration 3403/3560 Training loss: 1.4486 1.9567 sec/batch
Epoch 20/20  Iteration 3404/3560 Training loss: 1.4487 1.9259 sec/batch
Epoch 20/20  Iteration 3405/3560 Training loss: 1.4483 1.9206 sec/batch
Epoch 20/20  Iteration 3406/3560 Training loss: 1.4487 1.9241 sec/batch
Epoch 20/20  Iteration 3407/3560 Training loss: 1.4485 1.9504 sec/batch
Epoch 20/20  Iteration 3408/3560 Training loss: 1.4474 2.1006 sec/batch
Epoch 20/20  Iteration 3409/3560 Training loss: 1.4465 2.6106 sec/batch
Epoch 20/20  Iteration 3410/3560 Training loss: 1.4473 2.0664 sec/batch
Epoch 20/20  Iteration 3411/3560 Training loss: 1.4476 2.5504 sec/batch
Epoch 20/20  Iteration 3412/3560 Training loss: 1.4476 2.4450 sec/batch
Epoch 20/20  Iteration 3413/3560 Training loss: 1.4472 2.0497 sec/batch
Epoch 20/20  Iteration 3414/3560 Training loss: 1.4467 1.9590 sec/batch
Epoch 20/20  Iteration 3415/3560 Training loss: 1.4471 1.8952 sec/batch
Epoch 20/20  Iteration 3416/3560 Training loss: 1.4471 2.0453 sec/batch
Epoch 20/20  Iteration 3417/3560 Training loss: 1.4467 2.1398 sec/batch
Epoch 20/20  Iteration 3418/3560 Training loss: 1.4465 2.0679 sec/batch
Epoch 20/20  Iteration 3419/3560 Training loss: 1.4457 1.9290 sec/batch
Epoch 20/20  Iteration 3420/3560 Training loss: 1.4446 1.8847 sec/batch
Epoch 20/20  Iteration 3421/3560 Training loss: 1.4433 1.9124 sec/batch
Epoch 20/20  Iteration 3422/3560 Training loss: 1.4428 1.9171 sec/batch
Epoch 20/20  Iteration 3423/3560 Training loss: 1.4425 1.9355 sec/batch
Epoch 20/20  Iteration 3424/3560 Training loss: 1.4433 1.9078 sec/batch
Epoch 20/20  Iteration 3425/3560 Training loss: 1.4429 1.8522 sec/batch
Epoch 20/20  Iteration 3426/3560 Training loss: 1.4422 1.9022 sec/batch
Epoch 20/20  Iteration 3427/3560 Training loss: 1.4426 1.9227 sec/batch
Epoch 20/20  Iteration 3428/3560 Training loss: 1.4416 1.9054 sec/batch
Epoch 20/20  Iteration 3429/3560 Training loss: 1.4414 1.8980 sec/batch
Epoch 20/20  Iteration 3430/3560 Training loss: 1.4411 1.8955 sec/batch
Epoch 20/20  Iteration 3431/3560 Training loss: 1.4408 1.9163 sec/batch
Epoch 20/20  Iteration 3432/3560 Training loss: 1.4410 1.9037 sec/batch
Epoch 20/20  Iteration 3433/3560 Training loss: 1.4406 1.9929 sec/batch
Epoch 20/20  Iteration 3434/3560 Training loss: 1.4413 2.1752 sec/batch
Epoch 20/20  Iteration 3435/3560 Training loss: 1.4413 1.9851 sec/batch
Epoch 20/20  Iteration 3436/3560 Training loss: 1.4416 1.9155 sec/batch
Epoch 20/20  Iteration 3437/3560 Training loss: 1.4413 1.9514 sec/batch
Epoch 20/20  Iteration 3438/3560 Training loss: 1.4413 2.1569 sec/batch
Epoch 20/20  Iteration 3439/3560 Training loss: 1.4418 1.9620 sec/batch
Epoch 20/20  Iteration 3440/3560 Training loss: 1.4415 1.9152 sec/batch
Epoch 20/20  Iteration 3441/3560 Training loss: 1.4410 1.8960 sec/batch
Epoch 20/20  Iteration 3442/3560 Training loss: 1.4415 1.8938 sec/batch
Epoch 20/20  Iteration 3443/3560 Training loss: 1.4413 1.8794 sec/batch
Epoch 20/20  Iteration 3444/3560 Training loss: 1.4424 1.9049 sec/batch
Epoch 20/20  Iteration 3445/3560 Training loss: 1.4428 1.9372 sec/batch
Epoch 20/20  Iteration 3446/3560 Training loss: 1.4431 1.8682 sec/batch
Epoch 20/20  Iteration 3447/3560 Training loss: 1.4429 1.9168 sec/batch
Epoch 20/20  Iteration 3448/3560 Training loss: 1.4433 1.9716 sec/batch
Epoch 20/20  Iteration 3449/3560 Training loss: 1.4434 1.9118 sec/batch
Epoch 20/20  Iteration 3450/3560 Training loss: 1.4432 1.9830 sec/batch
Epoch 20/20  Iteration 3451/3560 Training loss: 1.4433 2.1800 sec/batch
Epoch 20/20  Iteration 3452/3560 Training loss: 1.4431 2.2583 sec/batch
Epoch 20/20  Iteration 3453/3560 Training loss: 1.4438 2.3566 sec/batch
Epoch 20/20  Iteration 3454/3560 Training loss: 1.4441 2.9002 sec/batch
Epoch 20/20  Iteration 3455/3560 Training loss: 1.4446 2.5251 sec/batch
Epoch 20/20  Iteration 3456/3560 Training loss: 1.4443 2.4171 sec/batch
Epoch 20/20  Iteration 3457/3560 Training loss: 1.4443 2.4540 sec/batch
Epoch 20/20  Iteration 3458/3560 Training loss: 1.4445 2.2154 sec/batch
Epoch 20/20  Iteration 3459/3560 Training loss: 1.4444 1.9879 sec/batch
Epoch 20/20  Iteration 3460/3560 Training loss: 1.4443 1.9262 sec/batch
Epoch 20/20  Iteration 3461/3560 Training loss: 1.4437 1.9090 sec/batch
Epoch 20/20  Iteration 3462/3560 Training loss: 1.4437 1.9229 sec/batch
Epoch 20/20  Iteration 3463/3560 Training loss: 1.4432 1.9897 sec/batch
Epoch 20/20  Iteration 3464/3560 Training loss: 1.4432 1.9575 sec/batch
Epoch 20/20  Iteration 3465/3560 Training loss: 1.4427 1.9157 sec/batch
Epoch 20/20  Iteration 3466/3560 Training loss: 1.4426 2.2498 sec/batch
Epoch 20/20  Iteration 3467/3560 Training loss: 1.4425 2.5152 sec/batch
Epoch 20/20  Iteration 3468/3560 Training loss: 1.4423 2.0770 sec/batch
Epoch 20/20  Iteration 3469/3560 Training loss: 1.4420 1.8896 sec/batch
Epoch 20/20  Iteration 3470/3560 Training loss: 1.4419 1.9554 sec/batch
Epoch 20/20  Iteration 3471/3560 Training loss: 1.4414 1.9814 sec/batch
Epoch 20/20  Iteration 3472/3560 Training loss: 1.4415 1.8671 sec/batch
Epoch 20/20  Iteration 3473/3560 Training loss: 1.4412 1.8897 sec/batch
Epoch 20/20  Iteration 3474/3560 Training loss: 1.4411 2.0177 sec/batch
Epoch 20/20  Iteration 3475/3560 Training loss: 1.4408 2.0080 sec/batch
Epoch 20/20  Iteration 3476/3560 Training loss: 1.4404 1.9338 sec/batch
Epoch 20/20  Iteration 3477/3560 Training loss: 1.4401 1.9014 sec/batch
Epoch 20/20  Iteration 3478/3560 Training loss: 1.4403 2.0181 sec/batch
Epoch 20/20  Iteration 3479/3560 Training loss: 1.4403 2.0897 sec/batch
Epoch 20/20  Iteration 3480/3560 Training loss: 1.4400 2.0852 sec/batch
Epoch 20/20  Iteration 3481/3560 Training loss: 1.4396 1.9620 sec/batch
Epoch 20/20  Iteration 3482/3560 Training loss: 1.4391 1.8628 sec/batch
Epoch 20/20  Iteration 3483/3560 Training loss: 1.4391 1.9047 sec/batch
Epoch 20/20  Iteration 3484/3560 Training loss: 1.4391 1.9302 sec/batch
Epoch 20/20  Iteration 3485/3560 Training loss: 1.4389 1.9217 sec/batch
Epoch 20/20  Iteration 3486/3560 Training loss: 1.4388 1.9402 sec/batch
Epoch 20/20  Iteration 3487/3560 Training loss: 1.4386 1.8971 sec/batch
Epoch 20/20  Iteration 3488/3560 Training loss: 1.4385 1.9338 sec/batch
Epoch 20/20  Iteration 3489/3560 Training loss: 1.4384 1.9045 sec/batch
Epoch 20/20  Iteration 3490/3560 Training loss: 1.4383 1.9074 sec/batch
Epoch 20/20  Iteration 3491/3560 Training loss: 1.4382 1.9103 sec/batch
Epoch 20/20  Iteration 3492/3560 Training loss: 1.4382 1.9725 sec/batch
Epoch 20/20  Iteration 3493/3560 Training loss: 1.4381 2.0979 sec/batch
Epoch 20/20  Iteration 3494/3560 Training loss: 1.4380 1.9348 sec/batch
Epoch 20/20  Iteration 3495/3560 Training loss: 1.4378 2.0134 sec/batch
Epoch 20/20  Iteration 3496/3560 Training loss: 1.4377 2.0087 sec/batch
Epoch 20/20  Iteration 3497/3560 Training loss: 1.4373 1.9011 sec/batch
Epoch 20/20  Iteration 3498/3560 Training loss: 1.4370 1.8935 sec/batch
Epoch 20/20  Iteration 3499/3560 Training loss: 1.4370 1.9056 sec/batch
Epoch 20/20  Iteration 3500/3560 Training loss: 1.4370 2.1819 sec/batch
Epoch 20/20  Iteration 3501/3560 Training loss: 1.4368 2.1095 sec/batch
Epoch 20/20  Iteration 3502/3560 Training loss: 1.4368 1.8557 sec/batch
Epoch 20/20  Iteration 3503/3560 Training loss: 1.4368 2.2458 sec/batch
Epoch 20/20  Iteration 3504/3560 Training loss: 1.4365 1.9513 sec/batch
Epoch 20/20  Iteration 3505/3560 Training loss: 1.4361 1.8711 sec/batch
Epoch 20/20  Iteration 3506/3560 Training loss: 1.4362 1.9882 sec/batch
Epoch 20/20  Iteration 3507/3560 Training loss: 1.4361 1.9178 sec/batch
Epoch 20/20  Iteration 3508/3560 Training loss: 1.4358 1.8930 sec/batch
Epoch 20/20  Iteration 3509/3560 Training loss: 1.4359 1.9049 sec/batch
Epoch 20/20  Iteration 3510/3560 Training loss: 1.4359 1.9221 sec/batch
Epoch 20/20  Iteration 3511/3560 Training loss: 1.4357 1.8923 sec/batch
Epoch 20/20  Iteration 3512/3560 Training loss: 1.4355 1.8870 sec/batch
Epoch 20/20  Iteration 3513/3560 Training loss: 1.4351 1.9121 sec/batch
Epoch 20/20  Iteration 3514/3560 Training loss: 1.4349 1.8942 sec/batch
Epoch 20/20  Iteration 3515/3560 Training loss: 1.4350 1.9018 sec/batch
Epoch 20/20  Iteration 3516/3560 Training loss: 1.4350 1.9445 sec/batch
Epoch 20/20  Iteration 3517/3560 Training loss: 1.4350 2.0288 sec/batch
Epoch 20/20  Iteration 3518/3560 Training loss: 1.4351 1.9226 sec/batch
Epoch 20/20  Iteration 3519/3560 Training loss: 1.4353 1.9250 sec/batch
Epoch 20/20  Iteration 3520/3560 Training loss: 1.4354 1.8857 sec/batch
Epoch 20/20  Iteration 3521/3560 Training loss: 1.4354 2.0408 sec/batch
Epoch 20/20  Iteration 3522/3560 Training loss: 1.4354 2.1225 sec/batch
Epoch 20/20  Iteration 3523/3560 Training loss: 1.4357 1.8860 sec/batch
Epoch 20/20  Iteration 3524/3560 Training loss: 1.4358 1.8944 sec/batch
Epoch 20/20  Iteration 3525/3560 Training loss: 1.4357 2.0019 sec/batch
Epoch 20/20  Iteration 3526/3560 Training loss: 1.4358 1.8769 sec/batch
Epoch 20/20  Iteration 3527/3560 Training loss: 1.4357 1.9420 sec/batch
Epoch 20/20  Iteration 3528/3560 Training loss: 1.4359 1.9138 sec/batch
Epoch 20/20  Iteration 3529/3560 Training loss: 1.4359 1.9049 sec/batch
Epoch 20/20  Iteration 3530/3560 Training loss: 1.4361 1.8770 sec/batch
Epoch 20/20  Iteration 3531/3560 Training loss: 1.4361 1.9182 sec/batch
Epoch 20/20  Iteration 3532/3560 Training loss: 1.4360 1.8888 sec/batch
Epoch 20/20  Iteration 3533/3560 Training loss: 1.4358 1.8924 sec/batch
Epoch 20/20  Iteration 3534/3560 Training loss: 1.4358 1.8561 sec/batch
Epoch 20/20  Iteration 3535/3560 Training loss: 1.4358 1.8544 sec/batch
Epoch 20/20  Iteration 3536/3560 Training loss: 1.4358 1.9095 sec/batch
Epoch 20/20  Iteration 3537/3560 Training loss: 1.4358 1.9176 sec/batch
Epoch 20/20  Iteration 3538/3560 Training loss: 1.4357 1.9911 sec/batch
Epoch 20/20  Iteration 3539/3560 Training loss: 1.4358 1.9246 sec/batch
Epoch 20/20  Iteration 3540/3560 Training loss: 1.4359 1.8883 sec/batch
Epoch 20/20  Iteration 3541/3560 Training loss: 1.4357 1.9077 sec/batch
Epoch 20/20  Iteration 3542/3560 Training loss: 1.4358 1.8540 sec/batch
Epoch 20/20  Iteration 3543/3560 Training loss: 1.4361 2.0566 sec/batch
Epoch 20/20  Iteration 3544/3560 Training loss: 1.4361 2.1165 sec/batch
Epoch 20/20  Iteration 3545/3560 Training loss: 1.4361 1.9142 sec/batch
Epoch 20/20  Iteration 3546/3560 Training loss: 1.4361 1.8192 sec/batch
Epoch 20/20  Iteration 3547/3560 Training loss: 1.4361 1.8968 sec/batch
Epoch 20/20  Iteration 3548/3560 Training loss: 1.4362 1.9059 sec/batch
Epoch 20/20  Iteration 3549/3560 Training loss: 1.4364 1.8827 sec/batch
Epoch 20/20  Iteration 3550/3560 Training loss: 1.4368 1.8726 sec/batch
Epoch 20/20  Iteration 3551/3560 Training loss: 1.4368 1.9180 sec/batch
Epoch 20/20  Iteration 3552/3560 Training loss: 1.4369 1.8850 sec/batch
Epoch 20/20  Iteration 3553/3560 Training loss: 1.4368 1.9096 sec/batch
Epoch 20/20  Iteration 3554/3560 Training loss: 1.4366 1.9428 sec/batch
Epoch 20/20  Iteration 3555/3560 Training loss: 1.4368 1.9996 sec/batch
Epoch 20/20  Iteration 3556/3560 Training loss: 1.4368 1.9410 sec/batch
Epoch 20/20  Iteration 3557/3560 Training loss: 1.4368 1.8911 sec/batch
Epoch 20/20  Iteration 3558/3560 Training loss: 1.4367 1.8901 sec/batch
Epoch 20/20  Iteration 3559/3560 Training loss: 1.4366 1.8611 sec/batch
Epoch 20/20  Iteration 3560/3560 Training loss: 1.4367 2.0077 sec/batch
Epoch 1/20  Iteration 1/3560 Training loss: 4.4237 2.8514 sec/batch
Epoch 1/20  Iteration 2/3560 Training loss: 4.3732 2.7181 sec/batch
Epoch 1/20  Iteration 3/3560 Training loss: 4.4217 3.0299 sec/batch
Epoch 1/20  Iteration 4/3560 Training loss: 4.2956 2.7974 sec/batch
Epoch 1/20  Iteration 5/3560 Training loss: 4.1949 2.7031 sec/batch
Epoch 1/20  Iteration 6/3560 Training loss: 4.1125 2.6752 sec/batch
Epoch 1/20  Iteration 7/3560 Training loss: 4.0262 2.7966 sec/batch
Epoch 1/20  Iteration 8/3560 Training loss: 3.9447 2.7499 sec/batch
Epoch 1/20  Iteration 9/3560 Training loss: 3.8726 2.7114 sec/batch
Epoch 1/20  Iteration 10/3560 Training loss: 3.8150 2.7805 sec/batch
Epoch 1/20  Iteration 11/3560 Training loss: 3.7634 2.7196 sec/batch
Epoch 1/20  Iteration 12/3560 Training loss: 3.7192 3.3169 sec/batch
Epoch 1/20  Iteration 13/3560 Training loss: 3.6806 3.1282 sec/batch
Epoch 1/20  Iteration 14/3560 Training loss: 3.6481 2.9871 sec/batch
Epoch 1/20  Iteration 15/3560 Training loss: 3.6193 3.1176 sec/batch
Epoch 1/20  Iteration 16/3560 Training loss: 3.5936 3.0032 sec/batch
Epoch 1/20  Iteration 17/3560 Training loss: 3.5689 3.5173 sec/batch
Epoch 1/20  Iteration 18/3560 Training loss: 3.5483 3.1598 sec/batch
Epoch 1/20  Iteration 19/3560 Training loss: 3.5286 2.7084 sec/batch
Epoch 1/20  Iteration 20/3560 Training loss: 3.5092 2.7561 sec/batch
Epoch 1/20  Iteration 21/3560 Training loss: 3.4920 3.4579 sec/batch
Epoch 1/20  Iteration 22/3560 Training loss: 3.4763 3.2249 sec/batch
Epoch 1/20  Iteration 23/3560 Training loss: 3.4618 2.6192 sec/batch
Epoch 1/20  Iteration 24/3560 Training loss: 3.4483 2.7289 sec/batch
Epoch 1/20  Iteration 25/3560 Training loss: 3.4354 2.6985 sec/batch
Epoch 1/20  Iteration 26/3560 Training loss: 3.4239 2.6345 sec/batch
Epoch 1/20  Iteration 27/3560 Training loss: 3.4132 2.8406 sec/batch
Epoch 1/20  Iteration 28/3560 Training loss: 3.4020 2.5596 sec/batch
Epoch 1/20  Iteration 29/3560 Training loss: 3.3921 2.4947 sec/batch
Epoch 1/20  Iteration 30/3560 Training loss: 3.3830 2.5586 sec/batch
Epoch 1/20  Iteration 31/3560 Training loss: 3.3750 2.7408 sec/batch
Epoch 1/20  Iteration 32/3560 Training loss: 3.3663 2.5962 sec/batch
Epoch 1/20  Iteration 33/3560 Training loss: 3.3580 2.5103 sec/batch
Epoch 1/20  Iteration 34/3560 Training loss: 3.3506 2.5072 sec/batch
Epoch 1/20  Iteration 35/3560 Training loss: 3.3429 2.5528 sec/batch
Epoch 1/20  Iteration 36/3560 Training loss: 3.3360 2.5124 sec/batch
Epoch 1/20  Iteration 37/3560 Training loss: 3.3286 2.5174 sec/batch
Epoch 1/20  Iteration 38/3560 Training loss: 3.3217 2.7012 sec/batch
Epoch 1/20  Iteration 39/3560 Training loss: 3.3149 2.5952 sec/batch
Epoch 1/20  Iteration 40/3560 Training loss: 3.3086 2.4828 sec/batch
Epoch 1/20  Iteration 41/3560 Training loss: 3.3024 2.6086 sec/batch
Epoch 1/20  Iteration 42/3560 Training loss: 3.2964 2.5782 sec/batch
Epoch 1/20  Iteration 43/3560 Training loss: 3.2905 2.4952 sec/batch
Epoch 1/20  Iteration 44/3560 Training loss: 3.2848 2.6130 sec/batch
Epoch 1/20  Iteration 45/3560 Training loss: 3.2793 2.7035 sec/batch
Epoch 1/20  Iteration 46/3560 Training loss: 3.2740 2.5685 sec/batch
Epoch 1/20  Iteration 47/3560 Training loss: 3.2691 2.5085 sec/batch
Epoch 1/20  Iteration 48/3560 Training loss: 3.2643 2.4904 sec/batch
Epoch 1/20  Iteration 49/3560 Training loss: 3.2597 2.5281 sec/batch
Epoch 1/20  Iteration 50/3560 Training loss: 3.2551 2.4940 sec/batch
Epoch 1/20  Iteration 51/3560 Training loss: 3.2505 2.5294 sec/batch
Epoch 1/20  Iteration 52/3560 Training loss: 3.2458 2.5206 sec/batch
Epoch 1/20  Iteration 53/3560 Training loss: 3.2414 2.5138 sec/batch
Epoch 1/20  Iteration 54/3560 Training loss: 3.2368 2.4917 sec/batch
Epoch 1/20  Iteration 55/3560 Training loss: 3.2325 2.6114 sec/batch
Epoch 1/20  Iteration 56/3560 Training loss: 3.2279 2.5668 sec/batch
Epoch 1/20  Iteration 57/3560 Training loss: 3.2234 2.5061 sec/batch
Epoch 1/20  Iteration 58/3560 Training loss: 3.2190 2.5396 sec/batch
Epoch 1/20  Iteration 59/3560 Training loss: 3.2145 2.7530 sec/batch
Epoch 1/20  Iteration 60/3560 Training loss: 3.2102 2.5797 sec/batch
Epoch 1/20  Iteration 61/3560 Training loss: 3.2059 2.4878 sec/batch
Epoch 1/20  Iteration 62/3560 Training loss: 3.2019 2.5226 sec/batch
Epoch 1/20  Iteration 63/3560 Training loss: 3.1979 2.5161 sec/batch
Epoch 1/20  Iteration 64/3560 Training loss: 3.1932 2.5352 sec/batch
Epoch 1/20  Iteration 65/3560 Training loss: 3.1885 2.4980 sec/batch
Epoch 1/20  Iteration 66/3560 Training loss: 3.1841 2.5201 sec/batch
Epoch 1/20  Iteration 67/3560 Training loss: 3.1797 2.4936 sec/batch
Epoch 1/20  Iteration 68/3560 Training loss: 3.1745 2.5221 sec/batch
Epoch 1/20  Iteration 69/3560 Training loss: 3.1696 2.5953 sec/batch
Epoch 1/20  Iteration 70/3560 Training loss: 3.1651 2.8758 sec/batch
Epoch 1/20  Iteration 71/3560 Training loss: 3.1604 2.4812 sec/batch
Epoch 1/20  Iteration 72/3560 Training loss: 3.1558 2.5130 sec/batch
Epoch 1/20  Iteration 73/3560 Training loss: 3.1508 2.7716 sec/batch
Epoch 1/20  Iteration 74/3560 Training loss: 3.1457 2.5978 sec/batch
Epoch 1/20  Iteration 75/3560 Training loss: 3.1407 2.4914 sec/batch
Epoch 1/20  Iteration 76/3560 Training loss: 3.1358 2.5360 sec/batch
Epoch 1/20  Iteration 77/3560 Training loss: 3.1306 2.5334 sec/batch
Epoch 1/20  Iteration 78/3560 Training loss: 3.1254 2.4908 sec/batch
Epoch 1/20  Iteration 79/3560 Training loss: 3.1199 2.5212 sec/batch
Epoch 1/20  Iteration 80/3560 Training loss: 3.1145 2.5320 sec/batch
Epoch 1/20  Iteration 81/3560 Training loss: 3.1092 2.4994 sec/batch
Epoch 1/20  Iteration 82/3560 Training loss: 3.1041 2.4961 sec/batch
Epoch 1/20  Iteration 83/3560 Training loss: 3.0986 2.6384 sec/batch
Epoch 1/20  Iteration 84/3560 Training loss: 3.0930 2.5630 sec/batch
Epoch 1/20  Iteration 85/3560 Training loss: 3.0872 2.6797 sec/batch
Epoch 1/20  Iteration 86/3560 Training loss: 3.0815 2.5695 sec/batch
Epoch 1/20  Iteration 87/3560 Training loss: 3.0759 2.7765 sec/batch
Epoch 1/20  Iteration 88/3560 Training loss: 3.0703 2.5807 sec/batch
Epoch 1/20  Iteration 89/3560 Training loss: 3.0647 2.4970 sec/batch
Epoch 1/20  Iteration 90/3560 Training loss: 3.0593 2.5240 sec/batch
Epoch 1/20  Iteration 91/3560 Training loss: 3.0537 2.5267 sec/batch
Epoch 1/20  Iteration 92/3560 Training loss: 3.0483 2.4820 sec/batch
Epoch 1/20  Iteration 93/3560 Training loss: 3.0431 2.4985 sec/batch
Epoch 1/20  Iteration 94/3560 Training loss: 3.0378 2.5474 sec/batch
Epoch 1/20  Iteration 95/3560 Training loss: 3.0321 2.5068 sec/batch
Epoch 1/20  Iteration 96/3560 Training loss: 3.0267 2.4958 sec/batch
Epoch 1/20  Iteration 97/3560 Training loss: 3.0213 2.6111 sec/batch
Epoch 1/20  Iteration 98/3560 Training loss: 3.0162 2.5764 sec/batch
Epoch 1/20  Iteration 99/3560 Training loss: 3.0109 2.4830 sec/batch
Epoch 1/20  Iteration 100/3560 Training loss: 3.0056 2.5119 sec/batch
Epoch 1/20  Iteration 101/3560 Training loss: 3.0006 2.8021 sec/batch
Epoch 1/20  Iteration 102/3560 Training loss: 2.9954 2.5738 sec/batch
Epoch 1/20  Iteration 103/3560 Training loss: 2.9902 2.5066 sec/batch
Epoch 1/20  Iteration 104/3560 Training loss: 2.9851 2.5291 sec/batch
Epoch 1/20  Iteration 105/3560 Training loss: 2.9801 2.5355 sec/batch
Epoch 1/20  Iteration 106/3560 Training loss: 2.9750 2.5126 sec/batch
Epoch 1/20  Iteration 107/3560 Training loss: 2.9699 2.5017 sec/batch
Epoch 1/20  Iteration 108/3560 Training loss: 2.9650 2.5140 sec/batch
Epoch 1/20  Iteration 109/3560 Training loss: 2.9602 2.5124 sec/batch
Epoch 1/20  Iteration 110/3560 Training loss: 2.9551 2.4848 sec/batch
Epoch 1/20  Iteration 111/3560 Training loss: 2.9503 2.5753 sec/batch
Epoch 1/20  Iteration 112/3560 Training loss: 2.9457 2.6099 sec/batch
Epoch 1/20  Iteration 113/3560 Training loss: 2.9408 2.4920 sec/batch
Epoch 1/20  Iteration 114/3560 Training loss: 2.9359 2.5014 sec/batch
Epoch 1/20  Iteration 115/3560 Training loss: 2.9311 2.8094 sec/batch
Epoch 1/20  Iteration 116/3560 Training loss: 2.9262 2.6297 sec/batch
Epoch 1/20  Iteration 117/3560 Training loss: 2.9215 2.5066 sec/batch
Epoch 1/20  Iteration 118/3560 Training loss: 2.9169 2.5687 sec/batch
Epoch 1/20  Iteration 119/3560 Training loss: 2.9126 2.5464 sec/batch
Epoch 1/20  Iteration 120/3560 Training loss: 2.9081 2.5224 sec/batch
Epoch 1/20  Iteration 121/3560 Training loss: 2.9038 2.5011 sec/batch
Epoch 1/20  Iteration 122/3560 Training loss: 2.8993 2.5263 sec/batch
Epoch 1/20  Iteration 123/3560 Training loss: 2.8949 2.5103 sec/batch
Epoch 1/20  Iteration 124/3560 Training loss: 2.8907 2.4850 sec/batch
Epoch 1/20  Iteration 125/3560 Training loss: 2.8864 2.5407 sec/batch
Epoch 1/20  Iteration 126/3560 Training loss: 2.8819 2.6512 sec/batch
Epoch 1/20  Iteration 127/3560 Training loss: 2.8778 2.4964 sec/batch
Epoch 1/20  Iteration 128/3560 Training loss: 2.8737 2.5031 sec/batch
Epoch 1/20  Iteration 129/3560 Training loss: 2.8695 2.7505 sec/batch
Epoch 1/20  Iteration 130/3560 Training loss: 2.8654 2.6298 sec/batch
Epoch 1/20  Iteration 131/3560 Training loss: 2.8612 2.4762 sec/batch
Epoch 1/20  Iteration 132/3560 Training loss: 2.8570 2.7592 sec/batch
Epoch 1/20  Iteration 133/3560 Training loss: 2.8529 2.5466 sec/batch
Epoch 1/20  Iteration 134/3560 Training loss: 2.8492 2.5422 sec/batch
Epoch 1/20  Iteration 135/3560 Training loss: 2.8452 2.4799 sec/batch
Epoch 1/20  Iteration 136/3560 Training loss: 2.8412 2.5492 sec/batch
Epoch 1/20  Iteration 137/3560 Training loss: 2.8374 2.5121 sec/batch
Epoch 1/20  Iteration 138/3560 Training loss: 2.8336 2.5101 sec/batch
Epoch 1/20  Iteration 139/3560 Training loss: 2.8300 2.5185 sec/batch
Epoch 1/20  Iteration 140/3560 Training loss: 2.8261 2.6561 sec/batch
Epoch 1/20  Iteration 141/3560 Training loss: 2.8226 2.4850 sec/batch
Epoch 1/20  Iteration 142/3560 Training loss: 2.8188 2.4911 sec/batch
Epoch 1/20  Iteration 143/3560 Training loss: 2.8151 2.7354 sec/batch
Epoch 1/20  Iteration 144/3560 Training loss: 2.8115 2.6417 sec/batch
Epoch 1/20  Iteration 145/3560 Training loss: 2.8078 2.4798 sec/batch
Epoch 1/20  Iteration 146/3560 Training loss: 2.8045 2.5330 sec/batch
Epoch 1/20  Iteration 147/3560 Training loss: 2.8009 2.5148 sec/batch
Epoch 1/20  Iteration 148/3560 Training loss: 2.7975 2.5400 sec/batch
Epoch 1/20  Iteration 149/3560 Training loss: 2.7940 2.6649 sec/batch
Epoch 1/20  Iteration 150/3560 Training loss: 2.7905 2.5687 sec/batch
Epoch 1/20  Iteration 151/3560 Training loss: 2.7871 2.5076 sec/batch
Epoch 1/20  Iteration 152/3560 Training loss: 2.7840 2.4829 sec/batch
Epoch 1/20  Iteration 153/3560 Training loss: 2.7807 2.5006 sec/batch
Epoch 1/20  Iteration 154/3560 Training loss: 2.7775 2.6648 sec/batch
Epoch 1/20  Iteration 155/3560 Training loss: 2.7742 2.4824 sec/batch
Epoch 1/20  Iteration 156/3560 Training loss: 2.7709 2.5003 sec/batch
Epoch 1/20  Iteration 157/3560 Training loss: 2.7676 2.6747 sec/batch
Epoch 1/20  Iteration 158/3560 Training loss: 2.7644 2.7050 sec/batch
Epoch 1/20  Iteration 159/3560 Training loss: 2.7610 2.4954 sec/batch
Epoch 1/20  Iteration 160/3560 Training loss: 2.7581 2.5270 sec/batch
Epoch 1/20  Iteration 161/3560 Training loss: 2.7549 2.5307 sec/batch
Epoch 1/20  Iteration 162/3560 Training loss: 2.7517 2.5346 sec/batch
Epoch 1/20  Iteration 163/3560 Training loss: 2.7485 2.4791 sec/batch
Epoch 1/20  Iteration 164/3560 Training loss: 2.7453 2.5235 sec/batch
Epoch 1/20  Iteration 165/3560 Training loss: 2.7424 2.5141 sec/batch
Epoch 1/20  Iteration 166/3560 Training loss: 2.7393 2.5003 sec/batch
Epoch 1/20  Iteration 167/3560 Training loss: 2.7365 2.4845 sec/batch
Epoch 1/20  Iteration 168/3560 Training loss: 2.7346 2.6741 sec/batch
Epoch 1/20  Iteration 169/3560 Training loss: 2.7327 2.4975 sec/batch
Epoch 1/20  Iteration 170/3560 Training loss: 2.7306 2.4949 sec/batch
Epoch 1/20  Iteration 171/3560 Training loss: 2.7280 2.6592 sec/batch
Epoch 1/20  Iteration 172/3560 Training loss: 2.7253 2.7148 sec/batch
Epoch 1/20  Iteration 173/3560 Training loss: 2.7226 2.4876 sec/batch
Epoch 1/20  Iteration 174/3560 Training loss: 2.7199 2.5136 sec/batch
Epoch 1/20  Iteration 175/3560 Training loss: 2.7172 2.5467 sec/batch
Epoch 1/20  Iteration 176/3560 Training loss: 2.7148 2.5051 sec/batch
Epoch 1/20  Iteration 177/3560 Training loss: 2.7121 2.4735 sec/batch
Epoch 1/20  Iteration 178/3560 Training loss: 2.7108 2.5794 sec/batch
Epoch 2/20  Iteration 179/3560 Training loss: 2.3607 2.7393 sec/batch
Epoch 2/20  Iteration 180/3560 Training loss: 2.3098 2.4934 sec/batch
Epoch 2/20  Iteration 181/3560 Training loss: 2.2909 2.5065 sec/batch
Epoch 2/20  Iteration 182/3560 Training loss: 2.2815 2.6525 sec/batch
Epoch 2/20  Iteration 183/3560 Training loss: 2.2784 2.5120 sec/batch
Epoch 2/20  Iteration 184/3560 Training loss: 2.2708 2.4964 sec/batch
Epoch 2/20  Iteration 185/3560 Training loss: 2.2683 2.7221 sec/batch
Epoch 2/20  Iteration 186/3560 Training loss: 2.2661 2.6916 sec/batch
Epoch 2/20  Iteration 187/3560 Training loss: 2.2648 2.7246 sec/batch
Epoch 2/20  Iteration 188/3560 Training loss: 2.2613 2.4882 sec/batch
Epoch 2/20  Iteration 189/3560 Training loss: 2.2579 2.5206 sec/batch
Epoch 2/20  Iteration 190/3560 Training loss: 2.2540 2.5293 sec/batch
Epoch 2/20  Iteration 191/3560 Training loss: 2.2520 2.4900 sec/batch
Epoch 2/20  Iteration 192/3560 Training loss: 2.2516 2.5152 sec/batch
Epoch 2/20  Iteration 193/3560 Training loss: 2.2490 2.5312 sec/batch
Epoch 2/20  Iteration 194/3560 Training loss: 2.2464 2.5046 sec/batch
Epoch 2/20  Iteration 195/3560 Training loss: 2.2439 2.4879 sec/batch
Epoch 2/20  Iteration 196/3560 Training loss: 2.2439 2.6841 sec/batch
Epoch 2/20  Iteration 197/3560 Training loss: 2.2424 2.5221 sec/batch
Epoch 2/20  Iteration 198/3560 Training loss: 2.2399 2.5045 sec/batch
Epoch 2/20  Iteration 199/3560 Training loss: 2.2378 2.6164 sec/batch
Epoch 2/20  Iteration 200/3560 Training loss: 2.2380 2.7522 sec/batch
Epoch 2/20  Iteration 201/3560 Training loss: 2.2355 2.4781 sec/batch
Epoch 2/20  Iteration 202/3560 Training loss: 2.2326 2.5047 sec/batch
Epoch 2/20  Iteration 203/3560 Training loss: 2.2307 2.5253 sec/batch
Epoch 2/20  Iteration 204/3560 Training loss: 2.2281 2.5285 sec/batch
Epoch 2/20  Iteration 205/3560 Training loss: 2.2257 2.4954 sec/batch
Epoch 2/20  Iteration 206/3560 Training loss: 2.2242 2.5120 sec/batch
Epoch 2/20  Iteration 207/3560 Training loss: 2.2234 2.5336 sec/batch
Epoch 2/20  Iteration 208/3560 Training loss: 2.2218 2.5070 sec/batch
Epoch 2/20  Iteration 209/3560 Training loss: 2.2202 2.4875 sec/batch
Epoch 2/20  Iteration 210/3560 Training loss: 2.2182 2.6044 sec/batch
Epoch 2/20  Iteration 211/3560 Training loss: 2.2165 2.5732 sec/batch
Epoch 2/20  Iteration 212/3560 Training loss: 2.2155 2.4896 sec/batch
Epoch 2/20  Iteration 213/3560 Training loss: 2.2137 2.5950 sec/batch
Epoch 2/20  Iteration 214/3560 Training loss: 2.2122 2.7854 sec/batch
Epoch 2/20  Iteration 215/3560 Training loss: 2.2107 2.4717 sec/batch
Epoch 2/20  Iteration 216/3560 Training loss: 2.2081 2.4914 sec/batch
Epoch 2/20  Iteration 217/3560 Training loss: 2.2060 2.4992 sec/batch
Epoch 2/20  Iteration 218/3560 Training loss: 2.2038 2.5749 sec/batch
Epoch 2/20  Iteration 219/3560 Training loss: 2.2020 2.5043 sec/batch
Epoch 2/20  Iteration 220/3560 Training loss: 2.2007 2.5404 sec/batch
Epoch 2/20  Iteration 221/3560 Training loss: 2.1991 2.5127 sec/batch
Epoch 2/20  Iteration 222/3560 Training loss: 2.1970 2.5141 sec/batch
Epoch 2/20  Iteration 223/3560 Training loss: 2.1959 2.4956 sec/batch
Epoch 2/20  Iteration 224/3560 Training loss: 2.1933 2.6158 sec/batch
Epoch 2/20  Iteration 225/3560 Training loss: 2.1924 2.7212 sec/batch
Epoch 2/20  Iteration 226/3560 Training loss: 2.1907 2.6359 sec/batch
Epoch 2/20  Iteration 227/3560 Training loss: 2.1893 2.6044 sec/batch
Epoch 2/20  Iteration 228/3560 Training loss: 2.1887 2.7959 sec/batch
Epoch 2/20  Iteration 229/3560 Training loss: 2.1871 2.4819 sec/batch
Epoch 2/20  Iteration 230/3560 Training loss: 2.1863 2.5085 sec/batch
Epoch 2/20  Iteration 231/3560 Training loss: 2.1849 2.4971 sec/batch
Epoch 2/20  Iteration 232/3560 Training loss: 2.1835 2.6050 sec/batch
Epoch 2/20  Iteration 233/3560 Training loss: 2.1820 2.4968 sec/batch
Epoch 2/20  Iteration 234/3560 Training loss: 2.1811 2.5149 sec/batch
Epoch 2/20  Iteration 235/3560 Training loss: 2.1801 2.5171 sec/batch
Epoch 2/20  Iteration 236/3560 Training loss: 2.1787 2.4810 sec/batch
Epoch 2/20  Iteration 237/3560 Training loss: 2.1773 2.4951 sec/batch
Epoch 2/20  Iteration 238/3560 Training loss: 2.1769 2.5881 sec/batch
Epoch 2/20  Iteration 239/3560 Training loss: 2.1757 2.5842 sec/batch
Epoch 2/20  Iteration 240/3560 Training loss: 2.1751 2.4908 sec/batch
Epoch 2/20  Iteration 241/3560 Training loss: 2.1744 2.5737 sec/batch
Epoch 2/20  Iteration 242/3560 Training loss: 2.1734 2.8196 sec/batch
Epoch 2/20  Iteration 243/3560 Training loss: 2.1722 2.4932 sec/batch
Epoch 2/20  Iteration 244/3560 Training loss: 2.1715 2.5134 sec/batch
Epoch 2/20  Iteration 245/3560 Training loss: 2.1705 2.4984 sec/batch
Epoch 2/20  Iteration 246/3560 Training loss: 2.1691 2.5559 sec/batch
Epoch 2/20  Iteration 247/3560 Training loss: 2.1678 2.5328 sec/batch
Epoch 2/20  Iteration 248/3560 Training loss: 2.1669 2.5087 sec/batch
Epoch 2/20  Iteration 249/3560 Training loss: 2.1663 2.5157 sec/batch
Epoch 2/20  Iteration 250/3560 Training loss: 2.1653 2.5019 sec/batch
Epoch 2/20  Iteration 251/3560 Training loss: 2.1645 2.5204 sec/batch
Epoch 2/20  Iteration 252/3560 Training loss: 2.1632 2.6023 sec/batch
Epoch 2/20  Iteration 253/3560 Training loss: 2.1620 2.5618 sec/batch
Epoch 2/20  Iteration 254/3560 Training loss: 2.1615 2.4981 sec/batch
Epoch 2/20  Iteration 255/3560 Training loss: 2.1604 2.5749 sec/batch
Epoch 2/20  Iteration 256/3560 Training loss: 2.1596 2.7606 sec/batch
Epoch 2/20  Iteration 257/3560 Training loss: 2.1583 2.5393 sec/batch
Epoch 2/20  Iteration 258/3560 Training loss: 2.1571 2.4899 sec/batch
Epoch 2/20  Iteration 259/3560 Training loss: 2.1557 2.5121 sec/batch
Epoch 2/20  Iteration 260/3560 Training loss: 2.1549 2.5239 sec/batch
Epoch 2/20  Iteration 261/3560 Training loss: 2.1535 2.5439 sec/batch
Epoch 2/20  Iteration 262/3560 Training loss: 2.1523 2.5928 sec/batch
Epoch 2/20  Iteration 263/3560 Training loss: 2.1508 2.4965 sec/batch
Epoch 2/20  Iteration 264/3560 Training loss: 2.1496 2.4884 sec/batch
Epoch 2/20  Iteration 265/3560 Training loss: 2.1486 2.4910 sec/batch
Epoch 2/20  Iteration 266/3560 Training loss: 2.1474 2.5970 sec/batch
Epoch 2/20  Iteration 267/3560 Training loss: 2.1461 2.5540 sec/batch
Epoch 2/20  Iteration 268/3560 Training loss: 2.1452 2.4789 sec/batch
Epoch 2/20  Iteration 269/3560 Training loss: 2.1441 2.5135 sec/batch
Epoch 2/20  Iteration 270/3560 Training loss: 2.1432 2.7713 sec/batch
Epoch 2/20  Iteration 271/3560 Training loss: 2.1418 2.5581 sec/batch
Epoch 2/20  Iteration 272/3560 Training loss: 2.1407 2.5119 sec/batch
Epoch 2/20  Iteration 273/3560 Training loss: 2.1395 2.6593 sec/batch
Epoch 2/20  Iteration 274/3560 Training loss: 2.1385 2.5155 sec/batch
Epoch 2/20  Iteration 275/3560 Training loss: 2.1375 2.5468 sec/batch
Epoch 2/20  Iteration 276/3560 Training loss: 2.1363 2.4820 sec/batch
Epoch 2/20  Iteration 277/3560 Training loss: 2.1351 2.5074 sec/batch
Epoch 2/20  Iteration 278/3560 Training loss: 2.1337 2.4945 sec/batch
Epoch 2/20  Iteration 279/3560 Training loss: 2.1329 2.5071 sec/batch
Epoch 2/20  Iteration 280/3560 Training loss: 2.1320 2.4829 sec/batch
Epoch 2/20  Iteration 281/3560 Training loss: 2.1309 2.5832 sec/batch
Epoch 2/20  Iteration 282/3560 Training loss: 2.1299 2.5412 sec/batch
Epoch 2/20  Iteration 283/3560 Training loss: 2.1288 2.4733 sec/batch
Epoch 2/20  Iteration 284/3560 Training loss: 2.1278 2.5202 sec/batch
Epoch 2/20  Iteration 285/3560 Training loss: 2.1269 2.7479 sec/batch
Epoch 2/20  Iteration 286/3560 Training loss: 2.1262 2.5556 sec/batch
Epoch 2/20  Iteration 287/3560 Training loss: 2.1255 2.4914 sec/batch
Epoch 2/20  Iteration 288/3560 Training loss: 2.1246 2.4748 sec/batch
Epoch 2/20  Iteration 289/3560 Training loss: 2.1238 2.4958 sec/batch
Epoch 2/20  Iteration 290/3560 Training loss: 2.1229 2.5488 sec/batch
Epoch 2/20  Iteration 291/3560 Training loss: 2.1220 2.4943 sec/batch
Epoch 2/20  Iteration 292/3560 Training loss: 2.1211 2.5069 sec/batch
Epoch 2/20  Iteration 293/3560 Training loss: 2.1201 2.5071 sec/batch
Epoch 2/20  Iteration 294/3560 Training loss: 2.1189 2.4873 sec/batch
Epoch 2/20  Iteration 295/3560 Training loss: 2.1180 2.4855 sec/batch
Epoch 2/20  Iteration 296/3560 Training loss: 2.1172 2.5832 sec/batch
Epoch 2/20  Iteration 297/3560 Training loss: 2.1164 2.5388 sec/batch
Epoch 2/20  Iteration 298/3560 Training loss: 2.1155 2.4743 sec/batch
Epoch 2/20  Iteration 299/3560 Training loss: 2.1148 2.5084 sec/batch
Epoch 2/20  Iteration 300/3560 Training loss: 2.1138 2.7524 sec/batch
Epoch 2/20  Iteration 301/3560 Training loss: 2.1128 2.5603 sec/batch
Epoch 2/20  Iteration 302/3560 Training loss: 2.1122 2.4930 sec/batch
Epoch 2/20  Iteration 303/3560 Training loss: 2.1114 2.4819 sec/batch
Epoch 2/20  Iteration 304/3560 Training loss: 2.1104 2.8879 sec/batch
Epoch 2/20  Iteration 305/3560 Training loss: 2.1097 2.6189 sec/batch
Epoch 2/20  Iteration 306/3560 Training loss: 2.1091 2.4824 sec/batch
Epoch 2/20  Iteration 307/3560 Training loss: 2.1083 2.4877 sec/batch
Epoch 2/20  Iteration 308/3560 Training loss: 2.1076 2.4977 sec/batch
Epoch 2/20  Iteration 309/3560 Training loss: 2.1066 2.4986 sec/batch
Epoch 2/20  Iteration 310/3560 Training loss: 2.1056 2.4825 sec/batch
Epoch 2/20  Iteration 311/3560 Training loss: 2.1049 2.5953 sec/batch
Epoch 2/20  Iteration 312/3560 Training loss: 2.1042 2.5770 sec/batch
Epoch 2/20  Iteration 313/3560 Training loss: 2.1035 2.4945 sec/batch
Epoch 2/20  Iteration 314/3560 Training loss: 2.1028 2.6096 sec/batch
Epoch 2/20  Iteration 315/3560 Training loss: 2.1022 2.7852 sec/batch
Epoch 2/20  Iteration 316/3560 Training loss: 2.1015 2.4894 sec/batch
Epoch 2/20  Iteration 317/3560 Training loss: 2.1010 2.5223 sec/batch
Epoch 2/20  Iteration 318/3560 Training loss: 2.1002 2.4760 sec/batch
Epoch 2/20  Iteration 319/3560 Training loss: 2.0996 2.5128 sec/batch
Epoch 2/20  Iteration 320/3560 Training loss: 2.0989 2.6750 sec/batch
Epoch 2/20  Iteration 321/3560 Training loss: 2.0981 2.5234 sec/batch
Epoch 2/20  Iteration 322/3560 Training loss: 2.0973 2.5045 sec/batch
Epoch 2/20  Iteration 323/3560 Training loss: 2.0965 2.5056 sec/batch
Epoch 2/20  Iteration 324/3560 Training loss: 2.0959 2.4704 sec/batch
Epoch 2/20  Iteration 325/3560 Training loss: 2.0952 2.4784 sec/batch
Epoch 2/20  Iteration 326/3560 Training loss: 2.0946 2.5871 sec/batch
Epoch 2/20  Iteration 327/3560 Training loss: 2.0938 2.5802 sec/batch
Epoch 2/20  Iteration 328/3560 Training loss: 2.0929 2.5139 sec/batch
Epoch 2/20  Iteration 329/3560 Training loss: 2.0921 2.5908 sec/batch
Epoch 2/20  Iteration 330/3560 Training loss: 2.0916 2.7609 sec/batch
Epoch 2/20  Iteration 331/3560 Training loss: 2.0910 2.4866 sec/batch
Epoch 2/20  Iteration 332/3560 Training loss: 2.0904 2.4967 sec/batch
Epoch 2/20  Iteration 333/3560 Training loss: 2.0897 2.4863 sec/batch
Epoch 2/20  Iteration 334/3560 Training loss: 2.0889 2.5396 sec/batch
Epoch 2/20  Iteration 335/3560 Training loss: 2.0882 2.4817 sec/batch
Epoch 2/20  Iteration 336/3560 Training loss: 2.0874 2.4597 sec/batch
Epoch 2/20  Iteration 337/3560 Training loss: 2.0865 2.5078 sec/batch
Epoch 2/20  Iteration 338/3560 Training loss: 2.0860 2.5375 sec/batch
Epoch 2/20  Iteration 339/3560 Training loss: 2.0854 2.4717 sec/batch
Epoch 2/20  Iteration 340/3560 Training loss: 2.0846 2.4826 sec/batch
Epoch 2/20  Iteration 341/3560 Training loss: 2.0839 2.5903 sec/batch
Epoch 2/20  Iteration 342/3560 Training loss: 2.0832 2.5451 sec/batch
Epoch 2/20  Iteration 343/3560 Training loss: 2.0825 2.4851 sec/batch
Epoch 2/20  Iteration 344/3560 Training loss: 2.0818 2.6020 sec/batch
Epoch 2/20  Iteration 345/3560 Training loss: 2.0812 2.7669 sec/batch
Epoch 2/20  Iteration 346/3560 Training loss: 2.0808 2.4780 sec/batch
Epoch 2/20  Iteration 347/3560 Training loss: 2.0801 2.4673 sec/batch
Epoch 2/20  Iteration 348/3560 Training loss: 2.0796 2.4876 sec/batch
Epoch 2/20  Iteration 349/3560 Training loss: 2.0790 2.5104 sec/batch
Epoch 2/20  Iteration 350/3560 Training loss: 2.0784 2.5460 sec/batch
Epoch 2/20  Iteration 351/3560 Training loss: 2.0780 2.4843 sec/batch
Epoch 2/20  Iteration 352/3560 Training loss: 2.0775 2.4979 sec/batch
Epoch 2/20  Iteration 353/3560 Training loss: 2.0769 2.5067 sec/batch
Epoch 2/20  Iteration 354/3560 Training loss: 2.0763 2.4713 sec/batch
Epoch 2/20  Iteration 355/3560 Training loss: 2.0756 2.4814 sec/batch
Epoch 2/20  Iteration 356/3560 Training loss: 2.0750 2.5736 sec/batch
Epoch 3/20  Iteration 357/3560 Training loss: 2.0251 2.5633 sec/batch
Epoch 3/20  Iteration 358/3560 Training loss: 1.9856 2.4794 sec/batch
Epoch 3/20  Iteration 359/3560 Training loss: 1.9743 2.5953 sec/batch
Epoch 3/20  Iteration 360/3560 Training loss: 1.9686 2.7686 sec/batch
Epoch 3/20  Iteration 361/3560 Training loss: 1.9637 2.5081 sec/batch
Epoch 3/20  Iteration 362/3560 Training loss: 1.9553 2.4936 sec/batch
Epoch 3/20  Iteration 363/3560 Training loss: 1.9551 2.4779 sec/batch
Epoch 3/20  Iteration 364/3560 Training loss: 1.9535 2.5070 sec/batch
Epoch 3/20  Iteration 365/3560 Training loss: 1.9554 2.5200 sec/batch
Epoch 3/20  Iteration 366/3560 Training loss: 1.9540 2.4784 sec/batch
Epoch 3/20  Iteration 367/3560 Training loss: 1.9514 2.5782 sec/batch
Epoch 3/20  Iteration 368/3560 Training loss: 1.9492 2.6098 sec/batch
Epoch 3/20  Iteration 369/3560 Training loss: 1.9491 2.5071 sec/batch
Epoch 3/20  Iteration 370/3560 Training loss: 1.9506 2.4753 sec/batch
Epoch 3/20  Iteration 371/3560 Training loss: 1.9492 2.6058 sec/batch
Epoch 3/20  Iteration 372/3560 Training loss: 1.9476 2.5265 sec/batch
Epoch 3/20  Iteration 373/3560 Training loss: 1.9463 2.4865 sec/batch
Epoch 3/20  Iteration 374/3560 Training loss: 1.9478 2.6210 sec/batch
Epoch 3/20  Iteration 375/3560 Training loss: 1.9472 2.7322 sec/batch
Epoch 3/20  Iteration 376/3560 Training loss: 1.9463 2.4864 sec/batch
Epoch 3/20  Iteration 377/3560 Training loss: 1.9446 2.4851 sec/batch
Epoch 3/20  Iteration 378/3560 Training loss: 1.9452 2.4938 sec/batch
Epoch 3/20  Iteration 379/3560 Training loss: 1.9438 2.5223 sec/batch
Epoch 3/20  Iteration 380/3560 Training loss: 1.9428 2.5010 sec/batch
Epoch 3/20  Iteration 381/3560 Training loss: 1.9416 2.4867 sec/batch
Epoch 3/20  Iteration 382/3560 Training loss: 1.9399 2.5149 sec/batch
Epoch 3/20  Iteration 383/3560 Training loss: 1.9384 2.5173 sec/batch
Epoch 3/20  Iteration 384/3560 Training loss: 1.9382 2.4648 sec/batch
Epoch 3/20  Iteration 385/3560 Training loss: 1.9386 2.5080 sec/batch
Epoch 3/20  Iteration 386/3560 Training loss: 1.9383 2.6112 sec/batch
Epoch 3/20  Iteration 387/3560 Training loss: 1.9375 2.5432 sec/batch
Epoch 3/20  Iteration 388/3560 Training loss: 1.9362 2.4830 sec/batch
Epoch 3/20  Iteration 389/3560 Training loss: 1.9356 2.6063 sec/batch
Epoch 3/20  Iteration 390/3560 Training loss: 1.9358 2.7683 sec/batch
Epoch 3/20  Iteration 391/3560 Training loss: 1.9349 2.5024 sec/batch
Epoch 3/20  Iteration 392/3560 Training loss: 1.9340 2.5023 sec/batch
Epoch 3/20  Iteration 393/3560 Training loss: 1.9331 2.4880 sec/batch
Epoch 3/20  Iteration 394/3560 Training loss: 1.9312 2.5083 sec/batch
Epoch 3/20  Iteration 395/3560 Training loss: 1.9298 2.5060 sec/batch
Epoch 3/20  Iteration 396/3560 Training loss: 1.9282 2.4748 sec/batch
Epoch 3/20  Iteration 397/3560 Training loss: 1.9270 2.5121 sec/batch
Epoch 3/20  Iteration 398/3560 Training loss: 1.9266 2.5095 sec/batch
Epoch 3/20  Iteration 399/3560 Training loss: 1.9256 2.4885 sec/batch
Epoch 3/20  Iteration 400/3560 Training loss: 1.9241 2.4702 sec/batch
Epoch 3/20  Iteration 401/3560 Training loss: 1.9237 2.6044 sec/batch
Epoch 3/20  Iteration 402/3560 Training loss: 1.9219 2.5157 sec/batch
Epoch 3/20  Iteration 403/3560 Training loss: 1.9212 2.5494 sec/batch
Epoch 3/20  Iteration 404/3560 Training loss: 1.9200 2.5898 sec/batch
Epoch 3/20  Iteration 405/3560 Training loss: 1.9192 2.7457 sec/batch
Epoch 3/20  Iteration 406/3560 Training loss: 1.9194 2.5005 sec/batch
Epoch 3/20  Iteration 407/3560 Training loss: 1.9181 2.4882 sec/batch
Epoch 3/20  Iteration 408/3560 Training loss: 1.9183 2.5198 sec/batch
Epoch 3/20  Iteration 409/3560 Training loss: 1.9176 2.4913 sec/batch
Epoch 3/20  Iteration 410/3560 Training loss: 1.9169 2.5322 sec/batch
Epoch 3/20  Iteration 411/3560 Training loss: 1.9161 2.4961 sec/batch
Epoch 3/20  Iteration 412/3560 Training loss: 1.9158 2.4929 sec/batch
Epoch 3/20  Iteration 413/3560 Training loss: 1.9154 2.5131 sec/batch
Epoch 3/20  Iteration 414/3560 Training loss: 1.9148 2.6100 sec/batch
Epoch 3/20  Iteration 415/3560 Training loss: 1.9138 2.5646 sec/batch
Epoch 3/20  Iteration 416/3560 Training loss: 1.9139 2.6840 sec/batch
Epoch 3/20  Iteration 417/3560 Training loss: 1.9133 2.4726 sec/batch
Epoch 3/20  Iteration 418/3560 Training loss: 1.9134 2.5043 sec/batch
Epoch 3/20  Iteration 419/3560 Training loss: 1.9133 2.6496 sec/batch
Epoch 3/20  Iteration 420/3560 Training loss: 1.9130 2.6858 sec/batch
Epoch 3/20  Iteration 421/3560 Training loss: 1.9123 2.4732 sec/batch
Epoch 3/20  Iteration 422/3560 Training loss: 1.9121 2.8817 sec/batch
Epoch 3/20  Iteration 423/3560 Training loss: 1.9117 2.5745 sec/batch
Epoch 3/20  Iteration 424/3560 Training loss: 1.9108 2.5106 sec/batch
Epoch 3/20  Iteration 425/3560 Training loss: 1.9101 2.4891 sec/batch
Epoch 3/20  Iteration 426/3560 Training loss: 1.9096 2.5081 sec/batch
Epoch 3/20  Iteration 427/3560 Training loss: 1.9095 2.5004 sec/batch
Epoch 3/20  Iteration 428/3560 Training loss: 1.9092 2.5049 sec/batch
Epoch 3/20  Iteration 429/3560 Training loss: 1.9092 2.4839 sec/batch
Epoch 3/20  Iteration 430/3560 Training loss: 1.9082 2.5570 sec/batch
Epoch 3/20  Iteration 431/3560 Training loss: 1.9076 2.6099 sec/batch
Epoch 3/20  Iteration 432/3560 Training loss: 1.9075 2.5026 sec/batch
Epoch 3/20  Iteration 433/3560 Training loss: 1.9069 2.4811 sec/batch
Epoch 3/20  Iteration 434/3560 Training loss: 1.9066 2.7379 sec/batch
Epoch 3/20  Iteration 435/3560 Training loss: 1.9056 2.5817 sec/batch
Epoch 3/20  Iteration 436/3560 Training loss: 1.9051 2.4941 sec/batch
Epoch 3/20  Iteration 437/3560 Training loss: 1.9042 2.4755 sec/batch
Epoch 3/20  Iteration 438/3560 Training loss: 1.9039 2.5061 sec/batch
Epoch 3/20  Iteration 439/3560 Training loss: 1.9030 2.4946 sec/batch
Epoch 3/20  Iteration 440/3560 Training loss: 1.9025 2.5153 sec/batch
Epoch 3/20  Iteration 441/3560 Training loss: 1.9014 2.4891 sec/batch
Epoch 3/20  Iteration 442/3560 Training loss: 1.9006 2.5043 sec/batch
Epoch 3/20  Iteration 443/3560 Training loss: 1.9000 2.5004 sec/batch
Epoch 3/20  Iteration 444/3560 Training loss: 1.8993 2.4886 sec/batch
Epoch 3/20  Iteration 445/3560 Training loss: 1.8983 2.5556 sec/batch
Epoch 3/20  Iteration 446/3560 Training loss: 1.8979 2.5895 sec/batch
Epoch 3/20  Iteration 447/3560 Training loss: 1.8972 2.4823 sec/batch
Epoch 3/20  Iteration 448/3560 Training loss: 1.8966 2.4861 sec/batch
Epoch 3/20  Iteration 449/3560 Training loss: 1.8957 2.6995 sec/batch
Epoch 3/20  Iteration 450/3560 Training loss: 1.8949 2.6354 sec/batch
Epoch 3/20  Iteration 451/3560 Training loss: 1.8941 2.5127 sec/batch
Epoch 3/20  Iteration 452/3560 Training loss: 1.8935 2.4783 sec/batch
Epoch 3/20  Iteration 453/3560 Training loss: 1.8929 2.4933 sec/batch
Epoch 3/20  Iteration 454/3560 Training loss: 1.8920 2.5058 sec/batch
Epoch 3/20  Iteration 455/3560 Training loss: 1.8912 2.5430 sec/batch
Epoch 3/20  Iteration 456/3560 Training loss: 1.8901 2.5000 sec/batch
Epoch 3/20  Iteration 457/3560 Training loss: 1.8896 2.5007 sec/batch
Epoch 3/20  Iteration 458/3560 Training loss: 1.8890 2.4976 sec/batch
Epoch 3/20  Iteration 459/3560 Training loss: 1.8883 2.4831 sec/batch
Epoch 3/20  Iteration 460/3560 Training loss: 1.8876 2.5445 sec/batch
Epoch 3/20  Iteration 461/3560 Training loss: 1.8869 2.7535 sec/batch
Epoch 3/20  Iteration 462/3560 Training loss: 1.8862 2.5562 sec/batch
Epoch 3/20  Iteration 463/3560 Training loss: 1.8857 2.4891 sec/batch
Epoch 3/20  Iteration 464/3560 Training loss: 1.8851 2.7851 sec/batch
Epoch 3/20  Iteration 465/3560 Training loss: 1.8848 2.5617 sec/batch
Epoch 3/20  Iteration 466/3560 Training loss: 1.8843 2.4928 sec/batch
Epoch 3/20  Iteration 467/3560 Training loss: 1.8838 2.5056 sec/batch
Epoch 3/20  Iteration 468/3560 Training loss: 1.8832 2.4831 sec/batch
Epoch 3/20  Iteration 469/3560 Training loss: 1.8827 2.5232 sec/batch
Epoch 3/20  Iteration 470/3560 Training loss: 1.8821 2.5228 sec/batch
Epoch 3/20  Iteration 471/3560 Training loss: 1.8814 2.4854 sec/batch
Epoch 3/20  Iteration 472/3560 Training loss: 1.8805 2.5015 sec/batch
Epoch 3/20  Iteration 473/3560 Training loss: 1.8800 2.5119 sec/batch
Epoch 3/20  Iteration 474/3560 Training loss: 1.8794 2.4794 sec/batch
Epoch 3/20  Iteration 475/3560 Training loss: 1.8788 2.6006 sec/batch
Epoch 3/20  Iteration 476/3560 Training loss: 1.8783 2.5405 sec/batch
Epoch 3/20  Iteration 477/3560 Training loss: 1.8777 2.4783 sec/batch
Epoch 3/20  Iteration 478/3560 Training loss: 1.8770 2.5026 sec/batch
Epoch 3/20  Iteration 479/3560 Training loss: 1.8762 2.8155 sec/batch
Epoch 3/20  Iteration 480/3560 Training loss: 1.8758 2.5418 sec/batch
Epoch 3/20  Iteration 481/3560 Training loss: 1.8753 2.4953 sec/batch
Epoch 3/20  Iteration 482/3560 Training loss: 1.8745 2.4961 sec/batch
Epoch 3/20  Iteration 483/3560 Training loss: 1.8742 2.4953 sec/batch
Epoch 3/20  Iteration 484/3560 Training loss: 1.8739 2.5120 sec/batch
Epoch 3/20  Iteration 485/3560 Training loss: 1.8733 2.5098 sec/batch
Epoch 3/20  Iteration 486/3560 Training loss: 1.8728 2.4895 sec/batch
Epoch 3/20  Iteration 487/3560 Training loss: 1.8721 2.4935 sec/batch
Epoch 3/20  Iteration 488/3560 Training loss: 1.8715 2.4978 sec/batch
Epoch 3/20  Iteration 489/3560 Training loss: 1.8711 2.4911 sec/batch
Epoch 3/20  Iteration 490/3560 Training loss: 1.8707 2.5834 sec/batch
Epoch 3/20  Iteration 491/3560 Training loss: 1.8702 2.5867 sec/batch
Epoch 3/20  Iteration 492/3560 Training loss: 1.8698 2.4698 sec/batch
Epoch 3/20  Iteration 493/3560 Training loss: 1.8694 2.4967 sec/batch
Epoch 3/20  Iteration 494/3560 Training loss: 1.8691 2.7581 sec/batch
Epoch 3/20  Iteration 495/3560 Training loss: 1.8689 2.5590 sec/batch
Epoch 3/20  Iteration 496/3560 Training loss: 1.8683 2.4994 sec/batch
Epoch 3/20  Iteration 497/3560 Training loss: 1.8681 2.4865 sec/batch
Epoch 3/20  Iteration 498/3560 Training loss: 1.8677 2.5090 sec/batch
Epoch 3/20  Iteration 499/3560 Training loss: 1.8673 2.5126 sec/batch
Epoch 3/20  Iteration 500/3560 Training loss: 1.8670 2.5107 sec/batch
Epoch 3/20  Iteration 501/3560 Training loss: 1.8664 2.4925 sec/batch
Epoch 3/20  Iteration 502/3560 Training loss: 1.8661 2.5062 sec/batch
Epoch 3/20  Iteration 503/3560 Training loss: 1.8657 2.5040 sec/batch
Epoch 3/20  Iteration 504/3560 Training loss: 1.8655 2.4766 sec/batch
Epoch 3/20  Iteration 505/3560 Training loss: 1.8651 2.6174 sec/batch
Epoch 3/20  Iteration 506/3560 Training loss: 1.8646 2.5528 sec/batch
Epoch 3/20  Iteration 507/3560 Training loss: 1.8640 2.4833 sec/batch
Epoch 3/20  Iteration 508/3560 Training loss: 1.8637 2.6506 sec/batch
Epoch 3/20  Iteration 509/3560 Training loss: 1.8633 2.8398 sec/batch
Epoch 3/20  Iteration 510/3560 Training loss: 1.8630 2.5359 sec/batch
Epoch 3/20  Iteration 511/3560 Training loss: 1.8625 2.5055 sec/batch
Epoch 3/20  Iteration 512/3560 Training loss: 1.8621 2.4844 sec/batch
Epoch 3/20  Iteration 513/3560 Training loss: 1.8617 2.4993 sec/batch
Epoch 3/20  Iteration 514/3560 Training loss: 1.8613 2.5300 sec/batch
Epoch 3/20  Iteration 515/3560 Training loss: 1.8607 2.5002 sec/batch
Epoch 3/20  Iteration 516/3560 Training loss: 1.8604 2.4936 sec/batch
Epoch 3/20  Iteration 517/3560 Training loss: 1.8602 2.5207 sec/batch
Epoch 3/20  Iteration 518/3560 Training loss: 1.8597 2.4917 sec/batch
Epoch 3/20  Iteration 519/3560 Training loss: 1.8594 2.4935 sec/batch
Epoch 3/20  Iteration 520/3560 Training loss: 1.8590 2.5914 sec/batch
Epoch 3/20  Iteration 521/3560 Training loss: 1.8587 2.5578 sec/batch
Epoch 3/20  Iteration 522/3560 Training loss: 1.8582 2.4766 sec/batch
Epoch 3/20  Iteration 523/3560 Training loss: 1.8579 2.4970 sec/batch
Epoch 3/20  Iteration 524/3560 Training loss: 1.8578 2.7778 sec/batch
Epoch 3/20  Iteration 525/3560 Training loss: 1.8574 2.5562 sec/batch
Epoch 3/20  Iteration 526/3560 Training loss: 1.8570 2.5120 sec/batch
Epoch 3/20  Iteration 527/3560 Training loss: 1.8565 2.4782 sec/batch
Epoch 3/20  Iteration 528/3560 Training loss: 1.8560 2.5137 sec/batch
Epoch 3/20  Iteration 529/3560 Training loss: 1.8557 2.5093 sec/batch
Epoch 3/20  Iteration 530/3560 Training loss: 1.8554 2.5216 sec/batch
Epoch 3/20  Iteration 531/3560 Training loss: 1.8550 2.4754 sec/batch
Epoch 3/20  Iteration 532/3560 Training loss: 1.8545 2.5261 sec/batch
Epoch 3/20  Iteration 533/3560 Training loss: 1.8539 2.4812 sec/batch
Epoch 3/20  Iteration 534/3560 Training loss: 1.8535 2.4982 sec/batch
Epoch 4/20  Iteration 535/3560 Training loss: 1.8436 2.5816 sec/batch
Epoch 4/20  Iteration 536/3560 Training loss: 1.8097 2.5523 sec/batch
Epoch 4/20  Iteration 537/3560 Training loss: 1.7974 2.4680 sec/batch
Epoch 4/20  Iteration 538/3560 Training loss: 1.7891 2.5030 sec/batch
Epoch 4/20  Iteration 539/3560 Training loss: 1.7845 2.7756 sec/batch
Epoch 4/20  Iteration 540/3560 Training loss: 1.7745 2.8313 sec/batch
Epoch 4/20  Iteration 541/3560 Training loss: 1.7742 2.4934 sec/batch
Epoch 4/20  Iteration 542/3560 Training loss: 1.7713 2.4835 sec/batch
Epoch 4/20  Iteration 543/3560 Training loss: 1.7729 2.4922 sec/batch
Epoch 4/20  Iteration 544/3560 Training loss: 1.7713 2.5474 sec/batch
Epoch 4/20  Iteration 545/3560 Training loss: 1.7679 2.4923 sec/batch
Epoch 4/20  Iteration 546/3560 Training loss: 1.7655 2.5149 sec/batch
Epoch 4/20  Iteration 547/3560 Training loss: 1.7650 2.4936 sec/batch
Epoch 4/20  Iteration 548/3560 Training loss: 1.7669 2.4820 sec/batch
Epoch 4/20  Iteration 549/3560 Training loss: 1.7660 2.4891 sec/batch
Epoch 4/20  Iteration 550/3560 Training loss: 1.7641 2.6132 sec/batch
Epoch 4/20  Iteration 551/3560 Training loss: 1.7638 2.5503 sec/batch
Epoch 4/20  Iteration 552/3560 Training loss: 1.7659 2.5151 sec/batch
Epoch 4/20  Iteration 553/3560 Training loss: 1.7654 2.7096 sec/batch
Epoch 4/20  Iteration 554/3560 Training loss: 1.7657 2.7529 sec/batch
Epoch 4/20  Iteration 555/3560 Training loss: 1.7644 2.7690 sec/batch
Epoch 4/20  Iteration 556/3560 Training loss: 1.7650 2.6742 sec/batch
Epoch 4/20  Iteration 557/3560 Training loss: 1.7639 2.5688 sec/batch
Epoch 4/20  Iteration 558/3560 Training loss: 1.7631 2.5620 sec/batch
Epoch 4/20  Iteration 559/3560 Training loss: 1.7624 2.4785 sec/batch
Epoch 4/20  Iteration 560/3560 Training loss: 1.7609 2.4603 sec/batch
Epoch 4/20  Iteration 561/3560 Training loss: 1.7594 2.5204 sec/batch
Epoch 4/20  Iteration 562/3560 Training loss: 1.7596 2.5158 sec/batch
Epoch 4/20  Iteration 563/3560 Training loss: 1.7600 2.4764 sec/batch
Epoch 4/20  Iteration 564/3560 Training loss: 1.7598 2.4968 sec/batch
Epoch 4/20  Iteration 565/3560 Training loss: 1.7594 2.6011 sec/batch
Epoch 4/20  Iteration 566/3560 Training loss: 1.7581 2.6304 sec/batch
Epoch 4/20  Iteration 567/3560 Training loss: 1.7577 2.7174 sec/batch
Epoch 4/20  Iteration 568/3560 Training loss: 1.7581 2.8161 sec/batch
Epoch 4/20  Iteration 569/3560 Training loss: 1.7575 2.7656 sec/batch
Epoch 4/20  Iteration 570/3560 Training loss: 1.7569 2.5952 sec/batch
Epoch 4/20  Iteration 571/3560 Training loss: 1.7559 2.4833 sec/batch
Epoch 4/20  Iteration 572/3560 Training loss: 1.7546 2.5188 sec/batch
Epoch 4/20  Iteration 573/3560 Training loss: 1.7530 2.5127 sec/batch
Epoch 4/20  Iteration 574/3560 Training loss: 1.7521 2.5291 sec/batch
Epoch 4/20  Iteration 575/3560 Training loss: 1.7512 2.4679 sec/batch
Epoch 4/20  Iteration 576/3560 Training loss: 1.7514 2.5383 sec/batch
Epoch 4/20  Iteration 577/3560 Training loss: 1.7507 2.5030 sec/batch
Epoch 4/20  Iteration 578/3560 Training loss: 1.7497 2.4733 sec/batch
Epoch 4/20  Iteration 579/3560 Training loss: 1.7498 2.4840 sec/batch
Epoch 4/20  Iteration 580/3560 Training loss: 1.7482 2.6537 sec/batch
Epoch 4/20  Iteration 581/3560 Training loss: 1.7476 2.4860 sec/batch
Epoch 4/20  Iteration 582/3560 Training loss: 1.7468 2.4960 sec/batch
Epoch 4/20  Iteration 583/3560 Training loss: 1.7463 2.6758 sec/batch
Epoch 4/20  Iteration 584/3560 Training loss: 1.7467 2.6480 sec/batch
Epoch 4/20  Iteration 585/3560 Training loss: 1.7460 2.5163 sec/batch
Epoch 4/20  Iteration 586/3560 Training loss: 1.7464 2.4872 sec/batch
Epoch 4/20  Iteration 587/3560 Training loss: 1.7459 2.5083 sec/batch
Epoch 4/20  Iteration 588/3560 Training loss: 1.7456 2.4974 sec/batch
Epoch 4/20  Iteration 589/3560 Training loss: 1.7450 2.5208 sec/batch
Epoch 4/20  Iteration 590/3560 Training loss: 1.7449 2.4679 sec/batch
Epoch 4/20  Iteration 591/3560 Training loss: 1.7450 2.5345 sec/batch
Epoch 4/20  Iteration 592/3560 Training loss: 1.7445 2.5039 sec/batch
Epoch 4/20  Iteration 593/3560 Training loss: 1.7437 2.4593 sec/batch
Epoch 4/20  Iteration 594/3560 Training loss: 1.7441 2.5019 sec/batch
Epoch 4/20  Iteration 595/3560 Training loss: 1.7437 2.6485 sec/batch
Epoch 4/20  Iteration 596/3560 Training loss: 1.7441 2.4821 sec/batch
Epoch 4/20  Iteration 597/3560 Training loss: 1.7442 2.5137 sec/batch
Epoch 4/20  Iteration 598/3560 Training loss: 1.7441 2.7039 sec/batch
Epoch 4/20  Iteration 599/3560 Training loss: 1.7436 2.6358 sec/batch
Epoch 4/20  Iteration 600/3560 Training loss: 1.7436 2.4894 sec/batch
Epoch 4/20  Iteration 601/3560 Training loss: 1.7435 2.4859 sec/batch
Epoch 4/20  Iteration 602/3560 Training loss: 1.7429 2.6264 sec/batch
Epoch 4/20  Iteration 603/3560 Training loss: 1.7426 2.5909 sec/batch
Epoch 4/20  Iteration 604/3560 Training loss: 1.7423 2.5124 sec/batch
Epoch 4/20  Iteration 605/3560 Training loss: 1.7428 2.4994 sec/batch
Epoch 4/20  Iteration 606/3560 Training loss: 1.7427 2.4991 sec/batch
Epoch 4/20  Iteration 607/3560 Training loss: 1.7429 2.4985 sec/batch
Epoch 4/20  Iteration 608/3560 Training loss: 1.7423 2.4832 sec/batch
Epoch 4/20  Iteration 609/3560 Training loss: 1.7421 2.5584 sec/batch
Epoch 4/20  Iteration 610/3560 Training loss: 1.7420 2.5983 sec/batch
Epoch 4/20  Iteration 611/3560 Training loss: 1.7417 2.4665 sec/batch
Epoch 4/20  Iteration 612/3560 Training loss: 1.7416 2.4933 sec/batch
Epoch 4/20  Iteration 613/3560 Training loss: 1.7408 2.7075 sec/batch
Epoch 4/20  Iteration 614/3560 Training loss: 1.7404 2.6486 sec/batch
Epoch 4/20  Iteration 615/3560 Training loss: 1.7396 2.4916 sec/batch
Epoch 4/20  Iteration 616/3560 Training loss: 1.7396 2.4878 sec/batch
Epoch 4/20  Iteration 617/3560 Training loss: 1.7389 2.4998 sec/batch
Epoch 4/20  Iteration 618/3560 Training loss: 1.7386 2.5117 sec/batch
Epoch 4/20  Iteration 619/3560 Training loss: 1.7378 2.5049 sec/batch
Epoch 4/20  Iteration 620/3560 Training loss: 1.7371 2.5196 sec/batch
Epoch 4/20  Iteration 621/3560 Training loss: 1.7367 2.5150 sec/batch
Epoch 4/20  Iteration 622/3560 Training loss: 1.7361 2.4955 sec/batch
Epoch 4/20  Iteration 623/3560 Training loss: 1.7353 2.4755 sec/batch
Epoch 4/20  Iteration 624/3560 Training loss: 1.7351 2.5684 sec/batch
Epoch 4/20  Iteration 625/3560 Training loss: 1.7346 2.5975 sec/batch
Epoch 4/20  Iteration 626/3560 Training loss: 1.7341 2.4881 sec/batch
Epoch 4/20  Iteration 627/3560 Training loss: 1.7334 2.4987 sec/batch
Epoch 4/20  Iteration 628/3560 Training loss: 1.7328 2.7130 sec/batch
Epoch 4/20  Iteration 629/3560 Training loss: 1.7322 2.6213 sec/batch
Epoch 4/20  Iteration 630/3560 Training loss: 1.7318 2.4970 sec/batch
Epoch 4/20  Iteration 631/3560 Training loss: 1.7314 2.4929 sec/batch
Epoch 4/20  Iteration 632/3560 Training loss: 1.7308 2.4934 sec/batch
Epoch 4/20  Iteration 633/3560 Training loss: 1.7302 2.5097 sec/batch
Epoch 4/20  Iteration 634/3560 Training loss: 1.7294 2.4903 sec/batch
Epoch 4/20  Iteration 635/3560 Training loss: 1.7290 2.4996 sec/batch
Epoch 4/20  Iteration 636/3560 Training loss: 1.7286 2.5004 sec/batch
Epoch 4/20  Iteration 637/3560 Training loss: 1.7281 2.4999 sec/batch
Epoch 4/20  Iteration 638/3560 Training loss: 1.7276 2.4638 sec/batch
Epoch 4/20  Iteration 639/3560 Training loss: 1.7270 2.5931 sec/batch
Epoch 4/20  Iteration 640/3560 Training loss: 1.7266 2.5868 sec/batch
Epoch 4/20  Iteration 641/3560 Training loss: 1.7263 2.4750 sec/batch
Epoch 4/20  Iteration 642/3560 Training loss: 1.7259 2.4993 sec/batch
Epoch 4/20  Iteration 643/3560 Training loss: 1.7257 2.6946 sec/batch
Epoch 4/20  Iteration 644/3560 Training loss: 1.7253 2.6297 sec/batch
Epoch 4/20  Iteration 645/3560 Training loss: 1.7250 2.4823 sec/batch
Epoch 4/20  Iteration 646/3560 Training loss: 1.7246 2.5097 sec/batch
Epoch 4/20  Iteration 647/3560 Training loss: 1.7242 2.4925 sec/batch
Epoch 4/20  Iteration 648/3560 Training loss: 1.7238 2.5080 sec/batch
Epoch 4/20  Iteration 649/3560 Training loss: 1.7232 2.5100 sec/batch
Epoch 4/20  Iteration 650/3560 Training loss: 1.7225 2.6780 sec/batch
Epoch 4/20  Iteration 651/3560 Training loss: 1.7222 2.5383 sec/batch
Epoch 4/20  Iteration 652/3560 Training loss: 1.7218 2.4787 sec/batch
Epoch 4/20  Iteration 653/3560 Training loss: 1.7214 2.4792 sec/batch
Epoch 4/20  Iteration 654/3560 Training loss: 1.7210 2.5861 sec/batch
Epoch 4/20  Iteration 655/3560 Training loss: 1.7207 2.5452 sec/batch
Epoch 4/20  Iteration 656/3560 Training loss: 1.7200 2.4907 sec/batch
Epoch 4/20  Iteration 657/3560 Training loss: 1.7194 2.5164 sec/batch
Epoch 4/20  Iteration 658/3560 Training loss: 1.7193 3.0912 sec/batch
Epoch 4/20  Iteration 659/3560 Training loss: 1.7190 2.5486 sec/batch
Epoch 4/20  Iteration 660/3560 Training loss: 1.7183 2.5160 sec/batch
Epoch 4/20  Iteration 661/3560 Training loss: 1.7182 2.4816 sec/batch
Epoch 4/20  Iteration 662/3560 Training loss: 1.7180 2.4984 sec/batch
Epoch 4/20  Iteration 663/3560 Training loss: 1.7175 2.5214 sec/batch
Epoch 4/20  Iteration 664/3560 Training loss: 1.7171 2.5238 sec/batch
Epoch 4/20  Iteration 665/3560 Training loss: 1.7165 2.5051 sec/batch
Epoch 4/20  Iteration 666/3560 Training loss: 1.7160 2.5008 sec/batch
Epoch 4/20  Iteration 667/3560 Training loss: 1.7157 2.5015 sec/batch
Epoch 4/20  Iteration 668/3560 Training loss: 1.7155 2.4921 sec/batch
Epoch 4/20  Iteration 669/3560 Training loss: 1.7152 2.5836 sec/batch
Epoch 4/20  Iteration 670/3560 Training loss: 1.7150 2.5556 sec/batch
Epoch 4/20  Iteration 671/3560 Training loss: 1.7148 2.4796 sec/batch
Epoch 4/20  Iteration 672/3560 Training loss: 1.7146 2.4967 sec/batch
Epoch 4/20  Iteration 673/3560 Training loss: 1.7144 2.7629 sec/batch
Epoch 4/20  Iteration 674/3560 Training loss: 1.7140 2.5563 sec/batch
Epoch 4/20  Iteration 675/3560 Training loss: 1.7140 2.5195 sec/batch
Epoch 4/20  Iteration 676/3560 Training loss: 1.7136 2.4890 sec/batch
Epoch 4/20  Iteration 677/3560 Training loss: 1.7133 2.4990 sec/batch
Epoch 4/20  Iteration 678/3560 Training loss: 1.7131 2.5150 sec/batch
Epoch 4/20  Iteration 679/3560 Training loss: 1.7127 2.5135 sec/batch
Epoch 4/20  Iteration 680/3560 Training loss: 1.7125 2.5035 sec/batch
Epoch 4/20  Iteration 681/3560 Training loss: 1.7122 2.4997 sec/batch
Epoch 4/20  Iteration 682/3560 Training loss: 1.7122 2.4864 sec/batch
Epoch 4/20  Iteration 683/3560 Training loss: 1.7119 2.4886 sec/batch
Epoch 4/20  Iteration 684/3560 Training loss: 1.7115 2.6034 sec/batch
Epoch 4/20  Iteration 685/3560 Training loss: 1.7109 2.5502 sec/batch
Epoch 4/20  Iteration 686/3560 Training loss: 1.7107 2.4970 sec/batch
Epoch 4/20  Iteration 687/3560 Training loss: 1.7105 2.4822 sec/batch
Epoch 4/20  Iteration 688/3560 Training loss: 1.7103 2.7620 sec/batch
Epoch 4/20  Iteration 689/3560 Training loss: 1.7100 2.5705 sec/batch
Epoch 4/20  Iteration 690/3560 Training loss: 1.7097 2.5026 sec/batch
Epoch 4/20  Iteration 691/3560 Training loss: 1.7096 2.5084 sec/batch
Epoch 4/20  Iteration 692/3560 Training loss: 1.7093 2.4726 sec/batch
Epoch 4/20  Iteration 693/3560 Training loss: 1.7088 2.5143 sec/batch
Epoch 4/20  Iteration 694/3560 Training loss: 1.7087 2.5147 sec/batch
Epoch 4/20  Iteration 695/3560 Training loss: 1.7086 2.4870 sec/batch
Epoch 4/20  Iteration 696/3560 Training loss: 1.7083 2.5051 sec/batch
Epoch 4/20  Iteration 697/3560 Training loss: 1.7081 2.6959 sec/batch
Epoch 4/20  Iteration 698/3560 Training loss: 1.7079 2.5069 sec/batch
Epoch 4/20  Iteration 699/3560 Training loss: 1.7077 2.5981 sec/batch
Epoch 4/20  Iteration 700/3560 Training loss: 1.7074 2.5381 sec/batch
Epoch 4/20  Iteration 701/3560 Training loss: 1.7073 2.5000 sec/batch
Epoch 4/20  Iteration 702/3560 Training loss: 1.7074 2.5058 sec/batch
Epoch 4/20  Iteration 703/3560 Training loss: 1.7071 2.7409 sec/batch
Epoch 4/20  Iteration 704/3560 Training loss: 1.7068 2.5761 sec/batch
Epoch 4/20  Iteration 705/3560 Training loss: 1.7064 2.4885 sec/batch
Epoch 4/20  Iteration 706/3560 Training loss: 1.7060 2.4784 sec/batch
Epoch 4/20  Iteration 707/3560 Training loss: 1.7058 2.4886 sec/batch
Epoch 4/20  Iteration 708/3560 Training loss: 1.7056 2.5514 sec/batch
Epoch 4/20  Iteration 709/3560 Training loss: 1.7054 2.4848 sec/batch
Epoch 4/20  Iteration 710/3560 Training loss: 1.7050 2.5228 sec/batch
Epoch 4/20  Iteration 711/3560 Training loss: 1.7046 2.4912 sec/batch
Epoch 4/20  Iteration 712/3560 Training loss: 1.7044 2.4789 sec/batch
Epoch 5/20  Iteration 713/3560 Training loss: 1.7310 2.4912 sec/batch
Epoch 5/20  Iteration 714/3560 Training loss: 1.6901 2.6057 sec/batch
Epoch 5/20  Iteration 715/3560 Training loss: 1.6762 2.5730 sec/batch
Epoch 5/20  Iteration 716/3560 Training loss: 1.6714 2.4841 sec/batch
Epoch 5/20  Iteration 717/3560 Training loss: 1.6648 2.5307 sec/batch
Epoch 5/20  Iteration 718/3560 Training loss: 1.6530 2.7225 sec/batch
Epoch 5/20  Iteration 719/3560 Training loss: 1.6530 2.5761 sec/batch
Epoch 5/20  Iteration 720/3560 Training loss: 1.6510 2.4857 sec/batch
Epoch 5/20  Iteration 721/3560 Training loss: 1.6517 2.4875 sec/batch
Epoch 5/20  Iteration 722/3560 Training loss: 1.6505 2.4814 sec/batch
Epoch 5/20  Iteration 723/3560 Training loss: 1.6473 2.5320 sec/batch
Epoch 5/20  Iteration 724/3560 Training loss: 1.6460 2.4927 sec/batch
Epoch 5/20  Iteration 725/3560 Training loss: 1.6461 2.5091 sec/batch
Epoch 5/20  Iteration 726/3560 Training loss: 1.6484 2.5038 sec/batch
Epoch 5/20  Iteration 727/3560 Training loss: 1.6476 2.4876 sec/batch
Epoch 5/20  Iteration 728/3560 Training loss: 1.6459 2.4861 sec/batch
Epoch 5/20  Iteration 729/3560 Training loss: 1.6464 2.5890 sec/batch
Epoch 5/20  Iteration 730/3560 Training loss: 1.6481 2.5404 sec/batch
Epoch 5/20  Iteration 731/3560 Training loss: 1.6479 2.5063 sec/batch
Epoch 5/20  Iteration 732/3560 Training loss: 1.6490 2.5275 sec/batch
Epoch 5/20  Iteration 733/3560 Training loss: 1.6480 2.7196 sec/batch
Epoch 5/20  Iteration 734/3560 Training loss: 1.6486 2.5818 sec/batch
Epoch 5/20  Iteration 735/3560 Training loss: 1.6475 2.4887 sec/batch
Epoch 5/20  Iteration 736/3560 Training loss: 1.6474 2.4991 sec/batch
Epoch 5/20  Iteration 737/3560 Training loss: 1.6470 2.4844 sec/batch
Epoch 5/20  Iteration 738/3560 Training loss: 1.6453 2.5680 sec/batch
Epoch 5/20  Iteration 739/3560 Training loss: 1.6434 2.4799 sec/batch
Epoch 5/20  Iteration 740/3560 Training loss: 1.6439 2.5052 sec/batch
Epoch 5/20  Iteration 741/3560 Training loss: 1.6441 2.5216 sec/batch
Epoch 5/20  Iteration 742/3560 Training loss: 1.6443 2.4723 sec/batch
Epoch 5/20  Iteration 743/3560 Training loss: 1.6441 2.4867 sec/batch
Epoch 5/20  Iteration 744/3560 Training loss: 1.6429 2.6271 sec/batch
Epoch 5/20  Iteration 745/3560 Training loss: 1.6426 2.7045 sec/batch
Epoch 5/20  Iteration 746/3560 Training loss: 1.6429 2.5208 sec/batch
Epoch 5/20  Iteration 747/3560 Training loss: 1.6424 2.5803 sec/batch
Epoch 5/20  Iteration 748/3560 Training loss: 1.6419 2.6754 sec/batch
Epoch 5/20  Iteration 749/3560 Training loss: 1.6410 2.5640 sec/batch
Epoch 5/20  Iteration 750/3560 Training loss: 1.6396 2.5156 sec/batch
Epoch 5/20  Iteration 751/3560 Training loss: 1.6380 2.4756 sec/batch
Epoch 5/20  Iteration 752/3560 Training loss: 1.6371 2.4896 sec/batch
Epoch 5/20  Iteration 753/3560 Training loss: 1.6365 2.5382 sec/batch
Epoch 5/20  Iteration 754/3560 Training loss: 1.6367 2.4890 sec/batch
Epoch 5/20  Iteration 755/3560 Training loss: 1.6361 2.4908 sec/batch
Epoch 5/20  Iteration 756/3560 Training loss: 1.6351 2.5169 sec/batch
Epoch 5/20  Iteration 757/3560 Training loss: 1.6351 2.5092 sec/batch
Epoch 5/20  Iteration 758/3560 Training loss: 1.6341 2.4802 sec/batch
Epoch 5/20  Iteration 759/3560 Training loss: 1.6336 2.5972 sec/batch
Epoch 5/20  Iteration 760/3560 Training loss: 1.6329 2.5641 sec/batch
Epoch 5/20  Iteration 761/3560 Training loss: 1.6327 2.4939 sec/batch
Epoch 5/20  Iteration 762/3560 Training loss: 1.6330 2.6210 sec/batch
Epoch 5/20  Iteration 763/3560 Training loss: 1.6323 2.6976 sec/batch
Epoch 5/20  Iteration 764/3560 Training loss: 1.6328 2.5510 sec/batch
Epoch 5/20  Iteration 765/3560 Training loss: 1.6324 2.4808 sec/batch
Epoch 5/20  Iteration 766/3560 Training loss: 1.6324 2.4914 sec/batch
Epoch 5/20  Iteration 767/3560 Training loss: 1.6321 2.4709 sec/batch
Epoch 5/20  Iteration 768/3560 Training loss: 1.6322 2.5397 sec/batch
Epoch 5/20  Iteration 769/3560 Training loss: 1.6324 2.4837 sec/batch
Epoch 5/20  Iteration 770/3560 Training loss: 1.6319 2.5076 sec/batch
Epoch 5/20  Iteration 771/3560 Training loss: 1.6311 2.5083 sec/batch
Epoch 5/20  Iteration 772/3560 Training loss: 1.6315 2.4731 sec/batch
Epoch 5/20  Iteration 773/3560 Training loss: 1.6312 2.4896 sec/batch
Epoch 5/20  Iteration 774/3560 Training loss: 1.6319 2.5968 sec/batch
Epoch 5/20  Iteration 775/3560 Training loss: 1.6321 2.6369 sec/batch
Epoch 5/20  Iteration 776/3560 Training loss: 1.6322 2.7788 sec/batch
Epoch 5/20  Iteration 777/3560 Training loss: 1.6318 2.5757 sec/batch
Epoch 5/20  Iteration 778/3560 Training loss: 1.6318 2.7807 sec/batch
Epoch 5/20  Iteration 779/3560 Training loss: 1.6318 2.4950 sec/batch
Epoch 5/20  Iteration 780/3560 Training loss: 1.6312 2.4747 sec/batch
Epoch 5/20  Iteration 781/3560 Training loss: 1.6310 2.4774 sec/batch
Epoch 5/20  Iteration 782/3560 Training loss: 1.6306 2.5697 sec/batch
Epoch 5/20  Iteration 783/3560 Training loss: 1.6311 2.5179 sec/batch
Epoch 5/20  Iteration 784/3560 Training loss: 1.6311 2.4652 sec/batch
Epoch 5/20  Iteration 785/3560 Training loss: 1.6314 2.5255 sec/batch
Epoch 5/20  Iteration 786/3560 Training loss: 1.6308 2.5164 sec/batch
Epoch 5/20  Iteration 787/3560 Training loss: 1.6305 2.4847 sec/batch
Epoch 5/20  Iteration 788/3560 Training loss: 1.6304 2.4802 sec/batch
Epoch 5/20  Iteration 789/3560 Training loss: 1.6301 2.6310 sec/batch
Epoch 5/20  Iteration 790/3560 Training loss: 1.6298 2.5460 sec/batch
Epoch 5/20  Iteration 791/3560 Training loss: 1.6291 2.6546 sec/batch
Epoch 5/20  Iteration 792/3560 Training loss: 1.6288 2.6959 sec/batch
Epoch 5/20  Iteration 793/3560 Training loss: 1.6281 2.7160 sec/batch
Epoch 5/20  Iteration 794/3560 Training loss: 1.6279 2.4875 sec/batch
Epoch 5/20  Iteration 795/3560 Training loss: 1.6272 2.4701 sec/batch
Epoch 5/20  Iteration 796/3560 Training loss: 1.6270 2.5184 sec/batch
Epoch 5/20  Iteration 797/3560 Training loss: 1.6266 2.5205 sec/batch
Epoch 5/20  Iteration 798/3560 Training loss: 1.6260 2.5290 sec/batch
Epoch 5/20  Iteration 799/3560 Training loss: 1.6256 2.4775 sec/batch
Epoch 5/20  Iteration 800/3560 Training loss: 1.6251 2.5073 sec/batch
Epoch 5/20  Iteration 801/3560 Training loss: 1.6244 2.5058 sec/batch
Epoch 5/20  Iteration 802/3560 Training loss: 1.6243 2.4645 sec/batch
Epoch 5/20  Iteration 803/3560 Training loss: 1.6238 2.5010 sec/batch
Epoch 5/20  Iteration 804/3560 Training loss: 1.6234 2.6466 sec/batch
Epoch 5/20  Iteration 805/3560 Training loss: 1.6229 2.4908 sec/batch
Epoch 5/20  Iteration 806/3560 Training loss: 1.6223 2.4892 sec/batch
Epoch 5/20  Iteration 807/3560 Training loss: 1.6218 2.6184 sec/batch
Epoch 5/20  Iteration 808/3560 Training loss: 1.6215 2.7181 sec/batch
Epoch 5/20  Iteration 809/3560 Training loss: 1.6212 2.4938 sec/batch
Epoch 5/20  Iteration 810/3560 Training loss: 1.6207 2.5166 sec/batch
Epoch 5/20  Iteration 811/3560 Training loss: 1.6202 2.4948 sec/batch
Epoch 5/20  Iteration 812/3560 Training loss: 1.6195 2.5120 sec/batch
Epoch 5/20  Iteration 813/3560 Training loss: 1.6193 2.4922 sec/batch
Epoch 5/20  Iteration 814/3560 Training loss: 1.6191 2.5100 sec/batch
Epoch 5/20  Iteration 815/3560 Training loss: 1.6188 2.5084 sec/batch
Epoch 5/20  Iteration 816/3560 Training loss: 1.6184 2.5052 sec/batch
Epoch 5/20  Iteration 817/3560 Training loss: 1.6180 2.4700 sec/batch
Epoch 5/20  Iteration 818/3560 Training loss: 1.6176 2.4821 sec/batch
Epoch 5/20  Iteration 819/3560 Training loss: 1.6174 2.7049 sec/batch
Epoch 5/20  Iteration 820/3560 Training loss: 1.6171 2.4687 sec/batch
Epoch 5/20  Iteration 821/3560 Training loss: 1.6168 2.5196 sec/batch
Epoch 5/20  Iteration 822/3560 Training loss: 1.6166 2.6478 sec/batch
Epoch 5/20  Iteration 823/3560 Training loss: 1.6162 2.7030 sec/batch
Epoch 5/20  Iteration 824/3560 Training loss: 1.6159 2.4828 sec/batch
Epoch 5/20  Iteration 825/3560 Training loss: 1.6156 2.4811 sec/batch
Epoch 5/20  Iteration 826/3560 Training loss: 1.6152 2.4949 sec/batch
Epoch 5/20  Iteration 827/3560 Training loss: 1.6147 2.4962 sec/batch
Epoch 5/20  Iteration 828/3560 Training loss: 1.6141 2.5377 sec/batch
Epoch 5/20  Iteration 829/3560 Training loss: 1.6139 2.4690 sec/batch
Epoch 5/20  Iteration 830/3560 Training loss: 1.6136 2.5050 sec/batch
Epoch 5/20  Iteration 831/3560 Training loss: 1.6132 2.5039 sec/batch
Epoch 5/20  Iteration 832/3560 Training loss: 1.6130 2.4811 sec/batch
Epoch 5/20  Iteration 833/3560 Training loss: 1.6128 2.4769 sec/batch
Epoch 5/20  Iteration 834/3560 Training loss: 1.6122 2.6713 sec/batch
Epoch 5/20  Iteration 835/3560 Training loss: 1.6117 2.4788 sec/batch
Epoch 5/20  Iteration 836/3560 Training loss: 1.6115 2.4899 sec/batch
Epoch 5/20  Iteration 837/3560 Training loss: 1.6113 2.6494 sec/batch
Epoch 5/20  Iteration 838/3560 Training loss: 1.6107 2.8777 sec/batch
Epoch 5/20  Iteration 839/3560 Training loss: 1.6106 2.5317 sec/batch
Epoch 5/20  Iteration 840/3560 Training loss: 1.6105 2.4862 sec/batch
Epoch 5/20  Iteration 841/3560 Training loss: 1.6101 2.5021 sec/batch
Epoch 5/20  Iteration 842/3560 Training loss: 1.6097 2.5224 sec/batch
Epoch 5/20  Iteration 843/3560 Training loss: 1.6091 2.5226 sec/batch
Epoch 5/20  Iteration 844/3560 Training loss: 1.6087 2.4902 sec/batch
Epoch 5/20  Iteration 845/3560 Training loss: 1.6085 2.5155 sec/batch
Epoch 5/20  Iteration 846/3560 Training loss: 1.6084 2.5047 sec/batch
Epoch 5/20  Iteration 847/3560 Training loss: 1.6082 2.4631 sec/batch
Epoch 5/20  Iteration 848/3560 Training loss: 1.6080 2.4771 sec/batch
Epoch 5/20  Iteration 849/3560 Training loss: 1.6079 2.6851 sec/batch
Epoch 5/20  Iteration 850/3560 Training loss: 1.6078 2.4869 sec/batch
Epoch 5/20  Iteration 851/3560 Training loss: 1.6077 2.4916 sec/batch
Epoch 5/20  Iteration 852/3560 Training loss: 1.6074 2.6886 sec/batch
Epoch 5/20  Iteration 853/3560 Training loss: 1.6075 2.6525 sec/batch
Epoch 5/20  Iteration 854/3560 Training loss: 1.6073 2.4895 sec/batch
Epoch 5/20  Iteration 855/3560 Training loss: 1.6070 2.4967 sec/batch
Epoch 5/20  Iteration 856/3560 Training loss: 1.6069 2.5385 sec/batch
Epoch 5/20  Iteration 857/3560 Training loss: 1.6066 2.5399 sec/batch
Epoch 5/20  Iteration 858/3560 Training loss: 1.6065 2.5351 sec/batch
Epoch 5/20  Iteration 859/3560 Training loss: 1.6064 2.4692 sec/batch
Epoch 5/20  Iteration 860/3560 Training loss: 1.6065 2.5282 sec/batch
Epoch 5/20  Iteration 861/3560 Training loss: 1.6063 2.5001 sec/batch
Epoch 5/20  Iteration 862/3560 Training loss: 1.6060 2.4836 sec/batch
Epoch 5/20  Iteration 863/3560 Training loss: 1.6055 2.4937 sec/batch
Epoch 5/20  Iteration 864/3560 Training loss: 1.6053 2.6373 sec/batch
Epoch 5/20  Iteration 865/3560 Training loss: 1.6052 2.4779 sec/batch
Epoch 5/20  Iteration 866/3560 Training loss: 1.6051 2.5000 sec/batch
Epoch 5/20  Iteration 867/3560 Training loss: 1.6049 2.6879 sec/batch
Epoch 5/20  Iteration 868/3560 Training loss: 1.6047 2.6733 sec/batch
Epoch 5/20  Iteration 869/3560 Training loss: 1.6046 2.4999 sec/batch
Epoch 5/20  Iteration 870/3560 Training loss: 1.6044 2.5004 sec/batch
Epoch 5/20  Iteration 871/3560 Training loss: 1.6040 2.5042 sec/batch
Epoch 5/20  Iteration 872/3560 Training loss: 1.6039 2.4910 sec/batch
Epoch 5/20  Iteration 873/3560 Training loss: 1.6039 2.5196 sec/batch
Epoch 5/20  Iteration 874/3560 Training loss: 1.6036 2.4714 sec/batch
Epoch 5/20  Iteration 875/3560 Training loss: 1.6035 2.5262 sec/batch
Epoch 5/20  Iteration 876/3560 Training loss: 1.6033 2.5032 sec/batch
Epoch 5/20  Iteration 877/3560 Training loss: 1.6032 2.4720 sec/batch
Epoch 5/20  Iteration 878/3560 Training loss: 1.6029 2.5185 sec/batch
Epoch 5/20  Iteration 879/3560 Training loss: 1.6029 2.6231 sec/batch
Epoch 5/20  Iteration 880/3560 Training loss: 1.6031 2.5015 sec/batch
Epoch 5/20  Iteration 881/3560 Training loss: 1.6029 2.4781 sec/batch
Epoch 5/20  Iteration 882/3560 Training loss: 1.6027 2.7235 sec/batch
Epoch 5/20  Iteration 883/3560 Training loss: 1.6024 2.6228 sec/batch
Epoch 5/20  Iteration 884/3560 Training loss: 1.6021 2.4751 sec/batch
Epoch 5/20  Iteration 885/3560 Training loss: 1.6020 2.6500 sec/batch
Epoch 5/20  Iteration 886/3560 Training loss: 1.6018 2.5505 sec/batch
Epoch 5/20  Iteration 887/3560 Training loss: 1.6017 2.5147 sec/batch
Epoch 5/20  Iteration 888/3560 Training loss: 1.6014 2.4922 sec/batch
Epoch 5/20  Iteration 889/3560 Training loss: 1.6011 2.4986 sec/batch
Epoch 5/20  Iteration 890/3560 Training loss: 1.6011 2.5015 sec/batch
Epoch 6/20  Iteration 891/3560 Training loss: 1.6434 2.5076 sec/batch
Epoch 6/20  Iteration 892/3560 Training loss: 1.6033 2.4808 sec/batch
Epoch 6/20  Iteration 893/3560 Training loss: 1.5938 2.8136 sec/batch
Epoch 6/20  Iteration 894/3560 Training loss: 1.5867 2.5901 sec/batch
Epoch 6/20  Iteration 895/3560 Training loss: 1.5782 2.4662 sec/batch
Epoch 6/20  Iteration 896/3560 Training loss: 1.5677 2.5012 sec/batch
Epoch 6/20  Iteration 897/3560 Training loss: 1.5672 2.9479 sec/batch
Epoch 6/20  Iteration 898/3560 Training loss: 1.5645 2.6108 sec/batch
Epoch 6/20  Iteration 899/3560 Training loss: 1.5648 2.5210 sec/batch
Epoch 6/20  Iteration 900/3560 Training loss: 1.5638 2.4848 sec/batch
Epoch 6/20  Iteration 901/3560 Training loss: 1.5604 2.4993 sec/batch
Epoch 6/20  Iteration 902/3560 Training loss: 1.5589 2.5088 sec/batch
Epoch 6/20  Iteration 903/3560 Training loss: 1.5591 2.5259 sec/batch
Epoch 6/20  Iteration 904/3560 Training loss: 1.5606 2.4898 sec/batch
Epoch 6/20  Iteration 905/3560 Training loss: 1.5599 2.5143 sec/batch
Epoch 6/20  Iteration 906/3560 Training loss: 1.5580 2.4932 sec/batch
Epoch 6/20  Iteration 907/3560 Training loss: 1.5576 2.4892 sec/batch
Epoch 6/20  Iteration 908/3560 Training loss: 1.5590 2.5997 sec/batch
Epoch 6/20  Iteration 909/3560 Training loss: 1.5587 2.5451 sec/batch
Epoch 6/20  Iteration 910/3560 Training loss: 1.5605 2.4813 sec/batch
Epoch 6/20  Iteration 911/3560 Training loss: 1.5594 2.4883 sec/batch
Epoch 6/20  Iteration 912/3560 Training loss: 1.5599 2.7836 sec/batch
Epoch 6/20  Iteration 913/3560 Training loss: 1.5589 2.5386 sec/batch
Epoch 6/20  Iteration 914/3560 Training loss: 1.5585 2.5081 sec/batch
Epoch 6/20  Iteration 915/3560 Training loss: 1.5584 2.4966 sec/batch
Epoch 6/20  Iteration 916/3560 Training loss: 1.5565 2.4658 sec/batch
Epoch 6/20  Iteration 917/3560 Training loss: 1.5552 2.5109 sec/batch
Epoch 6/20  Iteration 918/3560 Training loss: 1.5554 2.5014 sec/batch
Epoch 6/20  Iteration 919/3560 Training loss: 1.5556 2.4975 sec/batch
Epoch 6/20  Iteration 920/3560 Training loss: 1.5557 2.5043 sec/batch
Epoch 6/20  Iteration 921/3560 Training loss: 1.5555 2.4882 sec/batch
Epoch 6/20  Iteration 922/3560 Training loss: 1.5544 2.4878 sec/batch
Epoch 6/20  Iteration 923/3560 Training loss: 1.5545 2.5911 sec/batch
Epoch 6/20  Iteration 924/3560 Training loss: 1.5548 2.5585 sec/batch
Epoch 6/20  Iteration 925/3560 Training loss: 1.5545 2.4687 sec/batch
Epoch 6/20  Iteration 926/3560 Training loss: 1.5542 2.4928 sec/batch
Epoch 6/20  Iteration 927/3560 Training loss: 1.5533 2.7836 sec/batch
Epoch 6/20  Iteration 928/3560 Training loss: 1.5519 2.5648 sec/batch
Epoch 6/20  Iteration 929/3560 Training loss: 1.5501 2.4971 sec/batch
Epoch 6/20  Iteration 930/3560 Training loss: 1.5495 2.4807 sec/batch
Epoch 6/20  Iteration 931/3560 Training loss: 1.5487 2.4946 sec/batch
Epoch 6/20  Iteration 932/3560 Training loss: 1.5494 2.6706 sec/batch
Epoch 6/20  Iteration 933/3560 Training loss: 1.5487 2.5579 sec/batch
Epoch 6/20  Iteration 934/3560 Training loss: 1.5480 2.4970 sec/batch
Epoch 6/20  Iteration 935/3560 Training loss: 1.5482 2.4964 sec/batch
Epoch 6/20  Iteration 936/3560 Training loss: 1.5471 2.4867 sec/batch
Epoch 6/20  Iteration 937/3560 Training loss: 1.5466 2.4781 sec/batch
Epoch 6/20  Iteration 938/3560 Training loss: 1.5461 2.6184 sec/batch
Epoch 6/20  Iteration 939/3560 Training loss: 1.5457 2.6278 sec/batch
Epoch 6/20  Iteration 940/3560 Training loss: 1.5460 2.4892 sec/batch
Epoch 6/20  Iteration 941/3560 Training loss: 1.5454 2.4968 sec/batch
Epoch 6/20  Iteration 942/3560 Training loss: 1.5461 2.7546 sec/batch
Epoch 6/20  Iteration 943/3560 Training loss: 1.5460 2.5727 sec/batch
Epoch 6/20  Iteration 944/3560 Training loss: 1.5460 2.5073 sec/batch
Epoch 6/20  Iteration 945/3560 Training loss: 1.5457 2.4845 sec/batch
Epoch 6/20  Iteration 946/3560 Training loss: 1.5458 2.4974 sec/batch
Epoch 6/20  Iteration 947/3560 Training loss: 1.5460 2.5209 sec/batch
Epoch 6/20  Iteration 948/3560 Training loss: 1.5457 2.4925 sec/batch
Epoch 6/20  Iteration 949/3560 Training loss: 1.5448 2.5141 sec/batch
Epoch 6/20  Iteration 950/3560 Training loss: 1.5452 2.5162 sec/batch
Epoch 6/20  Iteration 951/3560 Training loss: 1.5450 2.4910 sec/batch
Epoch 6/20  Iteration 952/3560 Training loss: 1.5458 2.4700 sec/batch
Epoch 6/20  Iteration 953/3560 Training loss: 1.5460 2.6034 sec/batch
Epoch 6/20  Iteration 954/3560 Training loss: 1.5462 2.5476 sec/batch
Epoch 6/20  Iteration 955/3560 Training loss: 1.5459 2.4799 sec/batch
Epoch 6/20  Iteration 956/3560 Training loss: 1.5460 2.5263 sec/batch
Epoch 6/20  Iteration 957/3560 Training loss: 1.5461 2.7137 sec/batch
Epoch 6/20  Iteration 958/3560 Training loss: 1.5458 2.5681 sec/batch
Epoch 6/20  Iteration 959/3560 Training loss: 1.5458 2.4867 sec/batch
Epoch 6/20  Iteration 960/3560 Training loss: 1.5455 2.4888 sec/batch
Epoch 6/20  Iteration 961/3560 Training loss: 1.5463 2.4917 sec/batch
Epoch 6/20  Iteration 962/3560 Training loss: 1.5463 2.5293 sec/batch
Epoch 6/20  Iteration 963/3560 Training loss: 1.5466 2.5072 sec/batch
Epoch 6/20  Iteration 964/3560 Training loss: 1.5463 2.5045 sec/batch
Epoch 6/20  Iteration 965/3560 Training loss: 1.5460 2.5103 sec/batch
Epoch 6/20  Iteration 966/3560 Training loss: 1.5460 2.4782 sec/batch
Epoch 6/20  Iteration 967/3560 Training loss: 1.5457 2.4779 sec/batch
Epoch 6/20  Iteration 968/3560 Training loss: 1.5455 2.5878 sec/batch
Epoch 6/20  Iteration 969/3560 Training loss: 1.5448 2.5462 sec/batch
Epoch 6/20  Iteration 970/3560 Training loss: 1.5446 2.4910 sec/batch
Epoch 6/20  Iteration 971/3560 Training loss: 1.5439 2.5121 sec/batch
Epoch 6/20  Iteration 972/3560 Training loss: 1.5438 2.7596 sec/batch
Epoch 6/20  Iteration 973/3560 Training loss: 1.5431 2.5691 sec/batch
Epoch 6/20  Iteration 974/3560 Training loss: 1.5429 2.5214 sec/batch
Epoch 6/20  Iteration 975/3560 Training loss: 1.5425 2.4674 sec/batch
Epoch 6/20  Iteration 976/3560 Training loss: 1.5420 2.4886 sec/batch
Epoch 6/20  Iteration 977/3560 Training loss: 1.5416 2.5328 sec/batch
Epoch 6/20  Iteration 978/3560 Training loss: 1.5412 2.4981 sec/batch
Epoch 6/20  Iteration 979/3560 Training loss: 1.5407 2.4933 sec/batch
Epoch 6/20  Iteration 980/3560 Training loss: 1.5407 2.6601 sec/batch
Epoch 6/20  Iteration 981/3560 Training loss: 1.5402 2.5246 sec/batch
Epoch 6/20  Iteration 982/3560 Training loss: 1.5399 2.4910 sec/batch
Epoch 6/20  Iteration 983/3560 Training loss: 1.5393 2.5773 sec/batch
Epoch 6/20  Iteration 984/3560 Training loss: 1.5388 2.5508 sec/batch
Epoch 6/20  Iteration 985/3560 Training loss: 1.5384 2.4850 sec/batch
Epoch 6/20  Iteration 986/3560 Training loss: 1.5384 2.5640 sec/batch
Epoch 6/20  Iteration 987/3560 Training loss: 1.5381 2.7270 sec/batch
Epoch 6/20  Iteration 988/3560 Training loss: 1.5375 2.5430 sec/batch
Epoch 6/20  Iteration 989/3560 Training loss: 1.5370 2.4708 sec/batch
Epoch 6/20  Iteration 990/3560 Training loss: 1.5364 2.4617 sec/batch
Epoch 6/20  Iteration 991/3560 Training loss: 1.5363 2.4919 sec/batch
Epoch 6/20  Iteration 992/3560 Training loss: 1.5360 2.5311 sec/batch
Epoch 6/20  Iteration 993/3560 Training loss: 1.5358 2.5115 sec/batch
Epoch 6/20  Iteration 994/3560 Training loss: 1.5355 2.5162 sec/batch
Epoch 6/20  Iteration 995/3560 Training loss: 1.5352 2.5087 sec/batch
Epoch 6/20  Iteration 996/3560 Training loss: 1.5350 2.4796 sec/batch
Epoch 6/20  Iteration 997/3560 Training loss: 1.5348 2.4812 sec/batch
Epoch 6/20  Iteration 998/3560 Training loss: 1.5346 2.6176 sec/batch
Epoch 6/20  Iteration 999/3560 Training loss: 1.5343 2.5357 sec/batch
Epoch 6/20  Iteration 1000/3560 Training loss: 1.5342 2.5171 sec/batch
Epoch 6/20  Iteration 1001/3560 Training loss: 1.5340 2.5293 sec/batch
Epoch 6/20  Iteration 1002/3560 Training loss: 1.5337 2.8010 sec/batch
Epoch 6/20  Iteration 1003/3560 Training loss: 1.5335 2.4818 sec/batch
Epoch 6/20  Iteration 1004/3560 Training loss: 1.5332 2.4896 sec/batch
Epoch 6/20  Iteration 1005/3560 Training loss: 1.5328 2.4928 sec/batch
Epoch 6/20  Iteration 1006/3560 Training loss: 1.5323 2.4855 sec/batch
Epoch 6/20  Iteration 1007/3560 Training loss: 1.5322 2.5393 sec/batch
Epoch 6/20  Iteration 1008/3560 Training loss: 1.5320 2.4810 sec/batch
Epoch 6/20  Iteration 1009/3560 Training loss: 1.5317 2.5244 sec/batch
Epoch 6/20  Iteration 1010/3560 Training loss: 1.5315 2.4969 sec/batch
Epoch 6/20  Iteration 1011/3560 Training loss: 1.5312 2.7571 sec/batch
Epoch 6/20  Iteration 1012/3560 Training loss: 1.5308 2.5677 sec/batch
Epoch 6/20  Iteration 1013/3560 Training loss: 1.5303 2.5763 sec/batch
Epoch 6/20  Iteration 1014/3560 Training loss: 1.5302 2.5508 sec/batch
Epoch 6/20  Iteration 1015/3560 Training loss: 1.5301 2.4834 sec/batch
Epoch 6/20  Iteration 1016/3560 Training loss: 1.5296 2.5960 sec/batch
Epoch 6/20  Iteration 1017/3560 Training loss: 1.5296 2.7797 sec/batch
Epoch 6/20  Iteration 1018/3560 Training loss: 1.5295 2.4780 sec/batch
Epoch 6/20  Iteration 1019/3560 Training loss: 1.5291 2.5117 sec/batch
Epoch 6/20  Iteration 1020/3560 Training loss: 1.5287 2.5111 sec/batch
Epoch 6/20  Iteration 1021/3560 Training loss: 1.5281 2.5252 sec/batch
Epoch 6/20  Iteration 1022/3560 Training loss: 1.5278 2.5165 sec/batch
Epoch 6/20  Iteration 1023/3560 Training loss: 1.5278 2.4951 sec/batch
Epoch 6/20  Iteration 1024/3560 Training loss: 1.5277 2.5388 sec/batch
Epoch 6/20  Iteration 1025/3560 Training loss: 1.5276 2.5051 sec/batch
Epoch 6/20  Iteration 1026/3560 Training loss: 1.5275 2.4697 sec/batch
Epoch 6/20  Iteration 1027/3560 Training loss: 1.5275 2.6554 sec/batch
Epoch 6/20  Iteration 1028/3560 Training loss: 1.5274 2.6265 sec/batch
Epoch 6/20  Iteration 1029/3560 Training loss: 1.5273 2.5595 sec/batch
Epoch 6/20  Iteration 1030/3560 Training loss: 1.5271 2.4726 sec/batch
Epoch 6/20  Iteration 1031/3560 Training loss: 1.5273 2.5949 sec/batch
Epoch 6/20  Iteration 1032/3560 Training loss: 1.5271 2.7622 sec/batch
Epoch 6/20  Iteration 1033/3560 Training loss: 1.5270 2.5005 sec/batch
Epoch 6/20  Iteration 1034/3560 Training loss: 1.5271 2.4878 sec/batch
Epoch 6/20  Iteration 1035/3560 Training loss: 1.5268 2.4863 sec/batch
Epoch 6/20  Iteration 1036/3560 Training loss: 1.5268 2.5086 sec/batch
Epoch 6/20  Iteration 1037/3560 Training loss: 1.5267 2.5052 sec/batch
Epoch 6/20  Iteration 1038/3560 Training loss: 1.5269 2.4898 sec/batch
Epoch 6/20  Iteration 1039/3560 Training loss: 1.5268 2.4966 sec/batch
Epoch 6/20  Iteration 1040/3560 Training loss: 1.5265 2.5240 sec/batch
Epoch 6/20  Iteration 1041/3560 Training loss: 1.5261 2.4846 sec/batch
Epoch 6/20  Iteration 1042/3560 Training loss: 1.5260 2.4640 sec/batch
Epoch 6/20  Iteration 1043/3560 Training loss: 1.5260 2.5817 sec/batch
Epoch 6/20  Iteration 1044/3560 Training loss: 1.5259 2.5629 sec/batch
Epoch 6/20  Iteration 1045/3560 Training loss: 1.5257 2.4997 sec/batch
Epoch 6/20  Iteration 1046/3560 Training loss: 1.5256 2.5883 sec/batch
Epoch 6/20  Iteration 1047/3560 Training loss: 1.5256 2.7680 sec/batch
Epoch 6/20  Iteration 1048/3560 Training loss: 1.5254 2.4924 sec/batch
Epoch 6/20  Iteration 1049/3560 Training loss: 1.5250 2.4776 sec/batch
Epoch 6/20  Iteration 1050/3560 Training loss: 1.5250 2.4848 sec/batch
Epoch 6/20  Iteration 1051/3560 Training loss: 1.5250 2.5556 sec/batch
Epoch 6/20  Iteration 1052/3560 Training loss: 1.5248 2.4962 sec/batch
Epoch 6/20  Iteration 1053/3560 Training loss: 1.5248 2.4809 sec/batch
Epoch 6/20  Iteration 1054/3560 Training loss: 1.5246 2.5072 sec/batch
Epoch 6/20  Iteration 1055/3560 Training loss: 1.5246 2.5175 sec/batch
Epoch 6/20  Iteration 1056/3560 Training loss: 1.5244 2.4858 sec/batch
Epoch 6/20  Iteration 1057/3560 Training loss: 1.5244 2.5367 sec/batch
Epoch 6/20  Iteration 1058/3560 Training loss: 1.5248 2.6144 sec/batch
Epoch 6/20  Iteration 1059/3560 Training loss: 1.5247 2.5283 sec/batch
Epoch 6/20  Iteration 1060/3560 Training loss: 1.5245 2.4825 sec/batch
Epoch 6/20  Iteration 1061/3560 Training loss: 1.5243 2.5900 sec/batch
Epoch 6/20  Iteration 1062/3560 Training loss: 1.5241 2.7528 sec/batch
Epoch 6/20  Iteration 1063/3560 Training loss: 1.5241 2.5102 sec/batch
Epoch 6/20  Iteration 1064/3560 Training loss: 1.5240 2.4693 sec/batch
Epoch 6/20  Iteration 1065/3560 Training loss: 1.5239 2.4841 sec/batch
Epoch 6/20  Iteration 1066/3560 Training loss: 1.5236 2.4930 sec/batch
Epoch 6/20  Iteration 1067/3560 Training loss: 1.5233 2.5245 sec/batch
Epoch 6/20  Iteration 1068/3560 Training loss: 1.5233 2.7335 sec/batch
Epoch 7/20  Iteration 1069/3560 Training loss: 1.5736 2.6867 sec/batch
Epoch 7/20  Iteration 1070/3560 Training loss: 1.5336 2.5977 sec/batch
Epoch 7/20  Iteration 1071/3560 Training loss: 1.5216 2.5817 sec/batch
Epoch 7/20  Iteration 1072/3560 Training loss: 1.5179 2.6573 sec/batch
Epoch 7/20  Iteration 1073/3560 Training loss: 1.5105 2.7875 sec/batch
Epoch 7/20  Iteration 1074/3560 Training loss: 1.4989 2.9268 sec/batch
Epoch 7/20  Iteration 1075/3560 Training loss: 1.4983 2.5269 sec/batch
Epoch 7/20  Iteration 1076/3560 Training loss: 1.4966 2.7840 sec/batch
Epoch 7/20  Iteration 1077/3560 Training loss: 1.4961 2.5535 sec/batch
Epoch 7/20  Iteration 1078/3560 Training loss: 1.4945 2.5168 sec/batch
Epoch 7/20  Iteration 1079/3560 Training loss: 1.4907 2.4889 sec/batch
Epoch 7/20  Iteration 1080/3560 Training loss: 1.4895 2.4815 sec/batch
Epoch 7/20  Iteration 1081/3560 Training loss: 1.4894 2.5247 sec/batch
Epoch 7/20  Iteration 1082/3560 Training loss: 1.4910 2.5114 sec/batch
Epoch 7/20  Iteration 1083/3560 Training loss: 1.4903 2.4985 sec/batch
Epoch 7/20  Iteration 1084/3560 Training loss: 1.4888 2.4991 sec/batch
Epoch 7/20  Iteration 1085/3560 Training loss: 1.4891 2.4877 sec/batch
Epoch 7/20  Iteration 1086/3560 Training loss: 1.4905 2.4883 sec/batch
Epoch 7/20  Iteration 1087/3560 Training loss: 1.4901 2.5856 sec/batch
Epoch 7/20  Iteration 1088/3560 Training loss: 1.4912 2.5408 sec/batch
Epoch 7/20  Iteration 1089/3560 Training loss: 1.4901 2.4687 sec/batch
Epoch 7/20  Iteration 1090/3560 Training loss: 1.4900 2.5367 sec/batch
Epoch 7/20  Iteration 1091/3560 Training loss: 1.4890 2.8040 sec/batch
Epoch 7/20  Iteration 1092/3560 Training loss: 1.4887 2.5627 sec/batch
Epoch 7/20  Iteration 1093/3560 Training loss: 1.4887 2.5136 sec/batch
Epoch 7/20  Iteration 1094/3560 Training loss: 1.4871 2.4863 sec/batch
Epoch 7/20  Iteration 1095/3560 Training loss: 1.4856 2.4984 sec/batch
Epoch 7/20  Iteration 1096/3560 Training loss: 1.4861 2.4867 sec/batch
Epoch 7/20  Iteration 1097/3560 Training loss: 1.4861 2.5774 sec/batch
Epoch 7/20  Iteration 1098/3560 Training loss: 1.4862 2.5028 sec/batch
Epoch 7/20  Iteration 1099/3560 Training loss: 1.4859 2.5037 sec/batch
Epoch 7/20  Iteration 1100/3560 Training loss: 1.4850 2.4750 sec/batch
Epoch 7/20  Iteration 1101/3560 Training loss: 1.4852 2.5034 sec/batch
Epoch 7/20  Iteration 1102/3560 Training loss: 1.4854 2.5902 sec/batch
Epoch 7/20  Iteration 1103/3560 Training loss: 1.4851 2.5481 sec/batch
Epoch 7/20  Iteration 1104/3560 Training loss: 1.4848 2.4882 sec/batch
Epoch 7/20  Iteration 1105/3560 Training loss: 1.4839 2.4733 sec/batch
Epoch 7/20  Iteration 1106/3560 Training loss: 1.4825 2.7715 sec/batch
Epoch 7/20  Iteration 1107/3560 Training loss: 1.4810 2.5402 sec/batch
Epoch 7/20  Iteration 1108/3560 Training loss: 1.4804 2.5060 sec/batch
Epoch 7/20  Iteration 1109/3560 Training loss: 1.4799 2.4992 sec/batch
Epoch 7/20  Iteration 1110/3560 Training loss: 1.4808 2.5087 sec/batch
Epoch 7/20  Iteration 1111/3560 Training loss: 1.4804 2.5108 sec/batch
Epoch 7/20  Iteration 1112/3560 Training loss: 1.4798 2.5133 sec/batch
Epoch 7/20  Iteration 1113/3560 Training loss: 1.4801 2.4974 sec/batch
Epoch 7/20  Iteration 1114/3560 Training loss: 1.4793 2.5032 sec/batch
Epoch 7/20  Iteration 1115/3560 Training loss: 1.4788 2.5172 sec/batch
Epoch 7/20  Iteration 1116/3560 Training loss: 1.4784 2.4841 sec/batch
Epoch 7/20  Iteration 1117/3560 Training loss: 1.4782 2.6271 sec/batch
Epoch 7/20  Iteration 1118/3560 Training loss: 1.4784 2.5551 sec/batch
Epoch 7/20  Iteration 1119/3560 Training loss: 1.4780 2.4770 sec/batch
Epoch 7/20  Iteration 1120/3560 Training loss: 1.4787 2.4896 sec/batch
Epoch 7/20  Iteration 1121/3560 Training loss: 1.4786 2.9448 sec/batch
Epoch 7/20  Iteration 1122/3560 Training loss: 1.4785 2.6158 sec/batch
Epoch 7/20  Iteration 1123/3560 Training loss: 1.4781 2.4652 sec/batch
Epoch 7/20  Iteration 1124/3560 Training loss: 1.4782 2.4947 sec/batch
Epoch 7/20  Iteration 1125/3560 Training loss: 1.4785 2.5040 sec/batch
Epoch 7/20  Iteration 1126/3560 Training loss: 1.4780 2.5372 sec/batch
Epoch 7/20  Iteration 1127/3560 Training loss: 1.4774 2.5009 sec/batch
Epoch 7/20  Iteration 1128/3560 Training loss: 1.4779 2.5082 sec/batch
Epoch 7/20  Iteration 1129/3560 Training loss: 1.4776 2.7618 sec/batch
Epoch 7/20  Iteration 1130/3560 Training loss: 1.4784 2.4793 sec/batch
Epoch 7/20  Iteration 1131/3560 Training loss: 1.4785 2.4824 sec/batch
Epoch 7/20  Iteration 1132/3560 Training loss: 1.4786 2.5937 sec/batch
Epoch 7/20  Iteration 1133/3560 Training loss: 1.4783 2.5598 sec/batch
Epoch 7/20  Iteration 1134/3560 Training loss: 1.4783 2.4975 sec/batch
Epoch 7/20  Iteration 1135/3560 Training loss: 1.4784 2.6051 sec/batch
Epoch 7/20  Iteration 1136/3560 Training loss: 1.4780 2.6920 sec/batch
Epoch 7/20  Iteration 1137/3560 Training loss: 1.4780 2.5628 sec/batch
Epoch 7/20  Iteration 1138/3560 Training loss: 1.4777 2.5140 sec/batch
Epoch 7/20  Iteration 1139/3560 Training loss: 1.4783 2.4845 sec/batch
Epoch 7/20  Iteration 1140/3560 Training loss: 1.4782 2.4935 sec/batch
Epoch 7/20  Iteration 1141/3560 Training loss: 1.4787 2.5276 sec/batch
Epoch 7/20  Iteration 1142/3560 Training loss: 1.4783 2.4889 sec/batch
Epoch 7/20  Iteration 1143/3560 Training loss: 1.4781 2.4866 sec/batch
Epoch 7/20  Iteration 1144/3560 Training loss: 1.4781 2.5021 sec/batch
Epoch 7/20  Iteration 1145/3560 Training loss: 1.4778 2.4829 sec/batch
Epoch 7/20  Iteration 1146/3560 Training loss: 1.4776 2.4718 sec/batch
Epoch 7/20  Iteration 1147/3560 Training loss: 1.4768 2.5792 sec/batch
Epoch 7/20  Iteration 1148/3560 Training loss: 1.4766 2.5507 sec/batch
Epoch 7/20  Iteration 1149/3560 Training loss: 1.4760 2.5027 sec/batch
Epoch 7/20  Iteration 1150/3560 Training loss: 1.4759 2.5751 sec/batch
Epoch 7/20  Iteration 1151/3560 Training loss: 1.4753 2.6768 sec/batch
Epoch 7/20  Iteration 1152/3560 Training loss: 1.4751 2.5817 sec/batch
Epoch 7/20  Iteration 1153/3560 Training loss: 1.4747 2.4846 sec/batch
Epoch 7/20  Iteration 1154/3560 Training loss: 1.4743 2.4823 sec/batch
Epoch 7/20  Iteration 1155/3560 Training loss: 1.4739 2.4820 sec/batch
Epoch 7/20  Iteration 1156/3560 Training loss: 1.4735 2.5350 sec/batch
Epoch 7/20  Iteration 1157/3560 Training loss: 1.4729 2.5032 sec/batch
Epoch 7/20  Iteration 1158/3560 Training loss: 1.4729 2.5083 sec/batch
Epoch 7/20  Iteration 1159/3560 Training loss: 1.4723 2.4956 sec/batch
Epoch 7/20  Iteration 1160/3560 Training loss: 1.4720 2.4730 sec/batch
Epoch 7/20  Iteration 1161/3560 Training loss: 1.4714 2.4910 sec/batch
Epoch 7/20  Iteration 1162/3560 Training loss: 1.4710 2.6082 sec/batch
Epoch 7/20  Iteration 1163/3560 Training loss: 1.4705 2.5573 sec/batch
Epoch 7/20  Iteration 1164/3560 Training loss: 1.4704 2.4835 sec/batch
Epoch 7/20  Iteration 1165/3560 Training loss: 1.4701 2.5646 sec/batch
Epoch 7/20  Iteration 1166/3560 Training loss: 1.4695 2.7060 sec/batch
Epoch 7/20  Iteration 1167/3560 Training loss: 1.4692 2.5573 sec/batch
Epoch 7/20  Iteration 1168/3560 Training loss: 1.4686 2.6867 sec/batch
Epoch 7/20  Iteration 1169/3560 Training loss: 1.4684 2.5167 sec/batch
Epoch 7/20  Iteration 1170/3560 Training loss: 1.4681 2.5214 sec/batch
Epoch 7/20  Iteration 1171/3560 Training loss: 1.4679 2.5213 sec/batch
Epoch 7/20  Iteration 1172/3560 Training loss: 1.4675 2.4804 sec/batch
Epoch 7/20  Iteration 1173/3560 Training loss: 1.4672 2.4911 sec/batch
Epoch 7/20  Iteration 1174/3560 Training loss: 1.4669 2.5325 sec/batch
Epoch 7/20  Iteration 1175/3560 Training loss: 1.4667 2.4697 sec/batch
Epoch 7/20  Iteration 1176/3560 Training loss: 1.4665 2.4864 sec/batch
Epoch 7/20  Iteration 1177/3560 Training loss: 1.4662 2.5848 sec/batch
Epoch 7/20  Iteration 1178/3560 Training loss: 1.4660 2.5572 sec/batch
Epoch 7/20  Iteration 1179/3560 Training loss: 1.4656 2.4877 sec/batch
Epoch 7/20  Iteration 1180/3560 Training loss: 1.4654 2.5830 sec/batch
Epoch 7/20  Iteration 1181/3560 Training loss: 1.4651 2.7264 sec/batch
Epoch 7/20  Iteration 1182/3560 Training loss: 1.4647 2.5289 sec/batch
Epoch 7/20  Iteration 1183/3560 Training loss: 1.4644 2.5006 sec/batch
Epoch 7/20  Iteration 1184/3560 Training loss: 1.4639 2.4860 sec/batch
Epoch 7/20  Iteration 1185/3560 Training loss: 1.4637 2.5308 sec/batch
Epoch 7/20  Iteration 1186/3560 Training loss: 1.4636 2.5125 sec/batch
Epoch 7/20  Iteration 1187/3560 Training loss: 1.4632 2.4907 sec/batch
Epoch 7/20  Iteration 1188/3560 Training loss: 1.4630 2.5107 sec/batch
Epoch 7/20  Iteration 1189/3560 Training loss: 1.4628 2.5190 sec/batch
Epoch 7/20  Iteration 1190/3560 Training loss: 1.4623 2.4706 sec/batch
Epoch 7/20  Iteration 1191/3560 Training loss: 1.4617 2.4818 sec/batch
Epoch 7/20  Iteration 1192/3560 Training loss: 1.4617 2.5741 sec/batch
Epoch 7/20  Iteration 1193/3560 Training loss: 1.4615 2.5534 sec/batch
Epoch 7/20  Iteration 1194/3560 Training loss: 1.4609 2.4811 sec/batch
Epoch 7/20  Iteration 1195/3560 Training loss: 1.4608 2.5851 sec/batch
Epoch 7/20  Iteration 1196/3560 Training loss: 1.4608 2.7077 sec/batch
Epoch 7/20  Iteration 1197/3560 Training loss: 1.4605 2.5627 sec/batch
Epoch 7/20  Iteration 1198/3560 Training loss: 1.4601 2.4942 sec/batch
Epoch 7/20  Iteration 1199/3560 Training loss: 1.4595 2.4722 sec/batch
Epoch 7/20  Iteration 1200/3560 Training loss: 1.4591 2.5149 sec/batch
Epoch 7/20  Iteration 1201/3560 Training loss: 1.4590 2.5002 sec/batch
Epoch 7/20  Iteration 1202/3560 Training loss: 1.4589 2.4876 sec/batch
Epoch 7/20  Iteration 1203/3560 Training loss: 1.4587 2.5031 sec/batch
Epoch 7/20  Iteration 1204/3560 Training loss: 1.4586 2.5138 sec/batch
Epoch 7/20  Iteration 1205/3560 Training loss: 1.4586 2.4740 sec/batch
Epoch 7/20  Iteration 1206/3560 Training loss: 1.4584 2.4780 sec/batch
Epoch 7/20  Iteration 1207/3560 Training loss: 1.4584 2.5789 sec/batch
Epoch 7/20  Iteration 1208/3560 Training loss: 1.4582 2.5528 sec/batch
Epoch 7/20  Iteration 1209/3560 Training loss: 1.4583 2.5090 sec/batch
Epoch 7/20  Iteration 1210/3560 Training loss: 1.4581 2.5724 sec/batch
Epoch 7/20  Iteration 1211/3560 Training loss: 1.4579 2.7341 sec/batch
Epoch 7/20  Iteration 1212/3560 Training loss: 1.4580 2.5089 sec/batch
Epoch 7/20  Iteration 1213/3560 Training loss: 1.4577 2.5053 sec/batch
Epoch 7/20  Iteration 1214/3560 Training loss: 1.4577 2.4780 sec/batch
Epoch 7/20  Iteration 1215/3560 Training loss: 1.4577 2.5709 sec/batch
Epoch 7/20  Iteration 1216/3560 Training loss: 1.4578 2.6217 sec/batch
Epoch 7/20  Iteration 1217/3560 Training loss: 1.4577 2.4958 sec/batch
Epoch 7/20  Iteration 1218/3560 Training loss: 1.4575 2.5132 sec/batch
Epoch 7/20  Iteration 1219/3560 Training loss: 1.4571 2.5075 sec/batch
Epoch 7/20  Iteration 1220/3560 Training loss: 1.4569 2.4714 sec/batch
Epoch 7/20  Iteration 1221/3560 Training loss: 1.4568 2.4963 sec/batch
Epoch 7/20  Iteration 1222/3560 Training loss: 1.4567 2.5825 sec/batch
Epoch 7/20  Iteration 1223/3560 Training loss: 1.4565 2.5648 sec/batch
Epoch 7/20  Iteration 1224/3560 Training loss: 1.4564 2.4924 sec/batch
Epoch 7/20  Iteration 1225/3560 Training loss: 1.4564 2.5837 sec/batch
Epoch 7/20  Iteration 1226/3560 Training loss: 1.4562 2.7428 sec/batch
Epoch 7/20  Iteration 1227/3560 Training loss: 1.4558 2.4921 sec/batch
Epoch 7/20  Iteration 1228/3560 Training loss: 1.4558 2.4876 sec/batch
Epoch 7/20  Iteration 1229/3560 Training loss: 1.4558 2.4795 sec/batch
Epoch 7/20  Iteration 1230/3560 Training loss: 1.4557 2.5236 sec/batch
Epoch 7/20  Iteration 1231/3560 Training loss: 1.4556 2.4949 sec/batch
Epoch 7/20  Iteration 1232/3560 Training loss: 1.4554 2.4890 sec/batch
Epoch 7/20  Iteration 1233/3560 Training loss: 1.4553 2.5152 sec/batch
Epoch 7/20  Iteration 1234/3560 Training loss: 1.4551 2.5132 sec/batch
Epoch 7/20  Iteration 1235/3560 Training loss: 1.4551 2.4687 sec/batch
Epoch 7/20  Iteration 1236/3560 Training loss: 1.4554 2.4863 sec/batch
Epoch 7/20  Iteration 1237/3560 Training loss: 1.4553 2.5915 sec/batch
Epoch 7/20  Iteration 1238/3560 Training loss: 1.4552 2.5414 sec/batch
Epoch 7/20  Iteration 1239/3560 Training loss: 1.4550 2.5015 sec/batch
Epoch 7/20  Iteration 1240/3560 Training loss: 1.4548 2.5745 sec/batch
Epoch 7/20  Iteration 1241/3560 Training loss: 1.4548 2.7695 sec/batch
Epoch 7/20  Iteration 1242/3560 Training loss: 1.4546 2.4748 sec/batch
Epoch 7/20  Iteration 1243/3560 Training loss: 1.4546 2.4860 sec/batch
Epoch 7/20  Iteration 1244/3560 Training loss: 1.4543 2.4819 sec/batch
Epoch 7/20  Iteration 1245/3560 Training loss: 1.4541 2.5184 sec/batch
Epoch 7/20  Iteration 1246/3560 Training loss: 1.4540 2.5021 sec/batch
Epoch 8/20  Iteration 1247/3560 Training loss: 1.5578 2.9804 sec/batch
Epoch 8/20  Iteration 1248/3560 Training loss: 1.5030 2.8019 sec/batch
Epoch 8/20  Iteration 1249/3560 Training loss: 1.4843 2.5273 sec/batch
Epoch 8/20  Iteration 1250/3560 Training loss: 1.4759 2.4785 sec/batch
Epoch 8/20  Iteration 1251/3560 Training loss: 1.4637 2.5260 sec/batch
Epoch 8/20  Iteration 1252/3560 Training loss: 1.4512 2.6367 sec/batch
Epoch 8/20  Iteration 1253/3560 Training loss: 1.4486 2.4810 sec/batch
Epoch 8/20  Iteration 1254/3560 Training loss: 1.4457 2.4945 sec/batch
Epoch 8/20  Iteration 1255/3560 Training loss: 1.4448 2.7407 sec/batch
Epoch 8/20  Iteration 1256/3560 Training loss: 1.4430 2.6534 sec/batch
Epoch 8/20  Iteration 1257/3560 Training loss: 1.4387 2.5075 sec/batch
Epoch 8/20  Iteration 1258/3560 Training loss: 1.4374 2.5003 sec/batch
Epoch 8/20  Iteration 1259/3560 Training loss: 1.4371 2.5097 sec/batch
Epoch 8/20  Iteration 1260/3560 Training loss: 1.4382 2.5296 sec/batch
Epoch 8/20  Iteration 1261/3560 Training loss: 1.4368 2.5023 sec/batch
Epoch 8/20  Iteration 1262/3560 Training loss: 1.4344 2.5448 sec/batch
Epoch 8/20  Iteration 1263/3560 Training loss: 1.4335 2.6353 sec/batch
Epoch 8/20  Iteration 1264/3560 Training loss: 1.4341 2.5052 sec/batch
Epoch 8/20  Iteration 1265/3560 Training loss: 1.4341 2.4643 sec/batch
Epoch 8/20  Iteration 1266/3560 Training loss: 1.4348 2.6065 sec/batch
Epoch 8/20  Iteration 1267/3560 Training loss: 1.4330 2.5542 sec/batch
Epoch 8/20  Iteration 1268/3560 Training loss: 1.4329 2.4889 sec/batch
Epoch 8/20  Iteration 1269/3560 Training loss: 1.4315 2.4855 sec/batch
Epoch 8/20  Iteration 1270/3560 Training loss: 1.4309 2.7278 sec/batch
Epoch 8/20  Iteration 1271/3560 Training loss: 1.4309 2.6178 sec/batch
Epoch 8/20  Iteration 1272/3560 Training loss: 1.4284 2.4768 sec/batch
Epoch 8/20  Iteration 1273/3560 Training loss: 1.4269 2.5221 sec/batch
Epoch 8/20  Iteration 1274/3560 Training loss: 1.4269 2.4859 sec/batch
Epoch 8/20  Iteration 1275/3560 Training loss: 1.4269 2.5165 sec/batch
Epoch 8/20  Iteration 1276/3560 Training loss: 1.4271 2.4939 sec/batch
Epoch 8/20  Iteration 1277/3560 Training loss: 1.4264 2.4921 sec/batch
Epoch 8/20  Iteration 1278/3560 Training loss: 1.4250 2.5102 sec/batch
Epoch 8/20  Iteration 1279/3560 Training loss: 1.4251 2.4890 sec/batch
Epoch 8/20  Iteration 1280/3560 Training loss: 1.4253 2.4853 sec/batch
Epoch 8/20  Iteration 1281/3560 Training loss: 1.4248 2.5703 sec/batch
Epoch 8/20  Iteration 1282/3560 Training loss: 1.4243 2.5929 sec/batch
Epoch 8/20  Iteration 1283/3560 Training loss: 1.4234 2.4770 sec/batch
Epoch 8/20  Iteration 1284/3560 Training loss: 1.4220 2.5038 sec/batch
Epoch 8/20  Iteration 1285/3560 Training loss: 1.4202 2.7480 sec/batch
Epoch 8/20  Iteration 1286/3560 Training loss: 1.4197 2.6060 sec/batch
Epoch 8/20  Iteration 1287/3560 Training loss: 1.4188 2.4771 sec/batch
Epoch 8/20  Iteration 1288/3560 Training loss: 1.4194 2.5110 sec/batch
Epoch 8/20  Iteration 1289/3560 Training loss: 1.4189 2.4858 sec/batch
Epoch 8/20  Iteration 1290/3560 Training loss: 1.4181 2.5222 sec/batch
Epoch 8/20  Iteration 1291/3560 Training loss: 1.4181 2.5058 sec/batch
Epoch 8/20  Iteration 1292/3560 Training loss: 1.4171 2.4922 sec/batch
Epoch 8/20  Iteration 1293/3560 Training loss: 1.4164 2.5297 sec/batch
Epoch 8/20  Iteration 1294/3560 Training loss: 1.4159 2.4749 sec/batch
Epoch 8/20  Iteration 1295/3560 Training loss: 1.4156 2.4711 sec/batch
Epoch 8/20  Iteration 1296/3560 Training loss: 1.4158 2.6048 sec/batch
Epoch 8/20  Iteration 1297/3560 Training loss: 1.4154 2.5651 sec/batch
Epoch 8/20  Iteration 1298/3560 Training loss: 1.4159 2.4714 sec/batch
Epoch 8/20  Iteration 1299/3560 Training loss: 1.4156 2.4877 sec/batch
Epoch 8/20  Iteration 1300/3560 Training loss: 1.4157 2.7548 sec/batch
Epoch 8/20  Iteration 1301/3560 Training loss: 1.4154 2.5711 sec/batch
Epoch 8/20  Iteration 1302/3560 Training loss: 1.4153 2.4876 sec/batch
Epoch 8/20  Iteration 1303/3560 Training loss: 1.4155 2.5186 sec/batch
Epoch 8/20  Iteration 1304/3560 Training loss: 1.4151 2.4855 sec/batch
Epoch 8/20  Iteration 1305/3560 Training loss: 1.4144 2.5119 sec/batch
Epoch 8/20  Iteration 1306/3560 Training loss: 1.4147 2.5107 sec/batch
Epoch 8/20  Iteration 1307/3560 Training loss: 1.4144 2.5086 sec/batch
Epoch 8/20  Iteration 1308/3560 Training loss: 1.4151 2.5119 sec/batch
Epoch 8/20  Iteration 1309/3560 Training loss: 1.4152 2.6283 sec/batch
Epoch 8/20  Iteration 1310/3560 Training loss: 1.4152 2.5451 sec/batch
Epoch 8/20  Iteration 1311/3560 Training loss: 1.4149 2.6028 sec/batch
Epoch 8/20  Iteration 1312/3560 Training loss: 1.4149 2.5499 sec/batch
Epoch 8/20  Iteration 1313/3560 Training loss: 1.4150 2.4686 sec/batch
Epoch 8/20  Iteration 1314/3560 Training loss: 1.4146 2.5068 sec/batch
Epoch 8/20  Iteration 1315/3560 Training loss: 1.4145 2.7911 sec/batch
Epoch 8/20  Iteration 1316/3560 Training loss: 1.4143 2.5570 sec/batch
Epoch 8/20  Iteration 1317/3560 Training loss: 1.4149 2.4844 sec/batch
Epoch 8/20  Iteration 1318/3560 Training loss: 1.4149 2.5038 sec/batch
Epoch 8/20  Iteration 1319/3560 Training loss: 1.4152 2.4838 sec/batch
Epoch 8/20  Iteration 1320/3560 Training loss: 1.4147 2.5157 sec/batch
Epoch 8/20  Iteration 1321/3560 Training loss: 1.4146 2.5215 sec/batch
Epoch 8/20  Iteration 1322/3560 Training loss: 1.4147 2.4911 sec/batch
Epoch 8/20  Iteration 1323/3560 Training loss: 1.4144 2.5329 sec/batch
Epoch 8/20  Iteration 1324/3560 Training loss: 1.4141 2.4841 sec/batch
Epoch 8/20  Iteration 1325/3560 Training loss: 1.4134 2.4742 sec/batch
Epoch 8/20  Iteration 1326/3560 Training loss: 1.4133 2.5996 sec/batch
Epoch 8/20  Iteration 1327/3560 Training loss: 1.4128 2.5753 sec/batch
Epoch 8/20  Iteration 1328/3560 Training loss: 1.4126 2.4729 sec/batch
Epoch 8/20  Iteration 1329/3560 Training loss: 1.4121 2.4863 sec/batch
Epoch 8/20  Iteration 1330/3560 Training loss: 1.4119 2.7976 sec/batch
Epoch 8/20  Iteration 1331/3560 Training loss: 1.4115 2.5491 sec/batch
Epoch 8/20  Iteration 1332/3560 Training loss: 1.4112 2.5250 sec/batch
Epoch 8/20  Iteration 1333/3560 Training loss: 1.4108 2.4752 sec/batch
Epoch 8/20  Iteration 1334/3560 Training loss: 1.4104 2.4941 sec/batch
Epoch 8/20  Iteration 1335/3560 Training loss: 1.4099 2.5070 sec/batch
Epoch 8/20  Iteration 1336/3560 Training loss: 1.4099 2.5165 sec/batch
Epoch 8/20  Iteration 1337/3560 Training loss: 1.4095 2.4863 sec/batch
Epoch 8/20  Iteration 1338/3560 Training loss: 1.4093 2.5381 sec/batch
Epoch 8/20  Iteration 1339/3560 Training loss: 1.4087 2.5101 sec/batch
Epoch 8/20  Iteration 1340/3560 Training loss: 1.4083 2.4694 sec/batch
Epoch 8/20  Iteration 1341/3560 Training loss: 1.4078 2.6025 sec/batch
Epoch 8/20  Iteration 1342/3560 Training loss: 1.4078 2.5464 sec/batch
Epoch 8/20  Iteration 1343/3560 Training loss: 1.4076 2.4848 sec/batch
Epoch 8/20  Iteration 1344/3560 Training loss: 1.4071 2.4918 sec/batch
Epoch 8/20  Iteration 1345/3560 Training loss: 1.4068 2.7797 sec/batch
Epoch 8/20  Iteration 1346/3560 Training loss: 1.4063 2.5562 sec/batch
Epoch 8/20  Iteration 1347/3560 Training loss: 1.4061 2.5097 sec/batch
Epoch 8/20  Iteration 1348/3560 Training loss: 1.4058 2.4692 sec/batch
Epoch 8/20  Iteration 1349/3560 Training loss: 1.4056 2.4921 sec/batch
Epoch 8/20  Iteration 1350/3560 Training loss: 1.4054 2.5212 sec/batch
Epoch 8/20  Iteration 1351/3560 Training loss: 1.4051 2.5236 sec/batch
Epoch 8/20  Iteration 1352/3560 Training loss: 1.4048 2.4893 sec/batch
Epoch 8/20  Iteration 1353/3560 Training loss: 1.4048 2.5346 sec/batch
Epoch 8/20  Iteration 1354/3560 Training loss: 1.4046 2.4915 sec/batch
Epoch 8/20  Iteration 1355/3560 Training loss: 1.4044 2.4829 sec/batch
Epoch 8/20  Iteration 1356/3560 Training loss: 1.4044 2.7730 sec/batch
Epoch 8/20  Iteration 1357/3560 Training loss: 1.4040 2.5840 sec/batch
Epoch 8/20  Iteration 1358/3560 Training loss: 1.4039 2.4809 sec/batch
Epoch 8/20  Iteration 1359/3560 Training loss: 1.4036 2.4853 sec/batch
Epoch 8/20  Iteration 1360/3560 Training loss: 1.4033 2.7644 sec/batch
Epoch 8/20  Iteration 1361/3560 Training loss: 1.4030 2.5598 sec/batch
Epoch 8/20  Iteration 1362/3560 Training loss: 1.4025 2.5181 sec/batch
Epoch 8/20  Iteration 1363/3560 Training loss: 1.4024 2.4876 sec/batch
Epoch 8/20  Iteration 1364/3560 Training loss: 1.4022 2.7267 sec/batch
Epoch 8/20  Iteration 1365/3560 Training loss: 1.4020 2.5494 sec/batch
Epoch 8/20  Iteration 1366/3560 Training loss: 1.4018 2.4986 sec/batch
Epoch 8/20  Iteration 1367/3560 Training loss: 1.4016 2.4961 sec/batch
Epoch 8/20  Iteration 1368/3560 Training loss: 1.4012 2.5032 sec/batch
Epoch 8/20  Iteration 1369/3560 Training loss: 1.4007 2.4837 sec/batch
Epoch 8/20  Iteration 1370/3560 Training loss: 1.4006 2.4929 sec/batch
Epoch 8/20  Iteration 1371/3560 Training loss: 1.4004 2.5971 sec/batch
Epoch 8/20  Iteration 1372/3560 Training loss: 1.4000 2.5577 sec/batch
Epoch 8/20  Iteration 1373/3560 Training loss: 1.3999 2.4912 sec/batch
Epoch 8/20  Iteration 1374/3560 Training loss: 1.3998 2.5554 sec/batch
Epoch 8/20  Iteration 1375/3560 Training loss: 1.3995 2.7269 sec/batch
Epoch 8/20  Iteration 1376/3560 Training loss: 1.3991 2.5372 sec/batch
Epoch 8/20  Iteration 1377/3560 Training loss: 1.3986 2.5008 sec/batch
Epoch 8/20  Iteration 1378/3560 Training loss: 1.3983 2.4936 sec/batch
Epoch 8/20  Iteration 1379/3560 Training loss: 1.3983 2.4963 sec/batch
Epoch 8/20  Iteration 1380/3560 Training loss: 1.3983 2.5194 sec/batch
Epoch 8/20  Iteration 1381/3560 Training loss: 1.3981 2.4724 sec/batch
Epoch 8/20  Iteration 1382/3560 Training loss: 1.3981 2.5024 sec/batch
Epoch 8/20  Iteration 1383/3560 Training loss: 1.3981 2.4898 sec/batch
Epoch 8/20  Iteration 1384/3560 Training loss: 1.3981 2.4819 sec/batch
Epoch 8/20  Iteration 1385/3560 Training loss: 1.3981 2.4717 sec/batch
Epoch 8/20  Iteration 1386/3560 Training loss: 1.3979 2.6015 sec/batch
Epoch 8/20  Iteration 1387/3560 Training loss: 1.3981 2.5373 sec/batch
Epoch 8/20  Iteration 1388/3560 Training loss: 1.3980 2.4930 sec/batch
Epoch 8/20  Iteration 1389/3560 Training loss: 1.3979 2.5293 sec/batch
Epoch 8/20  Iteration 1390/3560 Training loss: 1.3980 2.7257 sec/batch
Epoch 8/20  Iteration 1391/3560 Training loss: 1.3978 2.5809 sec/batch
Epoch 8/20  Iteration 1392/3560 Training loss: 1.3979 2.4870 sec/batch
Epoch 8/20  Iteration 1393/3560 Training loss: 1.3978 2.4851 sec/batch
Epoch 8/20  Iteration 1394/3560 Training loss: 1.3980 2.4755 sec/batch
Epoch 8/20  Iteration 1395/3560 Training loss: 1.3981 2.5341 sec/batch
Epoch 8/20  Iteration 1396/3560 Training loss: 1.3978 2.4880 sec/batch
Epoch 8/20  Iteration 1397/3560 Training loss: 1.3974 2.5116 sec/batch
Epoch 8/20  Iteration 1398/3560 Training loss: 1.3973 2.5163 sec/batch
Epoch 8/20  Iteration 1399/3560 Training loss: 1.3974 2.4866 sec/batch
Epoch 8/20  Iteration 1400/3560 Training loss: 1.3973 2.5012 sec/batch
Epoch 8/20  Iteration 1401/3560 Training loss: 1.3972 2.5970 sec/batch
Epoch 8/20  Iteration 1402/3560 Training loss: 1.3971 2.5501 sec/batch
Epoch 8/20  Iteration 1403/3560 Training loss: 1.3971 2.4988 sec/batch
Epoch 8/20  Iteration 1404/3560 Training loss: 1.3970 2.7366 sec/batch
Epoch 8/20  Iteration 1405/3560 Training loss: 1.3967 2.7557 sec/batch
Epoch 8/20  Iteration 1406/3560 Training loss: 1.3967 2.5438 sec/batch
Epoch 8/20  Iteration 1407/3560 Training loss: 1.3967 2.4825 sec/batch
Epoch 8/20  Iteration 1408/3560 Training loss: 1.3967 2.4833 sec/batch
Epoch 8/20  Iteration 1409/3560 Training loss: 1.3966 2.4869 sec/batch
Epoch 8/20  Iteration 1410/3560 Training loss: 1.3965 2.5400 sec/batch
Epoch 8/20  Iteration 1411/3560 Training loss: 1.3965 2.4799 sec/batch
Epoch 8/20  Iteration 1412/3560 Training loss: 1.3963 2.5088 sec/batch
Epoch 8/20  Iteration 1413/3560 Training loss: 1.3964 2.4913 sec/batch
Epoch 8/20  Iteration 1414/3560 Training loss: 1.3968 2.4853 sec/batch
Epoch 8/20  Iteration 1415/3560 Training loss: 1.3967 2.4752 sec/batch
Epoch 8/20  Iteration 1416/3560 Training loss: 1.3966 2.5784 sec/batch
Epoch 8/20  Iteration 1417/3560 Training loss: 1.3965 2.5761 sec/batch
Epoch 8/20  Iteration 1418/3560 Training loss: 1.3963 2.4826 sec/batch
Epoch 8/20  Iteration 1419/3560 Training loss: 1.3964 2.5964 sec/batch
Epoch 8/20  Iteration 1420/3560 Training loss: 1.3963 2.7087 sec/batch
Epoch 8/20  Iteration 1421/3560 Training loss: 1.3963 2.5392 sec/batch
Epoch 8/20  Iteration 1422/3560 Training loss: 1.3960 2.4909 sec/batch
Epoch 8/20  Iteration 1423/3560 Training loss: 1.3958 2.4824 sec/batch
Epoch 8/20  Iteration 1424/3560 Training loss: 1.3959 2.4902 sec/batch
Epoch 9/20  Iteration 1425/3560 Training loss: 1.5132 2.5278 sec/batch
Epoch 9/20  Iteration 1426/3560 Training loss: 1.4583 2.4949 sec/batch
Epoch 9/20  Iteration 1427/3560 Training loss: 1.4393 2.4912 sec/batch
Epoch 9/20  Iteration 1428/3560 Training loss: 1.4327 2.4995 sec/batch
Epoch 9/20  Iteration 1429/3560 Training loss: 1.4229 2.4937 sec/batch
Epoch 9/20  Iteration 1430/3560 Training loss: 1.4129 2.4666 sec/batch
Epoch 9/20  Iteration 1431/3560 Training loss: 1.4121 2.5789 sec/batch
Epoch 9/20  Iteration 1432/3560 Training loss: 1.4091 2.5688 sec/batch
Epoch 9/20  Iteration 1433/3560 Training loss: 1.4083 2.5078 sec/batch
Epoch 9/20  Iteration 1434/3560 Training loss: 1.4049 2.5597 sec/batch
Epoch 9/20  Iteration 1435/3560 Training loss: 1.4000 2.7197 sec/batch
Epoch 9/20  Iteration 1436/3560 Training loss: 1.3983 2.5225 sec/batch
Epoch 9/20  Iteration 1437/3560 Training loss: 1.3978 2.4798 sec/batch
Epoch 9/20  Iteration 1438/3560 Training loss: 1.3986 2.4930 sec/batch
Epoch 9/20  Iteration 1439/3560 Training loss: 1.3964 2.4773 sec/batch
Epoch 9/20  Iteration 1440/3560 Training loss: 1.3942 2.5451 sec/batch
Epoch 9/20  Iteration 1441/3560 Training loss: 1.3937 2.4916 sec/batch
Epoch 9/20  Iteration 1442/3560 Training loss: 1.3944 2.5099 sec/batch
Epoch 9/20  Iteration 1443/3560 Training loss: 1.3946 2.4945 sec/batch
Epoch 9/20  Iteration 1444/3560 Training loss: 1.3954 2.4760 sec/batch
Epoch 9/20  Iteration 1445/3560 Training loss: 1.3939 2.5209 sec/batch
Epoch 9/20  Iteration 1446/3560 Training loss: 1.3936 2.5765 sec/batch
Epoch 9/20  Iteration 1447/3560 Training loss: 1.3921 2.5648 sec/batch
Epoch 9/20  Iteration 1448/3560 Training loss: 1.3919 2.4908 sec/batch
Epoch 9/20  Iteration 1449/3560 Training loss: 1.3913 2.6114 sec/batch
Epoch 9/20  Iteration 1450/3560 Training loss: 1.3890 2.7070 sec/batch
Epoch 9/20  Iteration 1451/3560 Training loss: 1.3872 2.6951 sec/batch
Epoch 9/20  Iteration 1452/3560 Training loss: 1.3873 2.5306 sec/batch
Epoch 9/20  Iteration 1453/3560 Training loss: 1.3872 2.4711 sec/batch
Epoch 9/20  Iteration 1454/3560 Training loss: 1.3870 2.5127 sec/batch
Epoch 9/20  Iteration 1455/3560 Training loss: 1.3863 2.5336 sec/batch
Epoch 9/20  Iteration 1456/3560 Training loss: 1.3845 2.4931 sec/batch
Epoch 9/20  Iteration 1457/3560 Training loss: 1.3845 2.5134 sec/batch
Epoch 9/20  Iteration 1458/3560 Training loss: 1.3844 2.5216 sec/batch
Epoch 9/20  Iteration 1459/3560 Training loss: 1.3840 2.4778 sec/batch
Epoch 9/20  Iteration 1460/3560 Training loss: 1.3835 2.4629 sec/batch
Epoch 9/20  Iteration 1461/3560 Training loss: 1.3826 2.5847 sec/batch
Epoch 9/20  Iteration 1462/3560 Training loss: 1.3810 2.5532 sec/batch
Epoch 9/20  Iteration 1463/3560 Training loss: 1.3792 2.5287 sec/batch
Epoch 9/20  Iteration 1464/3560 Training loss: 1.3785 2.6063 sec/batch
Epoch 9/20  Iteration 1465/3560 Training loss: 1.3776 2.7655 sec/batch
Epoch 9/20  Iteration 1466/3560 Training loss: 1.3781 2.4833 sec/batch
Epoch 9/20  Iteration 1467/3560 Training loss: 1.3776 2.4812 sec/batch
Epoch 9/20  Iteration 1468/3560 Training loss: 1.3767 2.4825 sec/batch
Epoch 9/20  Iteration 1469/3560 Training loss: 1.3768 2.5292 sec/batch
Epoch 9/20  Iteration 1470/3560 Training loss: 1.3759 2.5276 sec/batch
Epoch 9/20  Iteration 1471/3560 Training loss: 1.3753 2.4797 sec/batch
Epoch 9/20  Iteration 1472/3560 Training loss: 1.3746 2.5085 sec/batch
Epoch 9/20  Iteration 1473/3560 Training loss: 1.3746 2.5144 sec/batch
Epoch 9/20  Iteration 1474/3560 Training loss: 1.3746 2.4756 sec/batch
Epoch 9/20  Iteration 1475/3560 Training loss: 1.3742 2.4888 sec/batch
Epoch 9/20  Iteration 1476/3560 Training loss: 1.3747 2.5701 sec/batch
Epoch 9/20  Iteration 1477/3560 Training loss: 1.3743 2.5622 sec/batch
Epoch 9/20  Iteration 1478/3560 Training loss: 1.3743 2.4829 sec/batch
Epoch 9/20  Iteration 1479/3560 Training loss: 1.3738 2.5793 sec/batch
Epoch 9/20  Iteration 1480/3560 Training loss: 1.3737 2.7840 sec/batch
Epoch 9/20  Iteration 1481/3560 Training loss: 1.3738 2.4758 sec/batch
Epoch 9/20  Iteration 1482/3560 Training loss: 1.3734 2.6877 sec/batch
Epoch 9/20  Iteration 1483/3560 Training loss: 1.3726 2.6179 sec/batch
Epoch 9/20  Iteration 1484/3560 Training loss: 1.3730 2.5255 sec/batch
Epoch 9/20  Iteration 1485/3560 Training loss: 1.3727 2.4904 sec/batch
Epoch 9/20  Iteration 1486/3560 Training loss: 1.3734 2.4658 sec/batch
Epoch 9/20  Iteration 1487/3560 Training loss: 1.3735 2.5176 sec/batch
Epoch 9/20  Iteration 1488/3560 Training loss: 1.3735 2.5130 sec/batch
Epoch 9/20  Iteration 1489/3560 Training loss: 1.3732 2.4858 sec/batch
Epoch 9/20  Iteration 1490/3560 Training loss: 1.3732 2.4883 sec/batch
Epoch 9/20  Iteration 1491/3560 Training loss: 1.3734 2.6218 sec/batch
Epoch 9/20  Iteration 1492/3560 Training loss: 1.3731 2.5305 sec/batch
Epoch 9/20  Iteration 1493/3560 Training loss: 1.3729 2.4884 sec/batch
Epoch 9/20  Iteration 1494/3560 Training loss: 1.3727 2.5887 sec/batch
Epoch 9/20  Iteration 1495/3560 Training loss: 1.3732 2.7860 sec/batch
Epoch 9/20  Iteration 1496/3560 Training loss: 1.3733 2.5024 sec/batch
Epoch 9/20  Iteration 1497/3560 Training loss: 1.3736 2.4972 sec/batch
Epoch 9/20  Iteration 1498/3560 Training loss: 1.3731 2.6949 sec/batch
Epoch 9/20  Iteration 1499/3560 Training loss: 1.3729 2.5608 sec/batch
Epoch 9/20  Iteration 1500/3560 Training loss: 1.3730 2.5165 sec/batch
Epoch 9/20  Iteration 1501/3560 Training loss: 1.3727 2.4636 sec/batch
Epoch 9/20  Iteration 1502/3560 Training loss: 1.3726 2.5122 sec/batch
Epoch 9/20  Iteration 1503/3560 Training loss: 1.3718 2.5055 sec/batch
Epoch 9/20  Iteration 1504/3560 Training loss: 1.3716 2.4803 sec/batch
Epoch 9/20  Iteration 1505/3560 Training loss: 1.3711 2.4712 sec/batch
Epoch 9/20  Iteration 1506/3560 Training loss: 1.3708 2.7094 sec/batch
Epoch 9/20  Iteration 1507/3560 Training loss: 1.3702 2.4834 sec/batch
Epoch 9/20  Iteration 1508/3560 Training loss: 1.3700 2.4858 sec/batch
Epoch 9/20  Iteration 1509/3560 Training loss: 1.3697 2.6650 sec/batch
Epoch 9/20  Iteration 1510/3560 Training loss: 1.3695 2.6641 sec/batch
Epoch 9/20  Iteration 1511/3560 Training loss: 1.3691 2.4918 sec/batch
Epoch 9/20  Iteration 1512/3560 Training loss: 1.3687 2.4738 sec/batch
Epoch 9/20  Iteration 1513/3560 Training loss: 1.3682 2.5023 sec/batch
Epoch 9/20  Iteration 1514/3560 Training loss: 1.3681 2.5163 sec/batch
Epoch 9/20  Iteration 1515/3560 Training loss: 1.3678 2.5141 sec/batch
Epoch 9/20  Iteration 1516/3560 Training loss: 1.3675 2.5020 sec/batch
Epoch 9/20  Iteration 1517/3560 Training loss: 1.3669 2.4995 sec/batch
Epoch 9/20  Iteration 1518/3560 Training loss: 1.3665 2.5356 sec/batch
Epoch 9/20  Iteration 1519/3560 Training loss: 1.3661 2.4899 sec/batch
Epoch 9/20  Iteration 1520/3560 Training loss: 1.3661 2.4682 sec/batch
Epoch 9/20  Iteration 1521/3560 Training loss: 1.3660 2.6648 sec/batch
Epoch 9/20  Iteration 1522/3560 Training loss: 1.3655 2.4842 sec/batch
Epoch 9/20  Iteration 1523/3560 Training loss: 1.3652 2.4896 sec/batch
Epoch 9/20  Iteration 1524/3560 Training loss: 1.3646 2.6441 sec/batch
Epoch 9/20  Iteration 1525/3560 Training loss: 1.3645 2.6798 sec/batch
Epoch 9/20  Iteration 1526/3560 Training loss: 1.3642 2.4893 sec/batch
Epoch 9/20  Iteration 1527/3560 Training loss: 1.3640 2.5002 sec/batch
Epoch 9/20  Iteration 1528/3560 Training loss: 1.3637 2.5108 sec/batch
Epoch 9/20  Iteration 1529/3560 Training loss: 1.3634 2.4982 sec/batch
Epoch 9/20  Iteration 1530/3560 Training loss: 1.3632 2.5129 sec/batch
Epoch 9/20  Iteration 1531/3560 Training loss: 1.3630 2.4782 sec/batch
Epoch 9/20  Iteration 1532/3560 Training loss: 1.3629 2.4928 sec/batch
Epoch 9/20  Iteration 1533/3560 Training loss: 1.3626 2.5089 sec/batch
Epoch 9/20  Iteration 1534/3560 Training loss: 1.3625 2.4652 sec/batch
Epoch 9/20  Iteration 1535/3560 Training loss: 1.3623 2.4737 sec/batch
Epoch 9/20  Iteration 1536/3560 Training loss: 1.3621 2.6840 sec/batch
Epoch 9/20  Iteration 1537/3560 Training loss: 1.3618 2.4702 sec/batch
Epoch 9/20  Iteration 1538/3560 Training loss: 1.3615 2.4952 sec/batch
Epoch 9/20  Iteration 1539/3560 Training loss: 1.3612 2.6854 sec/batch
Epoch 9/20  Iteration 1540/3560 Training loss: 1.3607 2.6584 sec/batch
Epoch 9/20  Iteration 1541/3560 Training loss: 1.3606 2.4865 sec/batch
Epoch 9/20  Iteration 1542/3560 Training loss: 1.3605 2.4933 sec/batch
Epoch 9/20  Iteration 1543/3560 Training loss: 1.3603 2.4992 sec/batch
Epoch 9/20  Iteration 1544/3560 Training loss: 1.3601 2.4962 sec/batch
Epoch 9/20  Iteration 1545/3560 Training loss: 1.3599 2.6424 sec/batch
Epoch 9/20  Iteration 1546/3560 Training loss: 1.3594 2.5428 sec/batch
Epoch 9/20  Iteration 1547/3560 Training loss: 1.3589 2.5393 sec/batch
Epoch 9/20  Iteration 1548/3560 Training loss: 1.3588 2.5051 sec/batch
Epoch 9/20  Iteration 1549/3560 Training loss: 1.3586 2.4585 sec/batch
Epoch 9/20  Iteration 1550/3560 Training loss: 1.3581 2.4775 sec/batch
Epoch 9/20  Iteration 1551/3560 Training loss: 1.3580 2.6905 sec/batch
Epoch 9/20  Iteration 1552/3560 Training loss: 1.3579 2.4761 sec/batch
Epoch 9/20  Iteration 1553/3560 Training loss: 1.3576 2.4984 sec/batch
Epoch 9/20  Iteration 1554/3560 Training loss: 1.3572 2.7113 sec/batch
Epoch 9/20  Iteration 1555/3560 Training loss: 1.3567 2.6190 sec/batch
Epoch 9/20  Iteration 1556/3560 Training loss: 1.3564 2.4848 sec/batch
Epoch 9/20  Iteration 1557/3560 Training loss: 1.3564 2.5096 sec/batch
Epoch 9/20  Iteration 1558/3560 Training loss: 1.3563 2.4964 sec/batch
Epoch 9/20  Iteration 1559/3560 Training loss: 1.3562 2.4952 sec/batch
Epoch 9/20  Iteration 1560/3560 Training loss: 1.3562 2.5337 sec/batch
Epoch 9/20  Iteration 1561/3560 Training loss: 1.3562 2.4689 sec/batch
Epoch 9/20  Iteration 1562/3560 Training loss: 1.3562 2.5233 sec/batch
Epoch 9/20  Iteration 1563/3560 Training loss: 1.3562 2.5117 sec/batch
Epoch 9/20  Iteration 1564/3560 Training loss: 1.3561 2.4712 sec/batch
Epoch 9/20  Iteration 1565/3560 Training loss: 1.3564 2.4913 sec/batch
Epoch 9/20  Iteration 1566/3560 Training loss: 1.3564 2.6693 sec/batch
Epoch 9/20  Iteration 1567/3560 Training loss: 1.3563 2.4844 sec/batch
Epoch 9/20  Iteration 1568/3560 Training loss: 1.3566 2.4933 sec/batch
Epoch 9/20  Iteration 1569/3560 Training loss: 1.3565 2.7107 sec/batch
Epoch 9/20  Iteration 1570/3560 Training loss: 1.3566 2.6181 sec/batch
Epoch 9/20  Iteration 1571/3560 Training loss: 1.3566 2.4957 sec/batch
Epoch 9/20  Iteration 1572/3560 Training loss: 1.3568 2.4887 sec/batch
Epoch 9/20  Iteration 1573/3560 Training loss: 1.3569 2.4924 sec/batch
Epoch 9/20  Iteration 1574/3560 Training loss: 1.3567 2.5047 sec/batch
Epoch 9/20  Iteration 1575/3560 Training loss: 1.3564 2.5379 sec/batch
Epoch 9/20  Iteration 1576/3560 Training loss: 1.3563 2.4716 sec/batch
Epoch 9/20  Iteration 1577/3560 Training loss: 1.3564 2.5321 sec/batch
Epoch 9/20  Iteration 1578/3560 Training loss: 1.3563 2.4987 sec/batch
Epoch 9/20  Iteration 1579/3560 Training loss: 1.3563 2.4751 sec/batch
Epoch 9/20  Iteration 1580/3560 Training loss: 1.3562 2.4772 sec/batch
Epoch 9/20  Iteration 1581/3560 Training loss: 1.3563 2.6564 sec/batch
Epoch 9/20  Iteration 1582/3560 Training loss: 1.3562 2.4889 sec/batch
Epoch 9/20  Iteration 1583/3560 Training loss: 1.3559 2.4946 sec/batch
Epoch 9/20  Iteration 1584/3560 Training loss: 1.3559 2.6993 sec/batch
Epoch 9/20  Iteration 1585/3560 Training loss: 1.3560 2.6310 sec/batch
Epoch 9/20  Iteration 1586/3560 Training loss: 1.3559 2.5058 sec/batch
Epoch 9/20  Iteration 1587/3560 Training loss: 1.3558 2.6756 sec/batch
Epoch 9/20  Iteration 1588/3560 Training loss: 1.3557 2.5032 sec/batch
Epoch 9/20  Iteration 1589/3560 Training loss: 1.3557 2.5268 sec/batch
Epoch 9/20  Iteration 1590/3560 Training loss: 1.3556 2.4774 sec/batch
Epoch 9/20  Iteration 1591/3560 Training loss: 1.3556 2.4817 sec/batch
Epoch 9/20  Iteration 1592/3560 Training loss: 1.3560 2.6647 sec/batch
Epoch 9/20  Iteration 1593/3560 Training loss: 1.3559 2.5909 sec/batch
Epoch 9/20  Iteration 1594/3560 Training loss: 1.3559 2.4929 sec/batch
Epoch 9/20  Iteration 1595/3560 Training loss: 1.3557 2.5964 sec/batch
Epoch 9/20  Iteration 1596/3560 Training loss: 1.3555 2.5487 sec/batch
Epoch 9/20  Iteration 1597/3560 Training loss: 1.3556 2.4704 sec/batch
Epoch 9/20  Iteration 1598/3560 Training loss: 1.3555 2.5087 sec/batch
Epoch 9/20  Iteration 1599/3560 Training loss: 1.3555 2.7606 sec/batch
Epoch 9/20  Iteration 1600/3560 Training loss: 1.3553 2.9567 sec/batch
Epoch 9/20  Iteration 1601/3560 Training loss: 1.3551 2.5522 sec/batch
Epoch 9/20  Iteration 1602/3560 Training loss: 1.3552 2.4955 sec/batch
Epoch 10/20  Iteration 1603/3560 Training loss: 1.4840 2.4806 sec/batch
Epoch 10/20  Iteration 1604/3560 Training loss: 1.4195 2.5350 sec/batch
Epoch 10/20  Iteration 1605/3560 Training loss: 1.3961 2.5102 sec/batch
Epoch 10/20  Iteration 1606/3560 Training loss: 1.3885 2.5080 sec/batch
Epoch 10/20  Iteration 1607/3560 Training loss: 1.3749 2.5177 sec/batch
Epoch 10/20  Iteration 1608/3560 Training loss: 1.3627 2.4787 sec/batch
Epoch 10/20  Iteration 1609/3560 Training loss: 1.3613 2.4824 sec/batch
Epoch 10/20  Iteration 1610/3560 Training loss: 1.3582 2.6131 sec/batch
Epoch 10/20  Iteration 1611/3560 Training loss: 1.3562 2.5471 sec/batch
Epoch 10/20  Iteration 1612/3560 Training loss: 1.3538 2.4824 sec/batch
Epoch 10/20  Iteration 1613/3560 Training loss: 1.3493 2.4965 sec/batch
Epoch 10/20  Iteration 1614/3560 Training loss: 1.3474 2.7735 sec/batch
Epoch 10/20  Iteration 1615/3560 Training loss: 1.3468 2.5464 sec/batch
Epoch 10/20  Iteration 1616/3560 Training loss: 1.3474 2.5046 sec/batch
Epoch 10/20  Iteration 1617/3560 Training loss: 1.3459 2.4697 sec/batch
Epoch 10/20  Iteration 1618/3560 Training loss: 1.3440 2.4981 sec/batch
Epoch 10/20  Iteration 1619/3560 Training loss: 1.3441 2.4951 sec/batch
Epoch 10/20  Iteration 1620/3560 Training loss: 1.3451 2.5158 sec/batch
Epoch 10/20  Iteration 1621/3560 Training loss: 1.3452 2.5021 sec/batch
Epoch 10/20  Iteration 1622/3560 Training loss: 1.3461 2.5242 sec/batch
Epoch 10/20  Iteration 1623/3560 Training loss: 1.3454 2.4711 sec/batch
Epoch 10/20  Iteration 1624/3560 Training loss: 1.3451 2.4821 sec/batch
Epoch 10/20  Iteration 1625/3560 Training loss: 1.3439 2.5933 sec/batch
Epoch 10/20  Iteration 1626/3560 Training loss: 1.3440 2.5261 sec/batch
Epoch 10/20  Iteration 1627/3560 Training loss: 1.3436 2.4939 sec/batch
Epoch 10/20  Iteration 1628/3560 Training loss: 1.3414 2.4858 sec/batch
Epoch 10/20  Iteration 1629/3560 Training loss: 1.3400 2.7804 sec/batch
Epoch 10/20  Iteration 1630/3560 Training loss: 1.3405 2.5414 sec/batch
Epoch 10/20  Iteration 1631/3560 Training loss: 1.3406 2.5286 sec/batch
Epoch 10/20  Iteration 1632/3560 Training loss: 1.3407 2.4802 sec/batch
Epoch 10/20  Iteration 1633/3560 Training loss: 1.3400 2.4916 sec/batch
Epoch 10/20  Iteration 1634/3560 Training loss: 1.3387 2.5279 sec/batch
Epoch 10/20  Iteration 1635/3560 Training loss: 1.3388 2.5241 sec/batch
Epoch 10/20  Iteration 1636/3560 Training loss: 1.3387 2.4985 sec/batch
Epoch 10/20  Iteration 1637/3560 Training loss: 1.3383 2.4991 sec/batch
Epoch 10/20  Iteration 1638/3560 Training loss: 1.3378 2.4818 sec/batch
Epoch 10/20  Iteration 1639/3560 Training loss: 1.3370 2.6716 sec/batch
Epoch 10/20  Iteration 1640/3560 Training loss: 1.3356 2.6426 sec/batch
Epoch 10/20  Iteration 1641/3560 Training loss: 1.3341 2.5472 sec/batch
Epoch 10/20  Iteration 1642/3560 Training loss: 1.3336 2.4806 sec/batch
Epoch 10/20  Iteration 1643/3560 Training loss: 1.3330 2.5471 sec/batch
Epoch 10/20  Iteration 1644/3560 Training loss: 1.3338 2.7266 sec/batch
Epoch 10/20  Iteration 1645/3560 Training loss: 1.3334 2.5905 sec/batch
Epoch 10/20  Iteration 1646/3560 Training loss: 1.3329 2.4932 sec/batch
Epoch 10/20  Iteration 1647/3560 Training loss: 1.3333 2.4821 sec/batch
Epoch 10/20  Iteration 1648/3560 Training loss: 1.3325 2.4847 sec/batch
Epoch 10/20  Iteration 1649/3560 Training loss: 1.3323 2.5539 sec/batch
Epoch 10/20  Iteration 1650/3560 Training loss: 1.3318 2.4789 sec/batch
Epoch 10/20  Iteration 1651/3560 Training loss: 1.3318 2.5030 sec/batch
Epoch 10/20  Iteration 1652/3560 Training loss: 1.3322 2.5030 sec/batch
Epoch 10/20  Iteration 1653/3560 Training loss: 1.3315 2.4872 sec/batch
Epoch 10/20  Iteration 1654/3560 Training loss: 1.3321 2.4870 sec/batch
Epoch 10/20  Iteration 1655/3560 Training loss: 1.3320 2.5885 sec/batch
Epoch 10/20  Iteration 1656/3560 Training loss: 1.3321 2.5350 sec/batch
Epoch 10/20  Iteration 1657/3560 Training loss: 1.3319 2.4991 sec/batch
Epoch 10/20  Iteration 1658/3560 Training loss: 1.3320 2.5205 sec/batch
Epoch 10/20  Iteration 1659/3560 Training loss: 1.3323 2.7234 sec/batch
Epoch 10/20  Iteration 1660/3560 Training loss: 1.3320 2.5643 sec/batch
Epoch 10/20  Iteration 1661/3560 Training loss: 1.3314 2.5066 sec/batch
Epoch 10/20  Iteration 1662/3560 Training loss: 1.3318 2.4740 sec/batch
Epoch 10/20  Iteration 1663/3560 Training loss: 1.3316 2.4851 sec/batch
Epoch 10/20  Iteration 1664/3560 Training loss: 1.3322 2.5328 sec/batch
Epoch 10/20  Iteration 1665/3560 Training loss: 1.3323 2.4924 sec/batch
Epoch 10/20  Iteration 1666/3560 Training loss: 1.3325 2.5082 sec/batch
Epoch 10/20  Iteration 1667/3560 Training loss: 1.3322 2.5055 sec/batch
Epoch 10/20  Iteration 1668/3560 Training loss: 1.3325 2.4832 sec/batch
Epoch 10/20  Iteration 1669/3560 Training loss: 1.3326 2.4954 sec/batch
Epoch 10/20  Iteration 1670/3560 Training loss: 1.3324 2.5975 sec/batch
Epoch 10/20  Iteration 1671/3560 Training loss: 1.3325 2.5348 sec/batch
Epoch 10/20  Iteration 1672/3560 Training loss: 1.3323 2.4927 sec/batch
Epoch 10/20  Iteration 1673/3560 Training loss: 1.3330 2.5207 sec/batch
Epoch 10/20  Iteration 1674/3560 Training loss: 1.3332 2.7252 sec/batch
Epoch 10/20  Iteration 1675/3560 Training loss: 1.3335 2.5799 sec/batch
Epoch 10/20  Iteration 1676/3560 Training loss: 1.3331 2.4830 sec/batch
Epoch 10/20  Iteration 1677/3560 Training loss: 1.3330 2.4689 sec/batch
Epoch 10/20  Iteration 1678/3560 Training loss: 1.3331 2.4981 sec/batch
Epoch 10/20  Iteration 1679/3560 Training loss: 1.3330 2.5416 sec/batch
Epoch 10/20  Iteration 1680/3560 Training loss: 1.3329 2.4783 sec/batch
Epoch 10/20  Iteration 1681/3560 Training loss: 1.3321 2.5278 sec/batch
Epoch 10/20  Iteration 1682/3560 Training loss: 1.3319 2.4987 sec/batch
Epoch 10/20  Iteration 1683/3560 Training loss: 1.3315 2.4846 sec/batch
Epoch 10/20  Iteration 1684/3560 Training loss: 1.3313 2.4844 sec/batch
Epoch 10/20  Iteration 1685/3560 Training loss: 1.3308 2.5789 sec/batch
Epoch 10/20  Iteration 1686/3560 Training loss: 1.3308 2.5554 sec/batch
Epoch 10/20  Iteration 1687/3560 Training loss: 1.3305 2.7590 sec/batch
Epoch 10/20  Iteration 1688/3560 Training loss: 1.3304 2.5781 sec/batch
Epoch 10/20  Iteration 1689/3560 Training loss: 1.3301 2.7002 sec/batch
Epoch 10/20  Iteration 1690/3560 Training loss: 1.3297 2.5549 sec/batch
Epoch 10/20  Iteration 1691/3560 Training loss: 1.3292 2.4965 sec/batch
Epoch 10/20  Iteration 1692/3560 Training loss: 1.3292 2.4959 sec/batch
Epoch 10/20  Iteration 1693/3560 Training loss: 1.3289 2.4878 sec/batch
Epoch 10/20  Iteration 1694/3560 Training loss: 1.3287 2.5469 sec/batch
Epoch 10/20  Iteration 1695/3560 Training loss: 1.3282 2.4725 sec/batch
Epoch 10/20  Iteration 1696/3560 Training loss: 1.3278 2.5169 sec/batch
Epoch 10/20  Iteration 1697/3560 Training loss: 1.3275 2.5010 sec/batch
Epoch 10/20  Iteration 1698/3560 Training loss: 1.3275 2.4818 sec/batch
Epoch 10/20  Iteration 1699/3560 Training loss: 1.3274 2.4947 sec/batch
Epoch 10/20  Iteration 1700/3560 Training loss: 1.3270 2.5856 sec/batch
Epoch 10/20  Iteration 1701/3560 Training loss: 1.3265 2.5580 sec/batch
Epoch 10/20  Iteration 1702/3560 Training loss: 1.3262 2.4768 sec/batch
Epoch 10/20  Iteration 1703/3560 Training loss: 1.3260 2.5970 sec/batch
Epoch 10/20  Iteration 1704/3560 Training loss: 1.3257 2.6822 sec/batch
Epoch 10/20  Iteration 1705/3560 Training loss: 1.3255 2.6224 sec/batch
Epoch 10/20  Iteration 1706/3560 Training loss: 1.3253 2.4719 sec/batch
Epoch 10/20  Iteration 1707/3560 Training loss: 1.3251 2.4838 sec/batch
Epoch 10/20  Iteration 1708/3560 Training loss: 1.3249 2.5006 sec/batch
Epoch 10/20  Iteration 1709/3560 Training loss: 1.3248 2.5450 sec/batch
Epoch 10/20  Iteration 1710/3560 Training loss: 1.3248 2.5004 sec/batch
Epoch 10/20  Iteration 1711/3560 Training loss: 1.3246 2.4827 sec/batch
Epoch 10/20  Iteration 1712/3560 Training loss: 1.3245 2.5044 sec/batch
Epoch 10/20  Iteration 1713/3560 Training loss: 1.3243 2.4776 sec/batch
Epoch 10/20  Iteration 1714/3560 Training loss: 1.3241 2.4839 sec/batch
Epoch 10/20  Iteration 1715/3560 Training loss: 1.3240 2.5806 sec/batch
Epoch 10/20  Iteration 1716/3560 Training loss: 1.3238 2.5780 sec/batch
Epoch 10/20  Iteration 1717/3560 Training loss: 1.3234 2.4764 sec/batch
Epoch 10/20  Iteration 1718/3560 Training loss: 1.3230 2.9176 sec/batch
Epoch 10/20  Iteration 1719/3560 Training loss: 1.3228 2.7647 sec/batch
Epoch 10/20  Iteration 1720/3560 Training loss: 1.3227 2.4920 sec/batch
Epoch 10/20  Iteration 1721/3560 Training loss: 1.3225 2.4905 sec/batch
Epoch 10/20  Iteration 1722/3560 Training loss: 1.3224 2.4737 sec/batch
Epoch 10/20  Iteration 1723/3560 Training loss: 1.3224 2.5141 sec/batch
Epoch 10/20  Iteration 1724/3560 Training loss: 1.3220 2.5214 sec/batch
Epoch 10/20  Iteration 1725/3560 Training loss: 1.3216 2.4997 sec/batch
Epoch 10/20  Iteration 1726/3560 Training loss: 1.3216 2.5685 sec/batch
Epoch 10/20  Iteration 1727/3560 Training loss: 1.3214 2.5445 sec/batch
Epoch 10/20  Iteration 1728/3560 Training loss: 1.3210 2.4997 sec/batch
Epoch 10/20  Iteration 1729/3560 Training loss: 1.3210 2.4766 sec/batch
Epoch 10/20  Iteration 1730/3560 Training loss: 1.3209 2.5888 sec/batch
Epoch 10/20  Iteration 1731/3560 Training loss: 1.3207 2.5871 sec/batch
Epoch 10/20  Iteration 1732/3560 Training loss: 1.3203 2.4862 sec/batch
Epoch 10/20  Iteration 1733/3560 Training loss: 1.3197 2.5995 sec/batch
Epoch 10/20  Iteration 1734/3560 Training loss: 1.3195 2.9529 sec/batch
Epoch 10/20  Iteration 1735/3560 Training loss: 1.3195 2.4872 sec/batch
Epoch 10/20  Iteration 1736/3560 Training loss: 1.3195 2.4694 sec/batch
Epoch 10/20  Iteration 1737/3560 Training loss: 1.3194 2.5364 sec/batch
Epoch 10/20  Iteration 1738/3560 Training loss: 1.3194 2.4984 sec/batch
Epoch 10/20  Iteration 1739/3560 Training loss: 1.3195 2.4895 sec/batch
Epoch 10/20  Iteration 1740/3560 Training loss: 1.3196 2.5186 sec/batch
Epoch 10/20  Iteration 1741/3560 Training loss: 1.3196 2.5060 sec/batch
Epoch 10/20  Iteration 1742/3560 Training loss: 1.3195 2.5030 sec/batch
Epoch 10/20  Iteration 1743/3560 Training loss: 1.3197 2.4680 sec/batch
Epoch 10/20  Iteration 1744/3560 Training loss: 1.3198 2.4952 sec/batch
Epoch 10/20  Iteration 1745/3560 Training loss: 1.3196 2.6560 sec/batch
Epoch 10/20  Iteration 1746/3560 Training loss: 1.3198 2.4943 sec/batch
Epoch 10/20  Iteration 1747/3560 Training loss: 1.3196 2.4980 sec/batch
Epoch 10/20  Iteration 1748/3560 Training loss: 1.3198 2.6202 sec/batch
Epoch 10/20  Iteration 1749/3560 Training loss: 1.3198 2.7153 sec/batch
Epoch 10/20  Iteration 1750/3560 Training loss: 1.3201 2.4826 sec/batch
Epoch 10/20  Iteration 1751/3560 Training loss: 1.3202 2.5152 sec/batch
Epoch 10/20  Iteration 1752/3560 Training loss: 1.3201 2.6454 sec/batch
Epoch 10/20  Iteration 1753/3560 Training loss: 1.3197 2.5126 sec/batch
Epoch 10/20  Iteration 1754/3560 Training loss: 1.3196 2.5122 sec/batch
Epoch 10/20  Iteration 1755/3560 Training loss: 1.3197 2.4761 sec/batch
Epoch 10/20  Iteration 1756/3560 Training loss: 1.3196 2.5228 sec/batch
Epoch 10/20  Iteration 1757/3560 Training loss: 1.3196 2.4974 sec/batch
Epoch 10/20  Iteration 1758/3560 Training loss: 1.3197 2.4881 sec/batch
Epoch 10/20  Iteration 1759/3560 Training loss: 1.3197 2.4630 sec/batch
Epoch 10/20  Iteration 1760/3560 Training loss: 1.3196 2.6987 sec/batch
Epoch 10/20  Iteration 1761/3560 Training loss: 1.3194 2.4667 sec/batch
Epoch 10/20  Iteration 1762/3560 Training loss: 1.3194 2.4945 sec/batch
Epoch 10/20  Iteration 1763/3560 Training loss: 1.3195 2.6682 sec/batch
Epoch 10/20  Iteration 1764/3560 Training loss: 1.3195 2.6906 sec/batch
Epoch 10/20  Iteration 1765/3560 Training loss: 1.3194 2.4916 sec/batch
Epoch 10/20  Iteration 1766/3560 Training loss: 1.3194 2.4752 sec/batch
Epoch 10/20  Iteration 1767/3560 Training loss: 1.3194 2.5051 sec/batch
Epoch 10/20  Iteration 1768/3560 Training loss: 1.3193 2.4919 sec/batch
Epoch 10/20  Iteration 1769/3560 Training loss: 1.3194 2.5167 sec/batch
Epoch 10/20  Iteration 1770/3560 Training loss: 1.3198 2.4573 sec/batch
Epoch 10/20  Iteration 1771/3560 Training loss: 1.3198 2.5347 sec/batch
Epoch 10/20  Iteration 1772/3560 Training loss: 1.3197 2.4900 sec/batch
Epoch 10/20  Iteration 1773/3560 Training loss: 1.3196 2.4610 sec/batch
Epoch 10/20  Iteration 1774/3560 Training loss: 1.3194 2.4891 sec/batch
Epoch 10/20  Iteration 1775/3560 Training loss: 1.3195 2.6759 sec/batch
Epoch 10/20  Iteration 1776/3560 Training loss: 1.3195 2.4847 sec/batch
Epoch 10/20  Iteration 1777/3560 Training loss: 1.3195 2.4964 sec/batch
Epoch 10/20  Iteration 1778/3560 Training loss: 1.3192 2.6550 sec/batch
Epoch 10/20  Iteration 1779/3560 Training loss: 1.3190 2.6825 sec/batch
Epoch 10/20  Iteration 1780/3560 Training loss: 1.3191 2.5251 sec/batch
Epoch 11/20  Iteration 1781/3560 Training loss: 1.4394 2.6225 sec/batch
Epoch 11/20  Iteration 1782/3560 Training loss: 1.3812 2.5443 sec/batch
Epoch 11/20  Iteration 1783/3560 Training loss: 1.3587 2.5073 sec/batch
Epoch 11/20  Iteration 1784/3560 Training loss: 1.3486 2.4811 sec/batch
Epoch 11/20  Iteration 1785/3560 Training loss: 1.3353 2.4895 sec/batch
Epoch 11/20  Iteration 1786/3560 Training loss: 1.3235 2.5479 sec/batch
Epoch 11/20  Iteration 1787/3560 Training loss: 1.3226 2.5122 sec/batch
Epoch 11/20  Iteration 1788/3560 Training loss: 1.3187 2.4603 sec/batch
Epoch 11/20  Iteration 1789/3560 Training loss: 1.3169 2.4889 sec/batch
Epoch 11/20  Iteration 1790/3560 Training loss: 1.3153 2.6830 sec/batch
Epoch 11/20  Iteration 1791/3560 Training loss: 1.3118 2.4791 sec/batch
Epoch 11/20  Iteration 1792/3560 Training loss: 1.3108 2.4850 sec/batch
Epoch 11/20  Iteration 1793/3560 Training loss: 1.3100 2.7095 sec/batch
Epoch 11/20  Iteration 1794/3560 Training loss: 1.3107 2.6281 sec/batch
Epoch 11/20  Iteration 1795/3560 Training loss: 1.3088 2.5068 sec/batch
Epoch 11/20  Iteration 1796/3560 Training loss: 1.3064 2.4843 sec/batch
Epoch 11/20  Iteration 1797/3560 Training loss: 1.3063 2.5208 sec/batch
Epoch 11/20  Iteration 1798/3560 Training loss: 1.3075 2.5091 sec/batch
Epoch 11/20  Iteration 1799/3560 Training loss: 1.3080 2.5178 sec/batch
Epoch 11/20  Iteration 1800/3560 Training loss: 1.3093 2.4731 sec/batch
Epoch 11/20  Iteration 1801/3560 Training loss: 1.3088 2.5249 sec/batch
Epoch 11/20  Iteration 1802/3560 Training loss: 1.3092 2.5038 sec/batch
Epoch 11/20  Iteration 1803/3560 Training loss: 1.3085 2.4625 sec/batch
Epoch 11/20  Iteration 1804/3560 Training loss: 1.3084 2.4937 sec/batch
Epoch 11/20  Iteration 1805/3560 Training loss: 1.3078 2.6815 sec/batch
Epoch 11/20  Iteration 1806/3560 Training loss: 1.3058 2.4684 sec/batch
Epoch 11/20  Iteration 1807/3560 Training loss: 1.3046 2.4990 sec/batch
Epoch 11/20  Iteration 1808/3560 Training loss: 1.3052 2.6880 sec/batch
Epoch 11/20  Iteration 1809/3560 Training loss: 1.3054 2.6432 sec/batch
Epoch 11/20  Iteration 1810/3560 Training loss: 1.3056 2.5092 sec/batch
Epoch 11/20  Iteration 1811/3560 Training loss: 1.3048 2.4817 sec/batch
Epoch 11/20  Iteration 1812/3560 Training loss: 1.3039 2.4973 sec/batch
Epoch 11/20  Iteration 1813/3560 Training loss: 1.3041 2.5125 sec/batch
Epoch 11/20  Iteration 1814/3560 Training loss: 1.3042 2.5000 sec/batch
Epoch 11/20  Iteration 1815/3560 Training loss: 1.3040 2.4704 sec/batch
Epoch 11/20  Iteration 1816/3560 Training loss: 1.3036 2.5232 sec/batch
Epoch 11/20  Iteration 1817/3560 Training loss: 1.3028 2.6590 sec/batch
Epoch 11/20  Iteration 1818/3560 Training loss: 1.3015 2.6801 sec/batch
Epoch 11/20  Iteration 1819/3560 Training loss: 1.3000 2.6566 sec/batch
Epoch 11/20  Iteration 1820/3560 Training loss: 1.2996 2.6239 sec/batch
Epoch 11/20  Iteration 1821/3560 Training loss: 1.2990 2.4874 sec/batch
Epoch 11/20  Iteration 1822/3560 Training loss: 1.2999 2.5118 sec/batch
Epoch 11/20  Iteration 1823/3560 Training loss: 1.2997 2.7652 sec/batch
Epoch 11/20  Iteration 1824/3560 Training loss: 1.2990 2.5623 sec/batch
Epoch 11/20  Iteration 1825/3560 Training loss: 1.2993 2.5293 sec/batch
Epoch 11/20  Iteration 1826/3560 Training loss: 1.2987 2.4976 sec/batch
Epoch 11/20  Iteration 1827/3560 Training loss: 1.2982 2.5002 sec/batch
Epoch 11/20  Iteration 1828/3560 Training loss: 1.2977 2.6747 sec/batch
Epoch 11/20  Iteration 1829/3560 Training loss: 1.2977 2.5345 sec/batch
Epoch 11/20  Iteration 1830/3560 Training loss: 1.2980 2.4880 sec/batch
Epoch 11/20  Iteration 1831/3560 Training loss: 1.2976 2.5188 sec/batch
Epoch 11/20  Iteration 1832/3560 Training loss: 1.2983 2.5210 sec/batch
Epoch 11/20  Iteration 1833/3560 Training loss: 1.2983 2.4960 sec/batch
Epoch 11/20  Iteration 1834/3560 Training loss: 1.2983 2.6220 sec/batch
Epoch 11/20  Iteration 1835/3560 Training loss: 1.2981 2.5440 sec/batch
Epoch 11/20  Iteration 1836/3560 Training loss: 1.2982 2.7976 sec/batch
Epoch 11/20  Iteration 1837/3560 Training loss: 1.2985 2.5089 sec/batch
Epoch 11/20  Iteration 1838/3560 Training loss: 1.2983 2.7448 sec/batch
Epoch 11/20  Iteration 1839/3560 Training loss: 1.2976 2.5776 sec/batch
Epoch 11/20  Iteration 1840/3560 Training loss: 1.2981 2.4950 sec/batch
Epoch 11/20  Iteration 1841/3560 Training loss: 1.2980 2.5268 sec/batch
Epoch 11/20  Iteration 1842/3560 Training loss: 1.2988 2.5298 sec/batch
Epoch 11/20  Iteration 1843/3560 Training loss: 1.2992 2.5335 sec/batch
Epoch 11/20  Iteration 1844/3560 Training loss: 1.2992 2.4620 sec/batch
Epoch 11/20  Iteration 1845/3560 Training loss: 1.2991 2.5145 sec/batch
Epoch 11/20  Iteration 1846/3560 Training loss: 1.2992 2.5249 sec/batch
Epoch 11/20  Iteration 1847/3560 Training loss: 1.2994 2.4881 sec/batch
Epoch 11/20  Iteration 1848/3560 Training loss: 1.2992 2.4976 sec/batch
Epoch 11/20  Iteration 1849/3560 Training loss: 1.2992 2.5825 sec/batch
Epoch 11/20  Iteration 1850/3560 Training loss: 1.2990 2.5473 sec/batch
Epoch 11/20  Iteration 1851/3560 Training loss: 1.2997 2.4948 sec/batch
Epoch 11/20  Iteration 1852/3560 Training loss: 1.2999 2.5244 sec/batch
Epoch 11/20  Iteration 1853/3560 Training loss: 1.3004 2.7185 sec/batch
Epoch 11/20  Iteration 1854/3560 Training loss: 1.3000 2.5699 sec/batch
Epoch 11/20  Iteration 1855/3560 Training loss: 1.3000 2.5056 sec/batch
Epoch 11/20  Iteration 1856/3560 Training loss: 1.3001 2.4888 sec/batch
Epoch 11/20  Iteration 1857/3560 Training loss: 1.3000 2.5044 sec/batch
Epoch 11/20  Iteration 1858/3560 Training loss: 1.3000 2.5371 sec/batch
Epoch 11/20  Iteration 1859/3560 Training loss: 1.2994 2.4998 sec/batch
Epoch 11/20  Iteration 1860/3560 Training loss: 1.2994 2.5025 sec/batch
Epoch 11/20  Iteration 1861/3560 Training loss: 1.2989 2.4897 sec/batch
Epoch 11/20  Iteration 1862/3560 Training loss: 1.2987 2.4911 sec/batch
Epoch 11/20  Iteration 1863/3560 Training loss: 1.2983 2.4682 sec/batch
Epoch 11/20  Iteration 1864/3560 Training loss: 1.2981 2.6070 sec/batch
Epoch 11/20  Iteration 1865/3560 Training loss: 1.2979 2.5433 sec/batch
Epoch 11/20  Iteration 1866/3560 Training loss: 1.2977 2.4954 sec/batch
Epoch 11/20  Iteration 1867/3560 Training loss: 1.2975 2.5199 sec/batch
Epoch 11/20  Iteration 1868/3560 Training loss: 1.2971 2.7018 sec/batch
Epoch 11/20  Iteration 1869/3560 Training loss: 1.2968 2.6235 sec/batch
Epoch 11/20  Iteration 1870/3560 Training loss: 1.2969 2.5017 sec/batch
Epoch 11/20  Iteration 1871/3560 Training loss: 1.2966 2.4861 sec/batch
Epoch 11/20  Iteration 1872/3560 Training loss: 1.2965 2.4916 sec/batch
Epoch 11/20  Iteration 1873/3560 Training loss: 1.2960 2.5346 sec/batch
Epoch 11/20  Iteration 1874/3560 Training loss: 1.2956 2.4778 sec/batch
Epoch 11/20  Iteration 1875/3560 Training loss: 1.2955 2.5415 sec/batch
Epoch 11/20  Iteration 1876/3560 Training loss: 1.2956 2.6684 sec/batch
Epoch 11/20  Iteration 1877/3560 Training loss: 1.2955 2.4855 sec/batch
Epoch 11/20  Iteration 1878/3560 Training loss: 1.2951 2.4983 sec/batch
Epoch 11/20  Iteration 1879/3560 Training loss: 1.2947 2.6026 sec/batch
Epoch 11/20  Iteration 1880/3560 Training loss: 1.2943 2.5767 sec/batch
Epoch 11/20  Iteration 1881/3560 Training loss: 1.2943 2.4921 sec/batch
Epoch 11/20  Iteration 1882/3560 Training loss: 1.2942 2.5822 sec/batch
Epoch 11/20  Iteration 1883/3560 Training loss: 1.2941 2.7030 sec/batch
Epoch 11/20  Iteration 1884/3560 Training loss: 1.2940 2.5698 sec/batch
Epoch 11/20  Iteration 1885/3560 Training loss: 1.2938 2.4835 sec/batch
Epoch 11/20  Iteration 1886/3560 Training loss: 1.2936 2.4721 sec/batch
Epoch 11/20  Iteration 1887/3560 Training loss: 1.2935 2.5080 sec/batch
Epoch 11/20  Iteration 1888/3560 Training loss: 1.2934 2.5258 sec/batch
Epoch 11/20  Iteration 1889/3560 Training loss: 1.2931 2.4795 sec/batch
Epoch 11/20  Iteration 1890/3560 Training loss: 1.2932 2.5100 sec/batch
Epoch 11/20  Iteration 1891/3560 Training loss: 1.2929 2.5022 sec/batch
Epoch 11/20  Iteration 1892/3560 Training loss: 1.2927 2.4747 sec/batch
Epoch 11/20  Iteration 1893/3560 Training loss: 1.2926 2.5049 sec/batch
Epoch 11/20  Iteration 1894/3560 Training loss: 1.2925 2.5646 sec/batch
Epoch 11/20  Iteration 1895/3560 Training loss: 1.2922 2.5728 sec/batch
Epoch 11/20  Iteration 1896/3560 Training loss: 1.2918 2.4806 sec/batch
Epoch 11/20  Iteration 1897/3560 Training loss: 1.2918 2.5923 sec/batch
Epoch 11/20  Iteration 1898/3560 Training loss: 1.2918 2.6984 sec/batch
Epoch 11/20  Iteration 1899/3560 Training loss: 1.2917 2.5457 sec/batch
Epoch 11/20  Iteration 1900/3560 Training loss: 1.2916 2.4916 sec/batch
Epoch 11/20  Iteration 1901/3560 Training loss: 1.2916 2.4971 sec/batch
Epoch 11/20  Iteration 1902/3560 Training loss: 1.2912 2.4910 sec/batch
Epoch 11/20  Iteration 1903/3560 Training loss: 1.2908 2.5154 sec/batch
Epoch 11/20  Iteration 1904/3560 Training loss: 1.2908 2.4916 sec/batch
Epoch 11/20  Iteration 1905/3560 Training loss: 1.2906 2.5101 sec/batch
Epoch 11/20  Iteration 1906/3560 Training loss: 1.2903 2.5069 sec/batch
Epoch 11/20  Iteration 1907/3560 Training loss: 1.2903 2.4681 sec/batch
Epoch 11/20  Iteration 1908/3560 Training loss: 1.2902 2.4764 sec/batch
Epoch 11/20  Iteration 1909/3560 Training loss: 1.2900 2.5888 sec/batch
Epoch 11/20  Iteration 1910/3560 Training loss: 1.2896 2.5520 sec/batch
Epoch 11/20  Iteration 1911/3560 Training loss: 1.2891 2.4798 sec/batch
Epoch 11/20  Iteration 1912/3560 Training loss: 1.2889 2.5920 sec/batch
Epoch 11/20  Iteration 1913/3560 Training loss: 1.2890 2.6941 sec/batch
Epoch 11/20  Iteration 1914/3560 Training loss: 1.2890 2.5350 sec/batch
Epoch 11/20  Iteration 1915/3560 Training loss: 1.2890 2.4815 sec/batch
Epoch 11/20  Iteration 1916/3560 Training loss: 1.2890 2.5104 sec/batch
Epoch 11/20  Iteration 1917/3560 Training loss: 1.2891 2.5142 sec/batch
Epoch 11/20  Iteration 1918/3560 Training loss: 1.2891 2.5076 sec/batch
Epoch 11/20  Iteration 1919/3560 Training loss: 1.2891 2.4732 sec/batch
Epoch 11/20  Iteration 1920/3560 Training loss: 1.2890 2.5111 sec/batch
Epoch 11/20  Iteration 1921/3560 Training loss: 1.2893 2.5248 sec/batch
Epoch 11/20  Iteration 1922/3560 Training loss: 1.2894 2.6544 sec/batch
Epoch 11/20  Iteration 1923/3560 Training loss: 1.2893 2.5482 sec/batch
Epoch 11/20  Iteration 1924/3560 Training loss: 1.2896 2.5644 sec/batch
Epoch 11/20  Iteration 1925/3560 Training loss: 1.2894 2.5531 sec/batch
Epoch 11/20  Iteration 1926/3560 Training loss: 1.2895 2.4841 sec/batch
Epoch 11/20  Iteration 1927/3560 Training loss: 1.2895 2.5969 sec/batch
Epoch 11/20  Iteration 1928/3560 Training loss: 1.2897 2.7566 sec/batch
Epoch 11/20  Iteration 1929/3560 Training loss: 1.2898 2.4711 sec/batch
Epoch 11/20  Iteration 1930/3560 Training loss: 1.2896 2.5126 sec/batch
Epoch 11/20  Iteration 1931/3560 Training loss: 1.2893 2.4736 sec/batch
Epoch 11/20  Iteration 1932/3560 Training loss: 1.2891 2.5335 sec/batch
Epoch 11/20  Iteration 1933/3560 Training loss: 1.2893 2.4999 sec/batch
Epoch 11/20  Iteration 1934/3560 Training loss: 1.2892 2.5012 sec/batch
Epoch 11/20  Iteration 1935/3560 Training loss: 1.2892 2.4931 sec/batch
Epoch 11/20  Iteration 1936/3560 Training loss: 1.2892 2.5241 sec/batch
Epoch 11/20  Iteration 1937/3560 Training loss: 1.2891 2.4651 sec/batch
Epoch 11/20  Iteration 1938/3560 Training loss: 1.2891 2.4829 sec/batch
Epoch 11/20  Iteration 1939/3560 Training loss: 1.2889 2.5725 sec/batch
Epoch 11/20  Iteration 1940/3560 Training loss: 1.2888 2.5822 sec/batch
Epoch 11/20  Iteration 1941/3560 Training loss: 1.2889 2.5077 sec/batch
Epoch 11/20  Iteration 1942/3560 Training loss: 1.2888 2.5741 sec/batch
Epoch 11/20  Iteration 1943/3560 Training loss: 1.2888 2.7509 sec/batch
Epoch 11/20  Iteration 1944/3560 Training loss: 1.2887 2.4746 sec/batch
Epoch 11/20  Iteration 1945/3560 Training loss: 1.2887 2.5168 sec/batch
Epoch 11/20  Iteration 1946/3560 Training loss: 1.2886 2.4715 sec/batch
Epoch 11/20  Iteration 1947/3560 Training loss: 1.2887 2.5041 sec/batch
Epoch 11/20  Iteration 1948/3560 Training loss: 1.2891 2.5475 sec/batch
Epoch 11/20  Iteration 1949/3560 Training loss: 1.2891 2.4749 sec/batch
Epoch 11/20  Iteration 1950/3560 Training loss: 1.2891 2.5055 sec/batch
Epoch 11/20  Iteration 1951/3560 Training loss: 1.2890 2.5061 sec/batch
Epoch 11/20  Iteration 1952/3560 Training loss: 1.2888 2.5066 sec/batch
Epoch 11/20  Iteration 1953/3560 Training loss: 1.2889 2.5444 sec/batch
Epoch 11/20  Iteration 1954/3560 Training loss: 1.2889 3.1273 sec/batch
Epoch 11/20  Iteration 1955/3560 Training loss: 1.2889 2.4706 sec/batch
Epoch 11/20  Iteration 1956/3560 Training loss: 1.2887 2.4987 sec/batch
Epoch 11/20  Iteration 1957/3560 Training loss: 1.2885 2.6009 sec/batch
Epoch 11/20  Iteration 1958/3560 Training loss: 1.2886 2.7522 sec/batch
Epoch 12/20  Iteration 1959/3560 Training loss: 1.4078 2.5030 sec/batch
Epoch 12/20  Iteration 1960/3560 Training loss: 1.3508 2.4836 sec/batch
Epoch 12/20  Iteration 1961/3560 Training loss: 1.3293 2.5035 sec/batch
Epoch 12/20  Iteration 1962/3560 Training loss: 1.3230 2.5303 sec/batch
Epoch 12/20  Iteration 1963/3560 Training loss: 1.3087 2.5238 sec/batch
Epoch 12/20  Iteration 1964/3560 Training loss: 1.2967 2.4720 sec/batch
Epoch 12/20  Iteration 1965/3560 Training loss: 1.2944 2.5074 sec/batch
Epoch 12/20  Iteration 1966/3560 Training loss: 1.2912 2.5020 sec/batch
Epoch 12/20  Iteration 1967/3560 Training loss: 1.2910 2.4931 sec/batch
Epoch 12/20  Iteration 1968/3560 Training loss: 1.2897 2.4894 sec/batch
Epoch 12/20  Iteration 1969/3560 Training loss: 1.2861 2.8176 sec/batch
Epoch 12/20  Iteration 1970/3560 Training loss: 1.2852 2.5573 sec/batch
Epoch 12/20  Iteration 1971/3560 Training loss: 1.2845 2.4830 sec/batch
Epoch 12/20  Iteration 1972/3560 Training loss: 1.2859 2.6941 sec/batch
Epoch 12/20  Iteration 1973/3560 Training loss: 1.2837 2.6375 sec/batch
Epoch 12/20  Iteration 1974/3560 Training loss: 1.2816 2.4929 sec/batch
Epoch 12/20  Iteration 1975/3560 Training loss: 1.2821 2.4982 sec/batch
Epoch 12/20  Iteration 1976/3560 Training loss: 1.2830 2.4963 sec/batch
Epoch 12/20  Iteration 1977/3560 Training loss: 1.2833 2.5133 sec/batch
Epoch 12/20  Iteration 1978/3560 Training loss: 1.2839 2.5060 sec/batch
Epoch 12/20  Iteration 1979/3560 Training loss: 1.2832 2.4812 sec/batch
Epoch 12/20  Iteration 1980/3560 Training loss: 1.2835 2.5350 sec/batch
Epoch 12/20  Iteration 1981/3560 Training loss: 1.2823 2.5050 sec/batch
Epoch 12/20  Iteration 1982/3560 Training loss: 1.2823 2.4622 sec/batch
Epoch 12/20  Iteration 1983/3560 Training loss: 1.2822 2.5245 sec/batch
Epoch 12/20  Iteration 1984/3560 Training loss: 1.2801 2.6270 sec/batch
Epoch 12/20  Iteration 1985/3560 Training loss: 1.2789 2.4691 sec/batch
Epoch 12/20  Iteration 1986/3560 Training loss: 1.2797 2.4989 sec/batch
Epoch 12/20  Iteration 1987/3560 Training loss: 1.2797 2.6972 sec/batch
Epoch 12/20  Iteration 1988/3560 Training loss: 1.2800 2.6386 sec/batch
Epoch 12/20  Iteration 1989/3560 Training loss: 1.2791 2.4957 sec/batch
Epoch 12/20  Iteration 1990/3560 Training loss: 1.2779 2.4614 sec/batch
Epoch 12/20  Iteration 1991/3560 Training loss: 1.2778 2.5296 sec/batch
Epoch 12/20  Iteration 1992/3560 Training loss: 1.2777 2.4979 sec/batch
Epoch 12/20  Iteration 1993/3560 Training loss: 1.2774 2.5442 sec/batch
Epoch 12/20  Iteration 1994/3560 Training loss: 1.2772 2.4626 sec/batch
Epoch 12/20  Iteration 1995/3560 Training loss: 1.2765 2.5326 sec/batch
Epoch 12/20  Iteration 1996/3560 Training loss: 1.2750 2.5006 sec/batch
Epoch 12/20  Iteration 1997/3560 Training loss: 1.2735 2.4714 sec/batch
Epoch 12/20  Iteration 1998/3560 Training loss: 1.2730 2.4964 sec/batch
Epoch 12/20  Iteration 1999/3560 Training loss: 1.2724 2.6624 sec/batch
Epoch 12/20  Iteration 2000/3560 Training loss: 1.2733 2.4761 sec/batch
Epoch 12/20  Iteration 2001/3560 Training loss: 1.2730 2.4862 sec/batch
Epoch 12/20  Iteration 2002/3560 Training loss: 1.2725 2.7047 sec/batch
Epoch 12/20  Iteration 2003/3560 Training loss: 1.2730 2.6374 sec/batch
Epoch 12/20  Iteration 2004/3560 Training loss: 1.2724 2.4926 sec/batch
Epoch 12/20  Iteration 2005/3560 Training loss: 1.2721 2.4653 sec/batch
Epoch 12/20  Iteration 2006/3560 Training loss: 1.2717 2.5069 sec/batch
Epoch 12/20  Iteration 2007/3560 Training loss: 1.2716 2.4969 sec/batch
Epoch 12/20  Iteration 2008/3560 Training loss: 1.2717 2.5074 sec/batch
Epoch 12/20  Iteration 2009/3560 Training loss: 1.2716 2.4930 sec/batch
Epoch 12/20  Iteration 2010/3560 Training loss: 1.2724 2.5130 sec/batch
Epoch 12/20  Iteration 2011/3560 Training loss: 1.2724 2.5147 sec/batch
Epoch 12/20  Iteration 2012/3560 Training loss: 1.2726 2.4613 sec/batch
Epoch 12/20  Iteration 2013/3560 Training loss: 1.2724 2.5113 sec/batch
Epoch 12/20  Iteration 2014/3560 Training loss: 1.2724 2.6477 sec/batch
Epoch 12/20  Iteration 2015/3560 Training loss: 1.2727 2.4672 sec/batch
Epoch 12/20  Iteration 2016/3560 Training loss: 1.2726 2.6554 sec/batch
Epoch 12/20  Iteration 2017/3560 Training loss: 1.2720 2.7558 sec/batch
Epoch 12/20  Iteration 2018/3560 Training loss: 1.2725 2.6238 sec/batch
Epoch 12/20  Iteration 2019/3560 Training loss: 1.2725 2.4834 sec/batch
Epoch 12/20  Iteration 2020/3560 Training loss: 1.2732 2.5066 sec/batch
Epoch 12/20  Iteration 2021/3560 Training loss: 1.2735 2.4835 sec/batch
Epoch 12/20  Iteration 2022/3560 Training loss: 1.2736 2.5271 sec/batch
Epoch 12/20  Iteration 2023/3560 Training loss: 1.2736 2.5839 sec/batch
Epoch 12/20  Iteration 2024/3560 Training loss: 1.2736 2.4994 sec/batch
Epoch 12/20  Iteration 2025/3560 Training loss: 1.2739 2.4910 sec/batch
Epoch 12/20  Iteration 2026/3560 Training loss: 1.2738 2.5029 sec/batch
Epoch 12/20  Iteration 2027/3560 Training loss: 1.2739 2.4811 sec/batch
Epoch 12/20  Iteration 2028/3560 Training loss: 1.2738 2.5731 sec/batch
Epoch 12/20  Iteration 2029/3560 Training loss: 1.2745 2.5936 sec/batch
Epoch 12/20  Iteration 2030/3560 Training loss: 1.2748 2.4727 sec/batch
Epoch 12/20  Iteration 2031/3560 Training loss: 1.2753 2.5037 sec/batch
Epoch 12/20  Iteration 2032/3560 Training loss: 1.2749 2.7201 sec/batch
Epoch 12/20  Iteration 2033/3560 Training loss: 1.2748 2.6031 sec/batch
Epoch 12/20  Iteration 2034/3560 Training loss: 1.2749 2.5019 sec/batch
Epoch 12/20  Iteration 2035/3560 Training loss: 1.2747 2.5191 sec/batch
Epoch 12/20  Iteration 2036/3560 Training loss: 1.2748 2.4849 sec/batch
Epoch 12/20  Iteration 2037/3560 Training loss: 1.2741 2.5036 sec/batch
Epoch 12/20  Iteration 2038/3560 Training loss: 1.2741 2.5055 sec/batch
Epoch 12/20  Iteration 2039/3560 Training loss: 1.2737 2.4870 sec/batch
Epoch 12/20  Iteration 2040/3560 Training loss: 1.2736 2.5076 sec/batch
Epoch 12/20  Iteration 2041/3560 Training loss: 1.2731 2.4937 sec/batch
Epoch 12/20  Iteration 2042/3560 Training loss: 1.2731 2.4957 sec/batch
Epoch 12/20  Iteration 2043/3560 Training loss: 1.2729 2.5721 sec/batch
Epoch 12/20  Iteration 2044/3560 Training loss: 1.2728 2.5845 sec/batch
Epoch 12/20  Iteration 2045/3560 Training loss: 1.2726 2.4772 sec/batch
Epoch 12/20  Iteration 2046/3560 Training loss: 1.2723 2.5050 sec/batch
Epoch 12/20  Iteration 2047/3560 Training loss: 1.2719 2.7735 sec/batch
Epoch 12/20  Iteration 2048/3560 Training loss: 1.2719 2.5562 sec/batch
Epoch 12/20  Iteration 2049/3560 Training loss: 1.2717 2.4848 sec/batch
Epoch 12/20  Iteration 2050/3560 Training loss: 1.2716 2.5137 sec/batch
Epoch 12/20  Iteration 2051/3560 Training loss: 1.2712 2.4761 sec/batch
Epoch 12/20  Iteration 2052/3560 Training loss: 1.2708 2.5098 sec/batch
Epoch 12/20  Iteration 2053/3560 Training loss: 1.2705 2.5104 sec/batch
Epoch 12/20  Iteration 2054/3560 Training loss: 1.2706 2.5110 sec/batch
Epoch 12/20  Iteration 2055/3560 Training loss: 1.2705 2.4961 sec/batch
Epoch 12/20  Iteration 2056/3560 Training loss: 1.2700 2.4929 sec/batch
Epoch 12/20  Iteration 2057/3560 Training loss: 1.2697 2.4690 sec/batch
Epoch 12/20  Iteration 2058/3560 Training loss: 1.2693 2.5727 sec/batch
Epoch 12/20  Iteration 2059/3560 Training loss: 1.2692 2.5883 sec/batch
Epoch 12/20  Iteration 2060/3560 Training loss: 1.2691 2.5067 sec/batch
Epoch 12/20  Iteration 2061/3560 Training loss: 1.2690 2.5095 sec/batch
Epoch 12/20  Iteration 2062/3560 Training loss: 1.2689 2.7278 sec/batch
Epoch 12/20  Iteration 2063/3560 Training loss: 1.2687 2.5979 sec/batch
Epoch 12/20  Iteration 2064/3560 Training loss: 1.2685 2.6430 sec/batch
Epoch 12/20  Iteration 2065/3560 Training loss: 1.2684 2.5088 sec/batch
Epoch 12/20  Iteration 2066/3560 Training loss: 1.2684 2.5009 sec/batch
Epoch 12/20  Iteration 2067/3560 Training loss: 1.2682 2.5473 sec/batch
Epoch 12/20  Iteration 2068/3560 Training loss: 1.2682 2.4869 sec/batch
Epoch 12/20  Iteration 2069/3560 Training loss: 1.2680 2.5140 sec/batch
Epoch 12/20  Iteration 2070/3560 Training loss: 1.2679 2.5312 sec/batch
Epoch 12/20  Iteration 2071/3560 Training loss: 1.2677 2.4960 sec/batch
Epoch 12/20  Iteration 2072/3560 Training loss: 1.2675 2.7031 sec/batch
Epoch 12/20  Iteration 2073/3560 Training loss: 1.2673 2.6068 sec/batch
Epoch 12/20  Iteration 2074/3560 Training loss: 1.2670 2.5451 sec/batch
Epoch 12/20  Iteration 2075/3560 Training loss: 1.2669 2.4730 sec/batch
Epoch 12/20  Iteration 2076/3560 Training loss: 1.2669 2.4801 sec/batch
Epoch 12/20  Iteration 2077/3560 Training loss: 1.2668 2.7904 sec/batch
Epoch 12/20  Iteration 2078/3560 Training loss: 1.2668 2.5816 sec/batch
Epoch 12/20  Iteration 2079/3560 Training loss: 1.2666 2.5045 sec/batch
Epoch 12/20  Iteration 2080/3560 Training loss: 1.2663 2.4666 sec/batch
Epoch 12/20  Iteration 2081/3560 Training loss: 1.2659 2.4893 sec/batch
Epoch 12/20  Iteration 2082/3560 Training loss: 1.2658 2.5165 sec/batch
Epoch 12/20  Iteration 2083/3560 Training loss: 1.2657 2.5039 sec/batch
Epoch 12/20  Iteration 2084/3560 Training loss: 1.2653 2.5010 sec/batch
Epoch 12/20  Iteration 2085/3560 Training loss: 1.2653 2.5113 sec/batch
Epoch 12/20  Iteration 2086/3560 Training loss: 1.2652 2.4677 sec/batch
Epoch 12/20  Iteration 2087/3560 Training loss: 1.2649 2.5007 sec/batch
Epoch 12/20  Iteration 2088/3560 Training loss: 1.2646 2.5887 sec/batch
Epoch 12/20  Iteration 2089/3560 Training loss: 1.2642 2.5496 sec/batch
Epoch 12/20  Iteration 2090/3560 Training loss: 1.2640 2.4774 sec/batch
Epoch 12/20  Iteration 2091/3560 Training loss: 1.2640 2.5025 sec/batch
Epoch 12/20  Iteration 2092/3560 Training loss: 1.2640 2.7597 sec/batch
Epoch 12/20  Iteration 2093/3560 Training loss: 1.2639 2.5777 sec/batch
Epoch 12/20  Iteration 2094/3560 Training loss: 1.2640 2.5024 sec/batch
Epoch 12/20  Iteration 2095/3560 Training loss: 1.2641 2.4713 sec/batch
Epoch 12/20  Iteration 2096/3560 Training loss: 1.2641 2.4876 sec/batch
Epoch 12/20  Iteration 2097/3560 Training loss: 1.2640 2.5175 sec/batch
Epoch 12/20  Iteration 2098/3560 Training loss: 1.2640 2.5173 sec/batch
Epoch 12/20  Iteration 2099/3560 Training loss: 1.2643 2.4909 sec/batch
Epoch 12/20  Iteration 2100/3560 Training loss: 1.2643 2.5046 sec/batch
Epoch 12/20  Iteration 2101/3560 Training loss: 1.2643 2.4871 sec/batch
Epoch 12/20  Iteration 2102/3560 Training loss: 1.2645 2.4958 sec/batch
Epoch 12/20  Iteration 2103/3560 Training loss: 1.2644 2.6045 sec/batch
Epoch 12/20  Iteration 2104/3560 Training loss: 1.2646 2.5377 sec/batch
Epoch 12/20  Iteration 2105/3560 Training loss: 1.2646 2.4993 sec/batch
Epoch 12/20  Iteration 2106/3560 Training loss: 1.2648 2.4842 sec/batch
Epoch 12/20  Iteration 2107/3560 Training loss: 1.2649 2.7726 sec/batch
Epoch 12/20  Iteration 2108/3560 Training loss: 1.2647 2.5674 sec/batch
Epoch 12/20  Iteration 2109/3560 Training loss: 1.2644 2.5077 sec/batch
Epoch 12/20  Iteration 2110/3560 Training loss: 1.2642 2.4781 sec/batch
Epoch 12/20  Iteration 2111/3560 Training loss: 1.2643 2.6548 sec/batch
Epoch 12/20  Iteration 2112/3560 Training loss: 1.2642 2.5753 sec/batch
Epoch 12/20  Iteration 2113/3560 Training loss: 1.2641 2.4651 sec/batch
Epoch 12/20  Iteration 2114/3560 Training loss: 1.2641 2.5220 sec/batch
Epoch 12/20  Iteration 2115/3560 Training loss: 1.2640 2.4939 sec/batch
Epoch 12/20  Iteration 2116/3560 Training loss: 1.2639 2.5334 sec/batch
Epoch 12/20  Iteration 2117/3560 Training loss: 1.2637 2.4976 sec/batch
Epoch 12/20  Iteration 2118/3560 Training loss: 1.2637 2.5705 sec/batch
Epoch 12/20  Iteration 2119/3560 Training loss: 1.2639 2.5533 sec/batch
Epoch 12/20  Iteration 2120/3560 Training loss: 1.2638 2.4745 sec/batch
Epoch 12/20  Iteration 2121/3560 Training loss: 1.2638 2.5247 sec/batch
Epoch 12/20  Iteration 2122/3560 Training loss: 1.2637 2.7670 sec/batch
Epoch 12/20  Iteration 2123/3560 Training loss: 1.2637 2.5549 sec/batch
Epoch 12/20  Iteration 2124/3560 Training loss: 1.2637 2.4763 sec/batch
Epoch 12/20  Iteration 2125/3560 Training loss: 1.2638 2.4986 sec/batch
Epoch 12/20  Iteration 2126/3560 Training loss: 1.2642 2.5118 sec/batch
Epoch 12/20  Iteration 2127/3560 Training loss: 1.2642 2.5296 sec/batch
Epoch 12/20  Iteration 2128/3560 Training loss: 1.2642 2.5007 sec/batch
Epoch 12/20  Iteration 2129/3560 Training loss: 1.2642 2.5231 sec/batch
Epoch 12/20  Iteration 2130/3560 Training loss: 1.2640 2.5153 sec/batch
Epoch 12/20  Iteration 2131/3560 Training loss: 1.2642 2.4749 sec/batch
Epoch 12/20  Iteration 2132/3560 Training loss: 1.2642 2.4846 sec/batch
Epoch 12/20  Iteration 2133/3560 Training loss: 1.2642 2.5968 sec/batch
Epoch 12/20  Iteration 2134/3560 Training loss: 1.2640 2.5428 sec/batch
Epoch 12/20  Iteration 2135/3560 Training loss: 1.2638 2.4822 sec/batch
Epoch 12/20  Iteration 2136/3560 Training loss: 1.2640 2.5283 sec/batch
Epoch 13/20  Iteration 2137/3560 Training loss: 1.3766 2.7322 sec/batch
Epoch 13/20  Iteration 2138/3560 Training loss: 1.3225 2.5646 sec/batch
Epoch 13/20  Iteration 2139/3560 Training loss: 1.3036 2.4938 sec/batch
Epoch 13/20  Iteration 2140/3560 Training loss: 1.2999 2.4969 sec/batch
Epoch 13/20  Iteration 2141/3560 Training loss: 1.2880 2.4877 sec/batch
Epoch 13/20  Iteration 2142/3560 Training loss: 1.2765 2.5219 sec/batch
Epoch 13/20  Iteration 2143/3560 Training loss: 1.2749 2.4895 sec/batch
Epoch 13/20  Iteration 2144/3560 Training loss: 1.2712 2.5204 sec/batch
Epoch 13/20  Iteration 2145/3560 Training loss: 1.2709 2.5066 sec/batch
Epoch 13/20  Iteration 2146/3560 Training loss: 1.2691 2.4916 sec/batch
Epoch 13/20  Iteration 2147/3560 Training loss: 1.2660 2.4773 sec/batch
Epoch 13/20  Iteration 2148/3560 Training loss: 1.2650 2.5883 sec/batch
Epoch 13/20  Iteration 2149/3560 Training loss: 1.2651 2.5512 sec/batch
Epoch 13/20  Iteration 2150/3560 Training loss: 1.2658 2.4783 sec/batch
Epoch 13/20  Iteration 2151/3560 Training loss: 1.2642 2.5245 sec/batch
Epoch 13/20  Iteration 2152/3560 Training loss: 1.2622 2.7416 sec/batch
Epoch 13/20  Iteration 2153/3560 Training loss: 1.2619 2.5913 sec/batch
Epoch 13/20  Iteration 2154/3560 Training loss: 1.2629 2.4899 sec/batch
Epoch 13/20  Iteration 2155/3560 Training loss: 1.2628 2.4830 sec/batch
Epoch 13/20  Iteration 2156/3560 Training loss: 1.2638 2.5030 sec/batch
Epoch 13/20  Iteration 2157/3560 Training loss: 1.2632 2.5171 sec/batch
Epoch 13/20  Iteration 2158/3560 Training loss: 1.2634 2.5295 sec/batch
Epoch 13/20  Iteration 2159/3560 Training loss: 1.2625 2.6733 sec/batch
Epoch 13/20  Iteration 2160/3560 Training loss: 1.2626 2.5141 sec/batch
Epoch 13/20  Iteration 2161/3560 Training loss: 1.2621 2.4908 sec/batch
Epoch 13/20  Iteration 2162/3560 Training loss: 1.2600 2.5045 sec/batch
Epoch 13/20  Iteration 2163/3560 Training loss: 1.2585 2.6009 sec/batch
Epoch 13/20  Iteration 2164/3560 Training loss: 1.2592 2.5768 sec/batch
Epoch 13/20  Iteration 2165/3560 Training loss: 1.2591 2.4859 sec/batch
Epoch 13/20  Iteration 2166/3560 Training loss: 1.2592 2.5501 sec/batch
Epoch 13/20  Iteration 2167/3560 Training loss: 1.2585 2.7706 sec/batch
Epoch 13/20  Iteration 2168/3560 Training loss: 1.2574 2.4809 sec/batch
Epoch 13/20  Iteration 2169/3560 Training loss: 1.2575 2.4977 sec/batch
Epoch 13/20  Iteration 2170/3560 Training loss: 1.2577 2.5182 sec/batch
Epoch 13/20  Iteration 2171/3560 Training loss: 1.2574 2.4795 sec/batch
Epoch 13/20  Iteration 2172/3560 Training loss: 1.2573 2.5434 sec/batch
Epoch 13/20  Iteration 2173/3560 Training loss: 1.2565 2.4801 sec/batch
Epoch 13/20  Iteration 2174/3560 Training loss: 1.2549 2.5158 sec/batch
Epoch 13/20  Iteration 2175/3560 Training loss: 1.2538 2.4903 sec/batch
Epoch 13/20  Iteration 2176/3560 Training loss: 1.2534 2.5067 sec/batch
Epoch 13/20  Iteration 2177/3560 Training loss: 1.2528 2.4764 sec/batch
Epoch 13/20  Iteration 2178/3560 Training loss: 1.2536 2.5863 sec/batch
Epoch 13/20  Iteration 2179/3560 Training loss: 1.2535 2.5556 sec/batch
Epoch 13/20  Iteration 2180/3560 Training loss: 1.2528 2.4973 sec/batch
Epoch 13/20  Iteration 2181/3560 Training loss: 1.2533 2.5925 sec/batch
Epoch 13/20  Iteration 2182/3560 Training loss: 1.2524 2.7513 sec/batch
Epoch 13/20  Iteration 2183/3560 Training loss: 1.2520 2.5040 sec/batch
Epoch 13/20  Iteration 2184/3560 Training loss: 1.2514 2.4666 sec/batch
Epoch 13/20  Iteration 2185/3560 Training loss: 1.2514 2.5007 sec/batch
Epoch 13/20  Iteration 2186/3560 Training loss: 1.2515 2.4721 sec/batch
Epoch 13/20  Iteration 2187/3560 Training loss: 1.2509 2.5393 sec/batch
Epoch 13/20  Iteration 2188/3560 Training loss: 1.2516 2.5059 sec/batch
Epoch 13/20  Iteration 2189/3560 Training loss: 1.2513 2.4938 sec/batch
Epoch 13/20  Iteration 2190/3560 Training loss: 1.2515 2.8578 sec/batch
Epoch 13/20  Iteration 2191/3560 Training loss: 1.2512 2.4782 sec/batch
Epoch 13/20  Iteration 2192/3560 Training loss: 1.2511 2.4770 sec/batch
Epoch 13/20  Iteration 2193/3560 Training loss: 1.2513 2.5937 sec/batch
Epoch 13/20  Iteration 2194/3560 Training loss: 1.2511 2.5657 sec/batch
Epoch 13/20  Iteration 2195/3560 Training loss: 1.2505 2.4844 sec/batch
Epoch 13/20  Iteration 2196/3560 Training loss: 1.2510 2.5957 sec/batch
Epoch 13/20  Iteration 2197/3560 Training loss: 1.2509 2.7936 sec/batch
Epoch 13/20  Iteration 2198/3560 Training loss: 1.2516 2.5048 sec/batch
Epoch 13/20  Iteration 2199/3560 Training loss: 1.2521 2.5293 sec/batch
Epoch 13/20  Iteration 2200/3560 Training loss: 1.2522 2.5052 sec/batch
Epoch 13/20  Iteration 2201/3560 Training loss: 1.2521 2.5076 sec/batch
Epoch 13/20  Iteration 2202/3560 Training loss: 1.2521 2.5278 sec/batch
Epoch 13/20  Iteration 2203/3560 Training loss: 1.2524 2.5193 sec/batch
Epoch 13/20  Iteration 2204/3560 Training loss: 1.2523 2.4892 sec/batch
Epoch 13/20  Iteration 2205/3560 Training loss: 1.2524 2.5152 sec/batch
Epoch 13/20  Iteration 2206/3560 Training loss: 1.2524 2.6476 sec/batch
Epoch 13/20  Iteration 2207/3560 Training loss: 1.2531 2.4777 sec/batch
Epoch 13/20  Iteration 2208/3560 Training loss: 1.2533 2.6327 sec/batch
Epoch 13/20  Iteration 2209/3560 Training loss: 1.2539 2.5074 sec/batch
Epoch 13/20  Iteration 2210/3560 Training loss: 1.2536 2.4938 sec/batch
Epoch 13/20  Iteration 2211/3560 Training loss: 1.2536 2.6015 sec/batch
Epoch 13/20  Iteration 2212/3560 Training loss: 1.2538 2.7620 sec/batch
Epoch 13/20  Iteration 2213/3560 Training loss: 1.2536 2.4844 sec/batch
Epoch 13/20  Iteration 2214/3560 Training loss: 1.2535 2.4745 sec/batch
Epoch 13/20  Iteration 2215/3560 Training loss: 1.2529 2.4917 sec/batch
Epoch 13/20  Iteration 2216/3560 Training loss: 1.2530 2.5076 sec/batch
Epoch 13/20  Iteration 2217/3560 Training loss: 1.2526 2.5233 sec/batch
Epoch 13/20  Iteration 2218/3560 Training loss: 1.2524 2.4645 sec/batch
Epoch 13/20  Iteration 2219/3560 Training loss: 1.2520 2.5354 sec/batch
Epoch 13/20  Iteration 2220/3560 Training loss: 1.2521 2.5136 sec/batch
Epoch 13/20  Iteration 2221/3560 Training loss: 1.2519 2.4811 sec/batch
Epoch 13/20  Iteration 2222/3560 Training loss: 1.2517 2.5413 sec/batch
Epoch 13/20  Iteration 2223/3560 Training loss: 1.2516 2.7138 sec/batch
Epoch 13/20  Iteration 2224/3560 Training loss: 1.2513 2.4827 sec/batch
Epoch 13/20  Iteration 2225/3560 Training loss: 1.2510 2.4792 sec/batch
Epoch 13/20  Iteration 2226/3560 Training loss: 1.2511 2.6545 sec/batch
Epoch 13/20  Iteration 2227/3560 Training loss: 1.2509 2.6991 sec/batch
Epoch 13/20  Iteration 2228/3560 Training loss: 1.2509 2.4785 sec/batch
Epoch 13/20  Iteration 2229/3560 Training loss: 1.2505 2.5127 sec/batch
Epoch 13/20  Iteration 2230/3560 Training loss: 1.2500 2.4984 sec/batch
Epoch 13/20  Iteration 2231/3560 Training loss: 1.2499 2.5129 sec/batch
Epoch 13/20  Iteration 2232/3560 Training loss: 1.2499 2.5178 sec/batch
Epoch 13/20  Iteration 2233/3560 Training loss: 1.2498 2.4753 sec/batch
Epoch 13/20  Iteration 2234/3560 Training loss: 1.2494 2.4896 sec/batch
Epoch 13/20  Iteration 2235/3560 Training loss: 1.2490 2.5192 sec/batch
Epoch 13/20  Iteration 2236/3560 Training loss: 1.2486 2.4739 sec/batch
Epoch 13/20  Iteration 2237/3560 Training loss: 1.2485 2.4736 sec/batch
Epoch 13/20  Iteration 2238/3560 Training loss: 1.2483 2.6793 sec/batch
Epoch 13/20  Iteration 2239/3560 Training loss: 1.2482 2.4792 sec/batch
Epoch 13/20  Iteration 2240/3560 Training loss: 1.2480 2.5053 sec/batch
Epoch 13/20  Iteration 2241/3560 Training loss: 1.2479 2.6431 sec/batch
Epoch 13/20  Iteration 2242/3560 Training loss: 1.2477 2.6911 sec/batch
Epoch 13/20  Iteration 2243/3560 Training loss: 1.2476 2.4953 sec/batch
Epoch 13/20  Iteration 2244/3560 Training loss: 1.2477 2.4873 sec/batch
Epoch 13/20  Iteration 2245/3560 Training loss: 1.2474 2.5070 sec/batch
Epoch 13/20  Iteration 2246/3560 Training loss: 1.2474 2.4889 sec/batch
Epoch 13/20  Iteration 2247/3560 Training loss: 1.2472 2.5001 sec/batch
Epoch 13/20  Iteration 2248/3560 Training loss: 1.2471 2.4887 sec/batch
Epoch 13/20  Iteration 2249/3560 Training loss: 1.2470 2.5339 sec/batch
Epoch 13/20  Iteration 2250/3560 Training loss: 1.2468 2.5183 sec/batch
Epoch 13/20  Iteration 2251/3560 Training loss: 1.2465 2.4688 sec/batch
Epoch 13/20  Iteration 2252/3560 Training loss: 1.2462 2.5939 sec/batch
Epoch 13/20  Iteration 2253/3560 Training loss: 1.2462 2.7858 sec/batch
Epoch 13/20  Iteration 2254/3560 Training loss: 1.2462 2.4644 sec/batch
Epoch 13/20  Iteration 2255/3560 Training loss: 1.2461 2.5035 sec/batch
Epoch 13/20  Iteration 2256/3560 Training loss: 1.2459 2.6802 sec/batch
Epoch 13/20  Iteration 2257/3560 Training loss: 1.2458 2.6700 sec/batch
Epoch 13/20  Iteration 2258/3560 Training loss: 1.2455 2.5715 sec/batch
Epoch 13/20  Iteration 2259/3560 Training loss: 1.2451 2.6243 sec/batch
Epoch 13/20  Iteration 2260/3560 Training loss: 1.2450 2.4913 sec/batch
Epoch 13/20  Iteration 2261/3560 Training loss: 1.2449 2.5087 sec/batch
Epoch 13/20  Iteration 2262/3560 Training loss: 1.2444 2.5082 sec/batch
Epoch 13/20  Iteration 2263/3560 Training loss: 1.2444 2.4866 sec/batch
Epoch 13/20  Iteration 2264/3560 Training loss: 1.2444 2.5104 sec/batch
Epoch 13/20  Iteration 2265/3560 Training loss: 1.2442 2.4887 sec/batch
Epoch 13/20  Iteration 2266/3560 Training loss: 1.2439 2.4745 sec/batch
Epoch 13/20  Iteration 2267/3560 Training loss: 1.2434 2.5222 sec/batch
Epoch 13/20  Iteration 2268/3560 Training loss: 1.2432 2.6414 sec/batch
Epoch 13/20  Iteration 2269/3560 Training loss: 1.2432 2.4723 sec/batch
Epoch 13/20  Iteration 2270/3560 Training loss: 1.2432 2.4989 sec/batch
Epoch 13/20  Iteration 2271/3560 Training loss: 1.2431 2.6984 sec/batch
Epoch 13/20  Iteration 2272/3560 Training loss: 1.2432 2.6434 sec/batch
Epoch 13/20  Iteration 2273/3560 Training loss: 1.2432 2.5016 sec/batch
Epoch 13/20  Iteration 2274/3560 Training loss: 1.2433 2.4804 sec/batch
Epoch 13/20  Iteration 2275/3560 Training loss: 1.2432 2.5084 sec/batch
Epoch 13/20  Iteration 2276/3560 Training loss: 1.2432 2.4957 sec/batch
Epoch 13/20  Iteration 2277/3560 Training loss: 1.2435 2.5111 sec/batch
Epoch 13/20  Iteration 2278/3560 Training loss: 1.2436 2.5005 sec/batch
Epoch 13/20  Iteration 2279/3560 Training loss: 1.2435 2.4952 sec/batch
Epoch 13/20  Iteration 2280/3560 Training loss: 1.2437 2.5109 sec/batch
Epoch 13/20  Iteration 2281/3560 Training loss: 1.2436 2.4703 sec/batch
Epoch 13/20  Iteration 2282/3560 Training loss: 1.2437 2.5289 sec/batch
Epoch 13/20  Iteration 2283/3560 Training loss: 1.2437 2.6285 sec/batch
Epoch 13/20  Iteration 2284/3560 Training loss: 1.2439 2.4790 sec/batch
Epoch 13/20  Iteration 2285/3560 Training loss: 1.2441 2.4849 sec/batch
Epoch 13/20  Iteration 2286/3560 Training loss: 1.2439 2.7007 sec/batch
Epoch 13/20  Iteration 2287/3560 Training loss: 1.2436 2.6262 sec/batch
Epoch 13/20  Iteration 2288/3560 Training loss: 1.2435 2.4892 sec/batch
Epoch 13/20  Iteration 2289/3560 Training loss: 1.2436 2.4592 sec/batch
Epoch 13/20  Iteration 2290/3560 Training loss: 1.2435 2.4826 sec/batch
Epoch 13/20  Iteration 2291/3560 Training loss: 1.2434 2.4881 sec/batch
Epoch 13/20  Iteration 2292/3560 Training loss: 1.2434 2.4838 sec/batch
Epoch 13/20  Iteration 2293/3560 Training loss: 1.2433 2.4608 sec/batch
Epoch 13/20  Iteration 2294/3560 Training loss: 1.2433 2.4778 sec/batch
Epoch 13/20  Iteration 2295/3560 Training loss: 1.2431 2.4639 sec/batch
Epoch 13/20  Iteration 2296/3560 Training loss: 1.2431 2.5152 sec/batch
Epoch 13/20  Iteration 2297/3560 Training loss: 1.2432 2.4562 sec/batch
Epoch 13/20  Iteration 2298/3560 Training loss: 1.2432 2.4639 sec/batch
Epoch 13/20  Iteration 2299/3560 Training loss: 1.2431 2.4588 sec/batch
Epoch 13/20  Iteration 2300/3560 Training loss: 1.2431 2.6583 sec/batch
Epoch 13/20  Iteration 2301/3560 Training loss: 1.2431 2.4832 sec/batch
Epoch 13/20  Iteration 2302/3560 Training loss: 1.2430 2.4537 sec/batch
Epoch 13/20  Iteration 2303/3560 Training loss: 1.2431 2.4564 sec/batch
Epoch 13/20  Iteration 2304/3560 Training loss: 1.2435 2.4723 sec/batch
Epoch 13/20  Iteration 2305/3560 Training loss: 1.2435 2.4602 sec/batch
Epoch 13/20  Iteration 2306/3560 Training loss: 1.2435 2.4703 sec/batch
Epoch 13/20  Iteration 2307/3560 Training loss: 1.2434 2.4571 sec/batch
Epoch 13/20  Iteration 2308/3560 Training loss: 1.2433 2.7247 sec/batch
Epoch 13/20  Iteration 2309/3560 Training loss: 1.2434 2.4548 sec/batch
Epoch 13/20  Iteration 2310/3560 Training loss: 1.2434 2.4728 sec/batch
Epoch 13/20  Iteration 2311/3560 Training loss: 1.2434 2.4616 sec/batch
Epoch 13/20  Iteration 2312/3560 Training loss: 1.2432 2.4606 sec/batch
Epoch 13/20  Iteration 2313/3560 Training loss: 1.2431 2.4602 sec/batch
Epoch 13/20  Iteration 2314/3560 Training loss: 1.2433 2.4680 sec/batch
Epoch 14/20  Iteration 2315/3560 Training loss: 1.3632 2.4635 sec/batch
Epoch 14/20  Iteration 2316/3560 Training loss: 1.3113 2.4598 sec/batch
Epoch 14/20  Iteration 2317/3560 Training loss: 1.2883 2.4603 sec/batch
Epoch 14/20  Iteration 2318/3560 Training loss: 1.2823 2.4682 sec/batch
Epoch 14/20  Iteration 2319/3560 Training loss: 1.2693 2.4603 sec/batch
Epoch 14/20  Iteration 2320/3560 Training loss: 1.2572 2.4578 sec/batch
Epoch 14/20  Iteration 2321/3560 Training loss: 1.2547 2.4603 sec/batch
Epoch 14/20  Iteration 2322/3560 Training loss: 1.2511 2.4649 sec/batch
Epoch 14/20  Iteration 2323/3560 Training loss: 1.2502 2.4545 sec/batch
Epoch 14/20  Iteration 2324/3560 Training loss: 1.2485 2.4691 sec/batch
Epoch 14/20  Iteration 2325/3560 Training loss: 1.2458 2.4644 sec/batch
Epoch 14/20  Iteration 2326/3560 Training loss: 1.2452 2.4533 sec/batch
Epoch 14/20  Iteration 2327/3560 Training loss: 1.2444 2.4618 sec/batch
Epoch 14/20  Iteration 2328/3560 Training loss: 1.2451 2.4530 sec/batch
Epoch 14/20  Iteration 2329/3560 Training loss: 1.2432 2.4587 sec/batch
Epoch 14/20  Iteration 2330/3560 Training loss: 1.2414 2.4675 sec/batch
Epoch 14/20  Iteration 2331/3560 Training loss: 1.2417 2.4711 sec/batch
Epoch 14/20  Iteration 2332/3560 Training loss: 1.2430 2.4600 sec/batch
Epoch 14/20  Iteration 2333/3560 Training loss: 1.2428 2.4699 sec/batch
Epoch 14/20  Iteration 2334/3560 Training loss: 1.2434 2.4546 sec/batch
Epoch 14/20  Iteration 2335/3560 Training loss: 1.2427 2.4488 sec/batch
Epoch 14/20  Iteration 2336/3560 Training loss: 1.2429 2.4704 sec/batch
Epoch 14/20  Iteration 2337/3560 Training loss: 1.2423 2.4514 sec/batch
Epoch 14/20  Iteration 2338/3560 Training loss: 1.2422 2.4656 sec/batch
Epoch 14/20  Iteration 2339/3560 Training loss: 1.2416 2.4477 sec/batch
Epoch 14/20  Iteration 2340/3560 Training loss: 1.2391 2.4477 sec/batch
Epoch 14/20  Iteration 2341/3560 Training loss: 1.2379 2.4627 sec/batch
Epoch 14/20  Iteration 2342/3560 Training loss: 1.2386 2.4671 sec/batch
Epoch 14/20  Iteration 2343/3560 Training loss: 1.2388 2.4633 sec/batch
Epoch 14/20  Iteration 2344/3560 Training loss: 1.2390 2.4596 sec/batch
Epoch 14/20  Iteration 2345/3560 Training loss: 1.2383 2.4627 sec/batch
Epoch 14/20  Iteration 2346/3560 Training loss: 1.2373 2.4489 sec/batch
Epoch 14/20  Iteration 2347/3560 Training loss: 1.2373 2.4493 sec/batch
Epoch 14/20  Iteration 2348/3560 Training loss: 1.2375 2.4669 sec/batch
Epoch 14/20  Iteration 2349/3560 Training loss: 1.2369 2.6420 sec/batch
Epoch 14/20  Iteration 2350/3560 Training loss: 1.2366 2.4767 sec/batch
Epoch 14/20  Iteration 2351/3560 Training loss: 1.2360 2.4601 sec/batch
Epoch 14/20  Iteration 2352/3560 Training loss: 1.2348 2.4574 sec/batch
Epoch 14/20  Iteration 2353/3560 Training loss: 1.2334 2.4578 sec/batch
Epoch 14/20  Iteration 2354/3560 Training loss: 1.2331 2.4562 sec/batch
Epoch 14/20  Iteration 2355/3560 Training loss: 1.2324 2.4620 sec/batch
Epoch 14/20  Iteration 2356/3560 Training loss: 1.2332 2.4457 sec/batch
Epoch 14/20  Iteration 2357/3560 Training loss: 1.2329 2.4585 sec/batch
Epoch 14/20  Iteration 2358/3560 Training loss: 1.2323 2.4525 sec/batch
Epoch 14/20  Iteration 2359/3560 Training loss: 1.2326 2.4527 sec/batch
Epoch 14/20  Iteration 2360/3560 Training loss: 1.2321 2.4588 sec/batch
Epoch 14/20  Iteration 2361/3560 Training loss: 1.2319 2.4511 sec/batch
Epoch 14/20  Iteration 2362/3560 Training loss: 1.2314 2.4630 sec/batch
Epoch 14/20  Iteration 2363/3560 Training loss: 1.2314 2.4621 sec/batch
Epoch 14/20  Iteration 2364/3560 Training loss: 1.2316 2.4509 sec/batch
Epoch 14/20  Iteration 2365/3560 Training loss: 1.2310 2.4591 sec/batch
Epoch 14/20  Iteration 2366/3560 Training loss: 1.2315 2.4589 sec/batch
Epoch 14/20  Iteration 2367/3560 Training loss: 1.2315 2.4579 sec/batch
Epoch 14/20  Iteration 2368/3560 Training loss: 1.2316 2.4589 sec/batch
Epoch 14/20  Iteration 2369/3560 Training loss: 1.2312 2.4506 sec/batch
Epoch 14/20  Iteration 2370/3560 Training loss: 1.2310 2.4568 sec/batch
Epoch 14/20  Iteration 2371/3560 Training loss: 1.2311 2.4543 sec/batch
Epoch 14/20  Iteration 2372/3560 Training loss: 1.2308 2.4605 sec/batch
Epoch 14/20  Iteration 2373/3560 Training loss: 1.2301 2.4581 sec/batch
Epoch 14/20  Iteration 2374/3560 Training loss: 1.2308 2.4711 sec/batch
Epoch 14/20  Iteration 2375/3560 Training loss: 1.2308 2.4473 sec/batch
Epoch 14/20  Iteration 2376/3560 Training loss: 1.2315 2.4547 sec/batch
Epoch 14/20  Iteration 2377/3560 Training loss: 1.2318 2.4535 sec/batch
Epoch 14/20  Iteration 2378/3560 Training loss: 1.2320 2.4491 sec/batch
Epoch 14/20  Iteration 2379/3560 Training loss: 1.2318 2.4810 sec/batch
Epoch 14/20  Iteration 2380/3560 Training loss: 1.2319 2.4459 sec/batch
Epoch 14/20  Iteration 2381/3560 Training loss: 1.2321 2.4638 sec/batch
Epoch 14/20  Iteration 2382/3560 Training loss: 1.2322 2.4583 sec/batch
Epoch 14/20  Iteration 2383/3560 Training loss: 1.2322 2.4536 sec/batch
Epoch 14/20  Iteration 2384/3560 Training loss: 1.2322 2.4604 sec/batch
Epoch 14/20  Iteration 2385/3560 Training loss: 1.2328 2.4660 sec/batch
Epoch 14/20  Iteration 2386/3560 Training loss: 1.2331 2.4634 sec/batch
Epoch 14/20  Iteration 2387/3560 Training loss: 1.2335 2.4579 sec/batch
Epoch 14/20  Iteration 2388/3560 Training loss: 1.2332 2.4512 sec/batch
Epoch 14/20  Iteration 2389/3560 Training loss: 1.2332 2.4613 sec/batch
Epoch 14/20  Iteration 2390/3560 Training loss: 1.2332 2.4500 sec/batch
Epoch 14/20  Iteration 2391/3560 Training loss: 1.2331 2.4726 sec/batch
Epoch 14/20  Iteration 2392/3560 Training loss: 1.2330 2.4451 sec/batch
Epoch 14/20  Iteration 2393/3560 Training loss: 1.2324 2.4613 sec/batch
Epoch 14/20  Iteration 2394/3560 Training loss: 1.2322 2.4575 sec/batch
Epoch 14/20  Iteration 2395/3560 Training loss: 1.2318 2.4635 sec/batch
Epoch 14/20  Iteration 2396/3560 Training loss: 1.2317 2.4703 sec/batch
Epoch 14/20  Iteration 2397/3560 Training loss: 1.2312 2.6290 sec/batch
Epoch 14/20  Iteration 2398/3560 Training loss: 1.2312 2.5138 sec/batch
Epoch 14/20  Iteration 2399/3560 Training loss: 1.2310 2.4539 sec/batch
Epoch 14/20  Iteration 2400/3560 Training loss: 1.2308 2.4576 sec/batch
Epoch 14/20  Iteration 2401/3560 Training loss: 1.2307 2.4548 sec/batch
Epoch 14/20  Iteration 2402/3560 Training loss: 1.2304 2.4482 sec/batch
Epoch 14/20  Iteration 2403/3560 Training loss: 1.2300 2.4736 sec/batch
Epoch 14/20  Iteration 2404/3560 Training loss: 1.2300 2.4480 sec/batch
Epoch 14/20  Iteration 2405/3560 Training loss: 1.2298 2.4672 sec/batch
Epoch 14/20  Iteration 2406/3560 Training loss: 1.2297 2.4532 sec/batch
Epoch 14/20  Iteration 2407/3560 Training loss: 1.2293 2.4467 sec/batch
Epoch 14/20  Iteration 2408/3560 Training loss: 1.2290 2.4595 sec/batch
Epoch 14/20  Iteration 2409/3560 Training loss: 1.2288 2.4529 sec/batch
Epoch 14/20  Iteration 2410/3560 Training loss: 1.2288 2.4530 sec/batch
Epoch 14/20  Iteration 2411/3560 Training loss: 1.2288 2.4497 sec/batch
Epoch 14/20  Iteration 2412/3560 Training loss: 1.2284 2.4615 sec/batch
Epoch 14/20  Iteration 2413/3560 Training loss: 1.2282 2.4558 sec/batch
Epoch 14/20  Iteration 2414/3560 Training loss: 1.2279 2.4514 sec/batch
Epoch 14/20  Iteration 2415/3560 Training loss: 1.2278 2.4832 sec/batch
Epoch 14/20  Iteration 2416/3560 Training loss: 1.2276 2.4532 sec/batch
Epoch 14/20  Iteration 2417/3560 Training loss: 1.2276 2.4765 sec/batch
Epoch 14/20  Iteration 2418/3560 Training loss: 1.2274 2.4569 sec/batch
Epoch 14/20  Iteration 2419/3560 Training loss: 1.2272 2.4568 sec/batch
Epoch 14/20  Iteration 2420/3560 Training loss: 1.2270 2.4561 sec/batch
Epoch 14/20  Iteration 2421/3560 Training loss: 1.2270 2.4526 sec/batch
Epoch 14/20  Iteration 2422/3560 Training loss: 1.2271 2.4592 sec/batch
Epoch 14/20  Iteration 2423/3560 Training loss: 1.2268 2.4557 sec/batch
Epoch 14/20  Iteration 2424/3560 Training loss: 1.2269 2.4569 sec/batch
Epoch 14/20  Iteration 2425/3560 Training loss: 1.2267 2.4496 sec/batch
Epoch 14/20  Iteration 2426/3560 Training loss: 1.2267 2.4585 sec/batch
Epoch 14/20  Iteration 2427/3560 Training loss: 1.2266 2.4716 sec/batch
Epoch 14/20  Iteration 2428/3560 Training loss: 1.2264 2.4490 sec/batch
Epoch 14/20  Iteration 2429/3560 Training loss: 1.2263 2.7536 sec/batch
Epoch 14/20  Iteration 2430/3560 Training loss: 1.2259 2.4789 sec/batch
Epoch 14/20  Iteration 2431/3560 Training loss: 1.2258 2.4574 sec/batch
Epoch 14/20  Iteration 2432/3560 Training loss: 1.2258 2.4478 sec/batch
Epoch 14/20  Iteration 2433/3560 Training loss: 1.2256 2.4659 sec/batch
Epoch 14/20  Iteration 2434/3560 Training loss: 1.2256 2.4559 sec/batch
Epoch 14/20  Iteration 2435/3560 Training loss: 1.2255 2.4460 sec/batch
Epoch 14/20  Iteration 2436/3560 Training loss: 1.2251 2.4786 sec/batch
Epoch 14/20  Iteration 2437/3560 Training loss: 1.2248 2.4473 sec/batch
Epoch 14/20  Iteration 2438/3560 Training loss: 1.2248 2.4611 sec/batch
Epoch 14/20  Iteration 2439/3560 Training loss: 1.2246 2.4734 sec/batch
Epoch 14/20  Iteration 2440/3560 Training loss: 1.2242 2.4598 sec/batch
Epoch 14/20  Iteration 2441/3560 Training loss: 1.2242 2.4821 sec/batch
Epoch 14/20  Iteration 2442/3560 Training loss: 1.2241 2.4602 sec/batch
Epoch 14/20  Iteration 2443/3560 Training loss: 1.2239 2.4868 sec/batch
Epoch 14/20  Iteration 2444/3560 Training loss: 1.2236 2.4469 sec/batch
Epoch 14/20  Iteration 2445/3560 Training loss: 1.2231 2.6193 sec/batch
Epoch 14/20  Iteration 2446/3560 Training loss: 1.2229 2.5138 sec/batch
Epoch 14/20  Iteration 2447/3560 Training loss: 1.2230 2.4774 sec/batch
Epoch 14/20  Iteration 2448/3560 Training loss: 1.2230 2.4556 sec/batch
Epoch 14/20  Iteration 2449/3560 Training loss: 1.2229 2.4746 sec/batch
Epoch 14/20  Iteration 2450/3560 Training loss: 1.2229 2.4602 sec/batch
Epoch 14/20  Iteration 2451/3560 Training loss: 1.2230 2.4671 sec/batch
Epoch 14/20  Iteration 2452/3560 Training loss: 1.2231 2.4585 sec/batch
Epoch 14/20  Iteration 2453/3560 Training loss: 1.2231 2.4498 sec/batch
Epoch 14/20  Iteration 2454/3560 Training loss: 1.2230 2.4657 sec/batch
Epoch 14/20  Iteration 2455/3560 Training loss: 1.2234 2.4501 sec/batch
Epoch 14/20  Iteration 2456/3560 Training loss: 1.2234 2.4527 sec/batch
Epoch 14/20  Iteration 2457/3560 Training loss: 1.2234 2.4658 sec/batch
Epoch 14/20  Iteration 2458/3560 Training loss: 1.2236 2.4516 sec/batch
Epoch 14/20  Iteration 2459/3560 Training loss: 1.2235 2.4578 sec/batch
Epoch 14/20  Iteration 2460/3560 Training loss: 1.2237 2.4581 sec/batch
Epoch 14/20  Iteration 2461/3560 Training loss: 1.2237 2.4616 sec/batch
Epoch 14/20  Iteration 2462/3560 Training loss: 1.2239 2.4553 sec/batch
Epoch 14/20  Iteration 2463/3560 Training loss: 1.2240 2.4441 sec/batch
Epoch 14/20  Iteration 2464/3560 Training loss: 1.2239 2.4711 sec/batch
Epoch 14/20  Iteration 2465/3560 Training loss: 1.2236 2.4530 sec/batch
Epoch 14/20  Iteration 2466/3560 Training loss: 1.2234 2.4614 sec/batch
Epoch 14/20  Iteration 2467/3560 Training loss: 1.2235 2.4689 sec/batch
Epoch 14/20  Iteration 2468/3560 Training loss: 1.2235 2.4633 sec/batch
Epoch 14/20  Iteration 2469/3560 Training loss: 1.2234 2.4582 sec/batch
Epoch 14/20  Iteration 2470/3560 Training loss: 1.2234 2.4559 sec/batch
Epoch 14/20  Iteration 2471/3560 Training loss: 1.2233 2.4667 sec/batch
Epoch 14/20  Iteration 2472/3560 Training loss: 1.2233 2.4519 sec/batch
Epoch 14/20  Iteration 2473/3560 Training loss: 1.2231 2.4661 sec/batch
Epoch 14/20  Iteration 2474/3560 Training loss: 1.2232 2.4542 sec/batch
Epoch 14/20  Iteration 2475/3560 Training loss: 1.2233 2.4572 sec/batch
Epoch 14/20  Iteration 2476/3560 Training loss: 1.2232 2.4709 sec/batch
Epoch 14/20  Iteration 2477/3560 Training loss: 1.2232 2.4415 sec/batch
Epoch 14/20  Iteration 2478/3560 Training loss: 1.2232 2.4550 sec/batch
Epoch 14/20  Iteration 2479/3560 Training loss: 1.2232 2.4440 sec/batch
Epoch 14/20  Iteration 2480/3560 Training loss: 1.2231 2.4612 sec/batch
Epoch 14/20  Iteration 2481/3560 Training loss: 1.2233 2.4496 sec/batch
Epoch 14/20  Iteration 2482/3560 Training loss: 1.2236 2.4548 sec/batch
Epoch 14/20  Iteration 2483/3560 Training loss: 1.2236 2.4559 sec/batch
Epoch 14/20  Iteration 2484/3560 Training loss: 1.2237 2.4537 sec/batch
Epoch 14/20  Iteration 2485/3560 Training loss: 1.2236 2.4761 sec/batch
Epoch 14/20  Iteration 2486/3560 Training loss: 1.2234 2.4461 sec/batch
Epoch 14/20  Iteration 2487/3560 Training loss: 1.2236 2.4644 sec/batch
Epoch 14/20  Iteration 2488/3560 Training loss: 1.2236 2.4784 sec/batch
Epoch 14/20  Iteration 2489/3560 Training loss: 1.2236 2.4639 sec/batch
Epoch 14/20  Iteration 2490/3560 Training loss: 1.2234 2.4618 sec/batch
Epoch 14/20  Iteration 2491/3560 Training loss: 1.2234 2.4544 sec/batch
Epoch 14/20  Iteration 2492/3560 Training loss: 1.2236 2.4603 sec/batch
Epoch 15/20  Iteration 2493/3560 Training loss: 1.3350 2.4558 sec/batch
Epoch 15/20  Iteration 2494/3560 Training loss: 1.2888 2.6270 sec/batch
Epoch 15/20  Iteration 2495/3560 Training loss: 1.2702 2.5090 sec/batch
Epoch 15/20  Iteration 2496/3560 Training loss: 1.2626 2.4548 sec/batch
Epoch 15/20  Iteration 2497/3560 Training loss: 1.2524 2.4688 sec/batch
Epoch 15/20  Iteration 2498/3560 Training loss: 1.2409 2.4520 sec/batch
Epoch 15/20  Iteration 2499/3560 Training loss: 1.2375 2.4644 sec/batch
Epoch 15/20  Iteration 2500/3560 Training loss: 1.2355 2.4597 sec/batch
Epoch 15/20  Iteration 2501/3560 Training loss: 1.2348 2.4525 sec/batch
Epoch 15/20  Iteration 2502/3560 Training loss: 1.2329 2.4614 sec/batch
Epoch 15/20  Iteration 2503/3560 Training loss: 1.2291 2.4560 sec/batch
Epoch 15/20  Iteration 2504/3560 Training loss: 1.2287 2.4537 sec/batch
Epoch 15/20  Iteration 2505/3560 Training loss: 1.2281 2.4563 sec/batch
Epoch 15/20  Iteration 2506/3560 Training loss: 1.2297 2.4734 sec/batch
Epoch 15/20  Iteration 2507/3560 Training loss: 1.2281 2.4566 sec/batch
Epoch 15/20  Iteration 2508/3560 Training loss: 1.2262 2.4491 sec/batch
Epoch 15/20  Iteration 2509/3560 Training loss: 1.2256 2.4768 sec/batch
Epoch 15/20  Iteration 2510/3560 Training loss: 1.2264 2.4565 sec/batch
Epoch 15/20  Iteration 2511/3560 Training loss: 1.2266 2.4999 sec/batch
Epoch 15/20  Iteration 2512/3560 Training loss: 1.2275 2.5889 sec/batch
Epoch 15/20  Iteration 2513/3560 Training loss: 1.2268 2.5047 sec/batch
Epoch 15/20  Iteration 2514/3560 Training loss: 1.2269 2.4611 sec/batch
Epoch 15/20  Iteration 2515/3560 Training loss: 1.2261 2.4524 sec/batch
Epoch 15/20  Iteration 2516/3560 Training loss: 1.2265 2.4571 sec/batch
Epoch 15/20  Iteration 2517/3560 Training loss: 1.2262 2.4580 sec/batch
Epoch 15/20  Iteration 2518/3560 Training loss: 1.2238 2.4954 sec/batch
Epoch 15/20  Iteration 2519/3560 Training loss: 1.2226 2.4500 sec/batch
Epoch 15/20  Iteration 2520/3560 Training loss: 1.2233 2.4546 sec/batch
Epoch 15/20  Iteration 2521/3560 Training loss: 1.2236 2.4659 sec/batch
Epoch 15/20  Iteration 2522/3560 Training loss: 1.2234 2.4631 sec/batch
Epoch 15/20  Iteration 2523/3560 Training loss: 1.2228 2.4626 sec/batch
Epoch 15/20  Iteration 2524/3560 Training loss: 1.2217 2.4688 sec/batch
Epoch 15/20  Iteration 2525/3560 Training loss: 1.2217 2.4520 sec/batch
Epoch 15/20  Iteration 2526/3560 Training loss: 1.2217 2.4602 sec/batch
Epoch 15/20  Iteration 2527/3560 Training loss: 1.2212 2.4513 sec/batch
Epoch 15/20  Iteration 2528/3560 Training loss: 1.2210 2.4926 sec/batch
Epoch 15/20  Iteration 2529/3560 Training loss: 1.2202 2.4504 sec/batch
Epoch 15/20  Iteration 2530/3560 Training loss: 1.2188 2.4708 sec/batch
Epoch 15/20  Iteration 2531/3560 Training loss: 1.2172 2.4787 sec/batch
Epoch 15/20  Iteration 2532/3560 Training loss: 1.2170 2.4499 sec/batch
Epoch 15/20  Iteration 2533/3560 Training loss: 1.2162 2.4595 sec/batch
Epoch 15/20  Iteration 2534/3560 Training loss: 1.2171 2.4556 sec/batch
Epoch 15/20  Iteration 2535/3560 Training loss: 1.2167 2.4617 sec/batch
Epoch 15/20  Iteration 2536/3560 Training loss: 1.2163 2.4761 sec/batch
Epoch 15/20  Iteration 2537/3560 Training loss: 1.2164 2.4446 sec/batch
Epoch 15/20  Iteration 2538/3560 Training loss: 1.2158 2.4669 sec/batch
Epoch 15/20  Iteration 2539/3560 Training loss: 1.2153 2.4763 sec/batch
Epoch 15/20  Iteration 2540/3560 Training loss: 1.2147 2.4605 sec/batch
Epoch 15/20  Iteration 2541/3560 Training loss: 1.2147 2.4530 sec/batch
Epoch 15/20  Iteration 2542/3560 Training loss: 1.2149 2.5952 sec/batch
Epoch 15/20  Iteration 2543/3560 Training loss: 1.2141 2.5385 sec/batch
Epoch 15/20  Iteration 2544/3560 Training loss: 1.2147 2.4571 sec/batch
Epoch 15/20  Iteration 2545/3560 Training loss: 1.2147 2.4637 sec/batch
Epoch 15/20  Iteration 2546/3560 Training loss: 1.2148 2.4578 sec/batch
Epoch 15/20  Iteration 2547/3560 Training loss: 1.2144 2.4559 sec/batch
Epoch 15/20  Iteration 2548/3560 Training loss: 1.2144 2.4748 sec/batch
Epoch 15/20  Iteration 2549/3560 Training loss: 1.2145 2.4521 sec/batch
Epoch 15/20  Iteration 2550/3560 Training loss: 1.2142 2.6372 sec/batch
Epoch 15/20  Iteration 2551/3560 Training loss: 1.2134 2.5462 sec/batch
Epoch 15/20  Iteration 2552/3560 Training loss: 1.2140 2.4588 sec/batch
Epoch 15/20  Iteration 2553/3560 Training loss: 1.2140 2.4533 sec/batch
Epoch 15/20  Iteration 2554/3560 Training loss: 1.2147 2.4703 sec/batch
Epoch 15/20  Iteration 2555/3560 Training loss: 1.2150 2.4470 sec/batch
Epoch 15/20  Iteration 2556/3560 Training loss: 1.2151 2.4556 sec/batch
Epoch 15/20  Iteration 2557/3560 Training loss: 1.2151 2.4698 sec/batch
Epoch 15/20  Iteration 2558/3560 Training loss: 1.2151 2.4544 sec/batch
Epoch 15/20  Iteration 2559/3560 Training loss: 1.2153 2.4849 sec/batch
Epoch 15/20  Iteration 2560/3560 Training loss: 1.2151 2.4551 sec/batch
Epoch 15/20  Iteration 2561/3560 Training loss: 1.2152 2.4643 sec/batch
Epoch 15/20  Iteration 2562/3560 Training loss: 1.2151 2.4669 sec/batch
Epoch 15/20  Iteration 2563/3560 Training loss: 1.2158 2.4472 sec/batch
Epoch 15/20  Iteration 2564/3560 Training loss: 1.2161 2.4648 sec/batch
Epoch 15/20  Iteration 2565/3560 Training loss: 1.2166 2.4498 sec/batch
Epoch 15/20  Iteration 2566/3560 Training loss: 1.2163 2.4583 sec/batch
Epoch 15/20  Iteration 2567/3560 Training loss: 1.2163 2.4479 sec/batch
Epoch 15/20  Iteration 2568/3560 Training loss: 1.2163 2.4485 sec/batch
Epoch 15/20  Iteration 2569/3560 Training loss: 1.2162 2.4631 sec/batch
Epoch 15/20  Iteration 2570/3560 Training loss: 1.2162 2.4594 sec/batch
Epoch 15/20  Iteration 2571/3560 Training loss: 1.2155 2.4591 sec/batch
Epoch 15/20  Iteration 2572/3560 Training loss: 1.2155 2.4591 sec/batch
Epoch 15/20  Iteration 2573/3560 Training loss: 1.2151 2.4706 sec/batch
Epoch 15/20  Iteration 2574/3560 Training loss: 1.2150 2.4540 sec/batch
Epoch 15/20  Iteration 2575/3560 Training loss: 1.2145 2.4473 sec/batch
Epoch 15/20  Iteration 2576/3560 Training loss: 1.2145 2.4671 sec/batch
Epoch 15/20  Iteration 2577/3560 Training loss: 1.2143 2.4515 sec/batch
Epoch 15/20  Iteration 2578/3560 Training loss: 1.2144 2.4539 sec/batch
Epoch 15/20  Iteration 2579/3560 Training loss: 1.2143 2.4611 sec/batch
Epoch 15/20  Iteration 2580/3560 Training loss: 1.2140 2.4504 sec/batch
Epoch 15/20  Iteration 2581/3560 Training loss: 1.2137 2.4546 sec/batch
Epoch 15/20  Iteration 2582/3560 Training loss: 1.2138 2.4491 sec/batch
Epoch 15/20  Iteration 2583/3560 Training loss: 1.2135 2.4616 sec/batch
Epoch 15/20  Iteration 2584/3560 Training loss: 1.2134 2.4864 sec/batch
Epoch 15/20  Iteration 2585/3560 Training loss: 1.2130 2.4632 sec/batch
Epoch 15/20  Iteration 2586/3560 Training loss: 1.2126 2.4575 sec/batch
Epoch 15/20  Iteration 2587/3560 Training loss: 1.2125 2.4486 sec/batch
Epoch 15/20  Iteration 2588/3560 Training loss: 1.2125 2.4592 sec/batch
Epoch 15/20  Iteration 2589/3560 Training loss: 1.2125 2.4587 sec/batch
Epoch 15/20  Iteration 2590/3560 Training loss: 1.2121 2.4532 sec/batch
Epoch 15/20  Iteration 2591/3560 Training loss: 1.2118 2.6387 sec/batch
Epoch 15/20  Iteration 2592/3560 Training loss: 1.2115 2.5142 sec/batch
Epoch 15/20  Iteration 2593/3560 Training loss: 1.2114 2.4668 sec/batch
Epoch 15/20  Iteration 2594/3560 Training loss: 1.2112 2.4556 sec/batch
Epoch 15/20  Iteration 2595/3560 Training loss: 1.2111 2.4575 sec/batch
Epoch 15/20  Iteration 2596/3560 Training loss: 1.2110 2.4663 sec/batch
Epoch 15/20  Iteration 2597/3560 Training loss: 1.2108 2.4669 sec/batch
Epoch 15/20  Iteration 2598/3560 Training loss: 1.2107 2.4599 sec/batch
Epoch 15/20  Iteration 2599/3560 Training loss: 1.2107 2.4482 sec/batch
Epoch 15/20  Iteration 2600/3560 Training loss: 1.2108 2.4687 sec/batch
Epoch 15/20  Iteration 2601/3560 Training loss: 1.2105 2.4559 sec/batch
Epoch 15/20  Iteration 2602/3560 Training loss: 1.2106 2.4629 sec/batch
Epoch 15/20  Iteration 2603/3560 Training loss: 1.2104 2.4569 sec/batch
Epoch 15/20  Iteration 2604/3560 Training loss: 1.2104 2.4485 sec/batch
Epoch 15/20  Iteration 2605/3560 Training loss: 1.2103 2.4574 sec/batch
Epoch 15/20  Iteration 2606/3560 Training loss: 1.2102 2.4520 sec/batch
Epoch 15/20  Iteration 2607/3560 Training loss: 1.2100 2.4461 sec/batch
Epoch 15/20  Iteration 2608/3560 Training loss: 1.2097 2.4650 sec/batch
Epoch 15/20  Iteration 2609/3560 Training loss: 1.2096 2.4597 sec/batch
Epoch 15/20  Iteration 2610/3560 Training loss: 1.2095 2.4574 sec/batch
Epoch 15/20  Iteration 2611/3560 Training loss: 1.2094 2.4464 sec/batch
Epoch 15/20  Iteration 2612/3560 Training loss: 1.2094 2.4560 sec/batch
Epoch 15/20  Iteration 2613/3560 Training loss: 1.2092 2.4721 sec/batch
Epoch 15/20  Iteration 2614/3560 Training loss: 1.2089 2.4482 sec/batch
Epoch 15/20  Iteration 2615/3560 Training loss: 1.2085 2.4612 sec/batch
Epoch 15/20  Iteration 2616/3560 Training loss: 1.2085 2.4475 sec/batch
Epoch 15/20  Iteration 2617/3560 Training loss: 1.2084 2.4610 sec/batch
Epoch 15/20  Iteration 2618/3560 Training loss: 1.2081 2.4597 sec/batch
Epoch 15/20  Iteration 2619/3560 Training loss: 1.2080 2.4499 sec/batch
Epoch 15/20  Iteration 2620/3560 Training loss: 1.2080 2.4668 sec/batch
Epoch 15/20  Iteration 2621/3560 Training loss: 1.2079 2.4648 sec/batch
Epoch 15/20  Iteration 2622/3560 Training loss: 1.2076 2.4651 sec/batch
Epoch 15/20  Iteration 2623/3560 Training loss: 1.2072 2.4525 sec/batch
Epoch 15/20  Iteration 2624/3560 Training loss: 1.2071 2.4483 sec/batch
Epoch 15/20  Iteration 2625/3560 Training loss: 1.2071 2.4521 sec/batch
Epoch 15/20  Iteration 2626/3560 Training loss: 1.2070 2.4640 sec/batch
Epoch 15/20  Iteration 2627/3560 Training loss: 1.2070 2.4616 sec/batch
Epoch 15/20  Iteration 2628/3560 Training loss: 1.2070 2.4507 sec/batch
Epoch 15/20  Iteration 2629/3560 Training loss: 1.2071 2.4633 sec/batch
Epoch 15/20  Iteration 2630/3560 Training loss: 1.2072 2.4531 sec/batch
Epoch 15/20  Iteration 2631/3560 Training loss: 1.2072 2.4574 sec/batch
Epoch 15/20  Iteration 2632/3560 Training loss: 1.2072 2.4640 sec/batch
Epoch 15/20  Iteration 2633/3560 Training loss: 1.2076 2.4633 sec/batch
Epoch 15/20  Iteration 2634/3560 Training loss: 1.2077 2.4678 sec/batch
Epoch 15/20  Iteration 2635/3560 Training loss: 1.2076 2.4632 sec/batch
Epoch 15/20  Iteration 2636/3560 Training loss: 1.2079 2.4562 sec/batch
Epoch 15/20  Iteration 2637/3560 Training loss: 1.2078 2.4702 sec/batch
Epoch 15/20  Iteration 2638/3560 Training loss: 1.2079 2.4439 sec/batch
Epoch 15/20  Iteration 2639/3560 Training loss: 1.2080 2.5136 sec/batch
Epoch 15/20  Iteration 2640/3560 Training loss: 1.2082 2.6094 sec/batch
Epoch 15/20  Iteration 2641/3560 Training loss: 1.2083 2.5102 sec/batch
Epoch 15/20  Iteration 2642/3560 Training loss: 1.2082 2.4486 sec/batch
Epoch 15/20  Iteration 2643/3560 Training loss: 1.2079 2.4625 sec/batch
Epoch 15/20  Iteration 2644/3560 Training loss: 1.2076 2.4481 sec/batch
Epoch 15/20  Iteration 2645/3560 Training loss: 1.2077 2.4613 sec/batch
Epoch 15/20  Iteration 2646/3560 Training loss: 1.2076 2.4775 sec/batch
Epoch 15/20  Iteration 2647/3560 Training loss: 1.2076 2.4458 sec/batch
Epoch 15/20  Iteration 2648/3560 Training loss: 1.2075 2.4709 sec/batch
Epoch 15/20  Iteration 2649/3560 Training loss: 1.2075 2.4522 sec/batch
Epoch 15/20  Iteration 2650/3560 Training loss: 1.2075 2.4531 sec/batch
Epoch 15/20  Iteration 2651/3560 Training loss: 1.2072 2.4641 sec/batch
Epoch 15/20  Iteration 2652/3560 Training loss: 1.2073 2.4555 sec/batch
Epoch 15/20  Iteration 2653/3560 Training loss: 1.2074 2.4554 sec/batch
Epoch 15/20  Iteration 2654/3560 Training loss: 1.2074 2.4490 sec/batch
Epoch 15/20  Iteration 2655/3560 Training loss: 1.2073 2.4653 sec/batch
Epoch 15/20  Iteration 2656/3560 Training loss: 1.2073 2.5029 sec/batch
Epoch 15/20  Iteration 2657/3560 Training loss: 1.2073 2.7281 sec/batch
Epoch 15/20  Iteration 2658/3560 Training loss: 1.2072 2.6695 sec/batch
Epoch 15/20  Iteration 2659/3560 Training loss: 1.2074 2.4635 sec/batch
Epoch 15/20  Iteration 2660/3560 Training loss: 1.2077 2.4592 sec/batch
Epoch 15/20  Iteration 2661/3560 Training loss: 1.2077 2.4522 sec/batch
Epoch 15/20  Iteration 2662/3560 Training loss: 1.2077 2.4523 sec/batch
Epoch 15/20  Iteration 2663/3560 Training loss: 1.2077 2.4581 sec/batch
Epoch 15/20  Iteration 2664/3560 Training loss: 1.2075 2.4541 sec/batch
Epoch 15/20  Iteration 2665/3560 Training loss: 1.2077 2.4626 sec/batch
Epoch 15/20  Iteration 2666/3560 Training loss: 1.2077 2.4559 sec/batch
Epoch 15/20  Iteration 2667/3560 Training loss: 1.2077 2.4500 sec/batch
Epoch 15/20  Iteration 2668/3560 Training loss: 1.2075 2.4598 sec/batch
Epoch 15/20  Iteration 2669/3560 Training loss: 1.2074 2.4492 sec/batch
Epoch 15/20  Iteration 2670/3560 Training loss: 1.2076 2.4778 sec/batch
Epoch 16/20  Iteration 2671/3560 Training loss: 1.3182 2.4435 sec/batch
Epoch 16/20  Iteration 2672/3560 Training loss: 1.2722 2.8576 sec/batch
Epoch 16/20  Iteration 2673/3560 Training loss: 1.2554 2.4593 sec/batch
Epoch 16/20  Iteration 2674/3560 Training loss: 1.2512 2.4496 sec/batch
Epoch 16/20  Iteration 2675/3560 Training loss: 1.2391 2.4594 sec/batch
Epoch 16/20  Iteration 2676/3560 Training loss: 1.2260 2.4514 sec/batch
Epoch 16/20  Iteration 2677/3560 Training loss: 1.2231 2.4706 sec/batch
Epoch 16/20  Iteration 2678/3560 Training loss: 1.2205 2.4689 sec/batch
Epoch 16/20  Iteration 2679/3560 Training loss: 1.2173 2.4679 sec/batch
Epoch 16/20  Iteration 2680/3560 Training loss: 1.2156 2.4935 sec/batch
Epoch 16/20  Iteration 2681/3560 Training loss: 1.2126 2.4544 sec/batch
Epoch 16/20  Iteration 2682/3560 Training loss: 1.2113 2.4763 sec/batch
Epoch 16/20  Iteration 2683/3560 Training loss: 1.2111 2.4578 sec/batch
Epoch 16/20  Iteration 2684/3560 Training loss: 1.2115 2.4722 sec/batch
Epoch 16/20  Iteration 2685/3560 Training loss: 1.2101 2.4749 sec/batch
Epoch 16/20  Iteration 2686/3560 Training loss: 1.2084 2.4598 sec/batch
Epoch 16/20  Iteration 2687/3560 Training loss: 1.2079 2.4695 sec/batch
Epoch 16/20  Iteration 2688/3560 Training loss: 1.2089 2.6215 sec/batch
Epoch 16/20  Iteration 2689/3560 Training loss: 1.2087 2.4933 sec/batch
Epoch 16/20  Iteration 2690/3560 Training loss: 1.2099 2.4538 sec/batch
Epoch 16/20  Iteration 2691/3560 Training loss: 1.2093 2.4526 sec/batch
Epoch 16/20  Iteration 2692/3560 Training loss: 1.2093 2.4593 sec/batch
Epoch 16/20  Iteration 2693/3560 Training loss: 1.2082 2.4473 sec/batch
Epoch 16/20  Iteration 2694/3560 Training loss: 1.2084 2.4827 sec/batch
Epoch 16/20  Iteration 2695/3560 Training loss: 1.2084 2.4563 sec/batch
Epoch 16/20  Iteration 2696/3560 Training loss: 1.2065 2.4681 sec/batch
Epoch 16/20  Iteration 2697/3560 Training loss: 1.2053 2.4622 sec/batch
Epoch 16/20  Iteration 2698/3560 Training loss: 1.2057 2.4578 sec/batch
Epoch 16/20  Iteration 2699/3560 Training loss: 1.2055 2.4607 sec/batch
Epoch 16/20  Iteration 2700/3560 Training loss: 1.2061 2.4541 sec/batch
Epoch 16/20  Iteration 2701/3560 Training loss: 1.2056 2.4656 sec/batch
Epoch 16/20  Iteration 2702/3560 Training loss: 1.2046 2.4498 sec/batch
Epoch 16/20  Iteration 2703/3560 Training loss: 1.2049 2.4508 sec/batch
Epoch 16/20  Iteration 2704/3560 Training loss: 1.2049 2.4655 sec/batch
Epoch 16/20  Iteration 2705/3560 Training loss: 1.2042 2.4460 sec/batch
Epoch 16/20  Iteration 2706/3560 Training loss: 1.2041 2.4812 sec/batch
Epoch 16/20  Iteration 2707/3560 Training loss: 1.2033 2.4574 sec/batch
Epoch 16/20  Iteration 2708/3560 Training loss: 1.2020 2.4672 sec/batch
Epoch 16/20  Iteration 2709/3560 Training loss: 1.2006 2.4647 sec/batch
Epoch 16/20  Iteration 2710/3560 Training loss: 1.2005 2.4497 sec/batch
Epoch 16/20  Iteration 2711/3560 Training loss: 1.1997 2.4623 sec/batch
Epoch 16/20  Iteration 2712/3560 Training loss: 1.2004 2.4710 sec/batch
Epoch 16/20  Iteration 2713/3560 Training loss: 1.2003 2.4476 sec/batch
Epoch 16/20  Iteration 2714/3560 Training loss: 1.1998 2.4572 sec/batch
Epoch 16/20  Iteration 2715/3560 Training loss: 1.1999 2.4461 sec/batch
Epoch 16/20  Iteration 2716/3560 Training loss: 1.1992 2.4519 sec/batch
Epoch 16/20  Iteration 2717/3560 Training loss: 1.1989 2.4583 sec/batch
Epoch 16/20  Iteration 2718/3560 Training loss: 1.1984 2.4686 sec/batch
Epoch 16/20  Iteration 2719/3560 Training loss: 1.1985 2.4582 sec/batch
Epoch 16/20  Iteration 2720/3560 Training loss: 1.1988 2.4459 sec/batch
Epoch 16/20  Iteration 2721/3560 Training loss: 1.1981 2.4600 sec/batch
Epoch 16/20  Iteration 2722/3560 Training loss: 1.1987 2.4572 sec/batch
Epoch 16/20  Iteration 2723/3560 Training loss: 1.1985 2.4673 sec/batch
Epoch 16/20  Iteration 2724/3560 Training loss: 1.1987 2.4555 sec/batch
Epoch 16/20  Iteration 2725/3560 Training loss: 1.1984 2.4538 sec/batch
Epoch 16/20  Iteration 2726/3560 Training loss: 1.1986 2.4597 sec/batch
Epoch 16/20  Iteration 2727/3560 Training loss: 1.1988 2.4511 sec/batch
Epoch 16/20  Iteration 2728/3560 Training loss: 1.1986 2.4679 sec/batch
Epoch 16/20  Iteration 2729/3560 Training loss: 1.1981 2.4554 sec/batch
Epoch 16/20  Iteration 2730/3560 Training loss: 1.1988 2.4719 sec/batch
Epoch 16/20  Iteration 2731/3560 Training loss: 1.1988 2.4523 sec/batch
Epoch 16/20  Iteration 2732/3560 Training loss: 1.1997 2.4579 sec/batch
Epoch 16/20  Iteration 2733/3560 Training loss: 1.2000 2.4721 sec/batch
Epoch 16/20  Iteration 2734/3560 Training loss: 1.2001 2.4538 sec/batch
Epoch 16/20  Iteration 2735/3560 Training loss: 1.1999 2.4675 sec/batch
Epoch 16/20  Iteration 2736/3560 Training loss: 1.2001 2.5790 sec/batch
Epoch 16/20  Iteration 2737/3560 Training loss: 1.2004 2.5399 sec/batch
Epoch 16/20  Iteration 2738/3560 Training loss: 1.2004 2.4554 sec/batch
Epoch 16/20  Iteration 2739/3560 Training loss: 1.2006 2.4498 sec/batch
Epoch 16/20  Iteration 2740/3560 Training loss: 1.2006 2.4554 sec/batch
Epoch 16/20  Iteration 2741/3560 Training loss: 1.2012 2.4480 sec/batch
Epoch 16/20  Iteration 2742/3560 Training loss: 1.2015 2.4664 sec/batch
Epoch 16/20  Iteration 2743/3560 Training loss: 1.2019 2.4566 sec/batch
Epoch 16/20  Iteration 2744/3560 Training loss: 1.2017 2.4537 sec/batch
Epoch 16/20  Iteration 2745/3560 Training loss: 1.2017 2.4578 sec/batch
Epoch 16/20  Iteration 2746/3560 Training loss: 1.2017 2.4437 sec/batch
Epoch 16/20  Iteration 2747/3560 Training loss: 1.2016 2.4582 sec/batch
Epoch 16/20  Iteration 2748/3560 Training loss: 1.2015 2.4530 sec/batch
Epoch 16/20  Iteration 2749/3560 Training loss: 1.2010 2.4592 sec/batch
Epoch 16/20  Iteration 2750/3560 Training loss: 1.2010 2.4706 sec/batch
Epoch 16/20  Iteration 2751/3560 Training loss: 1.2006 2.4495 sec/batch
Epoch 16/20  Iteration 2752/3560 Training loss: 1.2005 2.4664 sec/batch
Epoch 16/20  Iteration 2753/3560 Training loss: 1.2000 2.4583 sec/batch
Epoch 16/20  Iteration 2754/3560 Training loss: 1.2000 2.4468 sec/batch
Epoch 16/20  Iteration 2755/3560 Training loss: 1.1997 2.4869 sec/batch
Epoch 16/20  Iteration 2756/3560 Training loss: 1.1997 2.4630 sec/batch
Epoch 16/20  Iteration 2757/3560 Training loss: 1.1995 2.4682 sec/batch
Epoch 16/20  Iteration 2758/3560 Training loss: 1.1991 2.4567 sec/batch
Epoch 16/20  Iteration 2759/3560 Training loss: 1.1988 2.4652 sec/batch
Epoch 16/20  Iteration 2760/3560 Training loss: 1.1988 2.4508 sec/batch
Epoch 16/20  Iteration 2761/3560 Training loss: 1.1986 2.4546 sec/batch
Epoch 16/20  Iteration 2762/3560 Training loss: 1.1985 2.4489 sec/batch
Epoch 16/20  Iteration 2763/3560 Training loss: 1.1981 2.4564 sec/batch
Epoch 16/20  Iteration 2764/3560 Training loss: 1.1977 2.4700 sec/batch
Epoch 16/20  Iteration 2765/3560 Training loss: 1.1975 2.4783 sec/batch
Epoch 16/20  Iteration 2766/3560 Training loss: 1.1977 2.4646 sec/batch
Epoch 16/20  Iteration 2767/3560 Training loss: 1.1976 2.4628 sec/batch
Epoch 16/20  Iteration 2768/3560 Training loss: 1.1972 2.4680 sec/batch
Epoch 16/20  Iteration 2769/3560 Training loss: 1.1969 2.4526 sec/batch
Epoch 16/20  Iteration 2770/3560 Training loss: 1.1966 2.4520 sec/batch
Epoch 16/20  Iteration 2771/3560 Training loss: 1.1965 2.4689 sec/batch
Epoch 16/20  Iteration 2772/3560 Training loss: 1.1963 2.4442 sec/batch
Epoch 16/20  Iteration 2773/3560 Training loss: 1.1962 2.4626 sec/batch
Epoch 16/20  Iteration 2774/3560 Training loss: 1.1961 2.4563 sec/batch
Epoch 16/20  Iteration 2775/3560 Training loss: 1.1960 2.4567 sec/batch
Epoch 16/20  Iteration 2776/3560 Training loss: 1.1958 2.4471 sec/batch
Epoch 16/20  Iteration 2777/3560 Training loss: 1.1958 2.4696 sec/batch
Epoch 16/20  Iteration 2778/3560 Training loss: 1.1959 2.4810 sec/batch
Epoch 16/20  Iteration 2779/3560 Training loss: 1.1957 2.4629 sec/batch
Epoch 16/20  Iteration 2780/3560 Training loss: 1.1957 2.4614 sec/batch
Epoch 16/20  Iteration 2781/3560 Training loss: 1.1955 2.4571 sec/batch
Epoch 16/20  Iteration 2782/3560 Training loss: 1.1955 2.4683 sec/batch
Epoch 16/20  Iteration 2783/3560 Training loss: 1.1954 2.4470 sec/batch
Epoch 16/20  Iteration 2784/3560 Training loss: 1.1953 2.4434 sec/batch
Epoch 16/20  Iteration 2785/3560 Training loss: 1.1952 2.6402 sec/batch
Epoch 16/20  Iteration 2786/3560 Training loss: 1.1949 2.4710 sec/batch
Epoch 16/20  Iteration 2787/3560 Training loss: 1.1949 2.4609 sec/batch
Epoch 16/20  Iteration 2788/3560 Training loss: 1.1948 2.4515 sec/batch
Epoch 16/20  Iteration 2789/3560 Training loss: 1.1948 2.4574 sec/batch
Epoch 16/20  Iteration 2790/3560 Training loss: 1.1948 2.4593 sec/batch
Epoch 16/20  Iteration 2791/3560 Training loss: 1.1948 2.4692 sec/batch
Epoch 16/20  Iteration 2792/3560 Training loss: 1.1945 2.4662 sec/batch
Epoch 16/20  Iteration 2793/3560 Training loss: 1.1942 2.6681 sec/batch
Epoch 16/20  Iteration 2794/3560 Training loss: 1.1941 2.4685 sec/batch
Epoch 16/20  Iteration 2795/3560 Training loss: 1.1940 2.4547 sec/batch
Epoch 16/20  Iteration 2796/3560 Training loss: 1.1937 2.4515 sec/batch
Epoch 16/20  Iteration 2797/3560 Training loss: 1.1937 2.4661 sec/batch
Epoch 16/20  Iteration 2798/3560 Training loss: 1.1938 2.4577 sec/batch
Epoch 16/20  Iteration 2799/3560 Training loss: 1.1936 2.4861 sec/batch
Epoch 16/20  Iteration 2800/3560 Training loss: 1.1933 2.4530 sec/batch
Epoch 16/20  Iteration 2801/3560 Training loss: 1.1930 2.4604 sec/batch
Epoch 16/20  Iteration 2802/3560 Training loss: 1.1928 2.4513 sec/batch
Epoch 16/20  Iteration 2803/3560 Training loss: 1.1929 2.4805 sec/batch
Epoch 16/20  Iteration 2804/3560 Training loss: 1.1929 2.4753 sec/batch
Epoch 16/20  Iteration 2805/3560 Training loss: 1.1928 2.4537 sec/batch
Epoch 16/20  Iteration 2806/3560 Training loss: 1.1928 2.4624 sec/batch
Epoch 16/20  Iteration 2807/3560 Training loss: 1.1929 2.4583 sec/batch
Epoch 16/20  Iteration 2808/3560 Training loss: 1.1930 2.4489 sec/batch
Epoch 16/20  Iteration 2809/3560 Training loss: 1.1930 2.4626 sec/batch
Epoch 16/20  Iteration 2810/3560 Training loss: 1.1930 2.4609 sec/batch
Epoch 16/20  Iteration 2811/3560 Training loss: 1.1934 2.4667 sec/batch
Epoch 16/20  Iteration 2812/3560 Training loss: 1.1935 2.4881 sec/batch
Epoch 16/20  Iteration 2813/3560 Training loss: 1.1935 2.4617 sec/batch
Epoch 16/20  Iteration 2814/3560 Training loss: 1.1937 2.4461 sec/batch
Epoch 16/20  Iteration 2815/3560 Training loss: 1.1936 2.4605 sec/batch
Epoch 16/20  Iteration 2816/3560 Training loss: 1.1937 2.4573 sec/batch
Epoch 16/20  Iteration 2817/3560 Training loss: 1.1937 2.4580 sec/batch
Epoch 16/20  Iteration 2818/3560 Training loss: 1.1940 2.4717 sec/batch
Epoch 16/20  Iteration 2819/3560 Training loss: 1.1941 2.4517 sec/batch
Epoch 16/20  Iteration 2820/3560 Training loss: 1.1939 2.4544 sec/batch
Epoch 16/20  Iteration 2821/3560 Training loss: 1.1936 2.4621 sec/batch
Epoch 16/20  Iteration 2822/3560 Training loss: 1.1934 2.4499 sec/batch
Epoch 16/20  Iteration 2823/3560 Training loss: 1.1935 2.4543 sec/batch
Epoch 16/20  Iteration 2824/3560 Training loss: 1.1934 2.4536 sec/batch
Epoch 16/20  Iteration 2825/3560 Training loss: 1.1934 2.4649 sec/batch
Epoch 16/20  Iteration 2826/3560 Training loss: 1.1934 2.4609 sec/batch
Epoch 16/20  Iteration 2827/3560 Training loss: 1.1934 2.4733 sec/batch
Epoch 16/20  Iteration 2828/3560 Training loss: 1.1933 2.4585 sec/batch
Epoch 16/20  Iteration 2829/3560 Training loss: 1.1931 2.4485 sec/batch
Epoch 16/20  Iteration 2830/3560 Training loss: 1.1932 2.4594 sec/batch
Epoch 16/20  Iteration 2831/3560 Training loss: 1.1933 2.4734 sec/batch
Epoch 16/20  Iteration 2832/3560 Training loss: 1.1933 2.4472 sec/batch
Epoch 16/20  Iteration 2833/3560 Training loss: 1.1932 2.5278 sec/batch
Epoch 16/20  Iteration 2834/3560 Training loss: 1.1932 2.5704 sec/batch
Epoch 16/20  Iteration 2835/3560 Training loss: 1.1932 2.4651 sec/batch
Epoch 16/20  Iteration 2836/3560 Training loss: 1.1931 2.4559 sec/batch
Epoch 16/20  Iteration 2837/3560 Training loss: 1.1933 2.4507 sec/batch
Epoch 16/20  Iteration 2838/3560 Training loss: 1.1936 2.4528 sec/batch
Epoch 16/20  Iteration 2839/3560 Training loss: 1.1936 2.4713 sec/batch
Epoch 16/20  Iteration 2840/3560 Training loss: 1.1937 2.5782 sec/batch
Epoch 16/20  Iteration 2841/3560 Training loss: 1.1936 2.4624 sec/batch
Epoch 16/20  Iteration 2842/3560 Training loss: 1.1935 2.4715 sec/batch
Epoch 16/20  Iteration 2843/3560 Training loss: 1.1937 2.4584 sec/batch
Epoch 16/20  Iteration 2844/3560 Training loss: 1.1937 2.4538 sec/batch
Epoch 16/20  Iteration 2845/3560 Training loss: 1.1937 2.4631 sec/batch
Epoch 16/20  Iteration 2846/3560 Training loss: 1.1935 2.4539 sec/batch
Epoch 16/20  Iteration 2847/3560 Training loss: 1.1934 2.4637 sec/batch
Epoch 16/20  Iteration 2848/3560 Training loss: 1.1937 2.4594 sec/batch
Epoch 17/20  Iteration 2849/3560 Training loss: 1.3020 2.4608 sec/batch
Epoch 17/20  Iteration 2850/3560 Training loss: 1.2552 2.4601 sec/batch
Epoch 17/20  Iteration 2851/3560 Training loss: 1.2390 2.4502 sec/batch
Epoch 17/20  Iteration 2852/3560 Training loss: 1.2357 2.4733 sec/batch
Epoch 17/20  Iteration 2853/3560 Training loss: 1.2241 2.4614 sec/batch
Epoch 17/20  Iteration 2854/3560 Training loss: 1.2121 2.4747 sec/batch
Epoch 17/20  Iteration 2855/3560 Training loss: 1.2094 2.4481 sec/batch
Epoch 17/20  Iteration 2856/3560 Training loss: 1.2067 2.4501 sec/batch
Epoch 17/20  Iteration 2857/3560 Training loss: 1.2057 2.4536 sec/batch
Epoch 17/20  Iteration 2858/3560 Training loss: 1.2033 2.4565 sec/batch
Epoch 17/20  Iteration 2859/3560 Training loss: 1.1993 2.4677 sec/batch
Epoch 17/20  Iteration 2860/3560 Training loss: 1.1986 2.4593 sec/batch
Epoch 17/20  Iteration 2861/3560 Training loss: 1.1986 2.4523 sec/batch
Epoch 17/20  Iteration 2862/3560 Training loss: 1.1996 2.4630 sec/batch
Epoch 17/20  Iteration 2863/3560 Training loss: 1.1979 2.4569 sec/batch
Epoch 17/20  Iteration 2864/3560 Training loss: 1.1962 2.4700 sec/batch
Epoch 17/20  Iteration 2865/3560 Training loss: 1.1961 2.4480 sec/batch
Epoch 17/20  Iteration 2866/3560 Training loss: 1.1974 2.4610 sec/batch
Epoch 17/20  Iteration 2867/3560 Training loss: 1.1970 2.4508 sec/batch
Epoch 17/20  Iteration 2868/3560 Training loss: 1.1974 2.4526 sec/batch
Epoch 17/20  Iteration 2869/3560 Training loss: 1.1966 2.4574 sec/batch
Epoch 17/20  Iteration 2870/3560 Training loss: 1.1966 2.4522 sec/batch
Epoch 17/20  Iteration 2871/3560 Training loss: 1.1959 2.4604 sec/batch
Epoch 17/20  Iteration 2872/3560 Training loss: 1.1963 2.4491 sec/batch
Epoch 17/20  Iteration 2873/3560 Training loss: 1.1961 2.4433 sec/batch
Epoch 17/20  Iteration 2874/3560 Training loss: 1.1941 2.4654 sec/batch
Epoch 17/20  Iteration 2875/3560 Training loss: 1.1928 2.4483 sec/batch
Epoch 17/20  Iteration 2876/3560 Training loss: 1.1932 2.4862 sec/batch
Epoch 17/20  Iteration 2877/3560 Training loss: 1.1931 2.4517 sec/batch
Epoch 17/20  Iteration 2878/3560 Training loss: 1.1933 2.4718 sec/batch
Epoch 17/20  Iteration 2879/3560 Training loss: 1.1925 2.4524 sec/batch
Epoch 17/20  Iteration 2880/3560 Training loss: 1.1914 2.4579 sec/batch
Epoch 17/20  Iteration 2881/3560 Training loss: 1.1916 2.4638 sec/batch
Epoch 17/20  Iteration 2882/3560 Training loss: 1.1916 2.6382 sec/batch
Epoch 17/20  Iteration 2883/3560 Training loss: 1.1909 2.5196 sec/batch
Epoch 17/20  Iteration 2884/3560 Training loss: 1.1904 2.4522 sec/batch
Epoch 17/20  Iteration 2885/3560 Training loss: 1.1897 2.4672 sec/batch
Epoch 17/20  Iteration 2886/3560 Training loss: 1.1885 2.4577 sec/batch
Epoch 17/20  Iteration 2887/3560 Training loss: 1.1874 2.4612 sec/batch
Epoch 17/20  Iteration 2888/3560 Training loss: 1.1872 2.4655 sec/batch
Epoch 17/20  Iteration 2889/3560 Training loss: 1.1864 2.4513 sec/batch
Epoch 17/20  Iteration 2890/3560 Training loss: 1.1872 2.4467 sec/batch
Epoch 17/20  Iteration 2891/3560 Training loss: 1.1868 2.4578 sec/batch
Epoch 17/20  Iteration 2892/3560 Training loss: 1.1862 2.4602 sec/batch
Epoch 17/20  Iteration 2893/3560 Training loss: 1.1865 2.4553 sec/batch
Epoch 17/20  Iteration 2894/3560 Training loss: 1.1860 2.4833 sec/batch
Epoch 17/20  Iteration 2895/3560 Training loss: 1.1856 2.4726 sec/batch
Epoch 17/20  Iteration 2896/3560 Training loss: 1.1851 2.4586 sec/batch
Epoch 17/20  Iteration 2897/3560 Training loss: 1.1851 2.4600 sec/batch
Epoch 17/20  Iteration 2898/3560 Training loss: 1.1854 2.4504 sec/batch
Epoch 17/20  Iteration 2899/3560 Training loss: 1.1849 2.4550 sec/batch
Epoch 17/20  Iteration 2900/3560 Training loss: 1.1853 2.4676 sec/batch
Epoch 17/20  Iteration 2901/3560 Training loss: 1.1852 2.4497 sec/batch
Epoch 17/20  Iteration 2902/3560 Training loss: 1.1853 2.4694 sec/batch
Epoch 17/20  Iteration 2903/3560 Training loss: 1.1849 2.4671 sec/batch
Epoch 17/20  Iteration 2904/3560 Training loss: 1.1850 2.4633 sec/batch
Epoch 17/20  Iteration 2905/3560 Training loss: 1.1851 2.4496 sec/batch
Epoch 17/20  Iteration 2906/3560 Training loss: 1.1850 2.4589 sec/batch
Epoch 17/20  Iteration 2907/3560 Training loss: 1.1846 2.4696 sec/batch
Epoch 17/20  Iteration 2908/3560 Training loss: 1.1852 2.4701 sec/batch
Epoch 17/20  Iteration 2909/3560 Training loss: 1.1851 2.4612 sec/batch
Epoch 17/20  Iteration 2910/3560 Training loss: 1.1857 2.4562 sec/batch
Epoch 17/20  Iteration 2911/3560 Training loss: 1.1859 2.4631 sec/batch
Epoch 17/20  Iteration 2912/3560 Training loss: 1.1860 2.4768 sec/batch
Epoch 17/20  Iteration 2913/3560 Training loss: 1.1859 2.4683 sec/batch
Epoch 17/20  Iteration 2914/3560 Training loss: 1.1858 2.7320 sec/batch
Epoch 17/20  Iteration 2915/3560 Training loss: 1.1860 2.4956 sec/batch
Epoch 17/20  Iteration 2916/3560 Training loss: 1.1859 2.4593 sec/batch
Epoch 17/20  Iteration 2917/3560 Training loss: 1.1859 2.4531 sec/batch
Epoch 17/20  Iteration 2918/3560 Training loss: 1.1859 2.4580 sec/batch
Epoch 17/20  Iteration 2919/3560 Training loss: 1.1866 2.4540 sec/batch
Epoch 17/20  Iteration 2920/3560 Training loss: 1.1867 2.4602 sec/batch
Epoch 17/20  Iteration 2921/3560 Training loss: 1.1871 2.4711 sec/batch
Epoch 17/20  Iteration 2922/3560 Training loss: 1.1868 2.4528 sec/batch
Epoch 17/20  Iteration 2923/3560 Training loss: 1.1869 2.4590 sec/batch
Epoch 17/20  Iteration 2924/3560 Training loss: 1.1869 2.4707 sec/batch
Epoch 17/20  Iteration 2925/3560 Training loss: 1.1867 2.4524 sec/batch
Epoch 17/20  Iteration 2926/3560 Training loss: 1.1865 2.4592 sec/batch
Epoch 17/20  Iteration 2927/3560 Training loss: 1.1859 2.4618 sec/batch
Epoch 17/20  Iteration 2928/3560 Training loss: 1.1859 2.4824 sec/batch
Epoch 17/20  Iteration 2929/3560 Training loss: 1.1855 2.4500 sec/batch
Epoch 17/20  Iteration 2930/3560 Training loss: 1.1852 2.6086 sec/batch
Epoch 17/20  Iteration 2931/3560 Training loss: 1.1848 2.5247 sec/batch
Epoch 17/20  Iteration 2932/3560 Training loss: 1.1847 2.4680 sec/batch
Epoch 17/20  Iteration 2933/3560 Training loss: 1.1844 2.4686 sec/batch
Epoch 17/20  Iteration 2934/3560 Training loss: 1.1842 2.4540 sec/batch
Epoch 17/20  Iteration 2935/3560 Training loss: 1.1841 2.4587 sec/batch
Epoch 17/20  Iteration 2936/3560 Training loss: 1.1838 2.4623 sec/batch
Epoch 17/20  Iteration 2937/3560 Training loss: 1.1834 2.5144 sec/batch
Epoch 17/20  Iteration 2938/3560 Training loss: 1.1835 2.4496 sec/batch
Epoch 17/20  Iteration 2939/3560 Training loss: 1.1833 2.4565 sec/batch
Epoch 17/20  Iteration 2940/3560 Training loss: 1.1831 2.4684 sec/batch
Epoch 17/20  Iteration 2941/3560 Training loss: 1.1828 2.4637 sec/batch
Epoch 17/20  Iteration 2942/3560 Training loss: 1.1824 2.4662 sec/batch
Epoch 17/20  Iteration 2943/3560 Training loss: 1.1822 2.4577 sec/batch
Epoch 17/20  Iteration 2944/3560 Training loss: 1.1823 2.4570 sec/batch
Epoch 17/20  Iteration 2945/3560 Training loss: 1.1822 2.4582 sec/batch
Epoch 17/20  Iteration 2946/3560 Training loss: 1.1818 2.4522 sec/batch
Epoch 17/20  Iteration 2947/3560 Training loss: 1.1815 2.4658 sec/batch
Epoch 17/20  Iteration 2948/3560 Training loss: 1.1813 2.4506 sec/batch
Epoch 17/20  Iteration 2949/3560 Training loss: 1.1813 2.4655 sec/batch
Epoch 17/20  Iteration 2950/3560 Training loss: 1.1811 2.4636 sec/batch
Epoch 17/20  Iteration 2951/3560 Training loss: 1.1810 2.4676 sec/batch
Epoch 17/20  Iteration 2952/3560 Training loss: 1.1809 2.4684 sec/batch
Epoch 17/20  Iteration 2953/3560 Training loss: 1.1808 2.4520 sec/batch
Epoch 17/20  Iteration 2954/3560 Training loss: 1.1806 2.4665 sec/batch
Epoch 17/20  Iteration 2955/3560 Training loss: 1.1806 2.4638 sec/batch
Epoch 17/20  Iteration 2956/3560 Training loss: 1.1806 2.4493 sec/batch
Epoch 17/20  Iteration 2957/3560 Training loss: 1.1803 2.4627 sec/batch
Epoch 17/20  Iteration 2958/3560 Training loss: 1.1804 2.4548 sec/batch
Epoch 17/20  Iteration 2959/3560 Training loss: 1.1804 2.4660 sec/batch
Epoch 17/20  Iteration 2960/3560 Training loss: 1.1803 2.4507 sec/batch
Epoch 17/20  Iteration 2961/3560 Training loss: 1.1802 2.4701 sec/batch
Epoch 17/20  Iteration 2962/3560 Training loss: 1.1801 2.4643 sec/batch
Epoch 17/20  Iteration 2963/3560 Training loss: 1.1800 2.4514 sec/batch
Epoch 17/20  Iteration 2964/3560 Training loss: 1.1798 2.4622 sec/batch
Epoch 17/20  Iteration 2965/3560 Training loss: 1.1797 2.4508 sec/batch
Epoch 17/20  Iteration 2966/3560 Training loss: 1.1799 2.4719 sec/batch
Epoch 17/20  Iteration 2967/3560 Training loss: 1.1798 2.6331 sec/batch
Epoch 17/20  Iteration 2968/3560 Training loss: 1.1798 2.4564 sec/batch
Epoch 17/20  Iteration 2969/3560 Training loss: 1.1798 2.4896 sec/batch
Epoch 17/20  Iteration 2970/3560 Training loss: 1.1797 2.4683 sec/batch
Epoch 17/20  Iteration 2971/3560 Training loss: 1.1795 2.4941 sec/batch
Epoch 17/20  Iteration 2972/3560 Training loss: 1.1795 2.4604 sec/batch
Epoch 17/20  Iteration 2973/3560 Training loss: 1.1793 2.4896 sec/batch
Epoch 17/20  Iteration 2974/3560 Training loss: 1.1789 2.4716 sec/batch
Epoch 17/20  Iteration 2975/3560 Training loss: 1.1789 2.4922 sec/batch
Epoch 17/20  Iteration 2976/3560 Training loss: 1.1789 2.4960 sec/batch
Epoch 17/20  Iteration 2977/3560 Training loss: 1.1788 2.4892 sec/batch
Epoch 17/20  Iteration 2978/3560 Training loss: 1.1786 2.4968 sec/batch
Epoch 17/20  Iteration 2979/3560 Training loss: 1.1782 2.7125 sec/batch
Epoch 17/20  Iteration 2980/3560 Training loss: 1.1780 2.5985 sec/batch
Epoch 17/20  Iteration 2981/3560 Training loss: 1.1782 2.5493 sec/batch
Epoch 17/20  Iteration 2982/3560 Training loss: 1.1782 2.5246 sec/batch
Epoch 17/20  Iteration 2983/3560 Training loss: 1.1782 2.5364 sec/batch
Epoch 17/20  Iteration 2984/3560 Training loss: 1.1782 2.4948 sec/batch
Epoch 17/20  Iteration 2985/3560 Training loss: 1.1783 2.5585 sec/batch
Epoch 17/20  Iteration 2986/3560 Training loss: 1.1784 2.5337 sec/batch
Epoch 17/20  Iteration 2987/3560 Training loss: 1.1784 2.5154 sec/batch
Epoch 17/20  Iteration 2988/3560 Training loss: 1.1785 2.5048 sec/batch
Epoch 17/20  Iteration 2989/3560 Training loss: 1.1788 2.5200 sec/batch
Epoch 17/20  Iteration 2990/3560 Training loss: 1.1789 2.4991 sec/batch
Epoch 17/20  Iteration 2991/3560 Training loss: 1.1788 2.4800 sec/batch
Epoch 17/20  Iteration 2992/3560 Training loss: 1.1791 2.5144 sec/batch
Epoch 17/20  Iteration 2993/3560 Training loss: 1.1790 2.4971 sec/batch
Epoch 17/20  Iteration 2994/3560 Training loss: 1.1791 2.5607 sec/batch
Epoch 17/20  Iteration 2995/3560 Training loss: 1.1791 2.5259 sec/batch
Epoch 17/20  Iteration 2996/3560 Training loss: 1.1792 2.4624 sec/batch
Epoch 17/20  Iteration 2997/3560 Training loss: 1.1794 2.4683 sec/batch
Epoch 17/20  Iteration 2998/3560 Training loss: 1.1793 2.4675 sec/batch
Epoch 17/20  Iteration 2999/3560 Training loss: 1.1790 2.4555 sec/batch
Epoch 17/20  Iteration 3000/3560 Training loss: 1.1788 2.4466 sec/batch
Epoch 17/20  Iteration 3001/3560 Training loss: 1.1788 2.4660 sec/batch
Epoch 17/20  Iteration 3002/3560 Training loss: 1.1788 2.4511 sec/batch
Epoch 17/20  Iteration 3003/3560 Training loss: 1.1787 2.4712 sec/batch
Epoch 17/20  Iteration 3004/3560 Training loss: 1.1787 2.4535 sec/batch
Epoch 17/20  Iteration 3005/3560 Training loss: 1.1786 2.4517 sec/batch
Epoch 17/20  Iteration 3006/3560 Training loss: 1.1786 2.4603 sec/batch
Epoch 17/20  Iteration 3007/3560 Training loss: 1.1784 2.4481 sec/batch
Epoch 17/20  Iteration 3008/3560 Training loss: 1.1784 2.4625 sec/batch
Epoch 17/20  Iteration 3009/3560 Training loss: 1.1785 2.4661 sec/batch
Epoch 17/20  Iteration 3010/3560 Training loss: 1.1785 2.4541 sec/batch
Epoch 17/20  Iteration 3011/3560 Training loss: 1.1784 2.4632 sec/batch
Epoch 17/20  Iteration 3012/3560 Training loss: 1.1783 2.4572 sec/batch
Epoch 17/20  Iteration 3013/3560 Training loss: 1.1783 2.4638 sec/batch
Epoch 17/20  Iteration 3014/3560 Training loss: 1.1783 2.4565 sec/batch
Epoch 17/20  Iteration 3015/3560 Training loss: 1.1784 2.4579 sec/batch
Epoch 17/20  Iteration 3016/3560 Training loss: 1.1788 2.4463 sec/batch
Epoch 17/20  Iteration 3017/3560 Training loss: 1.1789 2.4591 sec/batch
Epoch 17/20  Iteration 3018/3560 Training loss: 1.1789 2.4599 sec/batch
Epoch 17/20  Iteration 3019/3560 Training loss: 1.1789 2.4600 sec/batch
Epoch 17/20  Iteration 3020/3560 Training loss: 1.1787 2.4632 sec/batch
Epoch 17/20  Iteration 3021/3560 Training loss: 1.1789 2.4603 sec/batch
Epoch 17/20  Iteration 3022/3560 Training loss: 1.1789 2.4486 sec/batch
Epoch 17/20  Iteration 3023/3560 Training loss: 1.1789 2.4735 sec/batch
Epoch 17/20  Iteration 3024/3560 Training loss: 1.1787 2.4632 sec/batch
Epoch 17/20  Iteration 3025/3560 Training loss: 1.1787 2.4555 sec/batch
Epoch 17/20  Iteration 3026/3560 Training loss: 1.1788 2.4472 sec/batch
Epoch 18/20  Iteration 3027/3560 Training loss: 1.2813 2.6206 sec/batch
Epoch 18/20  Iteration 3028/3560 Training loss: 1.2343 2.5096 sec/batch
Epoch 18/20  Iteration 3029/3560 Training loss: 1.2183 2.4500 sec/batch
Epoch 18/20  Iteration 3030/3560 Training loss: 1.2138 2.4593 sec/batch
Epoch 18/20  Iteration 3031/3560 Training loss: 1.2040 2.4462 sec/batch
Epoch 18/20  Iteration 3032/3560 Training loss: 1.1924 2.4546 sec/batch
Epoch 18/20  Iteration 3033/3560 Training loss: 1.1895 2.4708 sec/batch
Epoch 18/20  Iteration 3034/3560 Training loss: 1.1867 2.4443 sec/batch
Epoch 18/20  Iteration 3035/3560 Training loss: 1.1860 2.8760 sec/batch
Epoch 18/20  Iteration 3036/3560 Training loss: 1.1858 2.7953 sec/batch
Epoch 18/20  Iteration 3037/3560 Training loss: 1.1822 2.4673 sec/batch
Epoch 18/20  Iteration 3038/3560 Training loss: 1.1814 2.4473 sec/batch
Epoch 18/20  Iteration 3039/3560 Training loss: 1.1813 2.4846 sec/batch
Epoch 18/20  Iteration 3040/3560 Training loss: 1.1820 2.4529 sec/batch
Epoch 18/20  Iteration 3041/3560 Training loss: 1.1803 2.4631 sec/batch
Epoch 18/20  Iteration 3042/3560 Training loss: 1.1781 2.4630 sec/batch
Epoch 18/20  Iteration 3043/3560 Training loss: 1.1781 2.4975 sec/batch
Epoch 18/20  Iteration 3044/3560 Training loss: 1.1795 2.4594 sec/batch
Epoch 18/20  Iteration 3045/3560 Training loss: 1.1796 2.4817 sec/batch
Epoch 18/20  Iteration 3046/3560 Training loss: 1.1807 2.4541 sec/batch
Epoch 18/20  Iteration 3047/3560 Training loss: 1.1801 2.4562 sec/batch
Epoch 18/20  Iteration 3048/3560 Training loss: 1.1802 2.4810 sec/batch
Epoch 18/20  Iteration 3049/3560 Training loss: 1.1796 2.4462 sec/batch
Epoch 18/20  Iteration 3050/3560 Training loss: 1.1802 2.4476 sec/batch
Epoch 18/20  Iteration 3051/3560 Training loss: 1.1802 2.4575 sec/batch
Epoch 18/20  Iteration 3052/3560 Training loss: 1.1783 2.4612 sec/batch
Epoch 18/20  Iteration 3053/3560 Training loss: 1.1769 2.4593 sec/batch
Epoch 18/20  Iteration 3054/3560 Training loss: 1.1775 2.4478 sec/batch
Epoch 18/20  Iteration 3055/3560 Training loss: 1.1776 2.4484 sec/batch
Epoch 18/20  Iteration 3056/3560 Training loss: 1.1778 2.4607 sec/batch
Epoch 18/20  Iteration 3057/3560 Training loss: 1.1771 2.4614 sec/batch
Epoch 18/20  Iteration 3058/3560 Training loss: 1.1760 2.4633 sec/batch
Epoch 18/20  Iteration 3059/3560 Training loss: 1.1761 2.4532 sec/batch
Epoch 18/20  Iteration 3060/3560 Training loss: 1.1763 2.4708 sec/batch
Epoch 18/20  Iteration 3061/3560 Training loss: 1.1756 2.4529 sec/batch
Epoch 18/20  Iteration 3062/3560 Training loss: 1.1755 2.4550 sec/batch
Epoch 18/20  Iteration 3063/3560 Training loss: 1.1747 2.4615 sec/batch
Epoch 18/20  Iteration 3064/3560 Training loss: 1.1734 2.4477 sec/batch
Epoch 18/20  Iteration 3065/3560 Training loss: 1.1723 2.4590 sec/batch
Epoch 18/20  Iteration 3066/3560 Training loss: 1.1723 2.4478 sec/batch
Epoch 18/20  Iteration 3067/3560 Training loss: 1.1716 2.4580 sec/batch
Epoch 18/20  Iteration 3068/3560 Training loss: 1.1724 2.4784 sec/batch
Epoch 18/20  Iteration 3069/3560 Training loss: 1.1722 2.4631 sec/batch
Epoch 18/20  Iteration 3070/3560 Training loss: 1.1715 2.4693 sec/batch
Epoch 18/20  Iteration 3071/3560 Training loss: 1.1715 2.4612 sec/batch
Epoch 18/20  Iteration 3072/3560 Training loss: 1.1709 2.4606 sec/batch
Epoch 18/20  Iteration 3073/3560 Training loss: 1.1706 2.4591 sec/batch
Epoch 18/20  Iteration 3074/3560 Training loss: 1.1702 2.4523 sec/batch
Epoch 18/20  Iteration 3075/3560 Training loss: 1.1703 2.4735 sec/batch
Epoch 18/20  Iteration 3076/3560 Training loss: 1.1707 2.6818 sec/batch
Epoch 18/20  Iteration 3077/3560 Training loss: 1.1702 2.4520 sec/batch
Epoch 18/20  Iteration 3078/3560 Training loss: 1.1705 2.4571 sec/batch
Epoch 18/20  Iteration 3079/3560 Training loss: 1.1705 2.4488 sec/batch
Epoch 18/20  Iteration 3080/3560 Training loss: 1.1706 2.4628 sec/batch
Epoch 18/20  Iteration 3081/3560 Training loss: 1.1703 2.4698 sec/batch
Epoch 18/20  Iteration 3082/3560 Training loss: 1.1702 2.4600 sec/batch
Epoch 18/20  Iteration 3083/3560 Training loss: 1.1705 2.4548 sec/batch
Epoch 18/20  Iteration 3084/3560 Training loss: 1.1704 2.4499 sec/batch
Epoch 18/20  Iteration 3085/3560 Training loss: 1.1698 2.4623 sec/batch
Epoch 18/20  Iteration 3086/3560 Training loss: 1.1705 2.4467 sec/batch
Epoch 18/20  Iteration 3087/3560 Training loss: 1.1706 2.4571 sec/batch
Epoch 18/20  Iteration 3088/3560 Training loss: 1.1713 2.4531 sec/batch
Epoch 18/20  Iteration 3089/3560 Training loss: 1.1715 2.4651 sec/batch
Epoch 18/20  Iteration 3090/3560 Training loss: 1.1717 2.4507 sec/batch
Epoch 18/20  Iteration 3091/3560 Training loss: 1.1717 2.4552 sec/batch
Epoch 18/20  Iteration 3092/3560 Training loss: 1.1718 2.4642 sec/batch
Epoch 18/20  Iteration 3093/3560 Training loss: 1.1721 2.4676 sec/batch
Epoch 18/20  Iteration 3094/3560 Training loss: 1.1721 2.4781 sec/batch
Epoch 18/20  Iteration 3095/3560 Training loss: 1.1723 2.4525 sec/batch
Epoch 18/20  Iteration 3096/3560 Training loss: 1.1722 2.4595 sec/batch
Epoch 18/20  Iteration 3097/3560 Training loss: 1.1729 2.5003 sec/batch
Epoch 18/20  Iteration 3098/3560 Training loss: 1.1732 2.4499 sec/batch
Epoch 18/20  Iteration 3099/3560 Training loss: 1.1737 2.4562 sec/batch
Epoch 18/20  Iteration 3100/3560 Training loss: 1.1734 2.4543 sec/batch
Epoch 18/20  Iteration 3101/3560 Training loss: 1.1733 2.4585 sec/batch
Epoch 18/20  Iteration 3102/3560 Training loss: 1.1732 2.4727 sec/batch
Epoch 18/20  Iteration 3103/3560 Training loss: 1.1732 2.4556 sec/batch
Epoch 18/20  Iteration 3104/3560 Training loss: 1.1732 2.4599 sec/batch
Epoch 18/20  Iteration 3105/3560 Training loss: 1.1725 2.4479 sec/batch
Epoch 18/20  Iteration 3106/3560 Training loss: 1.1725 2.4731 sec/batch
Epoch 18/20  Iteration 3107/3560 Training loss: 1.1721 2.4559 sec/batch
Epoch 18/20  Iteration 3108/3560 Training loss: 1.1720 2.4511 sec/batch
Epoch 18/20  Iteration 3109/3560 Training loss: 1.1715 2.4636 sec/batch
Epoch 18/20  Iteration 3110/3560 Training loss: 1.1714 2.4548 sec/batch
Epoch 18/20  Iteration 3111/3560 Training loss: 1.1711 2.4642 sec/batch
Epoch 18/20  Iteration 3112/3560 Training loss: 1.1711 2.4509 sec/batch
Epoch 18/20  Iteration 3113/3560 Training loss: 1.1710 2.4470 sec/batch
Epoch 18/20  Iteration 3114/3560 Training loss: 1.1707 2.4653 sec/batch
Epoch 18/20  Iteration 3115/3560 Training loss: 1.1704 2.4477 sec/batch
Epoch 18/20  Iteration 3116/3560 Training loss: 1.1704 2.4612 sec/batch
Epoch 18/20  Iteration 3117/3560 Training loss: 1.1702 2.4574 sec/batch
Epoch 18/20  Iteration 3118/3560 Training loss: 1.1701 2.4653 sec/batch
Epoch 18/20  Iteration 3119/3560 Training loss: 1.1698 2.4601 sec/batch
Epoch 18/20  Iteration 3120/3560 Training loss: 1.1694 2.4528 sec/batch
Epoch 18/20  Iteration 3121/3560 Training loss: 1.1692 2.4659 sec/batch
Epoch 18/20  Iteration 3122/3560 Training loss: 1.1693 2.4689 sec/batch
Epoch 18/20  Iteration 3123/3560 Training loss: 1.1692 2.4578 sec/batch
Epoch 18/20  Iteration 3124/3560 Training loss: 1.1688 2.6802 sec/batch
Epoch 18/20  Iteration 3125/3560 Training loss: 1.1685 2.4605 sec/batch
Epoch 18/20  Iteration 3126/3560 Training loss: 1.1683 2.4689 sec/batch
Epoch 18/20  Iteration 3127/3560 Training loss: 1.1682 2.4561 sec/batch
Epoch 18/20  Iteration 3128/3560 Training loss: 1.1681 2.4741 sec/batch
Epoch 18/20  Iteration 3129/3560 Training loss: 1.1681 2.4534 sec/batch
Epoch 18/20  Iteration 3130/3560 Training loss: 1.1680 2.4833 sec/batch
Epoch 18/20  Iteration 3131/3560 Training loss: 1.1678 2.4493 sec/batch
Epoch 18/20  Iteration 3132/3560 Training loss: 1.1677 2.4615 sec/batch
Epoch 18/20  Iteration 3133/3560 Training loss: 1.1677 2.4578 sec/batch
Epoch 18/20  Iteration 3134/3560 Training loss: 1.1678 2.4553 sec/batch
Epoch 18/20  Iteration 3135/3560 Training loss: 1.1677 2.4552 sec/batch
Epoch 18/20  Iteration 3136/3560 Training loss: 1.1678 2.4508 sec/batch
Epoch 18/20  Iteration 3137/3560 Training loss: 1.1677 2.4616 sec/batch
Epoch 18/20  Iteration 3138/3560 Training loss: 1.1677 2.4567 sec/batch
Epoch 18/20  Iteration 3139/3560 Training loss: 1.1676 2.4555 sec/batch
Epoch 18/20  Iteration 3140/3560 Training loss: 1.1675 2.4501 sec/batch
Epoch 18/20  Iteration 3141/3560 Training loss: 1.1674 2.4502 sec/batch
Epoch 18/20  Iteration 3142/3560 Training loss: 1.1672 2.4692 sec/batch
Epoch 18/20  Iteration 3143/3560 Training loss: 1.1672 2.4545 sec/batch
Epoch 18/20  Iteration 3144/3560 Training loss: 1.1672 2.4675 sec/batch
Epoch 18/20  Iteration 3145/3560 Training loss: 1.1671 2.4674 sec/batch
Epoch 18/20  Iteration 3146/3560 Training loss: 1.1671 2.4483 sec/batch
Epoch 18/20  Iteration 3147/3560 Training loss: 1.1670 2.4558 sec/batch
Epoch 18/20  Iteration 3148/3560 Training loss: 1.1668 2.4580 sec/batch
Epoch 18/20  Iteration 3149/3560 Training loss: 1.1666 2.4574 sec/batch
Epoch 18/20  Iteration 3150/3560 Training loss: 1.1666 2.4604 sec/batch
Epoch 18/20  Iteration 3151/3560 Training loss: 1.1664 2.4783 sec/batch
Epoch 18/20  Iteration 3152/3560 Training loss: 1.1660 2.4521 sec/batch
Epoch 18/20  Iteration 3153/3560 Training loss: 1.1661 2.4472 sec/batch
Epoch 18/20  Iteration 3154/3560 Training loss: 1.1660 2.4749 sec/batch
Epoch 18/20  Iteration 3155/3560 Training loss: 1.1659 2.4592 sec/batch
Epoch 18/20  Iteration 3156/3560 Training loss: 1.1656 2.7036 sec/batch
Epoch 18/20  Iteration 3157/3560 Training loss: 1.1652 2.4647 sec/batch
Epoch 18/20  Iteration 3158/3560 Training loss: 1.1652 2.4674 sec/batch
Epoch 18/20  Iteration 3159/3560 Training loss: 1.1653 2.4524 sec/batch
Epoch 18/20  Iteration 3160/3560 Training loss: 1.1653 2.4704 sec/batch
Epoch 18/20  Iteration 3161/3560 Training loss: 1.1652 2.4598 sec/batch
Epoch 18/20  Iteration 3162/3560 Training loss: 1.1652 2.4698 sec/batch
Epoch 18/20  Iteration 3163/3560 Training loss: 1.1653 2.4639 sec/batch
Epoch 18/20  Iteration 3164/3560 Training loss: 1.1655 2.4488 sec/batch
Epoch 18/20  Iteration 3165/3560 Training loss: 1.1655 2.4735 sec/batch
Epoch 18/20  Iteration 3166/3560 Training loss: 1.1655 2.4732 sec/batch
Epoch 18/20  Iteration 3167/3560 Training loss: 1.1659 2.4606 sec/batch
Epoch 18/20  Iteration 3168/3560 Training loss: 1.1660 2.4537 sec/batch
Epoch 18/20  Iteration 3169/3560 Training loss: 1.1659 2.4605 sec/batch
Epoch 18/20  Iteration 3170/3560 Training loss: 1.1661 2.4839 sec/batch
Epoch 18/20  Iteration 3171/3560 Training loss: 1.1660 2.4541 sec/batch
Epoch 18/20  Iteration 3172/3560 Training loss: 1.1662 2.4612 sec/batch
Epoch 18/20  Iteration 3173/3560 Training loss: 1.1663 2.6905 sec/batch
Epoch 18/20  Iteration 3174/3560 Training loss: 1.1664 2.4691 sec/batch
Epoch 18/20  Iteration 3175/3560 Training loss: 1.1666 2.4567 sec/batch
Epoch 18/20  Iteration 3176/3560 Training loss: 1.1665 2.4617 sec/batch
Epoch 18/20  Iteration 3177/3560 Training loss: 1.1662 2.4514 sec/batch
Epoch 18/20  Iteration 3178/3560 Training loss: 1.1660 2.4709 sec/batch
Epoch 18/20  Iteration 3179/3560 Training loss: 1.1661 2.4573 sec/batch
Epoch 18/20  Iteration 3180/3560 Training loss: 1.1660 2.4534 sec/batch
Epoch 18/20  Iteration 3181/3560 Training loss: 1.1660 2.4577 sec/batch
Epoch 18/20  Iteration 3182/3560 Training loss: 1.1660 2.4509 sec/batch
Epoch 18/20  Iteration 3183/3560 Training loss: 1.1659 2.4577 sec/batch
Epoch 18/20  Iteration 3184/3560 Training loss: 1.1658 2.4577 sec/batch
Epoch 18/20  Iteration 3185/3560 Training loss: 1.1656 2.4653 sec/batch
Epoch 18/20  Iteration 3186/3560 Training loss: 1.1657 2.4621 sec/batch
Epoch 18/20  Iteration 3187/3560 Training loss: 1.1657 2.4552 sec/batch
Epoch 18/20  Iteration 3188/3560 Training loss: 1.1657 2.4654 sec/batch
Epoch 18/20  Iteration 3189/3560 Training loss: 1.1656 2.4530 sec/batch
Epoch 18/20  Iteration 3190/3560 Training loss: 1.1656 2.4702 sec/batch
Epoch 18/20  Iteration 3191/3560 Training loss: 1.1655 2.5681 sec/batch
Epoch 18/20  Iteration 3192/3560 Training loss: 1.1655 2.4562 sec/batch
Epoch 18/20  Iteration 3193/3560 Training loss: 1.1657 2.4730 sec/batch
Epoch 18/20  Iteration 3194/3560 Training loss: 1.1660 2.4596 sec/batch
Epoch 18/20  Iteration 3195/3560 Training loss: 1.1661 2.4569 sec/batch
Epoch 18/20  Iteration 3196/3560 Training loss: 1.1661 2.4645 sec/batch
Epoch 18/20  Iteration 3197/3560 Training loss: 1.1661 2.4589 sec/batch
Epoch 18/20  Iteration 3198/3560 Training loss: 1.1660 2.4675 sec/batch
Epoch 18/20  Iteration 3199/3560 Training loss: 1.1662 2.4536 sec/batch
Epoch 18/20  Iteration 3200/3560 Training loss: 1.1662 2.4719 sec/batch
Epoch 18/20  Iteration 3201/3560 Training loss: 1.1662 2.4510 sec/batch
Epoch 18/20  Iteration 3202/3560 Training loss: 1.1660 2.4500 sec/batch
Epoch 18/20  Iteration 3203/3560 Training loss: 1.1660 2.4711 sec/batch
Epoch 18/20  Iteration 3204/3560 Training loss: 1.1662 2.4508 sec/batch
Epoch 19/20  Iteration 3205/3560 Training loss: 1.2681 2.4572 sec/batch
Epoch 19/20  Iteration 3206/3560 Training loss: 1.2251 2.4562 sec/batch
Epoch 19/20  Iteration 3207/3560 Training loss: 1.2076 2.4491 sec/batch
Epoch 19/20  Iteration 3208/3560 Training loss: 1.2008 2.4620 sec/batch
Epoch 19/20  Iteration 3209/3560 Training loss: 1.1881 2.4464 sec/batch
Epoch 19/20  Iteration 3210/3560 Training loss: 1.1778 2.4577 sec/batch
Epoch 19/20  Iteration 3211/3560 Training loss: 1.1759 2.4530 sec/batch
Epoch 19/20  Iteration 3212/3560 Training loss: 1.1730 2.4530 sec/batch
Epoch 19/20  Iteration 3213/3560 Training loss: 1.1716 2.4628 sec/batch
Epoch 19/20  Iteration 3214/3560 Training loss: 1.1713 2.4603 sec/batch
Epoch 19/20  Iteration 3215/3560 Training loss: 1.1684 2.4733 sec/batch
Epoch 19/20  Iteration 3216/3560 Training loss: 1.1677 2.4464 sec/batch
Epoch 19/20  Iteration 3217/3560 Training loss: 1.1671 2.4614 sec/batch
Epoch 19/20  Iteration 3218/3560 Training loss: 1.1675 2.4568 sec/batch
Epoch 19/20  Iteration 3219/3560 Training loss: 1.1667 2.4556 sec/batch
Epoch 19/20  Iteration 3220/3560 Training loss: 1.1651 2.4681 sec/batch
Epoch 19/20  Iteration 3221/3560 Training loss: 1.1655 2.6251 sec/batch
Epoch 19/20  Iteration 3222/3560 Training loss: 1.1666 2.4891 sec/batch
Epoch 19/20  Iteration 3223/3560 Training loss: 1.1665 2.4487 sec/batch
Epoch 19/20  Iteration 3224/3560 Training loss: 1.1680 2.4572 sec/batch
Epoch 19/20  Iteration 3225/3560 Training loss: 1.1674 2.4608 sec/batch
Epoch 19/20  Iteration 3226/3560 Training loss: 1.1676 2.4508 sec/batch
Epoch 19/20  Iteration 3227/3560 Training loss: 1.1671 2.4786 sec/batch
Epoch 19/20  Iteration 3228/3560 Training loss: 1.1673 2.4542 sec/batch
Epoch 19/20  Iteration 3229/3560 Training loss: 1.1670 2.4567 sec/batch
Epoch 19/20  Iteration 3230/3560 Training loss: 1.1650 2.4588 sec/batch
Epoch 19/20  Iteration 3231/3560 Training loss: 1.1638 2.4500 sec/batch
Epoch 19/20  Iteration 3232/3560 Training loss: 1.1647 2.4596 sec/batch
Epoch 19/20  Iteration 3233/3560 Training loss: 1.1644 2.4670 sec/batch
Epoch 19/20  Iteration 3234/3560 Training loss: 1.1643 2.4592 sec/batch
Epoch 19/20  Iteration 3235/3560 Training loss: 1.1638 2.4512 sec/batch
Epoch 19/20  Iteration 3236/3560 Training loss: 1.1627 2.4585 sec/batch
Epoch 19/20  Iteration 3237/3560 Training loss: 1.1626 2.4678 sec/batch
Epoch 19/20  Iteration 3238/3560 Training loss: 1.1627 2.4651 sec/batch
Epoch 19/20  Iteration 3239/3560 Training loss: 1.1623 2.4730 sec/batch
Epoch 19/20  Iteration 3240/3560 Training loss: 1.1619 2.4599 sec/batch
Epoch 19/20  Iteration 3241/3560 Training loss: 1.1612 2.4506 sec/batch
Epoch 19/20  Iteration 3242/3560 Training loss: 1.1601 2.4669 sec/batch
Epoch 19/20  Iteration 3243/3560 Training loss: 1.1589 2.4473 sec/batch
Epoch 19/20  Iteration 3244/3560 Training loss: 1.1589 2.4601 sec/batch
Epoch 19/20  Iteration 3245/3560 Training loss: 1.1581 2.4792 sec/batch
Epoch 19/20  Iteration 3246/3560 Training loss: 1.1589 2.4562 sec/batch
Epoch 19/20  Iteration 3247/3560 Training loss: 1.1588 2.4618 sec/batch
Epoch 19/20  Iteration 3248/3560 Training loss: 1.1582 2.4537 sec/batch
Epoch 19/20  Iteration 3249/3560 Training loss: 1.1584 2.4695 sec/batch
Epoch 19/20  Iteration 3250/3560 Training loss: 1.1577 2.4510 sec/batch
Epoch 19/20  Iteration 3251/3560 Training loss: 1.1573 2.4747 sec/batch
Epoch 19/20  Iteration 3252/3560 Training loss: 1.1569 2.4569 sec/batch
Epoch 19/20  Iteration 3253/3560 Training loss: 1.1571 2.4473 sec/batch
Epoch 19/20  Iteration 3254/3560 Training loss: 1.1575 2.4592 sec/batch
Epoch 19/20  Iteration 3255/3560 Training loss: 1.1569 2.4714 sec/batch
Epoch 19/20  Iteration 3256/3560 Training loss: 1.1576 2.4602 sec/batch
Epoch 19/20  Iteration 3257/3560 Training loss: 1.1576 2.4616 sec/batch
Epoch 19/20  Iteration 3258/3560 Training loss: 1.1575 2.5076 sec/batch
Epoch 19/20  Iteration 3259/3560 Training loss: 1.1573 2.4551 sec/batch
Epoch 19/20  Iteration 3260/3560 Training loss: 1.1574 2.4501 sec/batch
Epoch 19/20  Iteration 3261/3560 Training loss: 1.1576 2.4558 sec/batch
Epoch 19/20  Iteration 3262/3560 Training loss: 1.1575 2.4536 sec/batch
Epoch 19/20  Iteration 3263/3560 Training loss: 1.1570 2.4973 sec/batch
Epoch 19/20  Iteration 3264/3560 Training loss: 1.1576 2.4592 sec/batch
Epoch 19/20  Iteration 3265/3560 Training loss: 1.1576 2.4517 sec/batch
Epoch 19/20  Iteration 3266/3560 Training loss: 1.1585 2.4626 sec/batch
Epoch 19/20  Iteration 3267/3560 Training loss: 1.1589 2.4735 sec/batch
Epoch 19/20  Iteration 3268/3560 Training loss: 1.1591 2.4626 sec/batch
Epoch 19/20  Iteration 3269/3560 Training loss: 1.1592 2.5822 sec/batch
Epoch 19/20  Iteration 3270/3560 Training loss: 1.1593 2.5563 sec/batch
Epoch 19/20  Iteration 3271/3560 Training loss: 1.1596 2.4694 sec/batch
Epoch 19/20  Iteration 3272/3560 Training loss: 1.1596 2.4554 sec/batch
Epoch 19/20  Iteration 3273/3560 Training loss: 1.1598 2.4609 sec/batch
Epoch 19/20  Iteration 3274/3560 Training loss: 1.1599 2.4535 sec/batch
Epoch 19/20  Iteration 3275/3560 Training loss: 1.1605 2.4759 sec/batch
Epoch 19/20  Iteration 3276/3560 Training loss: 1.1609 2.4477 sec/batch
Epoch 19/20  Iteration 3277/3560 Training loss: 1.1615 2.6616 sec/batch
Epoch 19/20  Iteration 3278/3560 Training loss: 1.1613 2.5788 sec/batch
Epoch 19/20  Iteration 3279/3560 Training loss: 1.1614 2.4453 sec/batch
Epoch 19/20  Iteration 3280/3560 Training loss: 1.1614 2.4643 sec/batch
Epoch 19/20  Iteration 3281/3560 Training loss: 1.1613 2.4582 sec/batch
Epoch 19/20  Iteration 3282/3560 Training loss: 1.1614 2.4538 sec/batch
Epoch 19/20  Iteration 3283/3560 Training loss: 1.1608 2.4541 sec/batch
Epoch 19/20  Iteration 3284/3560 Training loss: 1.1607 2.4676 sec/batch
Epoch 19/20  Iteration 3285/3560 Training loss: 1.1603 2.4668 sec/batch
Epoch 19/20  Iteration 3286/3560 Training loss: 1.1601 2.4719 sec/batch
Epoch 19/20  Iteration 3287/3560 Training loss: 1.1597 2.4734 sec/batch
Epoch 19/20  Iteration 3288/3560 Training loss: 1.1597 2.4606 sec/batch
Epoch 19/20  Iteration 3289/3560 Training loss: 1.1595 2.4505 sec/batch
Epoch 19/20  Iteration 3290/3560 Training loss: 1.1594 2.4446 sec/batch
Epoch 19/20  Iteration 3291/3560 Training loss: 1.1594 2.4740 sec/batch
Epoch 19/20  Iteration 3292/3560 Training loss: 1.1592 2.4515 sec/batch
Epoch 19/20  Iteration 3293/3560 Training loss: 1.1589 2.4487 sec/batch
Epoch 19/20  Iteration 3294/3560 Training loss: 1.1590 2.4565 sec/batch
Epoch 19/20  Iteration 3295/3560 Training loss: 1.1588 2.4599 sec/batch
Epoch 19/20  Iteration 3296/3560 Training loss: 1.1587 2.4632 sec/batch
Epoch 19/20  Iteration 3297/3560 Training loss: 1.1584 2.4509 sec/batch
Epoch 19/20  Iteration 3298/3560 Training loss: 1.1579 2.4508 sec/batch
Epoch 19/20  Iteration 3299/3560 Training loss: 1.1577 2.4599 sec/batch
Epoch 19/20  Iteration 3300/3560 Training loss: 1.1578 2.4618 sec/batch
Epoch 19/20  Iteration 3301/3560 Training loss: 1.1579 2.4732 sec/batch
Epoch 19/20  Iteration 3302/3560 Training loss: 1.1575 2.4516 sec/batch
Epoch 19/20  Iteration 3303/3560 Training loss: 1.1572 2.4594 sec/batch
Epoch 19/20  Iteration 3304/3560 Training loss: 1.1570 2.4495 sec/batch
Epoch 19/20  Iteration 3305/3560 Training loss: 1.1569 2.4538 sec/batch
Epoch 19/20  Iteration 3306/3560 Training loss: 1.1569 2.4613 sec/batch
Epoch 19/20  Iteration 3307/3560 Training loss: 1.1568 2.4541 sec/batch
Epoch 19/20  Iteration 3308/3560 Training loss: 1.1568 2.4535 sec/batch
Epoch 19/20  Iteration 3309/3560 Training loss: 1.1566 2.4604 sec/batch
Epoch 19/20  Iteration 3310/3560 Training loss: 1.1564 2.4517 sec/batch
Epoch 19/20  Iteration 3311/3560 Training loss: 1.1564 2.4841 sec/batch
Epoch 19/20  Iteration 3312/3560 Training loss: 1.1566 2.4719 sec/batch
Epoch 19/20  Iteration 3313/3560 Training loss: 1.1564 2.4590 sec/batch
Epoch 19/20  Iteration 3314/3560 Training loss: 1.1565 2.4638 sec/batch
Epoch 19/20  Iteration 3315/3560 Training loss: 1.1564 2.4816 sec/batch
Epoch 19/20  Iteration 3316/3560 Training loss: 1.1563 2.4528 sec/batch
Epoch 19/20  Iteration 3317/3560 Training loss: 1.1563 2.4564 sec/batch
Epoch 19/20  Iteration 3318/3560 Training loss: 1.1563 2.6905 sec/batch
Epoch 19/20  Iteration 3319/3560 Training loss: 1.1561 2.4549 sec/batch
Epoch 19/20  Iteration 3320/3560 Training loss: 1.1559 2.4641 sec/batch
Epoch 19/20  Iteration 3321/3560 Training loss: 1.1559 2.4499 sec/batch
Epoch 19/20  Iteration 3322/3560 Training loss: 1.1559 2.4570 sec/batch
Epoch 19/20  Iteration 3323/3560 Training loss: 1.1558 2.4586 sec/batch
Epoch 19/20  Iteration 3324/3560 Training loss: 1.1558 2.4646 sec/batch
Epoch 19/20  Iteration 3325/3560 Training loss: 1.1557 2.4613 sec/batch
Epoch 19/20  Iteration 3326/3560 Training loss: 1.1554 2.4476 sec/batch
Epoch 19/20  Iteration 3327/3560 Training loss: 1.1552 2.4498 sec/batch
Epoch 19/20  Iteration 3328/3560 Training loss: 1.1553 2.4669 sec/batch
Epoch 19/20  Iteration 3329/3560 Training loss: 1.1551 2.4504 sec/batch
Epoch 19/20  Iteration 3330/3560 Training loss: 1.1548 2.4594 sec/batch
Epoch 19/20  Iteration 3331/3560 Training loss: 1.1548 2.4566 sec/batch
Epoch 19/20  Iteration 3332/3560 Training loss: 1.1548 2.4635 sec/batch
Epoch 19/20  Iteration 3333/3560 Training loss: 1.1546 2.4552 sec/batch
Epoch 19/20  Iteration 3334/3560 Training loss: 1.1544 2.4529 sec/batch
Epoch 19/20  Iteration 3335/3560 Training loss: 1.1541 2.4709 sec/batch
Epoch 19/20  Iteration 3336/3560 Training loss: 1.1540 2.4612 sec/batch
Epoch 19/20  Iteration 3337/3560 Training loss: 1.1541 2.4638 sec/batch
Epoch 19/20  Iteration 3338/3560 Training loss: 1.1541 2.4476 sec/batch
Epoch 19/20  Iteration 3339/3560 Training loss: 1.1541 2.4542 sec/batch
Epoch 19/20  Iteration 3340/3560 Training loss: 1.1541 2.4606 sec/batch
Epoch 19/20  Iteration 3341/3560 Training loss: 1.1543 2.4525 sec/batch
Epoch 19/20  Iteration 3342/3560 Training loss: 1.1543 2.4668 sec/batch
Epoch 19/20  Iteration 3343/3560 Training loss: 1.1544 2.4464 sec/batch
Epoch 19/20  Iteration 3344/3560 Training loss: 1.1544 2.4626 sec/batch
Epoch 19/20  Iteration 3345/3560 Training loss: 1.1546 2.4578 sec/batch
Epoch 19/20  Iteration 3346/3560 Training loss: 1.1548 2.4579 sec/batch
Epoch 19/20  Iteration 3347/3560 Training loss: 1.1547 2.4698 sec/batch
Epoch 19/20  Iteration 3348/3560 Training loss: 1.1549 2.4765 sec/batch
Epoch 19/20  Iteration 3349/3560 Training loss: 1.1548 2.4652 sec/batch
Epoch 19/20  Iteration 3350/3560 Training loss: 1.1550 2.4597 sec/batch
Epoch 19/20  Iteration 3351/3560 Training loss: 1.1550 2.4489 sec/batch
Epoch 19/20  Iteration 3352/3560 Training loss: 1.1553 2.4608 sec/batch
Epoch 19/20  Iteration 3353/3560 Training loss: 1.1554 2.4546 sec/batch
Epoch 19/20  Iteration 3354/3560 Training loss: 1.1553 2.4616 sec/batch
Epoch 19/20  Iteration 3355/3560 Training loss: 1.1550 2.4541 sec/batch
Epoch 19/20  Iteration 3356/3560 Training loss: 1.1548 2.4810 sec/batch
Epoch 19/20  Iteration 3357/3560 Training loss: 1.1548 2.4768 sec/batch
Epoch 19/20  Iteration 3358/3560 Training loss: 1.1547 2.4516 sec/batch
Epoch 19/20  Iteration 3359/3560 Training loss: 1.1546 2.4588 sec/batch
Epoch 19/20  Iteration 3360/3560 Training loss: 1.1545 2.4769 sec/batch
Epoch 19/20  Iteration 3361/3560 Training loss: 1.1545 2.4638 sec/batch
Epoch 19/20  Iteration 3362/3560 Training loss: 1.1544 2.4536 sec/batch
Epoch 19/20  Iteration 3363/3560 Training loss: 1.1542 2.4687 sec/batch
Epoch 19/20  Iteration 3364/3560 Training loss: 1.1543 2.4726 sec/batch
Epoch 19/20  Iteration 3365/3560 Training loss: 1.1544 2.4466 sec/batch
Epoch 19/20  Iteration 3366/3560 Training loss: 1.1544 2.6486 sec/batch
Epoch 19/20  Iteration 3367/3560 Training loss: 1.1543 2.5000 sec/batch
Epoch 19/20  Iteration 3368/3560 Training loss: 1.1543 2.4775 sec/batch
Epoch 19/20  Iteration 3369/3560 Training loss: 1.1543 2.4577 sec/batch
Epoch 19/20  Iteration 3370/3560 Training loss: 1.1543 2.4706 sec/batch
Epoch 19/20  Iteration 3371/3560 Training loss: 1.1545 2.4485 sec/batch
Epoch 19/20  Iteration 3372/3560 Training loss: 1.1548 2.4743 sec/batch
Epoch 19/20  Iteration 3373/3560 Training loss: 1.1549 2.4797 sec/batch
Epoch 19/20  Iteration 3374/3560 Training loss: 1.1549 2.4559 sec/batch
Epoch 19/20  Iteration 3375/3560 Training loss: 1.1549 2.4721 sec/batch
Epoch 19/20  Iteration 3376/3560 Training loss: 1.1548 2.4573 sec/batch
Epoch 19/20  Iteration 3377/3560 Training loss: 1.1550 2.4494 sec/batch
Epoch 19/20  Iteration 3378/3560 Training loss: 1.1551 2.4579 sec/batch
Epoch 19/20  Iteration 3379/3560 Training loss: 1.1551 2.4453 sec/batch
Epoch 19/20  Iteration 3380/3560 Training loss: 1.1549 2.4729 sec/batch
Epoch 19/20  Iteration 3381/3560 Training loss: 1.1549 2.4476 sec/batch
Epoch 19/20  Iteration 3382/3560 Training loss: 1.1551 2.4709 sec/batch
Epoch 20/20  Iteration 3383/3560 Training loss: 1.2512 2.4463 sec/batch
Epoch 20/20  Iteration 3384/3560 Training loss: 1.2091 2.4493 sec/batch
Epoch 20/20  Iteration 3385/3560 Training loss: 1.1943 2.4787 sec/batch
Epoch 20/20  Iteration 3386/3560 Training loss: 1.1906 2.4499 sec/batch
Epoch 20/20  Iteration 3387/3560 Training loss: 1.1758 2.4691 sec/batch
Epoch 20/20  Iteration 3388/3560 Training loss: 1.1647 2.4601 sec/batch
Epoch 20/20  Iteration 3389/3560 Training loss: 1.1634 2.4621 sec/batch
Epoch 20/20  Iteration 3390/3560 Training loss: 1.1611 2.4643 sec/batch
Epoch 20/20  Iteration 3391/3560 Training loss: 1.1597 2.4500 sec/batch
Epoch 20/20  Iteration 3392/3560 Training loss: 1.1591 2.4692 sec/batch
Epoch 20/20  Iteration 3393/3560 Training loss: 1.1569 2.4864 sec/batch
Epoch 20/20  Iteration 3394/3560 Training loss: 1.1564 2.4572 sec/batch
Epoch 20/20  Iteration 3395/3560 Training loss: 1.1567 2.4658 sec/batch
Epoch 20/20  Iteration 3396/3560 Training loss: 1.1567 2.4582 sec/batch
Epoch 20/20  Iteration 3397/3560 Training loss: 1.1553 2.4915 sec/batch
Epoch 20/20  Iteration 3398/3560 Training loss: 1.1541 2.5474 sec/batch
Epoch 20/20  Iteration 3399/3560 Training loss: 1.1547 2.7299 sec/batch
Epoch 20/20  Iteration 3400/3560 Training loss: 1.1564 2.4553 sec/batch
Epoch 20/20  Iteration 3401/3560 Training loss: 1.1564 2.4515 sec/batch
Epoch 20/20  Iteration 3402/3560 Training loss: 1.1576 2.4596 sec/batch
Epoch 20/20  Iteration 3403/3560 Training loss: 1.1567 2.4536 sec/batch
Epoch 20/20  Iteration 3404/3560 Training loss: 1.1566 2.4564 sec/batch
Epoch 20/20  Iteration 3405/3560 Training loss: 1.1554 2.4621 sec/batch
Epoch 20/20  Iteration 3406/3560 Training loss: 1.1561 2.4566 sec/batch
Epoch 20/20  Iteration 3407/3560 Training loss: 1.1560 2.4575 sec/batch
Epoch 20/20  Iteration 3408/3560 Training loss: 1.1542 2.4540 sec/batch
Epoch 20/20  Iteration 3409/3560 Training loss: 1.1534 2.4830 sec/batch
Epoch 20/20  Iteration 3410/3560 Training loss: 1.1543 2.4571 sec/batch
Epoch 20/20  Iteration 3411/3560 Training loss: 1.1541 2.4663 sec/batch
Epoch 20/20  Iteration 3412/3560 Training loss: 1.1539 2.4921 sec/batch
Epoch 20/20  Iteration 3413/3560 Training loss: 1.1532 2.4559 sec/batch
Epoch 20/20  Iteration 3414/3560 Training loss: 1.1523 2.4669 sec/batch
Epoch 20/20  Iteration 3415/3560 Training loss: 1.1522 2.6740 sec/batch
Epoch 20/20  Iteration 3416/3560 Training loss: 1.1522 2.4728 sec/batch
Epoch 20/20  Iteration 3417/3560 Training loss: 1.1517 2.4488 sec/batch
Epoch 20/20  Iteration 3418/3560 Training loss: 1.1513 2.4497 sec/batch
Epoch 20/20  Iteration 3419/3560 Training loss: 1.1508 2.4700 sec/batch
Epoch 20/20  Iteration 3420/3560 Training loss: 1.1497 2.4582 sec/batch
Epoch 20/20  Iteration 3421/3560 Training loss: 1.1486 2.4755 sec/batch
Epoch 20/20  Iteration 3422/3560 Training loss: 1.1484 2.4478 sec/batch
Epoch 20/20  Iteration 3423/3560 Training loss: 1.1477 2.4485 sec/batch
Epoch 20/20  Iteration 3424/3560 Training loss: 1.1485 2.4559 sec/batch
Epoch 20/20  Iteration 3425/3560 Training loss: 1.1485 2.4497 sec/batch
Epoch 20/20  Iteration 3426/3560 Training loss: 1.1479 2.4594 sec/batch
Epoch 20/20  Iteration 3427/3560 Training loss: 1.1481 2.4595 sec/batch
Epoch 20/20  Iteration 3428/3560 Training loss: 1.1476 2.4570 sec/batch
Epoch 20/20  Iteration 3429/3560 Training loss: 1.1471 2.4620 sec/batch
Epoch 20/20  Iteration 3430/3560 Training loss: 1.1466 2.4608 sec/batch
Epoch 20/20  Iteration 3431/3560 Training loss: 1.1467 2.4611 sec/batch
Epoch 20/20  Iteration 3432/3560 Training loss: 1.1470 2.4607 sec/batch
Epoch 20/20  Iteration 3433/3560 Training loss: 1.1465 2.4796 sec/batch
Epoch 20/20  Iteration 3434/3560 Training loss: 1.1470 2.4647 sec/batch
Epoch 20/20  Iteration 3435/3560 Training loss: 1.1471 2.4494 sec/batch
Epoch 20/20  Iteration 3436/3560 Training loss: 1.1474 2.4622 sec/batch
Epoch 20/20  Iteration 3437/3560 Training loss: 1.1471 2.4605 sec/batch
Epoch 20/20  Iteration 3438/3560 Training loss: 1.1473 2.4610 sec/batch
Epoch 20/20  Iteration 3439/3560 Training loss: 1.1474 2.4511 sec/batch
Epoch 20/20  Iteration 3440/3560 Training loss: 1.1475 2.4499 sec/batch
Epoch 20/20  Iteration 3441/3560 Training loss: 1.1470 2.4674 sec/batch
Epoch 20/20  Iteration 3442/3560 Training loss: 1.1476 2.4477 sec/batch
Epoch 20/20  Iteration 3443/3560 Training loss: 1.1476 2.4651 sec/batch
Epoch 20/20  Iteration 3444/3560 Training loss: 1.1484 2.4587 sec/batch
Epoch 20/20  Iteration 3445/3560 Training loss: 1.1487 2.4816 sec/batch
Epoch 20/20  Iteration 3446/3560 Training loss: 1.1489 2.4469 sec/batch
Epoch 20/20  Iteration 3447/3560 Training loss: 1.1489 2.4510 sec/batch
Epoch 20/20  Iteration 3448/3560 Training loss: 1.1490 2.4552 sec/batch
Epoch 20/20  Iteration 3449/3560 Training loss: 1.1493 2.4536 sec/batch
Epoch 20/20  Iteration 3450/3560 Training loss: 1.1492 2.4661 sec/batch
Epoch 20/20  Iteration 3451/3560 Training loss: 1.1494 2.4569 sec/batch
Epoch 20/20  Iteration 3452/3560 Training loss: 1.1494 2.4558 sec/batch
Epoch 20/20  Iteration 3453/3560 Training loss: 1.1501 2.4513 sec/batch
Epoch 20/20  Iteration 3454/3560 Training loss: 1.1503 2.4557 sec/batch
Epoch 20/20  Iteration 3455/3560 Training loss: 1.1508 2.4721 sec/batch
Epoch 20/20  Iteration 3456/3560 Training loss: 1.1506 2.4521 sec/batch
Epoch 20/20  Iteration 3457/3560 Training loss: 1.1507 2.4816 sec/batch
Epoch 20/20  Iteration 3458/3560 Training loss: 1.1508 2.4517 sec/batch
Epoch 20/20  Iteration 3459/3560 Training loss: 1.1508 2.4495 sec/batch
Epoch 20/20  Iteration 3460/3560 Training loss: 1.1508 2.4556 sec/batch
Epoch 20/20  Iteration 3461/3560 Training loss: 1.1503 2.4597 sec/batch
Epoch 20/20  Iteration 3462/3560 Training loss: 1.1503 2.4618 sec/batch
Epoch 20/20  Iteration 3463/3560 Training loss: 1.1500 2.5797 sec/batch
Epoch 20/20  Iteration 3464/3560 Training loss: 1.1498 2.5698 sec/batch
Epoch 20/20  Iteration 3465/3560 Training loss: 1.1494 2.4541 sec/batch
Epoch 20/20  Iteration 3466/3560 Training loss: 1.1494 2.4553 sec/batch
Epoch 20/20  Iteration 3467/3560 Training loss: 1.1491 2.4615 sec/batch
Epoch 20/20  Iteration 3468/3560 Training loss: 1.1490 2.4502 sec/batch
Epoch 20/20  Iteration 3469/3560 Training loss: 1.1490 2.4797 sec/batch
Epoch 20/20  Iteration 3470/3560 Training loss: 1.1487 2.4517 sec/batch
Epoch 20/20  Iteration 3471/3560 Training loss: 1.1485 2.4634 sec/batch
Epoch 20/20  Iteration 3472/3560 Training loss: 1.1486 2.4566 sec/batch
Epoch 20/20  Iteration 3473/3560 Training loss: 1.1484 2.4625 sec/batch
Epoch 20/20  Iteration 3474/3560 Training loss: 1.1483 2.4598 sec/batch
Epoch 20/20  Iteration 3475/3560 Training loss: 1.1479 2.4572 sec/batch
Epoch 20/20  Iteration 3476/3560 Training loss: 1.1476 2.4508 sec/batch
Epoch 20/20  Iteration 3477/3560 Training loss: 1.1473 2.4648 sec/batch
Epoch 20/20  Iteration 3478/3560 Training loss: 1.1474 2.4540 sec/batch
Epoch 20/20  Iteration 3479/3560 Training loss: 1.1473 2.4608 sec/batch
Epoch 20/20  Iteration 3480/3560 Training loss: 1.1468 2.4517 sec/batch
Epoch 20/20  Iteration 3481/3560 Training loss: 1.1465 2.4570 sec/batch
Epoch 20/20  Iteration 3482/3560 Training loss: 1.1463 2.4855 sec/batch
Epoch 20/20  Iteration 3483/3560 Training loss: 1.1462 2.4545 sec/batch
Epoch 20/20  Iteration 3484/3560 Training loss: 1.1460 2.4616 sec/batch
Epoch 20/20  Iteration 3485/3560 Training loss: 1.1459 2.4637 sec/batch
Epoch 20/20  Iteration 3486/3560 Training loss: 1.1458 2.4639 sec/batch
Epoch 20/20  Iteration 3487/3560 Training loss: 1.1456 2.4521 sec/batch
Epoch 20/20  Iteration 3488/3560 Training loss: 1.1456 2.4547 sec/batch
Epoch 20/20  Iteration 3489/3560 Training loss: 1.1455 2.4650 sec/batch
Epoch 20/20  Iteration 3490/3560 Training loss: 1.1455 2.4520 sec/batch
Epoch 20/20  Iteration 3491/3560 Training loss: 1.1453 2.4599 sec/batch
Epoch 20/20  Iteration 3492/3560 Training loss: 1.1454 2.4650 sec/batch
Epoch 20/20  Iteration 3493/3560 Training loss: 1.1453 2.4523 sec/batch
Epoch 20/20  Iteration 3494/3560 Training loss: 1.1451 2.4818 sec/batch
Epoch 20/20  Iteration 3495/3560 Training loss: 1.1451 2.4528 sec/batch
Epoch 20/20  Iteration 3496/3560 Training loss: 1.1450 2.4670 sec/batch
Epoch 20/20  Iteration 3497/3560 Training loss: 1.1448 2.4566 sec/batch
Epoch 20/20  Iteration 3498/3560 Training loss: 1.1446 2.4576 sec/batch
Epoch 20/20  Iteration 3499/3560 Training loss: 1.1446 2.4458 sec/batch
Epoch 20/20  Iteration 3500/3560 Training loss: 1.1446 2.4480 sec/batch
Epoch 20/20  Iteration 3501/3560 Training loss: 1.1445 2.4541 sec/batch
Epoch 20/20  Iteration 3502/3560 Training loss: 1.1444 2.4577 sec/batch
Epoch 20/20  Iteration 3503/3560 Training loss: 1.1444 2.4607 sec/batch
Epoch 20/20  Iteration 3504/3560 Training loss: 1.1442 2.4534 sec/batch
Epoch 20/20  Iteration 3505/3560 Training loss: 1.1439 2.4579 sec/batch
Epoch 20/20  Iteration 3506/3560 Training loss: 1.1440 2.4753 sec/batch
Epoch 20/20  Iteration 3507/3560 Training loss: 1.1439 2.4753 sec/batch
Epoch 20/20  Iteration 3508/3560 Training loss: 1.1435 2.4644 sec/batch
Epoch 20/20  Iteration 3509/3560 Training loss: 1.1436 2.4600 sec/batch
Epoch 20/20  Iteration 3510/3560 Training loss: 1.1436 2.4589 sec/batch
Epoch 20/20  Iteration 3511/3560 Training loss: 1.1434 2.4610 sec/batch
Epoch 20/20  Iteration 3512/3560 Training loss: 1.1431 2.6267 sec/batch
Epoch 20/20  Iteration 3513/3560 Training loss: 1.1427 2.5152 sec/batch
Epoch 20/20  Iteration 3514/3560 Training loss: 1.1426 2.4526 sec/batch
Epoch 20/20  Iteration 3515/3560 Training loss: 1.1427 2.4597 sec/batch
Epoch 20/20  Iteration 3516/3560 Training loss: 1.1427 2.4504 sec/batch
Epoch 20/20  Iteration 3517/3560 Training loss: 1.1428 2.4612 sec/batch
Epoch 20/20  Iteration 3518/3560 Training loss: 1.1428 2.4730 sec/batch
Epoch 20/20  Iteration 3519/3560 Training loss: 1.1429 2.4505 sec/batch
Epoch 20/20  Iteration 3520/3560 Training loss: 1.1430 2.9471 sec/batch
Epoch 20/20  Iteration 3521/3560 Training loss: 1.1430 2.5317 sec/batch
Epoch 20/20  Iteration 3522/3560 Training loss: 1.1430 2.4602 sec/batch
Epoch 20/20  Iteration 3523/3560 Training loss: 1.1433 2.4534 sec/batch
Epoch 20/20  Iteration 3524/3560 Training loss: 1.1434 2.4485 sec/batch
Epoch 20/20  Iteration 3525/3560 Training loss: 1.1433 2.4644 sec/batch
Epoch 20/20  Iteration 3526/3560 Training loss: 1.1436 2.4561 sec/batch
Epoch 20/20  Iteration 3527/3560 Training loss: 1.1435 2.4753 sec/batch
Epoch 20/20  Iteration 3528/3560 Training loss: 1.1436 2.5077 sec/batch
Epoch 20/20  Iteration 3529/3560 Training loss: 1.1437 2.4770 sec/batch
Epoch 20/20  Iteration 3530/3560 Training loss: 1.1439 2.5042 sec/batch
Epoch 20/20  Iteration 3531/3560 Training loss: 1.1440 2.4674 sec/batch
Epoch 20/20  Iteration 3532/3560 Training loss: 1.1438 2.4520 sec/batch
Epoch 20/20  Iteration 3533/3560 Training loss: 1.1436 2.4534 sec/batch
Epoch 20/20  Iteration 3534/3560 Training loss: 1.1435 2.4670 sec/batch
Epoch 20/20  Iteration 3535/3560 Training loss: 1.1436 2.4587 sec/batch
Epoch 20/20  Iteration 3536/3560 Training loss: 1.1435 2.4655 sec/batch
Epoch 20/20  Iteration 3537/3560 Training loss: 1.1435 2.4583 sec/batch
Epoch 20/20  Iteration 3538/3560 Training loss: 1.1434 2.4600 sec/batch
Epoch 20/20  Iteration 3539/3560 Training loss: 1.1434 2.4516 sec/batch
Epoch 20/20  Iteration 3540/3560 Training loss: 1.1434 2.4560 sec/batch
Epoch 20/20  Iteration 3541/3560 Training loss: 1.1433 2.4568 sec/batch
Epoch 20/20  Iteration 3542/3560 Training loss: 1.1433 2.4655 sec/batch
Epoch 20/20  Iteration 3543/3560 Training loss: 1.1434 2.4619 sec/batch
Epoch 20/20  Iteration 3544/3560 Training loss: 1.1434 2.4529 sec/batch
Epoch 20/20  Iteration 3545/3560 Training loss: 1.1434 2.4622 sec/batch
Epoch 20/20  Iteration 3546/3560 Training loss: 1.1433 2.4618 sec/batch
Epoch 20/20  Iteration 3547/3560 Training loss: 1.1433 2.4550 sec/batch
Epoch 20/20  Iteration 3548/3560 Training loss: 1.1433 2.4633 sec/batch
Epoch 20/20  Iteration 3549/3560 Training loss: 1.1435 2.4579 sec/batch
Epoch 20/20  Iteration 3550/3560 Training loss: 1.1438 2.4640 sec/batch
Epoch 20/20  Iteration 3551/3560 Training loss: 1.1439 2.4582 sec/batch
Epoch 20/20  Iteration 3552/3560 Training loss: 1.1440 2.4547 sec/batch
Epoch 20/20  Iteration 3553/3560 Training loss: 1.1440 2.4612 sec/batch
Epoch 20/20  Iteration 3554/3560 Training loss: 1.1439 2.4716 sec/batch
Epoch 20/20  Iteration 3555/3560 Training loss: 1.1441 2.4622 sec/batch
Epoch 20/20  Iteration 3556/3560 Training loss: 1.1441 2.5824 sec/batch
Epoch 20/20  Iteration 3557/3560 Training loss: 1.1441 2.4581 sec/batch
Epoch 20/20  Iteration 3558/3560 Training loss: 1.1440 2.4719 sec/batch
Epoch 20/20  Iteration 3559/3560 Training loss: 1.1440 2.4529 sec/batch
Epoch 20/20  Iteration 3560/3560 Training loss: 1.1443 2.4674 sec/batch
Epoch 1/20  Iteration 1/3560 Training loss: 4.4219 2.4841 sec/batch
Epoch 1/20  Iteration 2/3560 Training loss: 4.3968 2.4576 sec/batch
Epoch 1/20  Iteration 3/3560 Training loss: 4.3521 2.4437 sec/batch
Epoch 1/20  Iteration 4/3560 Training loss: 4.2255 2.4544 sec/batch
Epoch 1/20  Iteration 5/3560 Training loss: 4.0776 2.4598 sec/batch
Epoch 1/20  Iteration 6/3560 Training loss: 3.9884 2.4608 sec/batch
Epoch 1/20  Iteration 7/3560 Training loss: 3.9128 2.4498 sec/batch
Epoch 1/20  Iteration 8/3560 Training loss: 3.8422 2.4418 sec/batch
Epoch 1/20  Iteration 9/3560 Training loss: 3.7821 2.4611 sec/batch
Epoch 1/20  Iteration 10/3560 Training loss: 3.7359 2.4542 sec/batch
Epoch 1/20  Iteration 11/3560 Training loss: 3.6940 2.4464 sec/batch
Epoch 1/20  Iteration 12/3560 Training loss: 3.6581 2.4553 sec/batch
Epoch 1/20  Iteration 13/3560 Training loss: 3.6264 2.4504 sec/batch
Epoch 1/20  Iteration 14/3560 Training loss: 3.5994 2.4625 sec/batch
Epoch 1/20  Iteration 15/3560 Training loss: 3.5743 2.4419 sec/batch
Epoch 1/20  Iteration 16/3560 Training loss: 3.5521 2.4501 sec/batch
Epoch 1/20  Iteration 17/3560 Training loss: 3.5316 2.4667 sec/batch
Epoch 1/20  Iteration 18/3560 Training loss: 3.5150 2.4766 sec/batch
Epoch 1/20  Iteration 19/3560 Training loss: 3.4984 2.4648 sec/batch
Epoch 1/20  Iteration 20/3560 Training loss: 3.4820 2.4495 sec/batch
Epoch 1/20  Iteration 21/3560 Training loss: 3.4673 2.4465 sec/batch
Epoch 1/20  Iteration 22/3560 Training loss: 3.4540 2.4554 sec/batch
Epoch 1/20  Iteration 23/3560 Training loss: 3.4413 2.4526 sec/batch
Epoch 1/20  Iteration 24/3560 Training loss: 3.4300 2.4561 sec/batch
Epoch 1/20  Iteration 25/3560 Training loss: 3.4189 2.4494 sec/batch
Epoch 1/20  Iteration 26/3560 Training loss: 3.4092 2.4581 sec/batch
Epoch 1/20  Iteration 27/3560 Training loss: 3.4000 2.4375 sec/batch
Epoch 1/20  Iteration 28/3560 Training loss: 3.3905 2.4481 sec/batch
Epoch 1/20  Iteration 29/3560 Training loss: 3.3823 2.4542 sec/batch
Epoch 1/20  Iteration 30/3560 Training loss: 3.3744 2.4716 sec/batch
Epoch 1/20  Iteration 31/3560 Training loss: 3.3677 2.4576 sec/batch
Epoch 1/20  Iteration 32/3560 Training loss: 3.3603 2.4406 sec/batch
Epoch 1/20  Iteration 33/3560 Training loss: 3.3531 2.4443 sec/batch
Epoch 1/20  Iteration 34/3560 Training loss: 3.3466 2.4637 sec/batch
Epoch 1/20  Iteration 35/3560 Training loss: 3.3400 2.4341 sec/batch
Epoch 1/20  Iteration 36/3560 Training loss: 3.3342 2.4566 sec/batch
Epoch 1/20  Iteration 37/3560 Training loss: 3.3278 2.4526 sec/batch
Epoch 1/20  Iteration 38/3560 Training loss: 3.3218 2.4524 sec/batch
Epoch 1/20  Iteration 39/3560 Training loss: 3.3160 2.4498 sec/batch
Epoch 1/20  Iteration 40/3560 Training loss: 3.3106 2.4458 sec/batch
Epoch 1/20  Iteration 41/3560 Training loss: 3.3052 2.4575 sec/batch
Epoch 1/20  Iteration 42/3560 Training loss: 3.3002 2.4466 sec/batch
Epoch 1/20  Iteration 43/3560 Training loss: 3.2951 2.6856 sec/batch
Epoch 1/20  Iteration 44/3560 Training loss: 3.2901 2.4481 sec/batch
Epoch 1/20  Iteration 45/3560 Training loss: 3.2852 2.4636 sec/batch
Epoch 1/20  Iteration 46/3560 Training loss: 3.2809 2.4507 sec/batch
Epoch 1/20  Iteration 47/3560 Training loss: 3.2768 2.4541 sec/batch
Epoch 1/20  Iteration 48/3560 Training loss: 3.2729 2.5690 sec/batch
Epoch 1/20  Iteration 49/3560 Training loss: 3.2691 2.5690 sec/batch
Epoch 1/20  Iteration 50/3560 Training loss: 3.2653 2.4600 sec/batch
Epoch 1/20  Iteration 51/3560 Training loss: 3.2616 2.4665 sec/batch
Epoch 1/20  Iteration 52/3560 Training loss: 3.2577 2.4576 sec/batch
Epoch 1/20  Iteration 53/3560 Training loss: 3.2542 2.4561 sec/batch
Epoch 1/20  Iteration 54/3560 Training loss: 3.2504 2.4438 sec/batch
Epoch 1/20  Iteration 55/3560 Training loss: 3.2468 2.4714 sec/batch
Epoch 1/20  Iteration 56/3560 Training loss: 3.2431 2.4574 sec/batch
Epoch 1/20  Iteration 57/3560 Training loss: 3.2394 2.4561 sec/batch
Epoch 1/20  Iteration 58/3560 Training loss: 3.2359 2.4402 sec/batch
Epoch 1/20  Iteration 59/3560 Training loss: 3.2324 2.4434 sec/batch
Epoch 1/20  Iteration 60/3560 Training loss: 3.2291 2.4702 sec/batch
Epoch 1/20  Iteration 61/3560 Training loss: 3.2258 2.4807 sec/batch
Epoch 1/20  Iteration 62/3560 Training loss: 3.2228 2.4545 sec/batch
Epoch 1/20  Iteration 63/3560 Training loss: 3.2200 2.4527 sec/batch
Epoch 1/20  Iteration 64/3560 Training loss: 3.2165 2.4433 sec/batch
Epoch 1/20  Iteration 65/3560 Training loss: 3.2132 2.4571 sec/batch
Epoch 1/20  Iteration 66/3560 Training loss: 3.2103 2.4511 sec/batch
Epoch 1/20  Iteration 67/3560 Training loss: 3.2074 2.4761 sec/batch
Epoch 1/20  Iteration 68/3560 Training loss: 3.2038 2.4506 sec/batch
Epoch 1/20  Iteration 69/3560 Training loss: 3.2004 2.4644 sec/batch
Epoch 1/20  Iteration 70/3560 Training loss: 3.1976 2.4550 sec/batch
Epoch 1/20  Iteration 71/3560 Training loss: 3.1945 2.4554 sec/batch
Epoch 1/20  Iteration 72/3560 Training loss: 3.1917 2.4658 sec/batch
Epoch 1/20  Iteration 73/3560 Training loss: 3.1886 2.4571 sec/batch
Epoch 1/20  Iteration 74/3560 Training loss: 3.1855 2.4650 sec/batch
Epoch 1/20  Iteration 75/3560 Training loss: 3.1825 2.4508 sec/batch
Epoch 1/20  Iteration 76/3560 Training loss: 3.1796 2.4482 sec/batch
Epoch 1/20  Iteration 77/3560 Training loss: 3.1765 2.4615 sec/batch
Epoch 1/20  Iteration 78/3560 Training loss: 3.1735 2.5144 sec/batch
Epoch 1/20  Iteration 79/3560 Training loss: 3.1703 2.4708 sec/batch
Epoch 1/20  Iteration 80/3560 Training loss: 3.1670 2.4423 sec/batch
Epoch 1/20  Iteration 81/3560 Training loss: 3.1638 2.6571 sec/batch
Epoch 1/20  Iteration 82/3560 Training loss: 3.1608 2.4691 sec/batch
Epoch 1/20  Iteration 83/3560 Training loss: 3.1575 2.4504 sec/batch
Epoch 1/20  Iteration 84/3560 Training loss: 3.1542 2.4598 sec/batch
Epoch 1/20  Iteration 85/3560 Training loss: 3.1507 2.4511 sec/batch
Epoch 1/20  Iteration 86/3560 Training loss: 3.1472 2.4664 sec/batch
Epoch 1/20  Iteration 87/3560 Training loss: 3.1437 2.4847 sec/batch
Epoch 1/20  Iteration 88/3560 Training loss: 3.1403 2.4614 sec/batch
Epoch 1/20  Iteration 89/3560 Training loss: 3.1368 2.4625 sec/batch
Epoch 1/20  Iteration 90/3560 Training loss: 3.1336 2.4578 sec/batch
Epoch 1/20  Iteration 91/3560 Training loss: 3.1301 2.4728 sec/batch
Epoch 1/20  Iteration 92/3560 Training loss: 3.1264 2.4466 sec/batch
Epoch 1/20  Iteration 93/3560 Training loss: 3.1228 2.4602 sec/batch
Epoch 1/20  Iteration 94/3560 Training loss: 3.1191 2.4647 sec/batch
Epoch 1/20  Iteration 95/3560 Training loss: 3.1152 2.4563 sec/batch
Epoch 1/20  Iteration 96/3560 Training loss: 3.1114 2.4537 sec/batch
Epoch 1/20  Iteration 97/3560 Training loss: 3.1077 2.6307 sec/batch
Epoch 1/20  Iteration 98/3560 Training loss: 3.1039 2.5176 sec/batch
Epoch 1/20  Iteration 99/3560 Training loss: 3.1001 2.4594 sec/batch
Epoch 1/20  Iteration 100/3560 Training loss: 3.0964 2.4536 sec/batch
Epoch 1/20  Iteration 101/3560 Training loss: 3.0928 2.4515 sec/batch
Epoch 1/20  Iteration 102/3560 Training loss: 3.0890 2.4513 sec/batch
Epoch 1/20  Iteration 103/3560 Training loss: 3.0852 2.4736 sec/batch
Epoch 1/20  Iteration 104/3560 Training loss: 3.0813 2.4483 sec/batch
Epoch 1/20  Iteration 105/3560 Training loss: 3.0774 2.4590 sec/batch
Epoch 1/20  Iteration 106/3560 Training loss: 3.0736 2.4649 sec/batch
Epoch 1/20  Iteration 107/3560 Training loss: 3.0697 2.4563 sec/batch
Epoch 1/20  Iteration 108/3560 Training loss: 3.0660 2.4513 sec/batch
Epoch 1/20  Iteration 109/3560 Training loss: 3.0623 2.4507 sec/batch
Epoch 1/20  Iteration 110/3560 Training loss: 3.0581 2.4637 sec/batch
Epoch 1/20  Iteration 111/3560 Training loss: 3.0543 2.4533 sec/batch
Epoch 1/20  Iteration 112/3560 Training loss: 3.0506 2.4436 sec/batch
Epoch 1/20  Iteration 113/3560 Training loss: 3.0467 2.4655 sec/batch
Epoch 1/20  Iteration 114/3560 Training loss: 3.0427 2.4522 sec/batch
Epoch 1/20  Iteration 115/3560 Training loss: 3.0387 2.4818 sec/batch
Epoch 1/20  Iteration 116/3560 Training loss: 3.0346 2.4423 sec/batch
Epoch 1/20  Iteration 117/3560 Training loss: 3.0306 2.4679 sec/batch
Epoch 1/20  Iteration 118/3560 Training loss: 3.0268 2.4613 sec/batch
Epoch 1/20  Iteration 119/3560 Training loss: 3.0231 2.4569 sec/batch
Epoch 1/20  Iteration 120/3560 Training loss: 3.0194 2.4702 sec/batch
Epoch 1/20  Iteration 121/3560 Training loss: 3.0157 2.4441 sec/batch
Epoch 1/20  Iteration 122/3560 Training loss: 3.0119 2.4705 sec/batch
Epoch 1/20  Iteration 123/3560 Training loss: 3.0081 2.4432 sec/batch
Epoch 1/20  Iteration 124/3560 Training loss: 3.0044 2.4481 sec/batch
Epoch 1/20  Iteration 125/3560 Training loss: 3.0006 2.4691 sec/batch
Epoch 1/20  Iteration 126/3560 Training loss: 2.9967 2.4720 sec/batch
Epoch 1/20  Iteration 127/3560 Training loss: 2.9930 2.4806 sec/batch
Epoch 1/20  Iteration 128/3560 Training loss: 2.9894 2.4460 sec/batch
Epoch 1/20  Iteration 129/3560 Training loss: 2.9857 2.4569 sec/batch
Epoch 1/20  Iteration 130/3560 Training loss: 2.9820 2.4502 sec/batch
Epoch 1/20  Iteration 131/3560 Training loss: 2.9783 2.4612 sec/batch
Epoch 1/20  Iteration 132/3560 Training loss: 2.9748 2.4566 sec/batch
Epoch 1/20  Iteration 133/3560 Training loss: 2.9715 2.4532 sec/batch
Epoch 1/20  Iteration 134/3560 Training loss: 2.9680 2.4579 sec/batch
Epoch 1/20  Iteration 135/3560 Training loss: 2.9643 2.4518 sec/batch
Epoch 1/20  Iteration 136/3560 Training loss: 2.9608 2.4490 sec/batch
Epoch 1/20  Iteration 137/3560 Training loss: 2.9573 2.4444 sec/batch
Epoch 1/20  Iteration 138/3560 Training loss: 2.9538 2.4498 sec/batch
Epoch 1/20  Iteration 139/3560 Training loss: 2.9506 2.4660 sec/batch
Epoch 1/20  Iteration 140/3560 Training loss: 2.9471 2.4710 sec/batch
Epoch 1/20  Iteration 141/3560 Training loss: 2.9439 2.4639 sec/batch
Epoch 1/20  Iteration 142/3560 Training loss: 2.9404 2.4538 sec/batch
Epoch 1/20  Iteration 143/3560 Training loss: 2.9370 2.4606 sec/batch
Epoch 1/20  Iteration 144/3560 Training loss: 2.9336 2.4566 sec/batch
Epoch 1/20  Iteration 145/3560 Training loss: 2.9303 2.6379 sec/batch
Epoch 1/20  Iteration 146/3560 Training loss: 2.9271 2.5221 sec/batch
Epoch 1/20  Iteration 147/3560 Training loss: 2.9239 2.4445 sec/batch
Epoch 1/20  Iteration 148/3560 Training loss: 2.9207 2.4547 sec/batch
Epoch 1/20  Iteration 149/3560 Training loss: 2.9173 2.4464 sec/batch
Epoch 1/20  Iteration 150/3560 Training loss: 2.9140 2.4545 sec/batch
Epoch 1/20  Iteration 151/3560 Training loss: 2.9110 2.4483 sec/batch
Epoch 1/20  Iteration 152/3560 Training loss: 2.9081 2.4619 sec/batch
Epoch 1/20  Iteration 153/3560 Training loss: 2.9050 2.4644 sec/batch
Epoch 1/20  Iteration 154/3560 Training loss: 2.9020 2.4470 sec/batch
Epoch 1/20  Iteration 155/3560 Training loss: 2.8987 2.4429 sec/batch
Epoch 1/20  Iteration 156/3560 Training loss: 2.8956 2.4530 sec/batch
Epoch 1/20  Iteration 157/3560 Training loss: 2.8924 2.4477 sec/batch
Epoch 1/20  Iteration 158/3560 Training loss: 2.8893 2.4693 sec/batch
Epoch 1/20  Iteration 159/3560 Training loss: 2.8861 2.4462 sec/batch
Epoch 1/20  Iteration 160/3560 Training loss: 2.8832 2.4595 sec/batch
Epoch 1/20  Iteration 161/3560 Training loss: 2.8802 2.4508 sec/batch
Epoch 1/20  Iteration 162/3560 Training loss: 2.8771 2.4441 sec/batch
Epoch 1/20  Iteration 163/3560 Training loss: 2.8740 2.4565 sec/batch
Epoch 1/20  Iteration 164/3560 Training loss: 2.8710 2.4623 sec/batch
Epoch 1/20  Iteration 165/3560 Training loss: 2.8681 2.4549 sec/batch
Epoch 1/20  Iteration 166/3560 Training loss: 2.8652 2.4465 sec/batch
Epoch 1/20  Iteration 167/3560 Training loss: 2.8623 2.4627 sec/batch
Epoch 1/20  Iteration 168/3560 Training loss: 2.8595 2.4558 sec/batch
Epoch 1/20  Iteration 169/3560 Training loss: 2.8567 2.4523 sec/batch
Epoch 1/20  Iteration 170/3560 Training loss: 2.8537 2.4592 sec/batch
Epoch 1/20  Iteration 171/3560 Training loss: 2.8509 2.4482 sec/batch
Epoch 1/20  Iteration 172/3560 Training loss: 2.8482 2.4593 sec/batch
Epoch 1/20  Iteration 173/3560 Training loss: 2.8456 2.4448 sec/batch
Epoch 1/20  Iteration 174/3560 Training loss: 2.8431 2.6583 sec/batch
Epoch 1/20  Iteration 175/3560 Training loss: 2.8405 2.4703 sec/batch
Epoch 1/20  Iteration 176/3560 Training loss: 2.8382 2.4699 sec/batch
Epoch 1/20  Iteration 177/3560 Training loss: 2.8357 2.4558 sec/batch
Epoch 1/20  Iteration 178/3560 Training loss: 2.8331 2.4565 sec/batch
Epoch 2/20  Iteration 179/3560 Training loss: 2.4509 2.4479 sec/batch
Epoch 2/20  Iteration 180/3560 Training loss: 2.3853 2.4467 sec/batch
Epoch 2/20  Iteration 181/3560 Training loss: 2.3634 2.4432 sec/batch
Epoch 2/20  Iteration 182/3560 Training loss: 2.3581 2.4588 sec/batch
Epoch 2/20  Iteration 183/3560 Training loss: 2.3563 2.4517 sec/batch
Epoch 2/20  Iteration 184/3560 Training loss: 2.3535 2.4524 sec/batch
Epoch 2/20  Iteration 185/3560 Training loss: 2.3523 2.4548 sec/batch
Epoch 2/20  Iteration 186/3560 Training loss: 2.3517 2.4550 sec/batch
Epoch 2/20  Iteration 187/3560 Training loss: 2.3514 2.4786 sec/batch
Epoch 2/20  Iteration 188/3560 Training loss: 2.3497 2.4606 sec/batch
Epoch 2/20  Iteration 189/3560 Training loss: 2.3467 2.4563 sec/batch
Epoch 2/20  Iteration 190/3560 Training loss: 2.3451 2.4519 sec/batch
Epoch 2/20  Iteration 191/3560 Training loss: 2.3438 2.4600 sec/batch
Epoch 2/20  Iteration 192/3560 Training loss: 2.3452 2.4469 sec/batch
Epoch 2/20  Iteration 193/3560 Training loss: 2.3441 2.4474 sec/batch
Epoch 2/20  Iteration 194/3560 Training loss: 2.3429 2.6454 sec/batch
Epoch 2/20  Iteration 195/3560 Training loss: 2.3416 2.5091 sec/batch
Epoch 2/20  Iteration 196/3560 Training loss: 2.3424 2.4654 sec/batch
Epoch 2/20  Iteration 197/3560 Training loss: 2.3419 2.4408 sec/batch
Epoch 2/20  Iteration 198/3560 Training loss: 2.3397 2.4524 sec/batch
Epoch 2/20  Iteration 199/3560 Training loss: 2.3380 2.4525 sec/batch
Epoch 2/20  Iteration 200/3560 Training loss: 2.3383 2.4654 sec/batch
Epoch 2/20  Iteration 201/3560 Training loss: 2.3366 2.4569 sec/batch
Epoch 2/20  Iteration 202/3560 Training loss: 2.3348 2.7179 sec/batch
Epoch 2/20  Iteration 203/3560 Training loss: 2.3333 2.5441 sec/batch
Epoch 2/20  Iteration 204/3560 Training loss: 2.3318 2.4555 sec/batch
Epoch 2/20  Iteration 205/3560 Training loss: 2.3302 2.4649 sec/batch
Epoch 2/20  Iteration 206/3560 Training loss: 2.3293 2.4591 sec/batch
Epoch 2/20  Iteration 207/3560 Training loss: 2.3290 2.4457 sec/batch
Epoch 2/20  Iteration 208/3560 Training loss: 2.3280 2.4629 sec/batch
Epoch 2/20  Iteration 209/3560 Training loss: 2.3273 2.4718 sec/batch
Epoch 2/20  Iteration 210/3560 Training loss: 2.3259 2.4529 sec/batch
Epoch 2/20  Iteration 211/3560 Training loss: 2.3242 2.4476 sec/batch
Epoch 2/20  Iteration 212/3560 Training loss: 2.3235 2.4686 sec/batch
Epoch 2/20  Iteration 213/3560 Training loss: 2.3222 2.4625 sec/batch
Epoch 2/20  Iteration 214/3560 Training loss: 2.3212 2.4606 sec/batch
Epoch 2/20  Iteration 215/3560 Training loss: 2.3200 2.4543 sec/batch
Epoch 2/20  Iteration 216/3560 Training loss: 2.3177 2.4538 sec/batch
Epoch 2/20  Iteration 217/3560 Training loss: 2.3161 2.4507 sec/batch
Epoch 2/20  Iteration 218/3560 Training loss: 2.3146 2.4623 sec/batch
Epoch 2/20  Iteration 219/3560 Training loss: 2.3133 2.4479 sec/batch
Epoch 2/20  Iteration 220/3560 Training loss: 2.3119 2.4580 sec/batch
Epoch 2/20  Iteration 221/3560 Training loss: 2.3105 2.4469 sec/batch
Epoch 2/20  Iteration 222/3560 Training loss: 2.3088 2.4479 sec/batch
Epoch 2/20  Iteration 223/3560 Training loss: 2.3076 2.4640 sec/batch
Epoch 2/20  Iteration 224/3560 Training loss: 2.3055 2.4439 sec/batch
Epoch 2/20  Iteration 225/3560 Training loss: 2.3048 2.4812 sec/batch
Epoch 2/20  Iteration 226/3560 Training loss: 2.3036 2.4480 sec/batch
Epoch 2/20  Iteration 227/3560 Training loss: 2.3025 2.4592 sec/batch
Epoch 2/20  Iteration 228/3560 Training loss: 2.3021 2.4532 sec/batch
Epoch 2/20  Iteration 229/3560 Training loss: 2.3006 2.4565 sec/batch
Epoch 2/20  Iteration 230/3560 Training loss: 2.3001 2.4503 sec/batch
Epoch 2/20  Iteration 231/3560 Training loss: 2.2989 2.4444 sec/batch
Epoch 2/20  Iteration 232/3560 Training loss: 2.2978 2.4583 sec/batch
Epoch 2/20  Iteration 233/3560 Training loss: 2.2966 2.4478 sec/batch
Epoch 2/20  Iteration 234/3560 Training loss: 2.2958 2.4660 sec/batch
Epoch 2/20  Iteration 235/3560 Training loss: 2.2949 2.4455 sec/batch
Epoch 2/20  Iteration 236/3560 Training loss: 2.2937 2.4521 sec/batch
Epoch 2/20  Iteration 237/3560 Training loss: 2.2927 2.4729 sec/batch
Epoch 2/20  Iteration 238/3560 Training loss: 2.2922 2.4493 sec/batch
Epoch 2/20  Iteration 239/3560 Training loss: 2.2911 2.4597 sec/batch
Epoch 2/20  Iteration 240/3560 Training loss: 2.2905 2.4466 sec/batch
Epoch 2/20  Iteration 241/3560 Training loss: 2.2900 2.4644 sec/batch
Epoch 2/20  Iteration 242/3560 Training loss: 2.2890 2.4465 sec/batch
Epoch 2/20  Iteration 243/3560 Training loss: 2.2880 2.6781 sec/batch
Epoch 2/20  Iteration 244/3560 Training loss: 2.2874 2.4597 sec/batch
Epoch 2/20  Iteration 245/3560 Training loss: 2.2865 2.4571 sec/batch
Epoch 2/20  Iteration 246/3560 Training loss: 2.2853 2.4477 sec/batch
Epoch 2/20  Iteration 247/3560 Training loss: 2.2842 2.4486 sec/batch
Epoch 2/20  Iteration 248/3560 Training loss: 2.2833 2.4516 sec/batch
Epoch 2/20  Iteration 249/3560 Training loss: 2.2827 2.4626 sec/batch
Epoch 2/20  Iteration 250/3560 Training loss: 2.2819 2.4603 sec/batch
Epoch 2/20  Iteration 251/3560 Training loss: 2.2811 2.4595 sec/batch
Epoch 2/20  Iteration 252/3560 Training loss: 2.2800 2.4580 sec/batch
Epoch 2/20  Iteration 253/3560 Training loss: 2.2790 2.4553 sec/batch
Epoch 2/20  Iteration 254/3560 Training loss: 2.2787 2.4502 sec/batch
Epoch 2/20  Iteration 255/3560 Training loss: 2.2778 2.4605 sec/batch
Epoch 2/20  Iteration 256/3560 Training loss: 2.2771 2.4484 sec/batch
Epoch 2/20  Iteration 257/3560 Training loss: 2.2759 2.4555 sec/batch
Epoch 2/20  Iteration 258/3560 Training loss: 2.2750 2.4563 sec/batch
Epoch 2/20  Iteration 259/3560 Training loss: 2.2739 2.4538 sec/batch
Epoch 2/20  Iteration 260/3560 Training loss: 2.2733 2.4601 sec/batch
Epoch 2/20  Iteration 261/3560 Training loss: 2.2722 2.4689 sec/batch
Epoch 2/20  Iteration 262/3560 Training loss: 2.2712 2.4589 sec/batch
Epoch 2/20  Iteration 263/3560 Training loss: 2.2699 2.4519 sec/batch
Epoch 2/20  Iteration 264/3560 Training loss: 2.2689 2.4514 sec/batch
Epoch 2/20  Iteration 265/3560 Training loss: 2.2680 2.4629 sec/batch
Epoch 2/20  Iteration 266/3560 Training loss: 2.2671 2.4594 sec/batch
Epoch 2/20  Iteration 267/3560 Training loss: 2.2661 2.4532 sec/batch
Epoch 2/20  Iteration 268/3560 Training loss: 2.2654 2.4524 sec/batch
Epoch 2/20  Iteration 269/3560 Training loss: 2.2645 2.4595 sec/batch
Epoch 2/20  Iteration 270/3560 Training loss: 2.2638 2.4494 sec/batch
Epoch 2/20  Iteration 271/3560 Training loss: 2.2627 2.4580 sec/batch
Epoch 2/20  Iteration 272/3560 Training loss: 2.2617 2.4437 sec/batch
Epoch 2/20  Iteration 273/3560 Training loss: 2.2607 2.4703 sec/batch
Epoch 2/20  Iteration 274/3560 Training loss: 2.2598 2.4519 sec/batch
Epoch 2/20  Iteration 275/3560 Training loss: 2.2590 2.4538 sec/batch
Epoch 2/20  Iteration 276/3560 Training loss: 2.2581 2.4590 sec/batch
Epoch 2/20  Iteration 277/3560 Training loss: 2.2571 2.4460 sec/batch
Epoch 2/20  Iteration 278/3560 Training loss: 2.2560 2.4466 sec/batch
Epoch 2/20  Iteration 279/3560 Training loss: 2.2554 2.4634 sec/batch
Epoch 2/20  Iteration 280/3560 Training loss: 2.2546 2.4570 sec/batch
Epoch 2/20  Iteration 281/3560 Training loss: 2.2536 2.4571 sec/batch
Epoch 2/20  Iteration 282/3560 Training loss: 2.2528 2.4530 sec/batch
Epoch 2/20  Iteration 283/3560 Training loss: 2.2519 2.4520 sec/batch
Epoch 2/20  Iteration 284/3560 Training loss: 2.2511 2.4724 sec/batch
Epoch 2/20  Iteration 285/3560 Training loss: 2.2504 2.4710 sec/batch
Epoch 2/20  Iteration 286/3560 Training loss: 2.2499 2.4670 sec/batch
Epoch 2/20  Iteration 287/3560 Training loss: 2.2492 2.4486 sec/batch
Epoch 2/20  Iteration 288/3560 Training loss: 2.2484 2.4447 sec/batch
Epoch 2/20  Iteration 289/3560 Training loss: 2.2478 2.4536 sec/batch
Epoch 2/20  Iteration 290/3560 Training loss: 2.2472 2.4526 sec/batch
Epoch 2/20  Iteration 291/3560 Training loss: 2.2464 2.5959 sec/batch
Epoch 2/20  Iteration 292/3560 Training loss: 2.2456 2.5588 sec/batch
Epoch 2/20  Iteration 293/3560 Training loss: 2.2448 2.4620 sec/batch
Epoch 2/20  Iteration 294/3560 Training loss: 2.2438 2.4515 sec/batch
Epoch 2/20  Iteration 295/3560 Training loss: 2.2431 2.4700 sec/batch
Epoch 2/20  Iteration 296/3560 Training loss: 2.2423 2.4454 sec/batch
Epoch 2/20  Iteration 297/3560 Training loss: 2.2418 2.4564 sec/batch
Epoch 2/20  Iteration 298/3560 Training loss: 2.2412 2.7166 sec/batch
Epoch 2/20  Iteration 299/3560 Training loss: 2.2407 2.4491 sec/batch
Epoch 2/20  Iteration 300/3560 Training loss: 2.2399 2.4564 sec/batch
Epoch 2/20  Iteration 301/3560 Training loss: 2.2390 2.4487 sec/batch
Epoch 2/20  Iteration 302/3560 Training loss: 2.2385 2.4654 sec/batch
Epoch 2/20  Iteration 303/3560 Training loss: 2.2379 2.4531 sec/batch
Epoch 2/20  Iteration 304/3560 Training loss: 2.2371 2.4481 sec/batch
Epoch 2/20  Iteration 305/3560 Training loss: 2.2365 2.4596 sec/batch
Epoch 2/20  Iteration 306/3560 Training loss: 2.2361 2.4430 sec/batch
Epoch 2/20  Iteration 307/3560 Training loss: 2.2354 2.4565 sec/batch
Epoch 2/20  Iteration 308/3560 Training loss: 2.2348 2.4563 sec/batch
Epoch 2/20  Iteration 309/3560 Training loss: 2.2340 2.4442 sec/batch
Epoch 2/20  Iteration 310/3560 Training loss: 2.2331 2.4747 sec/batch
Epoch 2/20  Iteration 311/3560 Training loss: 2.2325 2.4419 sec/batch
Epoch 2/20  Iteration 312/3560 Training loss: 2.2320 2.4557 sec/batch
Epoch 2/20  Iteration 313/3560 Training loss: 2.2313 2.4464 sec/batch
Epoch 2/20  Iteration 314/3560 Training loss: 2.2308 2.4492 sec/batch
Epoch 2/20  Iteration 315/3560 Training loss: 2.2303 2.4695 sec/batch
Epoch 2/20  Iteration 316/3560 Training loss: 2.2297 2.4475 sec/batch
Epoch 2/20  Iteration 317/3560 Training loss: 2.2294 2.4710 sec/batch
Epoch 2/20  Iteration 318/3560 Training loss: 2.2287 2.4468 sec/batch
Epoch 2/20  Iteration 319/3560 Training loss: 2.2283 2.4662 sec/batch
Epoch 2/20  Iteration 320/3560 Training loss: 2.2277 2.4497 sec/batch
Epoch 2/20  Iteration 321/3560 Training loss: 2.2270 2.4471 sec/batch
Epoch 2/20  Iteration 322/3560 Training loss: 2.2263 2.4787 sec/batch
Epoch 2/20  Iteration 323/3560 Training loss: 2.2257 2.4433 sec/batch
Epoch 2/20  Iteration 324/3560 Training loss: 2.2253 2.6766 sec/batch
Epoch 2/20  Iteration 325/3560 Training loss: 2.2247 2.4518 sec/batch
Epoch 2/20  Iteration 326/3560 Training loss: 2.2244 2.4584 sec/batch
Epoch 2/20  Iteration 327/3560 Training loss: 2.2238 2.4454 sec/batch
Epoch 2/20  Iteration 328/3560 Training loss: 2.2231 2.4466 sec/batch
Epoch 2/20  Iteration 329/3560 Training loss: 2.2225 2.4625 sec/batch
Epoch 2/20  Iteration 330/3560 Training loss: 2.2221 2.4614 sec/batch
Epoch 2/20  Iteration 331/3560 Training loss: 2.2216 2.4534 sec/batch
Epoch 2/20  Iteration 332/3560 Training loss: 2.2210 2.4460 sec/batch
Epoch 2/20  Iteration 333/3560 Training loss: 2.2204 2.4483 sec/batch
Epoch 2/20  Iteration 334/3560 Training loss: 2.2199 2.4873 sec/batch
Epoch 2/20  Iteration 335/3560 Training loss: 2.2192 2.4466 sec/batch
Epoch 2/20  Iteration 336/3560 Training loss: 2.2186 2.4614 sec/batch
Epoch 2/20  Iteration 337/3560 Training loss: 2.2178 2.4769 sec/batch
Epoch 2/20  Iteration 338/3560 Training loss: 2.2175 2.4643 sec/batch
Epoch 2/20  Iteration 339/3560 Training loss: 2.2170 2.4463 sec/batch
Epoch 2/20  Iteration 340/3560 Training loss: 2.2164 2.6283 sec/batch
Epoch 2/20  Iteration 341/3560 Training loss: 2.2159 2.5264 sec/batch
Epoch 2/20  Iteration 342/3560 Training loss: 2.2153 2.4493 sec/batch
Epoch 2/20  Iteration 343/3560 Training loss: 2.2147 2.4581 sec/batch
Epoch 2/20  Iteration 344/3560 Training loss: 2.2141 2.4510 sec/batch
Epoch 2/20  Iteration 345/3560 Training loss: 2.2137 2.4553 sec/batch
Epoch 2/20  Iteration 346/3560 Training loss: 2.2133 2.4651 sec/batch
Epoch 2/20  Iteration 347/3560 Training loss: 2.2128 2.4467 sec/batch
Epoch 2/20  Iteration 348/3560 Training loss: 2.2122 2.4548 sec/batch
Epoch 2/20  Iteration 349/3560 Training loss: 2.2116 2.4475 sec/batch
Epoch 2/20  Iteration 350/3560 Training loss: 2.2110 2.4648 sec/batch
Epoch 2/20  Iteration 351/3560 Training loss: 2.2106 2.4485 sec/batch
Epoch 2/20  Iteration 352/3560 Training loss: 2.2101 2.4470 sec/batch
Epoch 2/20  Iteration 353/3560 Training loss: 2.2097 2.4569 sec/batch
Epoch 2/20  Iteration 354/3560 Training loss: 2.2092 2.4461 sec/batch
Epoch 2/20  Iteration 355/3560 Training loss: 2.2085 2.4644 sec/batch
Epoch 2/20  Iteration 356/3560 Training loss: 2.2080 2.4427 sec/batch
Epoch 3/20  Iteration 357/3560 Training loss: 2.2047 2.4548 sec/batch
Epoch 3/20  Iteration 358/3560 Training loss: 2.1444 2.4644 sec/batch
Epoch 3/20  Iteration 359/3560 Training loss: 2.1276 2.4512 sec/batch
Epoch 3/20  Iteration 360/3560 Training loss: 2.1181 2.4516 sec/batch
Epoch 3/20  Iteration 361/3560 Training loss: 2.1143 2.4454 sec/batch
Epoch 3/20  Iteration 362/3560 Training loss: 2.1079 2.4800 sec/batch
Epoch 3/20  Iteration 363/3560 Training loss: 2.1076 2.4596 sec/batch
Epoch 3/20  Iteration 364/3560 Training loss: 2.1082 2.4521 sec/batch
Epoch 3/20  Iteration 365/3560 Training loss: 2.1096 2.4575 sec/batch
Epoch 3/20  Iteration 366/3560 Training loss: 2.1085 2.4511 sec/batch
Epoch 3/20  Iteration 367/3560 Training loss: 2.1062 2.4660 sec/batch
Epoch 3/20  Iteration 368/3560 Training loss: 2.1038 2.4440 sec/batch
Epoch 3/20  Iteration 369/3560 Training loss: 2.1037 2.4585 sec/batch
Epoch 3/20  Iteration 370/3560 Training loss: 2.1055 2.4619 sec/batch
Epoch 3/20  Iteration 371/3560 Training loss: 2.1038 2.4437 sec/batch
Epoch 3/20  Iteration 372/3560 Training loss: 2.1020 2.4542 sec/batch
Epoch 3/20  Iteration 373/3560 Training loss: 2.1009 2.4525 sec/batch
Epoch 3/20  Iteration 374/3560 Training loss: 2.1029 2.4543 sec/batch
Epoch 3/20  Iteration 375/3560 Training loss: 2.1024 2.4536 sec/batch
Epoch 3/20  Iteration 376/3560 Training loss: 2.1014 2.4515 sec/batch
Epoch 3/20  Iteration 377/3560 Training loss: 2.1002 2.4604 sec/batch
Epoch 3/20  Iteration 378/3560 Training loss: 2.1014 2.4518 sec/batch
Epoch 3/20  Iteration 379/3560 Training loss: 2.1002 2.4619 sec/batch
Epoch 3/20  Iteration 380/3560 Training loss: 2.0993 2.4521 sec/batch
Epoch 3/20  Iteration 381/3560 Training loss: 2.0985 2.4455 sec/batch
Epoch 3/20  Iteration 382/3560 Training loss: 2.0968 2.4658 sec/batch
Epoch 3/20  Iteration 383/3560 Training loss: 2.0953 2.4707 sec/batch
Epoch 3/20  Iteration 384/3560 Training loss: 2.0950 2.4648 sec/batch
Epoch 3/20  Iteration 385/3560 Training loss: 2.0953 2.4615 sec/batch
Epoch 3/20  Iteration 386/3560 Training loss: 2.0947 2.4480 sec/batch
Epoch 3/20  Iteration 387/3560 Training loss: 2.0938 2.4631 sec/batch
Epoch 3/20  Iteration 388/3560 Training loss: 2.0926 2.6416 sec/batch
Epoch 3/20  Iteration 389/3560 Training loss: 2.0917 2.5266 sec/batch
Epoch 3/20  Iteration 390/3560 Training loss: 2.0919 2.4493 sec/batch
Epoch 3/20  Iteration 391/3560 Training loss: 2.0913 2.4625 sec/batch
Epoch 3/20  Iteration 392/3560 Training loss: 2.0906 2.4453 sec/batch
Epoch 3/20  Iteration 393/3560 Training loss: 2.0898 2.4511 sec/batch
Epoch 3/20  Iteration 394/3560 Training loss: 2.0879 2.4588 sec/batch
Epoch 3/20  Iteration 395/3560 Training loss: 2.0864 2.4628 sec/batch
Epoch 3/20  Iteration 396/3560 Training loss: 2.0852 2.4504 sec/batch
Epoch 3/20  Iteration 397/3560 Training loss: 2.0844 2.4430 sec/batch
Epoch 3/20  Iteration 398/3560 Training loss: 2.0839 2.4497 sec/batch
Epoch 3/20  Iteration 399/3560 Training loss: 2.0830 2.4572 sec/batch
Epoch 3/20  Iteration 400/3560 Training loss: 2.0818 2.4446 sec/batch
Epoch 3/20  Iteration 401/3560 Training loss: 2.0815 2.4563 sec/batch
Epoch 3/20  Iteration 402/3560 Training loss: 2.0797 2.4556 sec/batch
Epoch 3/20  Iteration 403/3560 Training loss: 2.0793 2.4578 sec/batch
Epoch 3/20  Iteration 404/3560 Training loss: 2.0781 2.4451 sec/batch
Epoch 3/20  Iteration 405/3560 Training loss: 2.0776 2.4637 sec/batch
Epoch 3/20  Iteration 406/3560 Training loss: 2.0778 2.4849 sec/batch
Epoch 3/20  Iteration 407/3560 Training loss: 2.0769 2.4631 sec/batch
Epoch 3/20  Iteration 408/3560 Training loss: 2.0772 2.4609 sec/batch
Epoch 3/20  Iteration 409/3560 Training loss: 2.0765 2.4572 sec/batch
Epoch 3/20  Iteration 410/3560 Training loss: 2.0760 2.4458 sec/batch
Epoch 3/20  Iteration 411/3560 Training loss: 2.0753 2.4691 sec/batch
Epoch 3/20  Iteration 412/3560 Training loss: 2.0750 2.4521 sec/batch
Epoch 3/20  Iteration 413/3560 Training loss: 2.0746 2.4812 sec/batch
Epoch 3/20  Iteration 414/3560 Training loss: 2.0740 2.4560 sec/batch
Epoch 3/20  Iteration 415/3560 Training loss: 2.0732 2.4665 sec/batch
Epoch 3/20  Iteration 416/3560 Training loss: 2.0734 2.4526 sec/batch
Epoch 3/20  Iteration 417/3560 Training loss: 2.0729 2.4735 sec/batch
Epoch 3/20  Iteration 418/3560 Training loss: 2.0731 2.4600 sec/batch
Epoch 3/20  Iteration 419/3560 Training loss: 2.0732 2.4682 sec/batch
Epoch 3/20  Iteration 420/3560 Training loss: 2.0729 2.4556 sec/batch
Epoch 3/20  Iteration 421/3560 Training loss: 2.0722 2.4611 sec/batch
Epoch 3/20  Iteration 422/3560 Training loss: 2.0722 2.4621 sec/batch
Epoch 3/20  Iteration 423/3560 Training loss: 2.0717 2.4456 sec/batch
Epoch 3/20  Iteration 424/3560 Training loss: 2.0707 2.4586 sec/batch
Epoch 3/20  Iteration 425/3560 Training loss: 2.0701 2.5963 sec/batch
Epoch 3/20  Iteration 426/3560 Training loss: 2.0696 2.4440 sec/batch
Epoch 3/20  Iteration 427/3560 Training loss: 2.0697 2.4687 sec/batch
Epoch 3/20  Iteration 428/3560 Training loss: 2.0693 2.4506 sec/batch
Epoch 3/20  Iteration 429/3560 Training loss: 2.0691 2.4618 sec/batch
Epoch 3/20  Iteration 430/3560 Training loss: 2.0683 2.4432 sec/batch
Epoch 3/20  Iteration 431/3560 Training loss: 2.0677 2.4691 sec/batch
Epoch 3/20  Iteration 432/3560 Training loss: 2.0678 2.4968 sec/batch
Epoch 3/20  Iteration 433/3560 Training loss: 2.0672 2.4496 sec/batch
Epoch 3/20  Iteration 434/3560 Training loss: 2.0669 2.4660 sec/batch
Epoch 3/20  Iteration 435/3560 Training loss: 2.0659 2.4669 sec/batch
Epoch 3/20  Iteration 436/3560 Training loss: 2.0653 2.4701 sec/batch
Epoch 3/20  Iteration 437/3560 Training loss: 2.0645 2.4912 sec/batch
Epoch 3/20  Iteration 438/3560 Training loss: 2.0642 2.6789 sec/batch
Epoch 3/20  Iteration 439/3560 Training loss: 2.0633 2.4544 sec/batch
Epoch 3/20  Iteration 440/3560 Training loss: 2.0627 2.4683 sec/batch
Epoch 3/20  Iteration 441/3560 Training loss: 2.0617 2.4589 sec/batch
Epoch 3/20  Iteration 442/3560 Training loss: 2.0610 2.4500 sec/batch
Epoch 3/20  Iteration 443/3560 Training loss: 2.0605 2.4709 sec/batch
Epoch 3/20  Iteration 444/3560 Training loss: 2.0599 2.4549 sec/batch
Epoch 3/20  Iteration 445/3560 Training loss: 2.0590 2.8295 sec/batch
Epoch 3/20  Iteration 446/3560 Training loss: 2.0587 2.5659 sec/batch
Epoch 3/20  Iteration 447/3560 Training loss: 2.0580 2.4484 sec/batch
Epoch 3/20  Iteration 448/3560 Training loss: 2.0575 2.4653 sec/batch
Epoch 3/20  Iteration 449/3560 Training loss: 2.0566 2.4576 sec/batch
Epoch 3/20  Iteration 450/3560 Training loss: 2.0558 2.4497 sec/batch
Epoch 3/20  Iteration 451/3560 Training loss: 2.0552 2.4505 sec/batch
Epoch 3/20  Iteration 452/3560 Training loss: 2.0548 2.4662 sec/batch
Epoch 3/20  Iteration 453/3560 Training loss: 2.0543 2.5119 sec/batch
Epoch 3/20  Iteration 454/3560 Training loss: 2.0537 2.4619 sec/batch
Epoch 3/20  Iteration 455/3560 Training loss: 2.0530 2.4775 sec/batch
Epoch 3/20  Iteration 456/3560 Training loss: 2.0522 2.4445 sec/batch
Epoch 3/20  Iteration 457/3560 Training loss: 2.0519 2.4655 sec/batch
Epoch 3/20  Iteration 458/3560 Training loss: 2.0515 2.4586 sec/batch
Epoch 3/20  Iteration 459/3560 Training loss: 2.0509 2.4502 sec/batch
Epoch 3/20  Iteration 460/3560 Training loss: 2.0504 2.4612 sec/batch
Epoch 3/20  Iteration 461/3560 Training loss: 2.0498 2.4603 sec/batch
Epoch 3/20  Iteration 462/3560 Training loss: 2.0493 2.4590 sec/batch
Epoch 3/20  Iteration 463/3560 Training loss: 2.0489 2.4593 sec/batch
Epoch 3/20  Iteration 464/3560 Training loss: 2.0486 2.4588 sec/batch
Epoch 3/20  Iteration 465/3560 Training loss: 2.0483 2.4479 sec/batch
Epoch 3/20  Iteration 466/3560 Training loss: 2.0478 2.4592 sec/batch
Epoch 3/20  Iteration 467/3560 Training loss: 2.0474 2.4827 sec/batch
Epoch 3/20  Iteration 468/3560 Training loss: 2.0470 2.4472 sec/batch
Epoch 3/20  Iteration 469/3560 Training loss: 2.0466 2.4582 sec/batch
Epoch 3/20  Iteration 470/3560 Training loss: 2.0461 2.4453 sec/batch
Epoch 3/20  Iteration 471/3560 Training loss: 2.0455 2.4674 sec/batch
Epoch 3/20  Iteration 472/3560 Training loss: 2.0448 2.4448 sec/batch
Epoch 3/20  Iteration 473/3560 Training loss: 2.0444 2.4546 sec/batch
Epoch 3/20  Iteration 474/3560 Training loss: 2.0439 2.4624 sec/batch
Epoch 3/20  Iteration 475/3560 Training loss: 2.0436 2.4509 sec/batch
Epoch 3/20  Iteration 476/3560 Training loss: 2.0432 2.4585 sec/batch
Epoch 3/20  Iteration 477/3560 Training loss: 2.0429 2.4565 sec/batch
Epoch 3/20  Iteration 478/3560 Training loss: 2.0423 2.4517 sec/batch
Epoch 3/20  Iteration 479/3560 Training loss: 2.0417 2.4747 sec/batch
Epoch 3/20  Iteration 480/3560 Training loss: 2.0415 2.4744 sec/batch
Epoch 3/20  Iteration 481/3560 Training loss: 2.0411 2.4567 sec/batch
Epoch 3/20  Iteration 482/3560 Training loss: 2.0405 2.4539 sec/batch
Epoch 3/20  Iteration 483/3560 Training loss: 2.0402 2.4517 sec/batch
Epoch 3/20  Iteration 484/3560 Training loss: 2.0399 2.4590 sec/batch
Epoch 3/20  Iteration 485/3560 Training loss: 2.0395 2.6394 sec/batch
Epoch 3/20  Iteration 486/3560 Training loss: 2.0391 2.5099 sec/batch
Epoch 3/20  Iteration 487/3560 Training loss: 2.0385 2.4570 sec/batch
Epoch 3/20  Iteration 488/3560 Training loss: 2.0378 2.4580 sec/batch
Epoch 3/20  Iteration 489/3560 Training loss: 2.0375 2.4462 sec/batch
Epoch 3/20  Iteration 490/3560 Training loss: 2.0371 2.4508 sec/batch
Epoch 3/20  Iteration 491/3560 Training loss: 2.0367 2.4500 sec/batch
Epoch 3/20  Iteration 492/3560 Training loss: 2.0364 2.4680 sec/batch
Epoch 3/20  Iteration 493/3560 Training loss: 2.0361 2.4599 sec/batch
Epoch 3/20  Iteration 494/3560 Training loss: 2.0358 2.4471 sec/batch
Epoch 3/20  Iteration 495/3560 Training loss: 2.0356 2.4500 sec/batch
Epoch 3/20  Iteration 496/3560 Training loss: 2.0352 2.4501 sec/batch
Epoch 3/20  Iteration 497/3560 Training loss: 2.0349 2.4493 sec/batch
Epoch 3/20  Iteration 498/3560 Training loss: 2.0345 2.4618 sec/batch
Epoch 3/20  Iteration 499/3560 Training loss: 2.0341 2.4536 sec/batch
Epoch 3/20  Iteration 500/3560 Training loss: 2.0338 2.4620 sec/batch
Epoch 3/20  Iteration 501/3560 Training loss: 2.0334 2.4459 sec/batch
Epoch 3/20  Iteration 502/3560 Training loss: 2.0331 2.4563 sec/batch
Epoch 3/20  Iteration 503/3560 Training loss: 2.0328 2.4479 sec/batch
Epoch 3/20  Iteration 504/3560 Training loss: 2.0327 2.4695 sec/batch
Epoch 3/20  Iteration 505/3560 Training loss: 2.0324 2.4589 sec/batch
Epoch 3/20  Iteration 506/3560 Training loss: 2.0319 2.4588 sec/batch
Epoch 3/20  Iteration 507/3560 Training loss: 2.0314 2.4649 sec/batch
Epoch 3/20  Iteration 508/3560 Training loss: 2.0313 2.4539 sec/batch
Epoch 3/20  Iteration 509/3560 Training loss: 2.0310 2.4723 sec/batch
Epoch 3/20  Iteration 510/3560 Training loss: 2.0306 2.4512 sec/batch
Epoch 3/20  Iteration 511/3560 Training loss: 2.0303 2.4499 sec/batch
Epoch 3/20  Iteration 512/3560 Training loss: 2.0299 2.4519 sec/batch
Epoch 3/20  Iteration 513/3560 Training loss: 2.0295 2.4509 sec/batch
Epoch 3/20  Iteration 514/3560 Training loss: 2.0291 2.4548 sec/batch
Epoch 3/20  Iteration 515/3560 Training loss: 2.0286 2.4457 sec/batch
Epoch 3/20  Iteration 516/3560 Training loss: 2.0285 2.4625 sec/batch
Epoch 3/20  Iteration 517/3560 Training loss: 2.0282 2.4562 sec/batch
Epoch 3/20  Iteration 518/3560 Training loss: 2.0278 2.4475 sec/batch
Epoch 3/20  Iteration 519/3560 Training loss: 2.0276 2.4712 sec/batch
Epoch 3/20  Iteration 520/3560 Training loss: 2.0272 2.4543 sec/batch
Epoch 3/20  Iteration 521/3560 Training loss: 2.0269 2.4440 sec/batch
Epoch 3/20  Iteration 522/3560 Training loss: 2.0264 2.4591 sec/batch
Epoch 3/20  Iteration 523/3560 Training loss: 2.0262 2.4484 sec/batch
Epoch 3/20  Iteration 524/3560 Training loss: 2.0261 2.4636 sec/batch
Epoch 3/20  Iteration 525/3560 Training loss: 2.0257 2.4494 sec/batch
Epoch 3/20  Iteration 526/3560 Training loss: 2.0253 2.4610 sec/batch
Epoch 3/20  Iteration 527/3560 Training loss: 2.0248 2.4617 sec/batch
Epoch 3/20  Iteration 528/3560 Training loss: 2.0244 2.4682 sec/batch
Epoch 3/20  Iteration 529/3560 Training loss: 2.0241 2.4609 sec/batch
Epoch 3/20  Iteration 530/3560 Training loss: 2.0238 2.4492 sec/batch
Epoch 3/20  Iteration 531/3560 Training loss: 2.0235 2.4587 sec/batch
Epoch 3/20  Iteration 532/3560 Training loss: 2.0231 2.4665 sec/batch
Epoch 3/20  Iteration 533/3560 Training loss: 2.0226 2.4450 sec/batch
Epoch 3/20  Iteration 534/3560 Training loss: 2.0223 2.6453 sec/batch
Epoch 4/20  Iteration 535/3560 Training loss: 2.0464 2.5128 sec/batch
Epoch 4/20  Iteration 536/3560 Training loss: 1.9872 2.4679 sec/batch
Epoch 4/20  Iteration 537/3560 Training loss: 1.9733 2.4572 sec/batch
Epoch 4/20  Iteration 538/3560 Training loss: 1.9657 2.4691 sec/batch
Epoch 4/20  Iteration 539/3560 Training loss: 1.9616 2.4473 sec/batch
Epoch 4/20  Iteration 540/3560 Training loss: 1.9523 2.4806 sec/batch
Epoch 4/20  Iteration 541/3560 Training loss: 1.9529 2.4533 sec/batch
Epoch 4/20  Iteration 542/3560 Training loss: 1.9529 2.4541 sec/batch
Epoch 4/20  Iteration 543/3560 Training loss: 1.9558 2.4696 sec/batch
Epoch 4/20  Iteration 544/3560 Training loss: 1.9538 2.4464 sec/batch
Epoch 4/20  Iteration 545/3560 Training loss: 1.9509 2.4458 sec/batch
Epoch 4/20  Iteration 546/3560 Training loss: 1.9491 2.4653 sec/batch
Epoch 4/20  Iteration 547/3560 Training loss: 1.9493 2.4498 sec/batch
Epoch 4/20  Iteration 548/3560 Training loss: 1.9510 2.4662 sec/batch
Epoch 4/20  Iteration 549/3560 Training loss: 1.9506 2.4563 sec/batch
Epoch 4/20  Iteration 550/3560 Training loss: 1.9487 2.4565 sec/batch
Epoch 4/20  Iteration 551/3560 Training loss: 1.9477 2.4684 sec/batch
Epoch 4/20  Iteration 552/3560 Training loss: 1.9498 2.4672 sec/batch
Epoch 4/20  Iteration 553/3560 Training loss: 1.9497 2.5473 sec/batch
Epoch 4/20  Iteration 554/3560 Training loss: 1.9498 2.4537 sec/batch
Epoch 4/20  Iteration 555/3560 Training loss: 1.9488 2.4694 sec/batch
Epoch 4/20  Iteration 556/3560 Training loss: 1.9498 2.4553 sec/batch
Epoch 4/20  Iteration 557/3560 Training loss: 1.9487 2.4478 sec/batch
Epoch 4/20  Iteration 558/3560 Training loss: 1.9481 2.4655 sec/batch
Epoch 4/20  Iteration 559/3560 Training loss: 1.9475 2.4495 sec/batch
Epoch 4/20  Iteration 560/3560 Training loss: 1.9460 2.4615 sec/batch
Epoch 4/20  Iteration 561/3560 Training loss: 1.9448 2.4484 sec/batch
Epoch 4/20  Iteration 562/3560 Training loss: 1.9449 2.4513 sec/batch
Epoch 4/20  Iteration 563/3560 Training loss: 1.9453 2.4487 sec/batch
Epoch 4/20  Iteration 564/3560 Training loss: 1.9450 2.4516 sec/batch
Epoch 4/20  Iteration 565/3560 Training loss: 1.9446 2.4763 sec/batch
Epoch 4/20  Iteration 566/3560 Training loss: 1.9434 2.4772 sec/batch
Epoch 4/20  Iteration 567/3560 Training loss: 1.9434 2.6471 sec/batch
Epoch 4/20  Iteration 568/3560 Training loss: 1.9437 2.4544 sec/batch
Epoch 4/20  Iteration 569/3560 Training loss: 1.9431 2.4494 sec/batch
Epoch 4/20  Iteration 570/3560 Training loss: 1.9424 2.4558 sec/batch
Epoch 4/20  Iteration 571/3560 Training loss: 1.9417 2.4409 sec/batch
Epoch 4/20  Iteration 572/3560 Training loss: 1.9401 2.4513 sec/batch
Epoch 4/20  Iteration 573/3560 Training loss: 1.9386 2.4619 sec/batch
Epoch 4/20  Iteration 574/3560 Training loss: 1.9377 2.4626 sec/batch
Epoch 4/20  Iteration 575/3560 Training loss: 1.9369 2.4572 sec/batch
Epoch 4/20  Iteration 576/3560 Training loss: 1.9368 2.4527 sec/batch
Epoch 4/20  Iteration 577/3560 Training loss: 1.9360 2.4707 sec/batch
Epoch 4/20  Iteration 578/3560 Training loss: 1.9348 2.4455 sec/batch
Epoch 4/20  Iteration 579/3560 Training loss: 1.9348 2.4593 sec/batch
Epoch 4/20  Iteration 580/3560 Training loss: 1.9333 2.4811 sec/batch
Epoch 4/20  Iteration 581/3560 Training loss: 1.9329 2.4519 sec/batch
Epoch 4/20  Iteration 582/3560 Training loss: 1.9320 2.4638 sec/batch
Epoch 4/20  Iteration 583/3560 Training loss: 1.9317 2.6388 sec/batch
Epoch 4/20  Iteration 584/3560 Training loss: 1.9322 2.5040 sec/batch
Epoch 4/20  Iteration 585/3560 Training loss: 1.9313 2.4497 sec/batch
Epoch 4/20  Iteration 586/3560 Training loss: 1.9316 2.4612 sec/batch
Epoch 4/20  Iteration 587/3560 Training loss: 1.9312 2.4519 sec/batch
Epoch 4/20  Iteration 588/3560 Training loss: 1.9308 2.4534 sec/batch
Epoch 4/20  Iteration 589/3560 Training loss: 1.9304 2.4807 sec/batch
Epoch 4/20  Iteration 590/3560 Training loss: 1.9305 2.4456 sec/batch
Epoch 4/20  Iteration 591/3560 Training loss: 1.9305 2.4651 sec/batch
Epoch 4/20  Iteration 592/3560 Training loss: 1.9300 2.4500 sec/batch
Epoch 4/20  Iteration 593/3560 Training loss: 1.9293 2.4448 sec/batch
Epoch 4/20  Iteration 594/3560 Training loss: 1.9296 2.4735 sec/batch
Epoch 4/20  Iteration 595/3560 Training loss: 1.9293 2.4500 sec/batch
Epoch 4/20  Iteration 596/3560 Training loss: 1.9296 2.4522 sec/batch
Epoch 4/20  Iteration 597/3560 Training loss: 1.9297 2.4473 sec/batch
Epoch 4/20  Iteration 598/3560 Training loss: 1.9297 2.4548 sec/batch
Epoch 4/20  Iteration 599/3560 Training loss: 1.9292 2.4452 sec/batch
Epoch 4/20  Iteration 600/3560 Training loss: 1.9294 2.4627 sec/batch
Epoch 4/20  Iteration 601/3560 Training loss: 1.9291 2.4580 sec/batch
Epoch 4/20  Iteration 602/3560 Training loss: 1.9286 2.4512 sec/batch
Epoch 4/20  Iteration 603/3560 Training loss: 1.9282 2.4703 sec/batch
Epoch 4/20  Iteration 604/3560 Training loss: 1.9279 2.4550 sec/batch
Epoch 4/20  Iteration 605/3560 Training loss: 1.9281 2.4703 sec/batch
Epoch 4/20  Iteration 606/3560 Training loss: 1.9279 2.4576 sec/batch
Epoch 4/20  Iteration 607/3560 Training loss: 1.9278 2.4547 sec/batch
Epoch 4/20  Iteration 608/3560 Training loss: 1.9272 2.4647 sec/batch
Epoch 4/20  Iteration 609/3560 Training loss: 1.9269 2.4454 sec/batch
Epoch 4/20  Iteration 610/3560 Training loss: 1.9271 2.4614 sec/batch
Epoch 4/20  Iteration 611/3560 Training loss: 1.9267 2.4515 sec/batch
Epoch 4/20  Iteration 612/3560 Training loss: 1.9267 2.4485 sec/batch
Epoch 4/20  Iteration 613/3560 Training loss: 1.9259 2.4792 sec/batch
Epoch 4/20  Iteration 614/3560 Training loss: 1.9256 2.4488 sec/batch
Epoch 4/20  Iteration 615/3560 Training loss: 1.9249 2.4563 sec/batch
Epoch 4/20  Iteration 616/3560 Training loss: 1.9248 2.4502 sec/batch
Epoch 4/20  Iteration 617/3560 Training loss: 1.9240 2.4593 sec/batch
Epoch 4/20  Iteration 618/3560 Training loss: 1.9237 2.4496 sec/batch
Epoch 4/20  Iteration 619/3560 Training loss: 1.9229 2.4454 sec/batch
Epoch 4/20  Iteration 620/3560 Training loss: 1.9223 2.4500 sec/batch
Epoch 4/20  Iteration 621/3560 Training loss: 1.9220 2.4445 sec/batch
Epoch 4/20  Iteration 622/3560 Training loss: 1.9213 2.4522 sec/batch
Epoch 4/20  Iteration 623/3560 Training loss: 1.9206 2.4536 sec/batch
Epoch 4/20  Iteration 624/3560 Training loss: 1.9204 2.4512 sec/batch
Epoch 4/20  Iteration 625/3560 Training loss: 1.9199 2.4763 sec/batch
Epoch 4/20  Iteration 626/3560 Training loss: 1.9196 2.4619 sec/batch
Epoch 4/20  Iteration 627/3560 Training loss: 1.9188 2.4561 sec/batch
Epoch 4/20  Iteration 628/3560 Training loss: 1.9183 2.4552 sec/batch
Epoch 4/20  Iteration 629/3560 Training loss: 1.9177 2.4474 sec/batch
Epoch 4/20  Iteration 630/3560 Training loss: 1.9173 2.4599 sec/batch
Epoch 4/20  Iteration 631/3560 Training loss: 1.9169 2.5845 sec/batch
Epoch 4/20  Iteration 632/3560 Training loss: 1.9163 2.5726 sec/batch
Epoch 4/20  Iteration 633/3560 Training loss: 1.9156 2.4461 sec/batch
Epoch 4/20  Iteration 634/3560 Training loss: 1.9147 2.4595 sec/batch
Epoch 4/20  Iteration 635/3560 Training loss: 1.9144 2.4537 sec/batch
Epoch 4/20  Iteration 636/3560 Training loss: 1.9140 2.4488 sec/batch
Epoch 4/20  Iteration 637/3560 Training loss: 1.9135 2.4712 sec/batch
Epoch 4/20  Iteration 638/3560 Training loss: 1.9130 2.4531 sec/batch
Epoch 4/20  Iteration 639/3560 Training loss: 1.9125 2.4533 sec/batch
Epoch 4/20  Iteration 640/3560 Training loss: 1.9120 2.4487 sec/batch
Epoch 4/20  Iteration 641/3560 Training loss: 1.9117 2.4516 sec/batch
Epoch 4/20  Iteration 642/3560 Training loss: 1.9114 2.4584 sec/batch
Epoch 4/20  Iteration 643/3560 Training loss: 1.9112 2.4565 sec/batch
Epoch 4/20  Iteration 644/3560 Training loss: 1.9108 2.4573 sec/batch
Epoch 4/20  Iteration 645/3560 Training loss: 1.9104 2.4481 sec/batch
Epoch 4/20  Iteration 646/3560 Training loss: 1.9100 2.4618 sec/batch
Epoch 4/20  Iteration 647/3560 Training loss: 1.9096 2.4627 sec/batch
Epoch 4/20  Iteration 648/3560 Training loss: 1.9092 2.4452 sec/batch
Epoch 4/20  Iteration 649/3560 Training loss: 1.9086 2.4565 sec/batch
Epoch 4/20  Iteration 650/3560 Training loss: 1.9079 2.4635 sec/batch
Epoch 4/20  Iteration 651/3560 Training loss: 1.9076 2.4658 sec/batch
Epoch 4/20  Iteration 652/3560 Training loss: 1.9072 2.4523 sec/batch
Epoch 4/20  Iteration 653/3560 Training loss: 1.9069 2.4762 sec/batch
Epoch 4/20  Iteration 654/3560 Training loss: 1.9065 2.4554 sec/batch
Epoch 4/20  Iteration 655/3560 Training loss: 1.9062 2.4586 sec/batch
Epoch 4/20  Iteration 656/3560 Training loss: 1.9056 2.4663 sec/batch
Epoch 4/20  Iteration 657/3560 Training loss: 1.9050 2.4447 sec/batch
Epoch 4/20  Iteration 658/3560 Training loss: 1.9048 2.4560 sec/batch
Epoch 4/20  Iteration 659/3560 Training loss: 1.9044 2.4429 sec/batch
Epoch 4/20  Iteration 660/3560 Training loss: 1.9038 2.4531 sec/batch
Epoch 4/20  Iteration 661/3560 Training loss: 1.9036 2.4811 sec/batch
Epoch 4/20  Iteration 662/3560 Training loss: 1.9033 2.4840 sec/batch
Epoch 4/20  Iteration 663/3560 Training loss: 1.9029 2.4544 sec/batch
Epoch 4/20  Iteration 664/3560 Training loss: 1.9026 2.4504 sec/batch
Epoch 4/20  Iteration 665/3560 Training loss: 1.9020 2.4630 sec/batch
Epoch 4/20  Iteration 666/3560 Training loss: 1.9014 2.4613 sec/batch
Epoch 4/20  Iteration 667/3560 Training loss: 1.9011 2.4466 sec/batch
Epoch 4/20  Iteration 668/3560 Training loss: 1.9008 2.4608 sec/batch
Epoch 4/20  Iteration 669/3560 Training loss: 1.9005 2.4477 sec/batch
Epoch 4/20  Iteration 670/3560 Training loss: 1.9003 2.4676 sec/batch
Epoch 4/20  Iteration 671/3560 Training loss: 1.9001 2.4479 sec/batch
Epoch 4/20  Iteration 672/3560 Training loss: 1.8999 2.4441 sec/batch
Epoch 4/20  Iteration 673/3560 Training loss: 1.8998 2.4588 sec/batch
Epoch 4/20  Iteration 674/3560 Training loss: 1.8994 2.4695 sec/batch
Epoch 4/20  Iteration 675/3560 Training loss: 1.8994 2.4629 sec/batch
Epoch 4/20  Iteration 676/3560 Training loss: 1.8991 2.4677 sec/batch
Epoch 4/20  Iteration 677/3560 Training loss: 1.8989 2.4504 sec/batch
Epoch 4/20  Iteration 678/3560 Training loss: 1.8987 2.4455 sec/batch
Epoch 4/20  Iteration 679/3560 Training loss: 1.8983 2.4549 sec/batch
Epoch 4/20  Iteration 680/3560 Training loss: 1.8982 2.6525 sec/batch
Epoch 4/20  Iteration 681/3560 Training loss: 1.8981 2.4873 sec/batch
Epoch 4/20  Iteration 682/3560 Training loss: 1.8980 2.4584 sec/batch
Epoch 4/20  Iteration 683/3560 Training loss: 1.8979 2.4511 sec/batch
Epoch 4/20  Iteration 684/3560 Training loss: 1.8976 2.4513 sec/batch
Epoch 4/20  Iteration 685/3560 Training loss: 1.8972 2.4611 sec/batch
Epoch 4/20  Iteration 686/3560 Training loss: 1.8971 2.4555 sec/batch
Epoch 4/20  Iteration 687/3560 Training loss: 1.8969 2.4606 sec/batch
Epoch 4/20  Iteration 688/3560 Training loss: 1.8967 2.7521 sec/batch
Epoch 4/20  Iteration 689/3560 Training loss: 1.8964 2.4679 sec/batch
Epoch 4/20  Iteration 690/3560 Training loss: 1.8962 2.4550 sec/batch
Epoch 4/20  Iteration 691/3560 Training loss: 1.8960 2.4680 sec/batch
Epoch 4/20  Iteration 692/3560 Training loss: 1.8958 2.4511 sec/batch
Epoch 4/20  Iteration 693/3560 Training loss: 1.8954 2.4522 sec/batch
Epoch 4/20  Iteration 694/3560 Training loss: 1.8953 2.4740 sec/batch
Epoch 4/20  Iteration 695/3560 Training loss: 1.8952 2.4601 sec/batch
Epoch 4/20  Iteration 696/3560 Training loss: 1.8949 2.4567 sec/batch
Epoch 4/20  Iteration 697/3560 Training loss: 1.8947 2.4517 sec/batch
Epoch 4/20  Iteration 698/3560 Training loss: 1.8945 2.4718 sec/batch
Epoch 4/20  Iteration 699/3560 Training loss: 1.8942 2.4584 sec/batch
Epoch 4/20  Iteration 700/3560 Training loss: 1.8939 2.4731 sec/batch
Epoch 4/20  Iteration 701/3560 Training loss: 1.8938 2.4469 sec/batch
Epoch 4/20  Iteration 702/3560 Training loss: 1.8939 2.4499 sec/batch
Epoch 4/20  Iteration 703/3560 Training loss: 1.8936 2.4643 sec/batch
Epoch 4/20  Iteration 704/3560 Training loss: 1.8933 2.4472 sec/batch
Epoch 4/20  Iteration 705/3560 Training loss: 1.8930 2.4513 sec/batch
Epoch 4/20  Iteration 706/3560 Training loss: 1.8927 2.4494 sec/batch
Epoch 4/20  Iteration 707/3560 Training loss: 1.8925 2.4465 sec/batch
Epoch 4/20  Iteration 708/3560 Training loss: 1.8923 2.4577 sec/batch
Epoch 4/20  Iteration 709/3560 Training loss: 1.8921 2.4566 sec/batch
Epoch 4/20  Iteration 710/3560 Training loss: 1.8918 2.4812 sec/batch
Epoch 4/20  Iteration 711/3560 Training loss: 1.8914 2.4580 sec/batch
Epoch 4/20  Iteration 712/3560 Training loss: 1.8912 2.4616 sec/batch
Epoch 5/20  Iteration 713/3560 Training loss: 1.9173 2.4496 sec/batch
Epoch 5/20  Iteration 714/3560 Training loss: 1.8729 2.4500 sec/batch
Epoch 5/20  Iteration 715/3560 Training loss: 1.8624 2.4606 sec/batch
Epoch 5/20  Iteration 716/3560 Training loss: 1.8537 2.4605 sec/batch
Epoch 5/20  Iteration 717/3560 Training loss: 1.8473 2.4589 sec/batch
Epoch 5/20  Iteration 718/3560 Training loss: 1.8383 2.4536 sec/batch
Epoch 5/20  Iteration 719/3560 Training loss: 1.8381 2.4654 sec/batch
Epoch 5/20  Iteration 720/3560 Training loss: 1.8369 2.4468 sec/batch
Epoch 5/20  Iteration 721/3560 Training loss: 1.8388 2.4439 sec/batch
Epoch 5/20  Iteration 722/3560 Training loss: 1.8383 2.4698 sec/batch
Epoch 5/20  Iteration 723/3560 Training loss: 1.8349 2.4640 sec/batch
Epoch 5/20  Iteration 724/3560 Training loss: 1.8328 2.4666 sec/batch
Epoch 5/20  Iteration 725/3560 Training loss: 1.8322 2.4506 sec/batch
Epoch 5/20  Iteration 726/3560 Training loss: 1.8342 2.4572 sec/batch
Epoch 5/20  Iteration 727/3560 Training loss: 1.8326 2.4530 sec/batch
Epoch 5/20  Iteration 728/3560 Training loss: 1.8308 2.6309 sec/batch
Epoch 5/20  Iteration 729/3560 Training loss: 1.8305 2.5104 sec/batch
Epoch 5/20  Iteration 730/3560 Training loss: 1.8330 2.4519 sec/batch
Epoch 5/20  Iteration 731/3560 Training loss: 1.8329 2.4470 sec/batch
Epoch 5/20  Iteration 732/3560 Training loss: 1.8330 2.4617 sec/batch
Epoch 5/20  Iteration 733/3560 Training loss: 1.8324 2.4470 sec/batch
Epoch 5/20  Iteration 734/3560 Training loss: 1.8331 2.4660 sec/batch
Epoch 5/20  Iteration 735/3560 Training loss: 1.8321 2.4681 sec/batch
Epoch 5/20  Iteration 736/3560 Training loss: 1.8317 2.4579 sec/batch
Epoch 5/20  Iteration 737/3560 Training loss: 1.8314 2.4553 sec/batch
Epoch 5/20  Iteration 738/3560 Training loss: 1.8298 2.4610 sec/batch
Epoch 5/20  Iteration 739/3560 Training loss: 1.8283 2.4454 sec/batch
Epoch 5/20  Iteration 740/3560 Training loss: 1.8285 2.4612 sec/batch
Epoch 5/20  Iteration 741/3560 Training loss: 1.8290 2.4590 sec/batch
Epoch 5/20  Iteration 742/3560 Training loss: 1.8291 2.4488 sec/batch
Epoch 5/20  Iteration 743/3560 Training loss: 1.8289 2.4602 sec/batch
Epoch 5/20  Iteration 744/3560 Training loss: 1.8277 2.4522 sec/batch
Epoch 5/20  Iteration 745/3560 Training loss: 1.8276 2.4623 sec/batch
Epoch 5/20  Iteration 746/3560 Training loss: 1.8283 2.4475 sec/batch
Epoch 5/20  Iteration 747/3560 Training loss: 1.8278 2.4754 sec/batch
Epoch 5/20  Iteration 748/3560 Training loss: 1.8273 2.4736 sec/batch
Epoch 5/20  Iteration 749/3560 Training loss: 1.8264 2.4608 sec/batch
Epoch 5/20  Iteration 750/3560 Training loss: 1.8248 2.4630 sec/batch
Epoch 5/20  Iteration 751/3560 Training loss: 1.8232 2.4527 sec/batch
Epoch 5/20  Iteration 752/3560 Training loss: 1.8223 2.4606 sec/batch
Epoch 5/20  Iteration 753/3560 Training loss: 1.8218 2.4700 sec/batch
Epoch 5/20  Iteration 754/3560 Training loss: 1.8218 2.4449 sec/batch
Epoch 5/20  Iteration 755/3560 Training loss: 1.8211 2.4592 sec/batch
Epoch 5/20  Iteration 756/3560 Training loss: 1.8202 2.4495 sec/batch
Epoch 5/20  Iteration 757/3560 Training loss: 1.8204 2.4391 sec/batch
Epoch 5/20  Iteration 758/3560 Training loss: 1.8189 2.4553 sec/batch
Epoch 5/20  Iteration 759/3560 Training loss: 1.8183 2.4582 sec/batch
Epoch 5/20  Iteration 760/3560 Training loss: 1.8176 2.4764 sec/batch
Epoch 5/20  Iteration 761/3560 Training loss: 1.8173 2.4487 sec/batch
Epoch 5/20  Iteration 762/3560 Training loss: 1.8179 2.4576 sec/batch
Epoch 5/20  Iteration 763/3560 Training loss: 1.8170 2.4524 sec/batch
Epoch 5/20  Iteration 764/3560 Training loss: 1.8176 2.4415 sec/batch
Epoch 5/20  Iteration 765/3560 Training loss: 1.8173 2.4673 sec/batch
Epoch 5/20  Iteration 766/3560 Training loss: 1.8168 2.4485 sec/batch
Epoch 5/20  Iteration 767/3560 Training loss: 1.8164 2.4575 sec/batch
Epoch 5/20  Iteration 768/3560 Training loss: 1.8164 2.4515 sec/batch
Epoch 5/20  Iteration 769/3560 Training loss: 1.8166 2.4480 sec/batch
Epoch 5/20  Iteration 770/3560 Training loss: 1.8162 2.4629 sec/batch
Epoch 5/20  Iteration 771/3560 Training loss: 1.8154 2.4650 sec/batch
Epoch 5/20  Iteration 772/3560 Training loss: 1.8159 2.4709 sec/batch
Epoch 5/20  Iteration 773/3560 Training loss: 1.8156 2.4491 sec/batch
Epoch 5/20  Iteration 774/3560 Training loss: 1.8161 2.4671 sec/batch
Epoch 5/20  Iteration 775/3560 Training loss: 1.8163 2.4646 sec/batch
Epoch 5/20  Iteration 776/3560 Training loss: 1.8165 2.4539 sec/batch
Epoch 5/20  Iteration 777/3560 Training loss: 1.8160 2.4807 sec/batch
Epoch 5/20  Iteration 778/3560 Training loss: 1.8162 2.6741 sec/batch
Epoch 5/20  Iteration 779/3560 Training loss: 1.8162 2.4644 sec/batch
Epoch 5/20  Iteration 780/3560 Training loss: 1.8155 2.4563 sec/batch
Epoch 5/20  Iteration 781/3560 Training loss: 1.8151 2.4706 sec/batch
Epoch 5/20  Iteration 782/3560 Training loss: 1.8148 2.4564 sec/batch
Epoch 5/20  Iteration 783/3560 Training loss: 1.8150 2.4592 sec/batch
Epoch 5/20  Iteration 784/3560 Training loss: 1.8149 2.4709 sec/batch
Epoch 5/20  Iteration 785/3560 Training loss: 1.8150 2.4461 sec/batch
Epoch 5/20  Iteration 786/3560 Training loss: 1.8144 2.4565 sec/batch
Epoch 5/20  Iteration 787/3560 Training loss: 1.8141 2.4500 sec/batch
Epoch 5/20  Iteration 788/3560 Training loss: 1.8141 2.4459 sec/batch
Epoch 5/20  Iteration 789/3560 Training loss: 1.8138 2.4616 sec/batch
Epoch 5/20  Iteration 790/3560 Training loss: 1.8136 2.4515 sec/batch
Epoch 5/20  Iteration 791/3560 Training loss: 1.8129 2.4583 sec/batch
Epoch 5/20  Iteration 792/3560 Training loss: 1.8124 2.4499 sec/batch
Epoch 5/20  Iteration 793/3560 Training loss: 1.8118 2.4559 sec/batch
Epoch 5/20  Iteration 794/3560 Training loss: 1.8116 2.4557 sec/batch
Epoch 5/20  Iteration 795/3560 Training loss: 1.8108 2.4635 sec/batch
Epoch 5/20  Iteration 796/3560 Training loss: 1.8105 2.4533 sec/batch
Epoch 5/20  Iteration 797/3560 Training loss: 1.8099 2.4547 sec/batch
Epoch 5/20  Iteration 798/3560 Training loss: 1.8092 2.4612 sec/batch
Epoch 5/20  Iteration 799/3560 Training loss: 1.8088 2.4548 sec/batch
Epoch 5/20  Iteration 800/3560 Training loss: 1.8082 2.4485 sec/batch
Epoch 5/20  Iteration 801/3560 Training loss: 1.8075 2.4595 sec/batch
Epoch 5/20  Iteration 802/3560 Training loss: 1.8074 2.4526 sec/batch
Epoch 5/20  Iteration 803/3560 Training loss: 1.8069 2.4580 sec/batch
Epoch 5/20  Iteration 804/3560 Training loss: 1.8065 2.4518 sec/batch
Epoch 5/20  Iteration 805/3560 Training loss: 1.8059 2.4623 sec/batch
Epoch 5/20  Iteration 806/3560 Training loss: 1.8054 2.4627 sec/batch
Epoch 5/20  Iteration 807/3560 Training loss: 1.8048 2.4548 sec/batch
Epoch 5/20  Iteration 808/3560 Training loss: 1.8046 2.4832 sec/batch
Epoch 5/20  Iteration 809/3560 Training loss: 1.8042 2.4799 sec/batch
Epoch 5/20  Iteration 810/3560 Training loss: 1.8036 2.6920 sec/batch
Epoch 5/20  Iteration 811/3560 Training loss: 1.8030 2.4536 sec/batch
Epoch 5/20  Iteration 812/3560 Training loss: 1.8022 2.4517 sec/batch
Epoch 5/20  Iteration 813/3560 Training loss: 1.8019 2.4564 sec/batch
Epoch 5/20  Iteration 814/3560 Training loss: 1.8015 2.4540 sec/batch
Epoch 5/20  Iteration 815/3560 Training loss: 1.8010 2.4695 sec/batch
Epoch 5/20  Iteration 816/3560 Training loss: 1.8006 2.4604 sec/batch
Epoch 5/20  Iteration 817/3560 Training loss: 1.8001 2.4560 sec/batch
Epoch 5/20  Iteration 818/3560 Training loss: 1.7997 2.4548 sec/batch
Epoch 5/20  Iteration 819/3560 Training loss: 1.7994 2.4445 sec/batch
Epoch 5/20  Iteration 820/3560 Training loss: 1.7991 2.4878 sec/batch
Epoch 5/20  Iteration 821/3560 Training loss: 1.7989 2.4552 sec/batch
Epoch 5/20  Iteration 822/3560 Training loss: 1.7987 2.4509 sec/batch
Epoch 5/20  Iteration 823/3560 Training loss: 1.7984 2.4795 sec/batch
Epoch 5/20  Iteration 824/3560 Training loss: 1.7981 2.4499 sec/batch
Epoch 5/20  Iteration 825/3560 Training loss: 1.7977 2.4670 sec/batch
Epoch 5/20  Iteration 826/3560 Training loss: 1.7973 2.6470 sec/batch
Epoch 5/20  Iteration 827/3560 Training loss: 1.7969 2.5014 sec/batch
Epoch 5/20  Iteration 828/3560 Training loss: 1.7963 2.4492 sec/batch
Epoch 5/20  Iteration 829/3560 Training loss: 1.7962 2.4495 sec/batch
Epoch 5/20  Iteration 830/3560 Training loss: 1.7959 2.4643 sec/batch
Epoch 5/20  Iteration 831/3560 Training loss: 1.7956 2.4526 sec/batch
Epoch 5/20  Iteration 832/3560 Training loss: 1.7953 2.4699 sec/batch
Epoch 5/20  Iteration 833/3560 Training loss: 1.7951 2.4484 sec/batch
Epoch 5/20  Iteration 834/3560 Training loss: 1.7945 2.4558 sec/batch
Epoch 5/20  Iteration 835/3560 Training loss: 1.7940 2.4546 sec/batch
Epoch 5/20  Iteration 836/3560 Training loss: 1.7938 2.4471 sec/batch
Epoch 5/20  Iteration 837/3560 Training loss: 1.7935 2.4618 sec/batch
Epoch 5/20  Iteration 838/3560 Training loss: 1.7930 2.4518 sec/batch
Epoch 5/20  Iteration 839/3560 Training loss: 1.7929 2.4608 sec/batch
Epoch 5/20  Iteration 840/3560 Training loss: 1.7928 2.4611 sec/batch
Epoch 5/20  Iteration 841/3560 Training loss: 1.7925 2.4574 sec/batch
Epoch 5/20  Iteration 842/3560 Training loss: 1.7921 2.4612 sec/batch
Epoch 5/20  Iteration 843/3560 Training loss: 1.7917 2.4458 sec/batch
Epoch 5/20  Iteration 844/3560 Training loss: 1.7912 2.4791 sec/batch
Epoch 5/20  Iteration 845/3560 Training loss: 1.7911 2.4535 sec/batch
Epoch 5/20  Iteration 846/3560 Training loss: 1.7908 2.4588 sec/batch
Epoch 5/20  Iteration 847/3560 Training loss: 1.7907 2.4641 sec/batch
Epoch 5/20  Iteration 848/3560 Training loss: 1.7904 2.4598 sec/batch
Epoch 5/20  Iteration 849/3560 Training loss: 1.7903 2.4636 sec/batch
Epoch 5/20  Iteration 850/3560 Training loss: 1.7902 2.4528 sec/batch
Epoch 5/20  Iteration 851/3560 Training loss: 1.7902 2.4463 sec/batch
Epoch 5/20  Iteration 852/3560 Training loss: 1.7899 2.4563 sec/batch
Epoch 5/20  Iteration 853/3560 Training loss: 1.7900 2.4506 sec/batch
Epoch 5/20  Iteration 854/3560 Training loss: 1.7897 2.4555 sec/batch
Epoch 5/20  Iteration 855/3560 Training loss: 1.7895 2.4546 sec/batch
Epoch 5/20  Iteration 856/3560 Training loss: 1.7895 2.4684 sec/batch
Epoch 5/20  Iteration 857/3560 Training loss: 1.7892 2.4566 sec/batch
Epoch 5/20  Iteration 858/3560 Training loss: 1.7891 2.4518 sec/batch
Epoch 5/20  Iteration 859/3560 Training loss: 1.7889 2.4598 sec/batch
Epoch 5/20  Iteration 860/3560 Training loss: 1.7889 2.4532 sec/batch
Epoch 5/20  Iteration 861/3560 Training loss: 1.7888 2.4642 sec/batch
Epoch 5/20  Iteration 862/3560 Training loss: 1.7884 2.4501 sec/batch
Epoch 5/20  Iteration 863/3560 Training loss: 1.7880 2.4551 sec/batch
Epoch 5/20  Iteration 864/3560 Training loss: 1.7879 2.4555 sec/batch
Epoch 5/20  Iteration 865/3560 Training loss: 1.7877 2.4508 sec/batch
Epoch 5/20  Iteration 866/3560 Training loss: 1.7876 2.4570 sec/batch
Epoch 5/20  Iteration 867/3560 Training loss: 1.7874 2.4467 sec/batch
Epoch 5/20  Iteration 868/3560 Training loss: 1.7873 2.4751 sec/batch
Epoch 5/20  Iteration 869/3560 Training loss: 1.7871 2.4481 sec/batch
Epoch 5/20  Iteration 870/3560 Training loss: 1.7869 2.4613 sec/batch
Epoch 5/20  Iteration 871/3560 Training loss: 1.7865 2.4640 sec/batch
Epoch 5/20  Iteration 872/3560 Training loss: 1.7864 2.4637 sec/batch
Epoch 5/20  Iteration 873/3560 Training loss: 1.7864 2.4536 sec/batch
Epoch 5/20  Iteration 874/3560 Training loss: 1.7860 2.6414 sec/batch
Epoch 5/20  Iteration 875/3560 Training loss: 1.7859 2.5702 sec/batch
Epoch 5/20  Iteration 876/3560 Training loss: 1.7857 2.4723 sec/batch
Epoch 5/20  Iteration 877/3560 Training loss: 1.7855 2.4726 sec/batch
Epoch 5/20  Iteration 878/3560 Training loss: 1.7853 2.4556 sec/batch
Epoch 5/20  Iteration 879/3560 Training loss: 1.7852 2.4556 sec/batch
Epoch 5/20  Iteration 880/3560 Training loss: 1.7853 2.4743 sec/batch
Epoch 5/20  Iteration 881/3560 Training loss: 1.7850 2.4490 sec/batch
Epoch 5/20  Iteration 882/3560 Training loss: 1.7847 2.4729 sec/batch
Epoch 5/20  Iteration 883/3560 Training loss: 1.7844 2.4458 sec/batch
Epoch 5/20  Iteration 884/3560 Training loss: 1.7841 2.4510 sec/batch
Epoch 5/20  Iteration 885/3560 Training loss: 1.7840 2.4649 sec/batch
Epoch 5/20  Iteration 886/3560 Training loss: 1.7839 2.4580 sec/batch
Epoch 5/20  Iteration 887/3560 Training loss: 1.7837 2.4678 sec/batch
Epoch 5/20  Iteration 888/3560 Training loss: 1.7835 2.4581 sec/batch
Epoch 5/20  Iteration 889/3560 Training loss: 1.7832 2.4448 sec/batch
Epoch 5/20  Iteration 890/3560 Training loss: 1.7830 2.4542 sec/batch
Epoch 6/20  Iteration 891/3560 Training loss: 1.8327 2.4493 sec/batch
Epoch 6/20  Iteration 892/3560 Training loss: 1.7845 2.4655 sec/batch
Epoch 6/20  Iteration 893/3560 Training loss: 1.7691 2.4558 sec/batch
Epoch 6/20  Iteration 894/3560 Training loss: 1.7595 2.4641 sec/batch
Epoch 6/20  Iteration 895/3560 Training loss: 1.7543 2.4576 sec/batch
Epoch 6/20  Iteration 896/3560 Training loss: 1.7432 2.4554 sec/batch
Epoch 6/20  Iteration 897/3560 Training loss: 1.7429 2.4606 sec/batch
Epoch 6/20  Iteration 898/3560 Training loss: 1.7416 2.4486 sec/batch
Epoch 6/20  Iteration 899/3560 Training loss: 1.7433 2.4634 sec/batch
Epoch 6/20  Iteration 900/3560 Training loss: 1.7417 2.4441 sec/batch
Epoch 6/20  Iteration 901/3560 Training loss: 1.7378 2.4483 sec/batch
Epoch 6/20  Iteration 902/3560 Training loss: 1.7361 2.4724 sec/batch
Epoch 6/20  Iteration 903/3560 Training loss: 1.7368 2.4576 sec/batch
Epoch 6/20  Iteration 904/3560 Training loss: 1.7389 2.4574 sec/batch
Epoch 6/20  Iteration 905/3560 Training loss: 1.7377 2.4628 sec/batch
Epoch 6/20  Iteration 906/3560 Training loss: 1.7367 2.4554 sec/batch
Epoch 6/20  Iteration 907/3560 Training loss: 1.7366 2.6883 sec/batch
Epoch 6/20  Iteration 908/3560 Training loss: 1.7382 2.4529 sec/batch
Epoch 6/20  Iteration 909/3560 Training loss: 1.7384 2.4649 sec/batch
Epoch 6/20  Iteration 910/3560 Training loss: 1.7386 2.4472 sec/batch
Epoch 6/20  Iteration 911/3560 Training loss: 1.7379 2.4579 sec/batch
Epoch 6/20  Iteration 912/3560 Training loss: 1.7389 2.4535 sec/batch
Epoch 6/20  Iteration 913/3560 Training loss: 1.7381 2.4431 sec/batch
Epoch 6/20  Iteration 914/3560 Training loss: 1.7375 2.4530 sec/batch
Epoch 6/20  Iteration 915/3560 Training loss: 1.7376 2.4536 sec/batch
Epoch 6/20  Iteration 916/3560 Training loss: 1.7358 2.4780 sec/batch
Epoch 6/20  Iteration 917/3560 Training loss: 1.7341 2.4599 sec/batch
Epoch 6/20  Iteration 918/3560 Training loss: 1.7342 2.4549 sec/batch
Epoch 6/20  Iteration 919/3560 Training loss: 1.7349 2.4843 sec/batch
Epoch 6/20  Iteration 920/3560 Training loss: 1.7350 2.4659 sec/batch
Epoch 6/20  Iteration 921/3560 Training loss: 1.7344 2.4625 sec/batch
Epoch 6/20  Iteration 922/3560 Training loss: 1.7330 2.4511 sec/batch
Epoch 6/20  Iteration 923/3560 Training loss: 1.7329 2.6581 sec/batch
Epoch 6/20  Iteration 924/3560 Training loss: 1.7332 2.4682 sec/batch
Epoch 6/20  Iteration 925/3560 Training loss: 1.7328 2.4614 sec/batch
Epoch 6/20  Iteration 926/3560 Training loss: 1.7324 2.4720 sec/batch
Epoch 6/20  Iteration 927/3560 Training loss: 1.7317 2.4582 sec/batch
Epoch 6/20  Iteration 928/3560 Training loss: 1.7303 2.4501 sec/batch
Epoch 6/20  Iteration 929/3560 Training loss: 1.7288 2.4699 sec/batch
Epoch 6/20  Iteration 930/3560 Training loss: 1.7282 2.4508 sec/batch
Epoch 6/20  Iteration 931/3560 Training loss: 1.7275 2.8887 sec/batch
Epoch 6/20  Iteration 932/3560 Training loss: 1.7277 2.4557 sec/batch
Epoch 6/20  Iteration 933/3560 Training loss: 1.7270 2.4491 sec/batch
Epoch 6/20  Iteration 934/3560 Training loss: 1.7260 2.4527 sec/batch
Epoch 6/20  Iteration 935/3560 Training loss: 1.7261 2.4634 sec/batch
Epoch 6/20  Iteration 936/3560 Training loss: 1.7248 2.4504 sec/batch
Epoch 6/20  Iteration 937/3560 Training loss: 1.7243 2.4750 sec/batch
Epoch 6/20  Iteration 938/3560 Training loss: 1.7237 2.4446 sec/batch
Epoch 6/20  Iteration 939/3560 Training loss: 1.7232 2.4791 sec/batch
Epoch 6/20  Iteration 940/3560 Training loss: 1.7239 2.4564 sec/batch
Epoch 6/20  Iteration 941/3560 Training loss: 1.7232 2.4742 sec/batch
Epoch 6/20  Iteration 942/3560 Training loss: 1.7239 2.4526 sec/batch
Epoch 6/20  Iteration 943/3560 Training loss: 1.7237 2.4561 sec/batch
Epoch 6/20  Iteration 944/3560 Training loss: 1.7237 2.4670 sec/batch
Epoch 6/20  Iteration 945/3560 Training loss: 1.7232 2.4504 sec/batch
Epoch 6/20  Iteration 946/3560 Training loss: 1.7233 2.4516 sec/batch
Epoch 6/20  Iteration 947/3560 Training loss: 1.7233 2.4622 sec/batch
Epoch 6/20  Iteration 948/3560 Training loss: 1.7227 2.4486 sec/batch
Epoch 6/20  Iteration 949/3560 Training loss: 1.7220 2.4703 sec/batch
Epoch 6/20  Iteration 950/3560 Training loss: 1.7223 2.4523 sec/batch
Epoch 6/20  Iteration 951/3560 Training loss: 1.7222 2.4481 sec/batch
Epoch 6/20  Iteration 952/3560 Training loss: 1.7227 2.4541 sec/batch
Epoch 6/20  Iteration 953/3560 Training loss: 1.7230 2.4691 sec/batch
Epoch 6/20  Iteration 954/3560 Training loss: 1.7231 2.4572 sec/batch
Epoch 6/20  Iteration 955/3560 Training loss: 1.7228 2.4459 sec/batch
Epoch 6/20  Iteration 956/3560 Training loss: 1.7229 2.4611 sec/batch
Epoch 6/20  Iteration 957/3560 Training loss: 1.7229 2.4500 sec/batch
Epoch 6/20  Iteration 958/3560 Training loss: 1.7223 2.4659 sec/batch
Epoch 6/20  Iteration 959/3560 Training loss: 1.7220 2.4440 sec/batch
Epoch 6/20  Iteration 960/3560 Training loss: 1.7216 2.4460 sec/batch
Epoch 6/20  Iteration 961/3560 Training loss: 1.7220 2.4523 sec/batch
Epoch 6/20  Iteration 962/3560 Training loss: 1.7220 2.4489 sec/batch
Epoch 6/20  Iteration 963/3560 Training loss: 1.7223 2.4618 sec/batch
Epoch 6/20  Iteration 964/3560 Training loss: 1.7218 2.4501 sec/batch
Epoch 6/20  Iteration 965/3560 Training loss: 1.7215 2.4601 sec/batch
Epoch 6/20  Iteration 966/3560 Training loss: 1.7217 2.4572 sec/batch
Epoch 6/20  Iteration 967/3560 Training loss: 1.7214 2.4552 sec/batch
Epoch 6/20  Iteration 968/3560 Training loss: 1.7213 2.4622 sec/batch
Epoch 6/20  Iteration 969/3560 Training loss: 1.7206 2.4546 sec/batch
Epoch 6/20  Iteration 970/3560 Training loss: 1.7204 2.4686 sec/batch
Epoch 6/20  Iteration 971/3560 Training loss: 1.7197 2.6488 sec/batch
Epoch 6/20  Iteration 972/3560 Training loss: 1.7197 2.5295 sec/batch
Epoch 6/20  Iteration 973/3560 Training loss: 1.7190 2.4539 sec/batch
Epoch 6/20  Iteration 974/3560 Training loss: 1.7189 2.4517 sec/batch
Epoch 6/20  Iteration 975/3560 Training loss: 1.7183 2.4591 sec/batch
Epoch 6/20  Iteration 976/3560 Training loss: 1.7177 2.4535 sec/batch
Epoch 6/20  Iteration 977/3560 Training loss: 1.7174 2.4790 sec/batch
Epoch 6/20  Iteration 978/3560 Training loss: 1.7169 2.4438 sec/batch
Epoch 6/20  Iteration 979/3560 Training loss: 1.7163 2.4523 sec/batch
Epoch 6/20  Iteration 980/3560 Training loss: 1.7163 2.4638 sec/batch
Epoch 6/20  Iteration 981/3560 Training loss: 1.7158 2.4568 sec/batch
Epoch 6/20  Iteration 982/3560 Training loss: 1.7155 2.4651 sec/batch
Epoch 6/20  Iteration 983/3560 Training loss: 1.7149 2.4443 sec/batch
Epoch 6/20  Iteration 984/3560 Training loss: 1.7145 2.4598 sec/batch
Epoch 6/20  Iteration 985/3560 Training loss: 1.7139 2.4561 sec/batch
Epoch 6/20  Iteration 986/3560 Training loss: 1.7136 2.4412 sec/batch
Epoch 6/20  Iteration 987/3560 Training loss: 1.7133 2.4680 sec/batch
Epoch 6/20  Iteration 988/3560 Training loss: 1.7127 2.4526 sec/batch
Epoch 6/20  Iteration 989/3560 Training loss: 1.7123 2.4655 sec/batch
Epoch 6/20  Iteration 990/3560 Training loss: 1.7115 2.4654 sec/batch
Epoch 6/20  Iteration 991/3560 Training loss: 1.7113 2.4445 sec/batch
Epoch 6/20  Iteration 992/3560 Training loss: 1.7109 2.4651 sec/batch
Epoch 6/20  Iteration 993/3560 Training loss: 1.7105 2.4523 sec/batch
Epoch 6/20  Iteration 994/3560 Training loss: 1.7101 2.4619 sec/batch
Epoch 6/20  Iteration 995/3560 Training loss: 1.7097 2.4476 sec/batch
Epoch 6/20  Iteration 996/3560 Training loss: 1.7094 2.4586 sec/batch
Epoch 6/20  Iteration 997/3560 Training loss: 1.7092 2.4688 sec/batch
Epoch 6/20  Iteration 998/3560 Training loss: 1.7091 2.4431 sec/batch
Epoch 6/20  Iteration 999/3560 Training loss: 1.7089 2.4633 sec/batch
Epoch 6/20  Iteration 1000/3560 Training loss: 1.7088 2.4488 sec/batch
Epoch 6/20  Iteration 1001/3560 Training loss: 1.7086 2.4557 sec/batch
Epoch 6/20  Iteration 1002/3560 Training loss: 1.7084 2.4676 sec/batch
Epoch 6/20  Iteration 1003/3560 Training loss: 1.7081 2.4699 sec/batch
Epoch 6/20  Iteration 1004/3560 Training loss: 1.7079 2.4586 sec/batch
Epoch 6/20  Iteration 1005/3560 Training loss: 1.7076 2.4460 sec/batch
Epoch 6/20  Iteration 1006/3560 Training loss: 1.7071 2.4575 sec/batch
Epoch 6/20  Iteration 1007/3560 Training loss: 1.7069 2.4540 sec/batch
Epoch 6/20  Iteration 1008/3560 Training loss: 1.7068 2.4614 sec/batch
Epoch 6/20  Iteration 1009/3560 Training loss: 1.7065 2.4528 sec/batch
Epoch 6/20  Iteration 1010/3560 Training loss: 1.7063 2.4495 sec/batch
Epoch 6/20  Iteration 1011/3560 Training loss: 1.7062 2.4574 sec/batch
Epoch 6/20  Iteration 1012/3560 Training loss: 1.7058 2.4464 sec/batch
Epoch 6/20  Iteration 1013/3560 Training loss: 1.7054 2.4570 sec/batch
Epoch 6/20  Iteration 1014/3560 Training loss: 1.7052 2.4642 sec/batch
Epoch 6/20  Iteration 1015/3560 Training loss: 1.7051 2.4593 sec/batch
Epoch 6/20  Iteration 1016/3560 Training loss: 1.7046 2.4598 sec/batch
Epoch 6/20  Iteration 1017/3560 Training loss: 1.7046 2.4505 sec/batch
Epoch 6/20  Iteration 1018/3560 Training loss: 1.7047 2.4553 sec/batch
Epoch 6/20  Iteration 1019/3560 Training loss: 1.7044 2.4483 sec/batch
Epoch 6/20  Iteration 1020/3560 Training loss: 1.7041 2.6411 sec/batch
Epoch 6/20  Iteration 1021/3560 Training loss: 1.7036 2.5182 sec/batch
Epoch 6/20  Iteration 1022/3560 Training loss: 1.7032 2.4578 sec/batch
Epoch 6/20  Iteration 1023/3560 Training loss: 1.7031 2.4501 sec/batch
Epoch 6/20  Iteration 1024/3560 Training loss: 1.7030 2.4693 sec/batch
Epoch 6/20  Iteration 1025/3560 Training loss: 1.7029 2.4472 sec/batch
Epoch 6/20  Iteration 1026/3560 Training loss: 1.7028 2.4579 sec/batch
Epoch 6/20  Iteration 1027/3560 Training loss: 1.7028 2.4563 sec/batch
Epoch 6/20  Iteration 1028/3560 Training loss: 1.7028 2.4413 sec/batch
Epoch 6/20  Iteration 1029/3560 Training loss: 1.7028 2.4799 sec/batch
Epoch 6/20  Iteration 1030/3560 Training loss: 1.7025 2.4544 sec/batch
Epoch 6/20  Iteration 1031/3560 Training loss: 1.7028 2.4564 sec/batch
Epoch 6/20  Iteration 1032/3560 Training loss: 1.7026 2.4529 sec/batch
Epoch 6/20  Iteration 1033/3560 Training loss: 1.7024 2.4542 sec/batch
Epoch 6/20  Iteration 1034/3560 Training loss: 1.7024 2.4667 sec/batch
Epoch 6/20  Iteration 1035/3560 Training loss: 1.7022 2.4438 sec/batch
Epoch 6/20  Iteration 1036/3560 Training loss: 1.7022 2.4604 sec/batch
Epoch 6/20  Iteration 1037/3560 Training loss: 1.7021 2.4640 sec/batch
Epoch 6/20  Iteration 1038/3560 Training loss: 1.7021 2.4688 sec/batch
Epoch 6/20  Iteration 1039/3560 Training loss: 1.7020 2.4653 sec/batch
Epoch 6/20  Iteration 1040/3560 Training loss: 1.7018 2.4550 sec/batch
Epoch 6/20  Iteration 1041/3560 Training loss: 1.7013 2.4555 sec/batch
Epoch 6/20  Iteration 1042/3560 Training loss: 1.7012 2.4489 sec/batch
Epoch 6/20  Iteration 1043/3560 Training loss: 1.7012 2.4571 sec/batch
Epoch 6/20  Iteration 1044/3560 Training loss: 1.7011 2.4520 sec/batch
Epoch 6/20  Iteration 1045/3560 Training loss: 1.7009 2.4506 sec/batch
Epoch 6/20  Iteration 1046/3560 Training loss: 1.7008 2.4634 sec/batch
Epoch 6/20  Iteration 1047/3560 Training loss: 1.7007 2.4371 sec/batch
Epoch 6/20  Iteration 1048/3560 Training loss: 1.7005 2.4532 sec/batch
Epoch 6/20  Iteration 1049/3560 Training loss: 1.7001 2.4440 sec/batch
Epoch 6/20  Iteration 1050/3560 Training loss: 1.7000 2.4709 sec/batch
Epoch 6/20  Iteration 1051/3560 Training loss: 1.7000 2.4535 sec/batch
Epoch 6/20  Iteration 1052/3560 Training loss: 1.6998 2.6793 sec/batch
Epoch 6/20  Iteration 1053/3560 Training loss: 1.6997 2.4836 sec/batch
Epoch 6/20  Iteration 1054/3560 Training loss: 1.6996 2.4599 sec/batch
Epoch 6/20  Iteration 1055/3560 Training loss: 1.6994 2.4588 sec/batch
Epoch 6/20  Iteration 1056/3560 Training loss: 1.6992 2.4542 sec/batch
Epoch 6/20  Iteration 1057/3560 Training loss: 1.6992 2.4487 sec/batch
Epoch 6/20  Iteration 1058/3560 Training loss: 1.6994 2.4575 sec/batch
Epoch 6/20  Iteration 1059/3560 Training loss: 1.6992 2.4544 sec/batch
Epoch 6/20  Iteration 1060/3560 Training loss: 1.6990 2.4628 sec/batch
Epoch 6/20  Iteration 1061/3560 Training loss: 1.6988 2.4488 sec/batch
Epoch 6/20  Iteration 1062/3560 Training loss: 1.6985 2.4650 sec/batch
Epoch 6/20  Iteration 1063/3560 Training loss: 1.6984 2.4574 sec/batch
Epoch 6/20  Iteration 1064/3560 Training loss: 1.6983 2.4470 sec/batch
Epoch 6/20  Iteration 1065/3560 Training loss: 1.6982 2.4800 sec/batch
Epoch 6/20  Iteration 1066/3560 Training loss: 1.6980 2.4761 sec/batch
Epoch 6/20  Iteration 1067/3560 Training loss: 1.6977 2.4544 sec/batch
Epoch 6/20  Iteration 1068/3560 Training loss: 1.6976 2.5858 sec/batch
Epoch 7/20  Iteration 1069/3560 Training loss: 1.7531 2.5112 sec/batch
Epoch 7/20  Iteration 1070/3560 Training loss: 1.7155 2.5016 sec/batch
Epoch 7/20  Iteration 1071/3560 Training loss: 1.6970 2.4486 sec/batch
Epoch 7/20  Iteration 1072/3560 Training loss: 1.6885 2.4704 sec/batch
Epoch 7/20  Iteration 1073/3560 Training loss: 1.6824 2.4498 sec/batch
Epoch 7/20  Iteration 1074/3560 Training loss: 1.6716 2.4549 sec/batch
Epoch 7/20  Iteration 1075/3560 Training loss: 1.6709 2.4550 sec/batch
Epoch 7/20  Iteration 1076/3560 Training loss: 1.6686 2.4639 sec/batch
Epoch 7/20  Iteration 1077/3560 Training loss: 1.6690 2.4596 sec/batch
Epoch 7/20  Iteration 1078/3560 Training loss: 1.6682 2.4521 sec/batch
Epoch 7/20  Iteration 1079/3560 Training loss: 1.6643 2.4413 sec/batch
Epoch 7/20  Iteration 1080/3560 Training loss: 1.6625 2.4622 sec/batch
Epoch 7/20  Iteration 1081/3560 Training loss: 1.6624 2.4608 sec/batch
Epoch 7/20  Iteration 1082/3560 Training loss: 1.6646 2.4554 sec/batch
Epoch 7/20  Iteration 1083/3560 Training loss: 1.6635 2.4495 sec/batch
Epoch 7/20  Iteration 1084/3560 Training loss: 1.6616 2.4539 sec/batch
Epoch 7/20  Iteration 1085/3560 Training loss: 1.6616 2.4541 sec/batch
Epoch 7/20  Iteration 1086/3560 Training loss: 1.6635 2.4429 sec/batch
Epoch 7/20  Iteration 1087/3560 Training loss: 1.6639 2.4732 sec/batch
Epoch 7/20  Iteration 1088/3560 Training loss: 1.6647 2.4449 sec/batch
Epoch 7/20  Iteration 1089/3560 Training loss: 1.6636 2.4581 sec/batch
Epoch 7/20  Iteration 1090/3560 Training loss: 1.6642 2.4550 sec/batch
Epoch 7/20  Iteration 1091/3560 Training loss: 1.6632 2.4465 sec/batch
Epoch 7/20  Iteration 1092/3560 Training loss: 1.6628 2.4556 sec/batch
Epoch 7/20  Iteration 1093/3560 Training loss: 1.6626 2.4512 sec/batch
Epoch 7/20  Iteration 1094/3560 Training loss: 1.6608 2.4613 sec/batch
Epoch 7/20  Iteration 1095/3560 Training loss: 1.6595 2.4463 sec/batch
Epoch 7/20  Iteration 1096/3560 Training loss: 1.6596 2.4565 sec/batch
Epoch 7/20  Iteration 1097/3560 Training loss: 1.6601 2.4659 sec/batch
Epoch 7/20  Iteration 1098/3560 Training loss: 1.6602 2.4450 sec/batch
Epoch 7/20  Iteration 1099/3560 Training loss: 1.6598 2.4712 sec/batch
Epoch 7/20  Iteration 1100/3560 Training loss: 1.6586 2.4569 sec/batch
Epoch 7/20  Iteration 1101/3560 Training loss: 1.6586 2.4586 sec/batch
Epoch 7/20  Iteration 1102/3560 Training loss: 1.6593 2.4506 sec/batch
Epoch 7/20  Iteration 1103/3560 Training loss: 1.6590 2.4551 sec/batch
Epoch 7/20  Iteration 1104/3560 Training loss: 1.6584 2.4546 sec/batch
Epoch 7/20  Iteration 1105/3560 Training loss: 1.6576 2.4511 sec/batch
Epoch 7/20  Iteration 1106/3560 Training loss: 1.6562 2.4526 sec/batch
Epoch 7/20  Iteration 1107/3560 Training loss: 1.6549 2.4552 sec/batch
Epoch 7/20  Iteration 1108/3560 Training loss: 1.6542 2.4679 sec/batch
Epoch 7/20  Iteration 1109/3560 Training loss: 1.6534 2.4551 sec/batch
Epoch 7/20  Iteration 1110/3560 Training loss: 1.6540 2.4475 sec/batch
Epoch 7/20  Iteration 1111/3560 Training loss: 1.6534 2.4867 sec/batch
Epoch 7/20  Iteration 1112/3560 Training loss: 1.6527 2.4459 sec/batch
Epoch 7/20  Iteration 1113/3560 Training loss: 1.6529 2.4653 sec/batch
Epoch 7/20  Iteration 1114/3560 Training loss: 1.6517 2.4625 sec/batch
Epoch 7/20  Iteration 1115/3560 Training loss: 1.6511 2.4468 sec/batch
Epoch 7/20  Iteration 1116/3560 Training loss: 1.6505 2.4565 sec/batch
Epoch 7/20  Iteration 1117/3560 Training loss: 1.6499 2.5042 sec/batch
Epoch 7/20  Iteration 1118/3560 Training loss: 1.6505 2.5995 sec/batch
Epoch 7/20  Iteration 1119/3560 Training loss: 1.6498 2.4944 sec/batch
Epoch 7/20  Iteration 1120/3560 Training loss: 1.6507 2.4517 sec/batch
Epoch 7/20  Iteration 1121/3560 Training loss: 1.6505 2.4630 sec/batch
Epoch 7/20  Iteration 1122/3560 Training loss: 1.6504 2.4505 sec/batch
Epoch 7/20  Iteration 1123/3560 Training loss: 1.6500 2.4652 sec/batch
Epoch 7/20  Iteration 1124/3560 Training loss: 1.6501 2.4516 sec/batch
Epoch 7/20  Iteration 1125/3560 Training loss: 1.6503 2.4490 sec/batch
Epoch 7/20  Iteration 1126/3560 Training loss: 1.6497 2.4624 sec/batch
Epoch 7/20  Iteration 1127/3560 Training loss: 1.6489 2.4535 sec/batch
Epoch 7/20  Iteration 1128/3560 Training loss: 1.6495 2.4619 sec/batch
Epoch 7/20  Iteration 1129/3560 Training loss: 1.6493 2.4515 sec/batch
Epoch 7/20  Iteration 1130/3560 Training loss: 1.6500 2.4679 sec/batch
Epoch 7/20  Iteration 1131/3560 Training loss: 1.6503 2.4493 sec/batch
Epoch 7/20  Iteration 1132/3560 Training loss: 1.6505 2.4546 sec/batch
Epoch 7/20  Iteration 1133/3560 Training loss: 1.6501 2.4482 sec/batch
Epoch 7/20  Iteration 1134/3560 Training loss: 1.6501 2.4574 sec/batch
Epoch 7/20  Iteration 1135/3560 Training loss: 1.6501 2.4747 sec/batch
Epoch 7/20  Iteration 1136/3560 Training loss: 1.6497 2.4569 sec/batch
Epoch 7/20  Iteration 1137/3560 Training loss: 1.6495 2.4509 sec/batch
Epoch 7/20  Iteration 1138/3560 Training loss: 1.6492 2.4618 sec/batch
Epoch 7/20  Iteration 1139/3560 Training loss: 1.6496 2.4617 sec/batch
Epoch 7/20  Iteration 1140/3560 Training loss: 1.6496 2.4456 sec/batch
Epoch 7/20  Iteration 1141/3560 Training loss: 1.6500 2.4472 sec/batch
Epoch 7/20  Iteration 1142/3560 Training loss: 1.6496 2.4839 sec/batch
Epoch 7/20  Iteration 1143/3560 Training loss: 1.6494 2.4601 sec/batch
Epoch 7/20  Iteration 1144/3560 Training loss: 1.6496 2.4514 sec/batch
Epoch 7/20  Iteration 1145/3560 Training loss: 1.6494 2.4539 sec/batch
Epoch 7/20  Iteration 1146/3560 Training loss: 1.6493 2.4515 sec/batch
Epoch 7/20  Iteration 1147/3560 Training loss: 1.6486 2.4726 sec/batch
Epoch 7/20  Iteration 1148/3560 Training loss: 1.6484 2.4512 sec/batch
Epoch 7/20  Iteration 1149/3560 Training loss: 1.6478 2.4586 sec/batch
Epoch 7/20  Iteration 1150/3560 Training loss: 1.6478 2.4478 sec/batch
Epoch 7/20  Iteration 1151/3560 Training loss: 1.6472 2.4584 sec/batch
Epoch 7/20  Iteration 1152/3560 Training loss: 1.6471 2.4555 sec/batch
Epoch 7/20  Iteration 1153/3560 Training loss: 1.6467 2.4489 sec/batch
Epoch 7/20  Iteration 1154/3560 Training loss: 1.6463 2.4602 sec/batch
Epoch 7/20  Iteration 1155/3560 Training loss: 1.6459 2.4538 sec/batch
Epoch 7/20  Iteration 1156/3560 Training loss: 1.6455 2.4616 sec/batch
Epoch 7/20  Iteration 1157/3560 Training loss: 1.6449 2.4539 sec/batch
Epoch 7/20  Iteration 1158/3560 Training loss: 1.6449 2.4566 sec/batch
Epoch 7/20  Iteration 1159/3560 Training loss: 1.6446 2.4512 sec/batch
Epoch 7/20  Iteration 1160/3560 Training loss: 1.6442 2.4627 sec/batch
Epoch 7/20  Iteration 1161/3560 Training loss: 1.6437 2.4598 sec/batch
Epoch 7/20  Iteration 1162/3560 Training loss: 1.6433 2.4722 sec/batch
Epoch 7/20  Iteration 1163/3560 Training loss: 1.6429 2.4686 sec/batch
Epoch 7/20  Iteration 1164/3560 Training loss: 1.6427 2.4633 sec/batch
Epoch 7/20  Iteration 1165/3560 Training loss: 1.6425 2.4439 sec/batch
Epoch 7/20  Iteration 1166/3560 Training loss: 1.6420 2.7023 sec/batch
Epoch 7/20  Iteration 1167/3560 Training loss: 1.6415 2.4567 sec/batch
Epoch 7/20  Iteration 1168/3560 Training loss: 1.6409 2.4594 sec/batch
Epoch 7/20  Iteration 1169/3560 Training loss: 1.6408 2.4453 sec/batch
Epoch 7/20  Iteration 1170/3560 Training loss: 1.6406 2.4571 sec/batch
Epoch 7/20  Iteration 1171/3560 Training loss: 1.6404 2.4527 sec/batch
Epoch 7/20  Iteration 1172/3560 Training loss: 1.6402 2.4707 sec/batch
Epoch 7/20  Iteration 1173/3560 Training loss: 1.6399 2.4631 sec/batch
Epoch 7/20  Iteration 1174/3560 Training loss: 1.6397 2.7756 sec/batch
Epoch 7/20  Iteration 1175/3560 Training loss: 1.6396 2.4629 sec/batch
Epoch 7/20  Iteration 1176/3560 Training loss: 1.6395 2.4540 sec/batch
Epoch 7/20  Iteration 1177/3560 Training loss: 1.6394 2.4543 sec/batch
Epoch 7/20  Iteration 1178/3560 Training loss: 1.6392 2.4633 sec/batch
Epoch 7/20  Iteration 1179/3560 Training loss: 1.6390 2.4476 sec/batch
Epoch 7/20  Iteration 1180/3560 Training loss: 1.6389 2.4695 sec/batch
Epoch 7/20  Iteration 1181/3560 Training loss: 1.6386 2.4501 sec/batch
Epoch 7/20  Iteration 1182/3560 Training loss: 1.6384 2.4780 sec/batch
Epoch 7/20  Iteration 1183/3560 Training loss: 1.6380 2.4752 sec/batch
Epoch 7/20  Iteration 1184/3560 Training loss: 1.6376 2.4584 sec/batch
Epoch 7/20  Iteration 1185/3560 Training loss: 1.6376 2.4602 sec/batch
Epoch 7/20  Iteration 1186/3560 Training loss: 1.6373 2.4521 sec/batch
Epoch 7/20  Iteration 1187/3560 Training loss: 1.6371 2.4688 sec/batch
Epoch 7/20  Iteration 1188/3560 Training loss: 1.6369 2.4516 sec/batch
Epoch 7/20  Iteration 1189/3560 Training loss: 1.6367 2.4533 sec/batch
Epoch 7/20  Iteration 1190/3560 Training loss: 1.6361 2.4614 sec/batch
Epoch 7/20  Iteration 1191/3560 Training loss: 1.6356 2.4520 sec/batch
Epoch 7/20  Iteration 1192/3560 Training loss: 1.6356 2.4632 sec/batch
Epoch 7/20  Iteration 1193/3560 Training loss: 1.6355 2.4497 sec/batch
Epoch 7/20  Iteration 1194/3560 Training loss: 1.6350 2.4505 sec/batch
Epoch 7/20  Iteration 1195/3560 Training loss: 1.6350 2.4491 sec/batch
Epoch 7/20  Iteration 1196/3560 Training loss: 1.6350 2.4618 sec/batch
Epoch 7/20  Iteration 1197/3560 Training loss: 1.6347 2.4673 sec/batch
Epoch 7/20  Iteration 1198/3560 Training loss: 1.6343 2.4543 sec/batch
Epoch 7/20  Iteration 1199/3560 Training loss: 1.6339 2.4567 sec/batch
Epoch 7/20  Iteration 1200/3560 Training loss: 1.6334 2.4459 sec/batch
Epoch 7/20  Iteration 1201/3560 Training loss: 1.6334 2.4648 sec/batch
Epoch 7/20  Iteration 1202/3560 Training loss: 1.6333 2.4495 sec/batch
Epoch 7/20  Iteration 1203/3560 Training loss: 1.6331 2.4494 sec/batch
Epoch 7/20  Iteration 1204/3560 Training loss: 1.6331 2.4610 sec/batch
Epoch 7/20  Iteration 1205/3560 Training loss: 1.6331 2.4509 sec/batch
Epoch 7/20  Iteration 1206/3560 Training loss: 1.6331 2.4632 sec/batch
Epoch 7/20  Iteration 1207/3560 Training loss: 1.6330 2.4512 sec/batch
Epoch 7/20  Iteration 1208/3560 Training loss: 1.6328 2.4794 sec/batch
Epoch 7/20  Iteration 1209/3560 Training loss: 1.6330 2.4647 sec/batch
Epoch 7/20  Iteration 1210/3560 Training loss: 1.6328 2.4424 sec/batch
Epoch 7/20  Iteration 1211/3560 Training loss: 1.6327 2.4660 sec/batch
Epoch 7/20  Iteration 1212/3560 Training loss: 1.6327 2.4432 sec/batch
Epoch 7/20  Iteration 1213/3560 Training loss: 1.6325 2.4527 sec/batch
Epoch 7/20  Iteration 1214/3560 Training loss: 1.6325 2.4642 sec/batch
Epoch 7/20  Iteration 1215/3560 Training loss: 1.6324 2.6058 sec/batch
Epoch 7/20  Iteration 1216/3560 Training loss: 1.6325 2.5322 sec/batch
Epoch 7/20  Iteration 1217/3560 Training loss: 1.6325 2.4530 sec/batch
Epoch 7/20  Iteration 1218/3560 Training loss: 1.6323 2.4572 sec/batch
Epoch 7/20  Iteration 1219/3560 Training loss: 1.6318 2.4506 sec/batch
Epoch 7/20  Iteration 1220/3560 Training loss: 1.6317 2.4668 sec/batch
Epoch 7/20  Iteration 1221/3560 Training loss: 1.6317 2.4515 sec/batch
Epoch 7/20  Iteration 1222/3560 Training loss: 1.6317 2.4567 sec/batch
Epoch 7/20  Iteration 1223/3560 Training loss: 1.6315 2.4609 sec/batch
Epoch 7/20  Iteration 1224/3560 Training loss: 1.6314 2.4563 sec/batch
Epoch 7/20  Iteration 1225/3560 Training loss: 1.6314 2.4452 sec/batch
Epoch 7/20  Iteration 1226/3560 Training loss: 1.6312 2.4624 sec/batch
Epoch 7/20  Iteration 1227/3560 Training loss: 1.6309 2.4553 sec/batch
Epoch 7/20  Iteration 1228/3560 Training loss: 1.6309 2.4642 sec/batch
Epoch 7/20  Iteration 1229/3560 Training loss: 1.6310 2.4539 sec/batch
Epoch 7/20  Iteration 1230/3560 Training loss: 1.6308 2.4447 sec/batch
Epoch 7/20  Iteration 1231/3560 Training loss: 1.6307 2.4589 sec/batch
Epoch 7/20  Iteration 1232/3560 Training loss: 1.6306 2.4709 sec/batch
Epoch 7/20  Iteration 1233/3560 Training loss: 1.6304 2.4530 sec/batch
Epoch 7/20  Iteration 1234/3560 Training loss: 1.6302 2.4468 sec/batch
Epoch 7/20  Iteration 1235/3560 Training loss: 1.6302 2.4560 sec/batch
Epoch 7/20  Iteration 1236/3560 Training loss: 1.6305 2.4700 sec/batch
Epoch 7/20  Iteration 1237/3560 Training loss: 1.6303 2.4425 sec/batch
Epoch 7/20  Iteration 1238/3560 Training loss: 1.6302 2.4615 sec/batch
Epoch 7/20  Iteration 1239/3560 Training loss: 1.6301 2.4423 sec/batch
Epoch 7/20  Iteration 1240/3560 Training loss: 1.6298 2.4722 sec/batch
Epoch 7/20  Iteration 1241/3560 Training loss: 1.6298 2.4527 sec/batch
Epoch 7/20  Iteration 1242/3560 Training loss: 1.6297 2.4591 sec/batch
Epoch 7/20  Iteration 1243/3560 Training loss: 1.6297 2.4533 sec/batch
Epoch 7/20  Iteration 1244/3560 Training loss: 1.6294 2.4537 sec/batch
Epoch 7/20  Iteration 1245/3560 Training loss: 1.6292 2.4675 sec/batch
Epoch 7/20  Iteration 1246/3560 Training loss: 1.6292 2.4511 sec/batch
Epoch 8/20  Iteration 1247/3560 Training loss: 1.6989 2.4425 sec/batch
Epoch 8/20  Iteration 1248/3560 Training loss: 1.6539 2.4528 sec/batch
Epoch 8/20  Iteration 1249/3560 Training loss: 1.6353 2.4470 sec/batch
Epoch 8/20  Iteration 1250/3560 Training loss: 1.6281 2.4678 sec/batch
Epoch 8/20  Iteration 1251/3560 Training loss: 1.6199 2.4481 sec/batch
Epoch 8/20  Iteration 1252/3560 Training loss: 1.6097 2.4613 sec/batch
Epoch 8/20  Iteration 1253/3560 Training loss: 1.6089 2.4506 sec/batch
Epoch 8/20  Iteration 1254/3560 Training loss: 1.6072 2.4542 sec/batch
Epoch 8/20  Iteration 1255/3560 Training loss: 1.6078 2.4609 sec/batch
Epoch 8/20  Iteration 1256/3560 Training loss: 1.6066 2.4573 sec/batch
Epoch 8/20  Iteration 1257/3560 Training loss: 1.6028 2.4813 sec/batch
Epoch 8/20  Iteration 1258/3560 Training loss: 1.6021 2.4467 sec/batch
Epoch 8/20  Iteration 1259/3560 Training loss: 1.6018 2.4586 sec/batch
Epoch 8/20  Iteration 1260/3560 Training loss: 1.6038 2.4526 sec/batch
Epoch 8/20  Iteration 1261/3560 Training loss: 1.6021 2.4456 sec/batch
Epoch 8/20  Iteration 1262/3560 Training loss: 1.5999 2.4605 sec/batch
Epoch 8/20  Iteration 1263/3560 Training loss: 1.5999 2.6400 sec/batch
Epoch 8/20  Iteration 1264/3560 Training loss: 1.6019 2.5155 sec/batch
Epoch 8/20  Iteration 1265/3560 Training loss: 1.6018 2.4567 sec/batch
Epoch 8/20  Iteration 1266/3560 Training loss: 1.6031 2.4488 sec/batch
Epoch 8/20  Iteration 1267/3560 Training loss: 1.6022 2.4679 sec/batch
Epoch 8/20  Iteration 1268/3560 Training loss: 1.6026 2.4596 sec/batch
Epoch 8/20  Iteration 1269/3560 Training loss: 1.6017 2.4718 sec/batch
Epoch 8/20  Iteration 1270/3560 Training loss: 1.6012 2.4507 sec/batch
Epoch 8/20  Iteration 1271/3560 Training loss: 1.6014 2.4639 sec/batch
Epoch 8/20  Iteration 1272/3560 Training loss: 1.5994 2.4462 sec/batch
Epoch 8/20  Iteration 1273/3560 Training loss: 1.5977 2.4666 sec/batch
Epoch 8/20  Iteration 1274/3560 Training loss: 1.5980 2.4465 sec/batch
Epoch 8/20  Iteration 1275/3560 Training loss: 1.5987 2.4630 sec/batch
Epoch 8/20  Iteration 1276/3560 Training loss: 1.5988 2.4506 sec/batch
Epoch 8/20  Iteration 1277/3560 Training loss: 1.5985 2.4552 sec/batch
Epoch 8/20  Iteration 1278/3560 Training loss: 1.5975 2.4633 sec/batch
Epoch 8/20  Iteration 1279/3560 Training loss: 1.5978 2.4540 sec/batch
Epoch 8/20  Iteration 1280/3560 Training loss: 1.5984 2.4605 sec/batch
Epoch 8/20  Iteration 1281/3560 Training loss: 1.5981 2.4668 sec/batch
Epoch 8/20  Iteration 1282/3560 Training loss: 1.5978 2.4467 sec/batch
Epoch 8/20  Iteration 1283/3560 Training loss: 1.5970 2.4591 sec/batch
Epoch 8/20  Iteration 1284/3560 Training loss: 1.5955 2.4593 sec/batch
Epoch 8/20  Iteration 1285/3560 Training loss: 1.5941 2.4588 sec/batch
Epoch 8/20  Iteration 1286/3560 Training loss: 1.5935 2.4468 sec/batch
Epoch 8/20  Iteration 1287/3560 Training loss: 1.5930 2.4771 sec/batch
Epoch 8/20  Iteration 1288/3560 Training loss: 1.5937 2.4899 sec/batch
Epoch 8/20  Iteration 1289/3560 Training loss: 1.5930 2.4654 sec/batch
Epoch 8/20  Iteration 1290/3560 Training loss: 1.5925 2.4857 sec/batch
Epoch 8/20  Iteration 1291/3560 Training loss: 1.5925 2.4558 sec/batch
Epoch 8/20  Iteration 1292/3560 Training loss: 1.5914 2.4512 sec/batch
Epoch 8/20  Iteration 1293/3560 Training loss: 1.5908 2.4568 sec/batch
Epoch 8/20  Iteration 1294/3560 Training loss: 1.5903 2.4979 sec/batch
Epoch 8/20  Iteration 1295/3560 Training loss: 1.5898 2.7416 sec/batch
Epoch 8/20  Iteration 1296/3560 Training loss: 1.5903 2.4528 sec/batch
Epoch 8/20  Iteration 1297/3560 Training loss: 1.5896 2.4625 sec/batch
Epoch 8/20  Iteration 1298/3560 Training loss: 1.5905 2.4501 sec/batch
Epoch 8/20  Iteration 1299/3560 Training loss: 1.5903 2.4730 sec/batch
Epoch 8/20  Iteration 1300/3560 Training loss: 1.5903 2.4466 sec/batch
Epoch 8/20  Iteration 1301/3560 Training loss: 1.5900 2.4609 sec/batch
Epoch 8/20  Iteration 1302/3560 Training loss: 1.5901 2.4699 sec/batch
Epoch 8/20  Iteration 1303/3560 Training loss: 1.5904 2.4554 sec/batch
Epoch 8/20  Iteration 1304/3560 Training loss: 1.5900 2.4541 sec/batch
Epoch 8/20  Iteration 1305/3560 Training loss: 1.5893 2.4723 sec/batch
Epoch 8/20  Iteration 1306/3560 Training loss: 1.5898 2.4609 sec/batch
Epoch 8/20  Iteration 1307/3560 Training loss: 1.5897 2.4577 sec/batch
Epoch 8/20  Iteration 1308/3560 Training loss: 1.5904 2.4525 sec/batch
Epoch 8/20  Iteration 1309/3560 Training loss: 1.5908 2.4976 sec/batch
Epoch 8/20  Iteration 1310/3560 Training loss: 1.5911 2.4528 sec/batch
Epoch 8/20  Iteration 1311/3560 Training loss: 1.5908 2.6644 sec/batch
Epoch 8/20  Iteration 1312/3560 Training loss: 1.5909 2.5263 sec/batch
Epoch 8/20  Iteration 1313/3560 Training loss: 1.5910 2.4611 sec/batch
Epoch 8/20  Iteration 1314/3560 Training loss: 1.5906 2.4648 sec/batch
Epoch 8/20  Iteration 1315/3560 Training loss: 1.5905 2.4466 sec/batch
Epoch 8/20  Iteration 1316/3560 Training loss: 1.5903 2.4616 sec/batch
Epoch 8/20  Iteration 1317/3560 Training loss: 1.5909 2.4696 sec/batch
Epoch 8/20  Iteration 1318/3560 Training loss: 1.5909 2.4534 sec/batch
Epoch 8/20  Iteration 1319/3560 Training loss: 1.5913 2.4629 sec/batch
Epoch 8/20  Iteration 1320/3560 Training loss: 1.5909 2.4541 sec/batch
Epoch 8/20  Iteration 1321/3560 Training loss: 1.5906 2.4554 sec/batch
Epoch 8/20  Iteration 1322/3560 Training loss: 1.5908 2.4549 sec/batch
Epoch 8/20  Iteration 1323/3560 Training loss: 1.5905 2.4659 sec/batch
Epoch 8/20  Iteration 1324/3560 Training loss: 1.5904 2.4469 sec/batch
Epoch 8/20  Iteration 1325/3560 Training loss: 1.5897 2.4474 sec/batch
Epoch 8/20  Iteration 1326/3560 Training loss: 1.5895 2.4627 sec/batch
Epoch 8/20  Iteration 1327/3560 Training loss: 1.5888 2.4490 sec/batch
Epoch 8/20  Iteration 1328/3560 Training loss: 1.5887 2.4694 sec/batch
Epoch 8/20  Iteration 1329/3560 Training loss: 1.5881 2.4571 sec/batch
Epoch 8/20  Iteration 1330/3560 Training loss: 1.5880 2.7031 sec/batch
Epoch 8/20  Iteration 1331/3560 Training loss: 1.5876 2.4582 sec/batch
Epoch 8/20  Iteration 1332/3560 Training loss: 1.5873 2.4667 sec/batch
Epoch 8/20  Iteration 1333/3560 Training loss: 1.5870 2.4611 sec/batch
Epoch 8/20  Iteration 1334/3560 Training loss: 1.5866 2.4465 sec/batch
Epoch 8/20  Iteration 1335/3560 Training loss: 1.5862 2.4597 sec/batch
Epoch 8/20  Iteration 1336/3560 Training loss: 1.5863 2.4552 sec/batch
Epoch 8/20  Iteration 1337/3560 Training loss: 1.5860 2.4639 sec/batch
Epoch 8/20  Iteration 1338/3560 Training loss: 1.5859 2.4448 sec/batch
Epoch 8/20  Iteration 1339/3560 Training loss: 1.5854 2.4470 sec/batch
Epoch 8/20  Iteration 1340/3560 Training loss: 1.5851 2.4618 sec/batch
Epoch 8/20  Iteration 1341/3560 Training loss: 1.5847 2.4595 sec/batch
Epoch 8/20  Iteration 1342/3560 Training loss: 1.5846 2.4824 sec/batch
Epoch 8/20  Iteration 1343/3560 Training loss: 1.5845 2.4472 sec/batch
Epoch 8/20  Iteration 1344/3560 Training loss: 1.5840 2.4503 sec/batch
Epoch 8/20  Iteration 1345/3560 Training loss: 1.5836 2.4620 sec/batch
Epoch 8/20  Iteration 1346/3560 Training loss: 1.5830 2.4462 sec/batch
Epoch 8/20  Iteration 1347/3560 Training loss: 1.5829 2.4602 sec/batch
Epoch 8/20  Iteration 1348/3560 Training loss: 1.5827 2.4592 sec/batch
Epoch 8/20  Iteration 1349/3560 Training loss: 1.5824 2.4485 sec/batch
Epoch 8/20  Iteration 1350/3560 Training loss: 1.5823 2.4657 sec/batch
Epoch 8/20  Iteration 1351/3560 Training loss: 1.5821 2.4486 sec/batch
Epoch 8/20  Iteration 1352/3560 Training loss: 1.5819 2.4589 sec/batch
Epoch 8/20  Iteration 1353/3560 Training loss: 1.5818 2.4489 sec/batch
Epoch 8/20  Iteration 1354/3560 Training loss: 1.5817 2.4668 sec/batch
Epoch 8/20  Iteration 1355/3560 Training loss: 1.5815 2.4589 sec/batch
Epoch 8/20  Iteration 1356/3560 Training loss: 1.5815 2.4569 sec/batch
Epoch 8/20  Iteration 1357/3560 Training loss: 1.5813 2.4648 sec/batch
Epoch 8/20  Iteration 1358/3560 Training loss: 1.5811 2.4552 sec/batch
Epoch 8/20  Iteration 1359/3560 Training loss: 1.5809 2.4593 sec/batch
Epoch 8/20  Iteration 1360/3560 Training loss: 1.5807 2.7094 sec/batch
Epoch 8/20  Iteration 1361/3560 Training loss: 1.5803 2.4503 sec/batch
Epoch 8/20  Iteration 1362/3560 Training loss: 1.5798 2.4735 sec/batch
Epoch 8/20  Iteration 1363/3560 Training loss: 1.5797 2.4531 sec/batch
Epoch 8/20  Iteration 1364/3560 Training loss: 1.5795 2.4610 sec/batch
Epoch 8/20  Iteration 1365/3560 Training loss: 1.5793 2.4518 sec/batch
Epoch 8/20  Iteration 1366/3560 Training loss: 1.5792 2.4840 sec/batch
Epoch 8/20  Iteration 1367/3560 Training loss: 1.5791 2.4521 sec/batch
Epoch 8/20  Iteration 1368/3560 Training loss: 1.5785 2.4564 sec/batch
Epoch 8/20  Iteration 1369/3560 Training loss: 1.5781 2.4651 sec/batch
Epoch 8/20  Iteration 1370/3560 Training loss: 1.5779 2.4478 sec/batch
Epoch 8/20  Iteration 1371/3560 Training loss: 1.5778 2.4476 sec/batch
Epoch 8/20  Iteration 1372/3560 Training loss: 1.5774 2.4572 sec/batch
Epoch 8/20  Iteration 1373/3560 Training loss: 1.5774 2.4422 sec/batch
Epoch 8/20  Iteration 1374/3560 Training loss: 1.5774 2.4638 sec/batch
Epoch 8/20  Iteration 1375/3560 Training loss: 1.5772 2.4488 sec/batch
Epoch 8/20  Iteration 1376/3560 Training loss: 1.5768 2.4562 sec/batch
Epoch 8/20  Iteration 1377/3560 Training loss: 1.5763 2.4455 sec/batch
Epoch 8/20  Iteration 1378/3560 Training loss: 1.5760 2.4593 sec/batch
Epoch 8/20  Iteration 1379/3560 Training loss: 1.5759 2.4608 sec/batch
Epoch 8/20  Iteration 1380/3560 Training loss: 1.5758 2.4545 sec/batch
Epoch 8/20  Iteration 1381/3560 Training loss: 1.5758 2.4679 sec/batch
Epoch 8/20  Iteration 1382/3560 Training loss: 1.5757 2.4516 sec/batch
Epoch 8/20  Iteration 1383/3560 Training loss: 1.5758 2.4453 sec/batch
Epoch 8/20  Iteration 1384/3560 Training loss: 1.5758 2.4653 sec/batch
Epoch 8/20  Iteration 1385/3560 Training loss: 1.5758 2.4457 sec/batch
Epoch 8/20  Iteration 1386/3560 Training loss: 1.5757 2.4623 sec/batch
Epoch 8/20  Iteration 1387/3560 Training loss: 1.5759 2.4526 sec/batch
Epoch 8/20  Iteration 1388/3560 Training loss: 1.5758 2.4617 sec/batch
Epoch 8/20  Iteration 1389/3560 Training loss: 1.5756 2.4478 sec/batch
Epoch 8/20  Iteration 1390/3560 Training loss: 1.5756 2.4693 sec/batch
Epoch 8/20  Iteration 1391/3560 Training loss: 1.5754 2.4532 sec/batch
Epoch 8/20  Iteration 1392/3560 Training loss: 1.5754 2.4500 sec/batch
Epoch 8/20  Iteration 1393/3560 Training loss: 1.5754 2.4577 sec/batch
Epoch 8/20  Iteration 1394/3560 Training loss: 1.5755 2.4613 sec/batch
Epoch 8/20  Iteration 1395/3560 Training loss: 1.5755 2.4438 sec/batch
Epoch 8/20  Iteration 1396/3560 Training loss: 1.5753 2.4695 sec/batch
Epoch 8/20  Iteration 1397/3560 Training loss: 1.5748 2.4551 sec/batch
Epoch 8/20  Iteration 1398/3560 Training loss: 1.5747 2.4699 sec/batch
Epoch 8/20  Iteration 1399/3560 Training loss: 1.5746 2.4553 sec/batch
Epoch 8/20  Iteration 1400/3560 Training loss: 1.5746 2.4570 sec/batch
Epoch 8/20  Iteration 1401/3560 Training loss: 1.5745 2.4654 sec/batch
Epoch 8/20  Iteration 1402/3560 Training loss: 1.5744 2.4707 sec/batch
Epoch 8/20  Iteration 1403/3560 Training loss: 1.5744 2.4626 sec/batch
Epoch 8/20  Iteration 1404/3560 Training loss: 1.5742 2.4443 sec/batch
Epoch 8/20  Iteration 1405/3560 Training loss: 1.5738 2.4725 sec/batch
Epoch 8/20  Iteration 1406/3560 Training loss: 1.5737 2.4636 sec/batch
Epoch 8/20  Iteration 1407/3560 Training loss: 1.5738 2.4512 sec/batch
Epoch 8/20  Iteration 1408/3560 Training loss: 1.5736 2.5787 sec/batch
Epoch 8/20  Iteration 1409/3560 Training loss: 1.5735 2.5494 sec/batch
Epoch 8/20  Iteration 1410/3560 Training loss: 1.5735 2.4889 sec/batch
Epoch 8/20  Iteration 1411/3560 Training loss: 1.5734 2.4567 sec/batch
Epoch 8/20  Iteration 1412/3560 Training loss: 1.5733 2.4420 sec/batch
Epoch 8/20  Iteration 1413/3560 Training loss: 1.5733 2.4520 sec/batch
Epoch 8/20  Iteration 1414/3560 Training loss: 1.5736 2.4649 sec/batch
Epoch 8/20  Iteration 1415/3560 Training loss: 1.5735 2.5961 sec/batch
Epoch 8/20  Iteration 1416/3560 Training loss: 1.5733 2.6322 sec/batch
Epoch 8/20  Iteration 1417/3560 Training loss: 1.5731 2.8000 sec/batch
Epoch 8/20  Iteration 1418/3560 Training loss: 1.5728 2.4529 sec/batch
Epoch 8/20  Iteration 1419/3560 Training loss: 1.5728 2.4564 sec/batch
Epoch 8/20  Iteration 1420/3560 Training loss: 1.5726 2.4622 sec/batch
Epoch 8/20  Iteration 1421/3560 Training loss: 1.5726 2.4597 sec/batch
Epoch 8/20  Iteration 1422/3560 Training loss: 1.5724 2.4563 sec/batch
Epoch 8/20  Iteration 1423/3560 Training loss: 1.5721 2.4766 sec/batch
Epoch 8/20  Iteration 1424/3560 Training loss: 1.5722 2.5044 sec/batch
Epoch 9/20  Iteration 1425/3560 Training loss: 1.6436 2.4490 sec/batch
Epoch 9/20  Iteration 1426/3560 Training loss: 1.5982 2.4820 sec/batch
Epoch 9/20  Iteration 1427/3560 Training loss: 1.5793 2.4466 sec/batch
Epoch 9/20  Iteration 1428/3560 Training loss: 1.5720 2.4626 sec/batch
Epoch 9/20  Iteration 1429/3560 Training loss: 1.5619 2.4582 sec/batch
Epoch 9/20  Iteration 1430/3560 Training loss: 1.5508 2.4624 sec/batch
Epoch 9/20  Iteration 1431/3560 Training loss: 1.5507 2.4595 sec/batch
Epoch 9/20  Iteration 1432/3560 Training loss: 1.5484 2.4556 sec/batch
Epoch 9/20  Iteration 1433/3560 Training loss: 1.5480 2.4626 sec/batch
Epoch 9/20  Iteration 1434/3560 Training loss: 1.5469 2.4521 sec/batch
Epoch 9/20  Iteration 1435/3560 Training loss: 1.5435 2.4613 sec/batch
Epoch 9/20  Iteration 1436/3560 Training loss: 1.5423 2.4641 sec/batch
Epoch 9/20  Iteration 1437/3560 Training loss: 1.5424 2.4431 sec/batch
Epoch 9/20  Iteration 1438/3560 Training loss: 1.5438 2.4557 sec/batch
Epoch 9/20  Iteration 1439/3560 Training loss: 1.5432 2.4674 sec/batch
Epoch 9/20  Iteration 1440/3560 Training loss: 1.5417 2.4611 sec/batch
Epoch 9/20  Iteration 1441/3560 Training loss: 1.5416 2.4627 sec/batch
Epoch 9/20  Iteration 1442/3560 Training loss: 1.5435 2.4486 sec/batch
Epoch 9/20  Iteration 1443/3560 Training loss: 1.5433 2.4556 sec/batch
Epoch 9/20  Iteration 1444/3560 Training loss: 1.5445 2.4492 sec/batch
Epoch 9/20  Iteration 1445/3560 Training loss: 1.5437 2.4641 sec/batch
Epoch 9/20  Iteration 1446/3560 Training loss: 1.5445 2.4582 sec/batch
Epoch 9/20  Iteration 1447/3560 Training loss: 1.5435 2.4488 sec/batch
Epoch 9/20  Iteration 1448/3560 Training loss: 1.5429 2.4570 sec/batch
Epoch 9/20  Iteration 1449/3560 Training loss: 1.5429 2.4400 sec/batch
Epoch 9/20  Iteration 1450/3560 Training loss: 1.5411 2.4578 sec/batch
Epoch 9/20  Iteration 1451/3560 Training loss: 1.5401 2.4650 sec/batch
Epoch 9/20  Iteration 1452/3560 Training loss: 1.5408 2.4558 sec/batch
Epoch 9/20  Iteration 1453/3560 Training loss: 1.5416 2.4572 sec/batch
Epoch 9/20  Iteration 1454/3560 Training loss: 1.5416 2.4509 sec/batch
Epoch 9/20  Iteration 1455/3560 Training loss: 1.5413 2.4625 sec/batch
Epoch 9/20  Iteration 1456/3560 Training loss: 1.5402 2.4484 sec/batch
Epoch 9/20  Iteration 1457/3560 Training loss: 1.5406 2.6494 sec/batch
Epoch 9/20  Iteration 1458/3560 Training loss: 1.5411 2.5050 sec/batch
Epoch 9/20  Iteration 1459/3560 Training loss: 1.5406 2.4615 sec/batch
Epoch 9/20  Iteration 1460/3560 Training loss: 1.5401 2.4494 sec/batch
Epoch 9/20  Iteration 1461/3560 Training loss: 1.5393 2.4442 sec/batch
Epoch 9/20  Iteration 1462/3560 Training loss: 1.5378 2.4508 sec/batch
Epoch 9/20  Iteration 1463/3560 Training loss: 1.5361 2.4662 sec/batch
Epoch 9/20  Iteration 1464/3560 Training loss: 1.5354 2.4671 sec/batch
Epoch 9/20  Iteration 1465/3560 Training loss: 1.5350 2.4599 sec/batch
Epoch 9/20  Iteration 1466/3560 Training loss: 1.5359 2.4588 sec/batch
Epoch 9/20  Iteration 1467/3560 Training loss: 1.5353 2.4451 sec/batch
Epoch 9/20  Iteration 1468/3560 Training loss: 1.5347 2.4491 sec/batch
Epoch 9/20  Iteration 1469/3560 Training loss: 1.5348 2.4640 sec/batch
Epoch 9/20  Iteration 1470/3560 Training loss: 1.5335 2.4527 sec/batch
Epoch 9/20  Iteration 1471/3560 Training loss: 1.5330 2.4961 sec/batch
Epoch 9/20  Iteration 1472/3560 Training loss: 1.5325 2.4539 sec/batch
Epoch 9/20  Iteration 1473/3560 Training loss: 1.5322 2.4526 sec/batch
Epoch 9/20  Iteration 1474/3560 Training loss: 1.5327 2.4616 sec/batch
Epoch 9/20  Iteration 1475/3560 Training loss: 1.5320 2.4589 sec/batch
Epoch 9/20  Iteration 1476/3560 Training loss: 1.5328 2.4539 sec/batch
Epoch 9/20  Iteration 1477/3560 Training loss: 1.5326 2.4519 sec/batch
Epoch 9/20  Iteration 1478/3560 Training loss: 1.5327 2.4672 sec/batch
Epoch 9/20  Iteration 1479/3560 Training loss: 1.5325 2.4510 sec/batch
Epoch 9/20  Iteration 1480/3560 Training loss: 1.5325 2.4607 sec/batch
Epoch 9/20  Iteration 1481/3560 Training loss: 1.5329 2.4515 sec/batch
Epoch 9/20  Iteration 1482/3560 Training loss: 1.5324 2.4583 sec/batch
Epoch 9/20  Iteration 1483/3560 Training loss: 1.5317 2.4624 sec/batch
Epoch 9/20  Iteration 1484/3560 Training loss: 1.5321 2.4424 sec/batch
Epoch 9/20  Iteration 1485/3560 Training loss: 1.5320 2.4566 sec/batch
Epoch 9/20  Iteration 1486/3560 Training loss: 1.5328 2.4471 sec/batch
Epoch 9/20  Iteration 1487/3560 Training loss: 1.5332 2.4801 sec/batch
Epoch 9/20  Iteration 1488/3560 Training loss: 1.5334 2.4574 sec/batch
Epoch 9/20  Iteration 1489/3560 Training loss: 1.5331 2.4519 sec/batch
Epoch 9/20  Iteration 1490/3560 Training loss: 1.5332 2.4568 sec/batch
Epoch 9/20  Iteration 1491/3560 Training loss: 1.5334 2.4456 sec/batch
Epoch 9/20  Iteration 1492/3560 Training loss: 1.5331 2.4448 sec/batch
Epoch 9/20  Iteration 1493/3560 Training loss: 1.5330 2.4628 sec/batch
Epoch 9/20  Iteration 1494/3560 Training loss: 1.5328 2.4514 sec/batch
Epoch 9/20  Iteration 1495/3560 Training loss: 1.5333 2.4650 sec/batch
Epoch 9/20  Iteration 1496/3560 Training loss: 1.5336 2.4463 sec/batch
Epoch 9/20  Iteration 1497/3560 Training loss: 1.5339 2.4702 sec/batch
Epoch 9/20  Iteration 1498/3560 Training loss: 1.5335 2.4501 sec/batch
Epoch 9/20  Iteration 1499/3560 Training loss: 1.5333 2.4746 sec/batch
Epoch 9/20  Iteration 1500/3560 Training loss: 1.5333 2.4539 sec/batch
Epoch 9/20  Iteration 1501/3560 Training loss: 1.5329 2.4552 sec/batch
Epoch 9/20  Iteration 1502/3560 Training loss: 1.5329 2.4624 sec/batch
Epoch 9/20  Iteration 1503/3560 Training loss: 1.5322 2.4524 sec/batch
Epoch 9/20  Iteration 1504/3560 Training loss: 1.5320 2.4515 sec/batch
Epoch 9/20  Iteration 1505/3560 Training loss: 1.5314 2.5301 sec/batch
Epoch 9/20  Iteration 1506/3560 Training loss: 1.5313 2.6073 sec/batch
Epoch 9/20  Iteration 1507/3560 Training loss: 1.5308 2.4602 sec/batch
Epoch 9/20  Iteration 1508/3560 Training loss: 1.5307 2.4615 sec/batch
Epoch 9/20  Iteration 1509/3560 Training loss: 1.5303 2.4621 sec/batch
Epoch 9/20  Iteration 1510/3560 Training loss: 1.5300 2.4455 sec/batch
Epoch 9/20  Iteration 1511/3560 Training loss: 1.5296 2.4671 sec/batch
Epoch 9/20  Iteration 1512/3560 Training loss: 1.5292 2.5758 sec/batch
Epoch 9/20  Iteration 1513/3560 Training loss: 1.5286 2.4541 sec/batch
Epoch 9/20  Iteration 1514/3560 Training loss: 1.5286 2.4580 sec/batch
Epoch 9/20  Iteration 1515/3560 Training loss: 1.5283 2.4524 sec/batch
Epoch 9/20  Iteration 1516/3560 Training loss: 1.5280 2.4638 sec/batch
Epoch 9/20  Iteration 1517/3560 Training loss: 1.5274 2.4433 sec/batch
Epoch 9/20  Iteration 1518/3560 Training loss: 1.5271 2.4505 sec/batch
Epoch 9/20  Iteration 1519/3560 Training loss: 1.5267 2.4636 sec/batch
Epoch 9/20  Iteration 1520/3560 Training loss: 1.5266 2.4585 sec/batch
Epoch 9/20  Iteration 1521/3560 Training loss: 1.5264 2.4498 sec/batch
Epoch 9/20  Iteration 1522/3560 Training loss: 1.5259 2.4582 sec/batch
Epoch 9/20  Iteration 1523/3560 Training loss: 1.5255 2.4610 sec/batch
Epoch 9/20  Iteration 1524/3560 Training loss: 1.5249 2.4733 sec/batch
Epoch 9/20  Iteration 1525/3560 Training loss: 1.5248 2.4491 sec/batch
Epoch 9/20  Iteration 1526/3560 Training loss: 1.5245 2.4588 sec/batch
Epoch 9/20  Iteration 1527/3560 Training loss: 1.5242 2.4581 sec/batch
Epoch 9/20  Iteration 1528/3560 Training loss: 1.5239 2.4647 sec/batch
Epoch 9/20  Iteration 1529/3560 Training loss: 1.5237 2.4497 sec/batch
Epoch 9/20  Iteration 1530/3560 Training loss: 1.5235 2.4413 sec/batch
Epoch 9/20  Iteration 1531/3560 Training loss: 1.5234 2.4634 sec/batch
Epoch 9/20  Iteration 1532/3560 Training loss: 1.5232 2.4473 sec/batch
Epoch 9/20  Iteration 1533/3560 Training loss: 1.5231 2.4565 sec/batch
Epoch 9/20  Iteration 1534/3560 Training loss: 1.5230 2.4515 sec/batch
Epoch 9/20  Iteration 1535/3560 Training loss: 1.5228 2.4585 sec/batch
Epoch 9/20  Iteration 1536/3560 Training loss: 1.5227 2.4643 sec/batch
Epoch 9/20  Iteration 1537/3560 Training loss: 1.5224 2.4496 sec/batch
Epoch 9/20  Iteration 1538/3560 Training loss: 1.5222 2.6748 sec/batch
Epoch 9/20  Iteration 1539/3560 Training loss: 1.5219 2.4521 sec/batch
Epoch 9/20  Iteration 1540/3560 Training loss: 1.5215 2.4638 sec/batch
Epoch 9/20  Iteration 1541/3560 Training loss: 1.5214 2.4521 sec/batch
Epoch 9/20  Iteration 1542/3560 Training loss: 1.5213 2.4587 sec/batch
Epoch 9/20  Iteration 1543/3560 Training loss: 1.5211 2.4567 sec/batch
Epoch 9/20  Iteration 1544/3560 Training loss: 1.5210 2.4630 sec/batch
Epoch 9/20  Iteration 1545/3560 Training loss: 1.5209 2.4576 sec/batch
Epoch 9/20  Iteration 1546/3560 Training loss: 1.5204 2.4586 sec/batch
Epoch 9/20  Iteration 1547/3560 Training loss: 1.5200 2.4534 sec/batch
Epoch 9/20  Iteration 1548/3560 Training loss: 1.5200 2.4646 sec/batch
Epoch 9/20  Iteration 1549/3560 Training loss: 1.5200 2.4605 sec/batch
Epoch 9/20  Iteration 1550/3560 Training loss: 1.5195 2.4603 sec/batch
Epoch 9/20  Iteration 1551/3560 Training loss: 1.5195 2.4814 sec/batch
Epoch 9/20  Iteration 1552/3560 Training loss: 1.5195 2.4542 sec/batch
Epoch 9/20  Iteration 1553/3560 Training loss: 1.5192 2.4535 sec/batch
Epoch 9/20  Iteration 1554/3560 Training loss: 1.5189 2.6515 sec/batch
Epoch 9/20  Iteration 1555/3560 Training loss: 1.5184 2.5200 sec/batch
Epoch 9/20  Iteration 1556/3560 Training loss: 1.5181 2.4618 sec/batch
Epoch 9/20  Iteration 1557/3560 Training loss: 1.5182 2.4587 sec/batch
Epoch 9/20  Iteration 1558/3560 Training loss: 1.5182 2.4526 sec/batch
Epoch 9/20  Iteration 1559/3560 Training loss: 1.5181 2.4703 sec/batch
Epoch 9/20  Iteration 1560/3560 Training loss: 1.5181 2.4612 sec/batch
Epoch 9/20  Iteration 1561/3560 Training loss: 1.5181 2.4626 sec/batch
Epoch 9/20  Iteration 1562/3560 Training loss: 1.5181 2.4561 sec/batch
Epoch 9/20  Iteration 1563/3560 Training loss: 1.5181 2.4589 sec/batch
Epoch 9/20  Iteration 1564/3560 Training loss: 1.5180 2.4561 sec/batch
Epoch 9/20  Iteration 1565/3560 Training loss: 1.5182 2.4587 sec/batch
Epoch 9/20  Iteration 1566/3560 Training loss: 1.5181 2.4582 sec/batch
Epoch 9/20  Iteration 1567/3560 Training loss: 1.5180 2.4460 sec/batch
Epoch 9/20  Iteration 1568/3560 Training loss: 1.5180 2.4613 sec/batch
Epoch 9/20  Iteration 1569/3560 Training loss: 1.5179 2.4533 sec/batch
Epoch 9/20  Iteration 1570/3560 Training loss: 1.5179 2.4502 sec/batch
Epoch 9/20  Iteration 1571/3560 Training loss: 1.5179 2.4632 sec/batch
Epoch 9/20  Iteration 1572/3560 Training loss: 1.5181 2.4619 sec/batch
Epoch 9/20  Iteration 1573/3560 Training loss: 1.5181 2.4645 sec/batch
Epoch 9/20  Iteration 1574/3560 Training loss: 1.5179 2.4499 sec/batch
Epoch 9/20  Iteration 1575/3560 Training loss: 1.5175 2.4586 sec/batch
Epoch 9/20  Iteration 1576/3560 Training loss: 1.5173 2.4671 sec/batch
Epoch 9/20  Iteration 1577/3560 Training loss: 1.5173 2.4590 sec/batch
Epoch 9/20  Iteration 1578/3560 Training loss: 1.5173 2.4569 sec/batch
Epoch 9/20  Iteration 1579/3560 Training loss: 1.5171 2.4421 sec/batch
Epoch 9/20  Iteration 1580/3560 Training loss: 1.5171 2.4601 sec/batch
Epoch 9/20  Iteration 1581/3560 Training loss: 1.5170 2.4410 sec/batch
Epoch 9/20  Iteration 1582/3560 Training loss: 1.5168 2.4471 sec/batch
Epoch 9/20  Iteration 1583/3560 Training loss: 1.5165 2.4643 sec/batch
Epoch 9/20  Iteration 1584/3560 Training loss: 1.5165 2.4679 sec/batch
Epoch 9/20  Iteration 1585/3560 Training loss: 1.5166 2.4570 sec/batch
Epoch 9/20  Iteration 1586/3560 Training loss: 1.5165 2.4494 sec/batch
Epoch 9/20  Iteration 1587/3560 Training loss: 1.5165 2.4532 sec/batch
Epoch 9/20  Iteration 1588/3560 Training loss: 1.5164 2.4572 sec/batch
Epoch 9/20  Iteration 1589/3560 Training loss: 1.5163 2.4497 sec/batch
Epoch 9/20  Iteration 1590/3560 Training loss: 1.5161 2.4564 sec/batch
Epoch 9/20  Iteration 1591/3560 Training loss: 1.5161 2.4528 sec/batch
Epoch 9/20  Iteration 1592/3560 Training loss: 1.5165 2.4496 sec/batch
Epoch 9/20  Iteration 1593/3560 Training loss: 1.5164 2.4730 sec/batch
Epoch 9/20  Iteration 1594/3560 Training loss: 1.5163 2.4465 sec/batch
Epoch 9/20  Iteration 1595/3560 Training loss: 1.5161 2.4691 sec/batch
Epoch 9/20  Iteration 1596/3560 Training loss: 1.5158 2.4647 sec/batch
Epoch 9/20  Iteration 1597/3560 Training loss: 1.5159 2.4555 sec/batch
Epoch 9/20  Iteration 1598/3560 Training loss: 1.5158 2.4531 sec/batch
Epoch 9/20  Iteration 1599/3560 Training loss: 1.5158 2.4448 sec/batch
Epoch 9/20  Iteration 1600/3560 Training loss: 1.5157 2.4580 sec/batch
Epoch 9/20  Iteration 1601/3560 Training loss: 1.5154 2.4591 sec/batch
Epoch 9/20  Iteration 1602/3560 Training loss: 1.5155 2.5107 sec/batch
Epoch 10/20  Iteration 1603/3560 Training loss: 1.6037 2.6391 sec/batch
Epoch 10/20  Iteration 1604/3560 Training loss: 1.5549 2.4575 sec/batch
Epoch 10/20  Iteration 1605/3560 Training loss: 1.5313 2.4647 sec/batch
Epoch 10/20  Iteration 1606/3560 Training loss: 1.5243 2.4524 sec/batch
Epoch 10/20  Iteration 1607/3560 Training loss: 1.5132 2.4499 sec/batch
Epoch 10/20  Iteration 1608/3560 Training loss: 1.5031 2.4475 sec/batch
Epoch 10/20  Iteration 1609/3560 Training loss: 1.5035 2.4766 sec/batch
Epoch 10/20  Iteration 1610/3560 Training loss: 1.5007 2.4480 sec/batch
Epoch 10/20  Iteration 1611/3560 Training loss: 1.4993 2.4616 sec/batch
Epoch 10/20  Iteration 1612/3560 Training loss: 1.4982 2.4472 sec/batch
Epoch 10/20  Iteration 1613/3560 Training loss: 1.4942 2.4498 sec/batch
Epoch 10/20  Iteration 1614/3560 Training loss: 1.4937 2.4679 sec/batch
Epoch 10/20  Iteration 1615/3560 Training loss: 1.4946 2.4513 sec/batch
Epoch 10/20  Iteration 1616/3560 Training loss: 1.4959 2.4650 sec/batch
Epoch 10/20  Iteration 1617/3560 Training loss: 1.4951 2.4604 sec/batch
Epoch 10/20  Iteration 1618/3560 Training loss: 1.4935 2.4493 sec/batch
Epoch 10/20  Iteration 1619/3560 Training loss: 1.4940 2.4611 sec/batch
Epoch 10/20  Iteration 1620/3560 Training loss: 1.4957 2.4487 sec/batch
Epoch 10/20  Iteration 1621/3560 Training loss: 1.4956 2.4818 sec/batch
Epoch 10/20  Iteration 1622/3560 Training loss: 1.4968 2.4443 sec/batch
Epoch 10/20  Iteration 1623/3560 Training loss: 1.4958 2.4603 sec/batch
Epoch 10/20  Iteration 1624/3560 Training loss: 1.4962 2.4528 sec/batch
Epoch 10/20  Iteration 1625/3560 Training loss: 1.4951 2.6712 sec/batch
Epoch 10/20  Iteration 1626/3560 Training loss: 1.4947 2.4964 sec/batch
Epoch 10/20  Iteration 1627/3560 Training loss: 1.4953 2.4697 sec/batch
Epoch 10/20  Iteration 1628/3560 Training loss: 1.4933 2.4883 sec/batch
Epoch 10/20  Iteration 1629/3560 Training loss: 1.4921 2.4478 sec/batch
Epoch 10/20  Iteration 1630/3560 Training loss: 1.4929 2.4635 sec/batch
Epoch 10/20  Iteration 1631/3560 Training loss: 1.4932 2.4502 sec/batch
Epoch 10/20  Iteration 1632/3560 Training loss: 1.4936 2.4453 sec/batch
Epoch 10/20  Iteration 1633/3560 Training loss: 1.4934 2.4838 sec/batch
Epoch 10/20  Iteration 1634/3560 Training loss: 1.4920 2.4478 sec/batch
Epoch 10/20  Iteration 1635/3560 Training loss: 1.4923 2.4667 sec/batch
Epoch 10/20  Iteration 1636/3560 Training loss: 1.4928 2.4471 sec/batch
Epoch 10/20  Iteration 1637/3560 Training loss: 1.4922 2.4561 sec/batch
Epoch 10/20  Iteration 1638/3560 Training loss: 1.4918 2.4632 sec/batch
Epoch 10/20  Iteration 1639/3560 Training loss: 1.4909 2.4467 sec/batch
Epoch 10/20  Iteration 1640/3560 Training loss: 1.4894 2.4561 sec/batch
Epoch 10/20  Iteration 1641/3560 Training loss: 1.4880 2.4497 sec/batch
Epoch 10/20  Iteration 1642/3560 Training loss: 1.4874 2.4666 sec/batch
Epoch 10/20  Iteration 1643/3560 Training loss: 1.4866 2.4545 sec/batch
Epoch 10/20  Iteration 1644/3560 Training loss: 1.4875 2.4480 sec/batch
Epoch 10/20  Iteration 1645/3560 Training loss: 1.4870 2.4732 sec/batch
Epoch 10/20  Iteration 1646/3560 Training loss: 1.4865 2.4504 sec/batch
Epoch 10/20  Iteration 1647/3560 Training loss: 1.4866 2.4756 sec/batch
Epoch 10/20  Iteration 1648/3560 Training loss: 1.4857 2.4581 sec/batch
Epoch 10/20  Iteration 1649/3560 Training loss: 1.4851 2.4601 sec/batch
Epoch 10/20  Iteration 1650/3560 Training loss: 1.4843 2.4622 sec/batch
Epoch 10/20  Iteration 1651/3560 Training loss: 1.4840 2.6404 sec/batch
Epoch 10/20  Iteration 1652/3560 Training loss: 1.4844 2.5076 sec/batch
Epoch 10/20  Iteration 1653/3560 Training loss: 1.4839 2.4500 sec/batch
Epoch 10/20  Iteration 1654/3560 Training loss: 1.4846 2.4749 sec/batch
Epoch 10/20  Iteration 1655/3560 Training loss: 1.4846 2.4503 sec/batch
Epoch 10/20  Iteration 1656/3560 Training loss: 1.4846 2.4446 sec/batch
Epoch 10/20  Iteration 1657/3560 Training loss: 1.4845 2.4836 sec/batch
Epoch 10/20  Iteration 1658/3560 Training loss: 1.4848 2.4434 sec/batch
Epoch 10/20  Iteration 1659/3560 Training loss: 1.4851 2.7713 sec/batch
Epoch 10/20  Iteration 1660/3560 Training loss: 1.4846 2.4487 sec/batch
Epoch 10/20  Iteration 1661/3560 Training loss: 1.4842 2.4547 sec/batch
Epoch 10/20  Iteration 1662/3560 Training loss: 1.4847 2.4511 sec/batch
Epoch 10/20  Iteration 1663/3560 Training loss: 1.4845 2.4443 sec/batch
Epoch 10/20  Iteration 1664/3560 Training loss: 1.4852 2.4618 sec/batch
Epoch 10/20  Iteration 1665/3560 Training loss: 1.4854 2.4700 sec/batch
Epoch 10/20  Iteration 1666/3560 Training loss: 1.4856 2.4554 sec/batch
Epoch 10/20  Iteration 1667/3560 Training loss: 1.4852 2.4525 sec/batch
Epoch 10/20  Iteration 1668/3560 Training loss: 1.4853 2.4563 sec/batch
Epoch 10/20  Iteration 1669/3560 Training loss: 1.4855 2.4689 sec/batch
Epoch 10/20  Iteration 1670/3560 Training loss: 1.4852 2.4464 sec/batch
Epoch 10/20  Iteration 1671/3560 Training loss: 1.4850 2.4703 sec/batch
Epoch 10/20  Iteration 1672/3560 Training loss: 1.4847 2.4543 sec/batch
Epoch 10/20  Iteration 1673/3560 Training loss: 1.4853 2.4647 sec/batch
Epoch 10/20  Iteration 1674/3560 Training loss: 1.4856 2.4548 sec/batch
Epoch 10/20  Iteration 1675/3560 Training loss: 1.4860 2.4511 sec/batch
Epoch 10/20  Iteration 1676/3560 Training loss: 1.4857 2.4537 sec/batch
Epoch 10/20  Iteration 1677/3560 Training loss: 1.4856 2.4547 sec/batch
Epoch 10/20  Iteration 1678/3560 Training loss: 1.4857 2.4571 sec/batch
Epoch 10/20  Iteration 1679/3560 Training loss: 1.4854 2.4554 sec/batch
Epoch 10/20  Iteration 1680/3560 Training loss: 1.4853 2.4634 sec/batch
Epoch 10/20  Iteration 1681/3560 Training loss: 1.4846 2.4566 sec/batch
Epoch 10/20  Iteration 1682/3560 Training loss: 1.4844 2.4514 sec/batch
Epoch 10/20  Iteration 1683/3560 Training loss: 1.4839 2.4640 sec/batch
Epoch 10/20  Iteration 1684/3560 Training loss: 1.4839 2.4505 sec/batch
Epoch 10/20  Iteration 1685/3560 Training loss: 1.4834 2.4581 sec/batch
Epoch 10/20  Iteration 1686/3560 Training loss: 1.4834 2.4533 sec/batch
Epoch 10/20  Iteration 1687/3560 Training loss: 1.4830 2.4454 sec/batch
Epoch 10/20  Iteration 1688/3560 Training loss: 1.4827 2.4556 sec/batch
Epoch 10/20  Iteration 1689/3560 Training loss: 1.4823 2.4503 sec/batch
Epoch 10/20  Iteration 1690/3560 Training loss: 1.4820 2.4642 sec/batch
Epoch 10/20  Iteration 1691/3560 Training loss: 1.4814 2.4475 sec/batch
Epoch 10/20  Iteration 1692/3560 Training loss: 1.4814 2.4497 sec/batch
Epoch 10/20  Iteration 1693/3560 Training loss: 1.4811 2.4769 sec/batch
Epoch 10/20  Iteration 1694/3560 Training loss: 1.4810 2.4511 sec/batch
Epoch 10/20  Iteration 1695/3560 Training loss: 1.4805 2.4577 sec/batch
Epoch 10/20  Iteration 1696/3560 Training loss: 1.4801 2.4521 sec/batch
Epoch 10/20  Iteration 1697/3560 Training loss: 1.4797 2.4668 sec/batch
Epoch 10/20  Iteration 1698/3560 Training loss: 1.4796 2.4489 sec/batch
Epoch 10/20  Iteration 1699/3560 Training loss: 1.4795 2.6414 sec/batch
Epoch 10/20  Iteration 1700/3560 Training loss: 1.4790 2.5151 sec/batch
Epoch 10/20  Iteration 1701/3560 Training loss: 1.4786 2.4575 sec/batch
Epoch 10/20  Iteration 1702/3560 Training loss: 1.4780 2.4651 sec/batch
Epoch 10/20  Iteration 1703/3560 Training loss: 1.4779 2.4464 sec/batch
Epoch 10/20  Iteration 1704/3560 Training loss: 1.4776 2.4576 sec/batch
Epoch 10/20  Iteration 1705/3560 Training loss: 1.4774 2.4479 sec/batch
Epoch 10/20  Iteration 1706/3560 Training loss: 1.4772 2.4668 sec/batch
Epoch 10/20  Iteration 1707/3560 Training loss: 1.4770 2.4628 sec/batch
Epoch 10/20  Iteration 1708/3560 Training loss: 1.4769 2.4463 sec/batch
Epoch 10/20  Iteration 1709/3560 Training loss: 1.4767 2.4635 sec/batch
Epoch 10/20  Iteration 1710/3560 Training loss: 1.4766 2.4630 sec/batch
Epoch 10/20  Iteration 1711/3560 Training loss: 1.4764 2.4527 sec/batch
Epoch 10/20  Iteration 1712/3560 Training loss: 1.4765 2.4588 sec/batch
Epoch 10/20  Iteration 1713/3560 Training loss: 1.4763 2.4482 sec/batch
Epoch 10/20  Iteration 1714/3560 Training loss: 1.4761 2.4518 sec/batch
Epoch 10/20  Iteration 1715/3560 Training loss: 1.4760 2.4477 sec/batch
Epoch 10/20  Iteration 1716/3560 Training loss: 1.4758 2.4531 sec/batch
Epoch 10/20  Iteration 1717/3560 Training loss: 1.4754 2.4590 sec/batch
Epoch 10/20  Iteration 1718/3560 Training loss: 1.4750 2.4568 sec/batch
Epoch 10/20  Iteration 1719/3560 Training loss: 1.4750 2.4602 sec/batch
Epoch 10/20  Iteration 1720/3560 Training loss: 1.4750 2.4538 sec/batch
Epoch 10/20  Iteration 1721/3560 Training loss: 1.4748 2.4719 sec/batch
Epoch 10/20  Iteration 1722/3560 Training loss: 1.4747 2.4535 sec/batch
Epoch 10/20  Iteration 1723/3560 Training loss: 1.4746 2.4554 sec/batch
Epoch 10/20  Iteration 1724/3560 Training loss: 1.4742 2.4664 sec/batch
Epoch 10/20  Iteration 1725/3560 Training loss: 1.4738 2.4502 sec/batch
Epoch 10/20  Iteration 1726/3560 Training loss: 1.4737 2.4625 sec/batch
Epoch 10/20  Iteration 1727/3560 Training loss: 1.4737 2.4575 sec/batch
Epoch 10/20  Iteration 1728/3560 Training loss: 1.4733 2.4436 sec/batch
Epoch 10/20  Iteration 1729/3560 Training loss: 1.4733 2.4526 sec/batch
Epoch 10/20  Iteration 1730/3560 Training loss: 1.4732 2.4588 sec/batch
Epoch 10/20  Iteration 1731/3560 Training loss: 1.4731 2.4613 sec/batch
Epoch 10/20  Iteration 1732/3560 Training loss: 1.4728 2.4595 sec/batch
Epoch 10/20  Iteration 1733/3560 Training loss: 1.4724 2.4549 sec/batch
Epoch 10/20  Iteration 1734/3560 Training loss: 1.4721 2.4545 sec/batch
Epoch 10/20  Iteration 1735/3560 Training loss: 1.4721 2.4674 sec/batch
Epoch 10/20  Iteration 1736/3560 Training loss: 1.4721 2.4469 sec/batch
Epoch 10/20  Iteration 1737/3560 Training loss: 1.4721 2.4484 sec/batch
Epoch 10/20  Iteration 1738/3560 Training loss: 1.4721 2.4563 sec/batch
Epoch 10/20  Iteration 1739/3560 Training loss: 1.4722 2.4511 sec/batch
Epoch 10/20  Iteration 1740/3560 Training loss: 1.4722 2.4719 sec/batch
Epoch 10/20  Iteration 1741/3560 Training loss: 1.4722 2.4522 sec/batch
Epoch 10/20  Iteration 1742/3560 Training loss: 1.4721 2.4792 sec/batch
Epoch 10/20  Iteration 1743/3560 Training loss: 1.4724 2.4558 sec/batch
Epoch 10/20  Iteration 1744/3560 Training loss: 1.4724 2.4541 sec/batch
Epoch 10/20  Iteration 1745/3560 Training loss: 1.4722 2.4569 sec/batch
Epoch 10/20  Iteration 1746/3560 Training loss: 1.4724 2.4522 sec/batch
Epoch 10/20  Iteration 1747/3560 Training loss: 1.4723 2.4639 sec/batch
Epoch 10/20  Iteration 1748/3560 Training loss: 1.4724 2.5153 sec/batch
Epoch 10/20  Iteration 1749/3560 Training loss: 1.4724 2.6387 sec/batch
Epoch 10/20  Iteration 1750/3560 Training loss: 1.4726 2.4715 sec/batch
Epoch 10/20  Iteration 1751/3560 Training loss: 1.4727 2.4549 sec/batch
Epoch 10/20  Iteration 1752/3560 Training loss: 1.4725 2.4694 sec/batch
Epoch 10/20  Iteration 1753/3560 Training loss: 1.4721 2.4592 sec/batch
Epoch 10/20  Iteration 1754/3560 Training loss: 1.4720 2.4980 sec/batch
Epoch 10/20  Iteration 1755/3560 Training loss: 1.4721 2.4446 sec/batch
Epoch 10/20  Iteration 1756/3560 Training loss: 1.4721 2.4626 sec/batch
Epoch 10/20  Iteration 1757/3560 Training loss: 1.4720 2.4510 sec/batch
Epoch 10/20  Iteration 1758/3560 Training loss: 1.4720 2.4448 sec/batch
Epoch 10/20  Iteration 1759/3560 Training loss: 1.4720 2.4555 sec/batch
Epoch 10/20  Iteration 1760/3560 Training loss: 1.4718 2.4507 sec/batch
Epoch 10/20  Iteration 1761/3560 Training loss: 1.4715 2.4576 sec/batch
Epoch 10/20  Iteration 1762/3560 Training loss: 1.4716 2.4457 sec/batch
Epoch 10/20  Iteration 1763/3560 Training loss: 1.4717 2.4625 sec/batch
Epoch 10/20  Iteration 1764/3560 Training loss: 1.4716 2.4628 sec/batch
Epoch 10/20  Iteration 1765/3560 Training loss: 1.4715 2.4621 sec/batch
Epoch 10/20  Iteration 1766/3560 Training loss: 1.4715 2.4690 sec/batch
Epoch 10/20  Iteration 1767/3560 Training loss: 1.4715 2.4480 sec/batch
Epoch 10/20  Iteration 1768/3560 Training loss: 1.4714 2.4490 sec/batch
Epoch 10/20  Iteration 1769/3560 Training loss: 1.4715 2.4631 sec/batch
Epoch 10/20  Iteration 1770/3560 Training loss: 1.4718 2.4526 sec/batch
Epoch 10/20  Iteration 1771/3560 Training loss: 1.4718 2.4662 sec/batch
Epoch 10/20  Iteration 1772/3560 Training loss: 1.4717 2.4603 sec/batch
Epoch 10/20  Iteration 1773/3560 Training loss: 1.4716 2.4424 sec/batch
Epoch 10/20  Iteration 1774/3560 Training loss: 1.4713 2.4597 sec/batch
Epoch 10/20  Iteration 1775/3560 Training loss: 1.4714 2.4506 sec/batch
Epoch 10/20  Iteration 1776/3560 Training loss: 1.4714 2.4649 sec/batch
Epoch 10/20  Iteration 1777/3560 Training loss: 1.4714 2.4480 sec/batch
Epoch 10/20  Iteration 1778/3560 Training loss: 1.4712 2.4502 sec/batch
Epoch 10/20  Iteration 1779/3560 Training loss: 1.4710 2.4771 sec/batch
Epoch 10/20  Iteration 1780/3560 Training loss: 1.4711 2.5019 sec/batch
Epoch 11/20  Iteration 1781/3560 Training loss: 1.5750 2.6755 sec/batch
Epoch 11/20  Iteration 1782/3560 Training loss: 1.5223 2.4571 sec/batch
Epoch 11/20  Iteration 1783/3560 Training loss: 1.4990 2.4586 sec/batch
Epoch 11/20  Iteration 1784/3560 Training loss: 1.4894 2.4543 sec/batch
Epoch 11/20  Iteration 1785/3560 Training loss: 1.4787 2.4626 sec/batch
Epoch 11/20  Iteration 1786/3560 Training loss: 1.4693 2.4417 sec/batch
Epoch 11/20  Iteration 1787/3560 Training loss: 1.4690 2.4538 sec/batch
Epoch 11/20  Iteration 1788/3560 Training loss: 1.4666 2.4552 sec/batch
Epoch 11/20  Iteration 1789/3560 Training loss: 1.4652 2.4443 sec/batch
Epoch 11/20  Iteration 1790/3560 Training loss: 1.4631 2.4644 sec/batch
Epoch 11/20  Iteration 1791/3560 Training loss: 1.4592 2.4763 sec/batch
Epoch 11/20  Iteration 1792/3560 Training loss: 1.4582 2.4521 sec/batch
Epoch 11/20  Iteration 1793/3560 Training loss: 1.4589 2.4563 sec/batch
Epoch 11/20  Iteration 1794/3560 Training loss: 1.4605 2.4789 sec/batch
Epoch 11/20  Iteration 1795/3560 Training loss: 1.4591 2.4611 sec/batch
Epoch 11/20  Iteration 1796/3560 Training loss: 1.4571 2.4605 sec/batch
Epoch 11/20  Iteration 1797/3560 Training loss: 1.4576 2.6363 sec/batch
Epoch 11/20  Iteration 1798/3560 Training loss: 1.4590 2.4839 sec/batch
Epoch 11/20  Iteration 1799/3560 Training loss: 1.4589 2.4492 sec/batch
Epoch 11/20  Iteration 1800/3560 Training loss: 1.4600 2.4636 sec/batch
Epoch 11/20  Iteration 1801/3560 Training loss: 1.4588 2.4578 sec/batch
Epoch 11/20  Iteration 1802/3560 Training loss: 1.4591 2.4568 sec/batch
Epoch 11/20  Iteration 1803/3560 Training loss: 1.4583 2.4669 sec/batch
Epoch 11/20  Iteration 1804/3560 Training loss: 1.4577 2.4510 sec/batch
Epoch 11/20  Iteration 1805/3560 Training loss: 1.4577 2.4625 sec/batch
Epoch 11/20  Iteration 1806/3560 Training loss: 1.4558 2.4486 sec/batch
Epoch 11/20  Iteration 1807/3560 Training loss: 1.4542 2.4642 sec/batch
Epoch 11/20  Iteration 1808/3560 Training loss: 1.4545 2.4475 sec/batch
Epoch 11/20  Iteration 1809/3560 Training loss: 1.4550 2.4607 sec/batch
Epoch 11/20  Iteration 1810/3560 Training loss: 1.4551 2.4452 sec/batch
Epoch 11/20  Iteration 1811/3560 Training loss: 1.4550 2.4687 sec/batch
Epoch 11/20  Iteration 1812/3560 Training loss: 1.4538 2.4441 sec/batch
Epoch 11/20  Iteration 1813/3560 Training loss: 1.4540 2.4457 sec/batch
Epoch 11/20  Iteration 1814/3560 Training loss: 1.4546 2.4590 sec/batch
Epoch 11/20  Iteration 1815/3560 Training loss: 1.4543 2.4741 sec/batch
Epoch 11/20  Iteration 1816/3560 Training loss: 1.4539 2.4623 sec/batch
Epoch 11/20  Iteration 1817/3560 Training loss: 1.4532 2.4558 sec/batch
Epoch 11/20  Iteration 1818/3560 Training loss: 1.4519 2.4486 sec/batch
Epoch 11/20  Iteration 1819/3560 Training loss: 1.4504 2.4604 sec/batch
Epoch 11/20  Iteration 1820/3560 Training loss: 1.4496 2.4496 sec/batch
Epoch 11/20  Iteration 1821/3560 Training loss: 1.4491 2.4632 sec/batch
Epoch 11/20  Iteration 1822/3560 Training loss: 1.4500 2.4461 sec/batch
Epoch 11/20  Iteration 1823/3560 Training loss: 1.4496 2.4650 sec/batch
Epoch 11/20  Iteration 1824/3560 Training loss: 1.4488 2.4484 sec/batch
Epoch 11/20  Iteration 1825/3560 Training loss: 1.4489 2.4434 sec/batch
Epoch 11/20  Iteration 1826/3560 Training loss: 1.4479 2.4611 sec/batch
Epoch 11/20  Iteration 1827/3560 Training loss: 1.4473 2.4705 sec/batch
Epoch 11/20  Iteration 1828/3560 Training loss: 1.4469 2.4599 sec/batch
Epoch 11/20  Iteration 1829/3560 Training loss: 1.4466 2.4464 sec/batch
Epoch 11/20  Iteration 1830/3560 Training loss: 1.4469 2.4491 sec/batch
Epoch 11/20  Iteration 1831/3560 Training loss: 1.4465 2.4638 sec/batch
Epoch 11/20  Iteration 1832/3560 Training loss: 1.4473 2.4495 sec/batch
Epoch 11/20  Iteration 1833/3560 Training loss: 1.4473 2.4861 sec/batch
Epoch 11/20  Iteration 1834/3560 Training loss: 1.4473 2.4401 sec/batch
Epoch 11/20  Iteration 1835/3560 Training loss: 1.4471 2.4470 sec/batch
Epoch 11/20  Iteration 1836/3560 Training loss: 1.4473 2.4669 sec/batch
Epoch 11/20  Iteration 1837/3560 Training loss: 1.4476 2.4586 sec/batch
Epoch 11/20  Iteration 1838/3560 Training loss: 1.4472 2.4611 sec/batch
Epoch 11/20  Iteration 1839/3560 Training loss: 1.4467 2.4689 sec/batch
Epoch 11/20  Iteration 1840/3560 Training loss: 1.4472 2.4613 sec/batch
Epoch 11/20  Iteration 1841/3560 Training loss: 1.4470 2.4464 sec/batch
Epoch 11/20  Iteration 1842/3560 Training loss: 1.4479 2.4582 sec/batch
Epoch 11/20  Iteration 1843/3560 Training loss: 1.4483 2.4455 sec/batch
Epoch 11/20  Iteration 1844/3560 Training loss: 1.4484 2.4564 sec/batch
Epoch 11/20  Iteration 1845/3560 Training loss: 1.4483 2.5801 sec/batch
Epoch 11/20  Iteration 1846/3560 Training loss: 1.4484 2.5768 sec/batch
Epoch 11/20  Iteration 1847/3560 Training loss: 1.4487 2.4650 sec/batch
Epoch 11/20  Iteration 1848/3560 Training loss: 1.4483 2.4515 sec/batch
Epoch 11/20  Iteration 1849/3560 Training loss: 1.4482 2.4855 sec/batch
Epoch 11/20  Iteration 1850/3560 Training loss: 1.4481 2.4725 sec/batch
Epoch 11/20  Iteration 1851/3560 Training loss: 1.4487 2.4732 sec/batch
Epoch 11/20  Iteration 1852/3560 Training loss: 1.4490 2.4637 sec/batch
Epoch 11/20  Iteration 1853/3560 Training loss: 1.4495 2.4505 sec/batch
Epoch 11/20  Iteration 1854/3560 Training loss: 1.4492 2.4568 sec/batch
Epoch 11/20  Iteration 1855/3560 Training loss: 1.4490 2.4499 sec/batch
Epoch 11/20  Iteration 1856/3560 Training loss: 1.4492 2.4685 sec/batch
Epoch 11/20  Iteration 1857/3560 Training loss: 1.4490 2.4527 sec/batch
Epoch 11/20  Iteration 1858/3560 Training loss: 1.4490 2.4605 sec/batch
Epoch 11/20  Iteration 1859/3560 Training loss: 1.4483 2.4676 sec/batch
Epoch 11/20  Iteration 1860/3560 Training loss: 1.4481 2.4584 sec/batch
Epoch 11/20  Iteration 1861/3560 Training loss: 1.4476 2.4607 sec/batch
Epoch 11/20  Iteration 1862/3560 Training loss: 1.4475 2.4839 sec/batch
Epoch 11/20  Iteration 1863/3560 Training loss: 1.4470 2.4685 sec/batch
Epoch 11/20  Iteration 1864/3560 Training loss: 1.4469 2.4620 sec/batch
Epoch 11/20  Iteration 1865/3560 Training loss: 1.4466 2.4454 sec/batch
Epoch 11/20  Iteration 1866/3560 Training loss: 1.4464 2.4573 sec/batch
Epoch 11/20  Iteration 1867/3560 Training loss: 1.4461 2.4507 sec/batch
Epoch 11/20  Iteration 1868/3560 Training loss: 1.4458 2.4580 sec/batch
Epoch 11/20  Iteration 1869/3560 Training loss: 1.4454 2.4664 sec/batch
Epoch 11/20  Iteration 1870/3560 Training loss: 1.4454 2.4723 sec/batch
Epoch 11/20  Iteration 1871/3560 Training loss: 1.4452 2.4775 sec/batch
Epoch 11/20  Iteration 1872/3560 Training loss: 1.4449 2.4526 sec/batch
Epoch 11/20  Iteration 1873/3560 Training loss: 1.4445 2.4549 sec/batch
Epoch 11/20  Iteration 1874/3560 Training loss: 1.4442 2.4564 sec/batch
Epoch 11/20  Iteration 1875/3560 Training loss: 1.4437 2.4506 sec/batch
Epoch 11/20  Iteration 1876/3560 Training loss: 1.4438 2.4766 sec/batch
Epoch 11/20  Iteration 1877/3560 Training loss: 1.4438 2.4524 sec/batch
Epoch 11/20  Iteration 1878/3560 Training loss: 1.4433 2.4475 sec/batch
Epoch 11/20  Iteration 1879/3560 Training loss: 1.4429 2.4632 sec/batch
Epoch 11/20  Iteration 1880/3560 Training loss: 1.4424 2.4532 sec/batch
Epoch 11/20  Iteration 1881/3560 Training loss: 1.4423 2.4519 sec/batch
Epoch 11/20  Iteration 1882/3560 Training loss: 1.4421 2.4517 sec/batch
Epoch 11/20  Iteration 1883/3560 Training loss: 1.4419 2.4589 sec/batch
Epoch 11/20  Iteration 1884/3560 Training loss: 1.4417 2.4523 sec/batch
Epoch 11/20  Iteration 1885/3560 Training loss: 1.4415 2.4597 sec/batch
Epoch 11/20  Iteration 1886/3560 Training loss: 1.4413 2.4533 sec/batch
Epoch 11/20  Iteration 1887/3560 Training loss: 1.4413 2.4603 sec/batch
Epoch 11/20  Iteration 1888/3560 Training loss: 1.4411 2.4645 sec/batch
Epoch 11/20  Iteration 1889/3560 Training loss: 1.4409 2.4461 sec/batch
Epoch 11/20  Iteration 1890/3560 Training loss: 1.4409 2.4686 sec/batch
Epoch 11/20  Iteration 1891/3560 Training loss: 1.4407 2.4531 sec/batch
Epoch 11/20  Iteration 1892/3560 Training loss: 1.4405 2.4577 sec/batch
Epoch 11/20  Iteration 1893/3560 Training loss: 1.4404 2.4813 sec/batch
Epoch 11/20  Iteration 1894/3560 Training loss: 1.4402 2.6471 sec/batch
Epoch 11/20  Iteration 1895/3560 Training loss: 1.4399 2.5088 sec/batch
Epoch 11/20  Iteration 1896/3560 Training loss: 1.4394 2.4610 sec/batch
Epoch 11/20  Iteration 1897/3560 Training loss: 1.4394 2.4789 sec/batch
Epoch 11/20  Iteration 1898/3560 Training loss: 1.4393 2.6259 sec/batch
Epoch 11/20  Iteration 1899/3560 Training loss: 1.4390 2.9239 sec/batch
Epoch 11/20  Iteration 1900/3560 Training loss: 1.4389 2.6667 sec/batch
Epoch 11/20  Iteration 1901/3560 Training loss: 1.4388 2.5283 sec/batch
Epoch 11/20  Iteration 1902/3560 Training loss: 1.4384 2.7887 sec/batch
Epoch 11/20  Iteration 1903/3560 Training loss: 1.4380 2.4584 sec/batch
Epoch 11/20  Iteration 1904/3560 Training loss: 1.4379 2.4445 sec/batch
Epoch 11/20  Iteration 1905/3560 Training loss: 1.4379 2.4608 sec/batch
Epoch 11/20  Iteration 1906/3560 Training loss: 1.4374 2.4677 sec/batch
Epoch 11/20  Iteration 1907/3560 Training loss: 1.4373 2.4497 sec/batch
Epoch 11/20  Iteration 1908/3560 Training loss: 1.4373 2.4757 sec/batch
Epoch 11/20  Iteration 1909/3560 Training loss: 1.4371 2.4665 sec/batch
Epoch 11/20  Iteration 1910/3560 Training loss: 1.4367 2.4902 sec/batch
Epoch 11/20  Iteration 1911/3560 Training loss: 1.4363 2.4772 sec/batch
Epoch 11/20  Iteration 1912/3560 Training loss: 1.4360 2.4658 sec/batch
Epoch 11/20  Iteration 1913/3560 Training loss: 1.4361 2.4595 sec/batch
Epoch 11/20  Iteration 1914/3560 Training loss: 1.4361 2.4475 sec/batch
Epoch 11/20  Iteration 1915/3560 Training loss: 1.4360 2.4649 sec/batch
Epoch 11/20  Iteration 1916/3560 Training loss: 1.4361 2.4648 sec/batch
Epoch 11/20  Iteration 1917/3560 Training loss: 1.4362 2.4526 sec/batch
Epoch 11/20  Iteration 1918/3560 Training loss: 1.4363 2.4560 sec/batch
Epoch 11/20  Iteration 1919/3560 Training loss: 1.4363 2.4512 sec/batch
Epoch 11/20  Iteration 1920/3560 Training loss: 1.4362 2.4688 sec/batch
Epoch 11/20  Iteration 1921/3560 Training loss: 1.4365 2.4754 sec/batch
Epoch 11/20  Iteration 1922/3560 Training loss: 1.4364 2.4561 sec/batch
Epoch 11/20  Iteration 1923/3560 Training loss: 1.4364 2.4478 sec/batch
Epoch 11/20  Iteration 1924/3560 Training loss: 1.4365 2.4681 sec/batch
Epoch 11/20  Iteration 1925/3560 Training loss: 1.4364 2.4623 sec/batch
Epoch 11/20  Iteration 1926/3560 Training loss: 1.4365 2.4507 sec/batch
Epoch 11/20  Iteration 1927/3560 Training loss: 1.4365 2.4577 sec/batch
Epoch 11/20  Iteration 1928/3560 Training loss: 1.4367 2.4515 sec/batch
Epoch 11/20  Iteration 1929/3560 Training loss: 1.4367 2.4522 sec/batch
Epoch 11/20  Iteration 1930/3560 Training loss: 1.4365 2.4595 sec/batch
Epoch 11/20  Iteration 1931/3560 Training loss: 1.4361 2.4523 sec/batch
Epoch 11/20  Iteration 1932/3560 Training loss: 1.4359 2.4687 sec/batch
Epoch 11/20  Iteration 1933/3560 Training loss: 1.4360 2.4548 sec/batch
Epoch 11/20  Iteration 1934/3560 Training loss: 1.4360 2.4548 sec/batch
Epoch 11/20  Iteration 1935/3560 Training loss: 1.4359 2.4629 sec/batch
Epoch 11/20  Iteration 1936/3560 Training loss: 1.4359 2.4810 sec/batch
Epoch 11/20  Iteration 1937/3560 Training loss: 1.4359 2.4614 sec/batch
Epoch 11/20  Iteration 1938/3560 Training loss: 1.4358 2.4455 sec/batch
Epoch 11/20  Iteration 1939/3560 Training loss: 1.4354 2.4668 sec/batch
Epoch 11/20  Iteration 1940/3560 Training loss: 1.4354 2.4582 sec/batch
Epoch 11/20  Iteration 1941/3560 Training loss: 1.4356 2.4559 sec/batch
Epoch 11/20  Iteration 1942/3560 Training loss: 1.4354 2.6562 sec/batch
Epoch 11/20  Iteration 1943/3560 Training loss: 1.4354 2.4926 sec/batch
Epoch 11/20  Iteration 1944/3560 Training loss: 1.4354 2.4669 sec/batch
Epoch 11/20  Iteration 1945/3560 Training loss: 1.4354 2.4480 sec/batch
Epoch 11/20  Iteration 1946/3560 Training loss: 1.4353 2.4514 sec/batch
Epoch 11/20  Iteration 1947/3560 Training loss: 1.4354 2.4599 sec/batch
Epoch 11/20  Iteration 1948/3560 Training loss: 1.4357 2.4651 sec/batch
Epoch 11/20  Iteration 1949/3560 Training loss: 1.4356 2.4594 sec/batch
Epoch 11/20  Iteration 1950/3560 Training loss: 1.4356 2.4461 sec/batch
Epoch 11/20  Iteration 1951/3560 Training loss: 1.4354 2.4656 sec/batch
Epoch 11/20  Iteration 1952/3560 Training loss: 1.4351 2.4460 sec/batch
Epoch 11/20  Iteration 1953/3560 Training loss: 1.4352 2.4489 sec/batch
Epoch 11/20  Iteration 1954/3560 Training loss: 1.4352 2.4593 sec/batch
Epoch 11/20  Iteration 1955/3560 Training loss: 1.4352 2.4538 sec/batch
Epoch 11/20  Iteration 1956/3560 Training loss: 1.4351 2.4708 sec/batch
Epoch 11/20  Iteration 1957/3560 Training loss: 1.4349 2.4530 sec/batch
Epoch 11/20  Iteration 1958/3560 Training loss: 1.4350 2.4746 sec/batch
Epoch 12/20  Iteration 1959/3560 Training loss: 1.5520 2.4563 sec/batch
Epoch 12/20  Iteration 1960/3560 Training loss: 1.4925 2.4684 sec/batch
Epoch 12/20  Iteration 1961/3560 Training loss: 1.4745 2.4699 sec/batch
Epoch 12/20  Iteration 1962/3560 Training loss: 1.4658 2.4490 sec/batch
Epoch 12/20  Iteration 1963/3560 Training loss: 1.4536 2.4549 sec/batch
Epoch 12/20  Iteration 1964/3560 Training loss: 1.4406 2.4520 sec/batch
Epoch 12/20  Iteration 1965/3560 Training loss: 1.4403 2.4618 sec/batch
Epoch 12/20  Iteration 1966/3560 Training loss: 1.4373 2.4523 sec/batch
Epoch 12/20  Iteration 1967/3560 Training loss: 1.4356 2.4523 sec/batch
Epoch 12/20  Iteration 1968/3560 Training loss: 1.4341 2.4601 sec/batch
Epoch 12/20  Iteration 1969/3560 Training loss: 1.4295 2.4552 sec/batch
Epoch 12/20  Iteration 1970/3560 Training loss: 1.4289 2.4601 sec/batch
Epoch 12/20  Iteration 1971/3560 Training loss: 1.4290 2.4533 sec/batch
Epoch 12/20  Iteration 1972/3560 Training loss: 1.4300 2.4693 sec/batch
Epoch 12/20  Iteration 1973/3560 Training loss: 1.4289 2.4495 sec/batch
Epoch 12/20  Iteration 1974/3560 Training loss: 1.4272 2.4544 sec/batch
Epoch 12/20  Iteration 1975/3560 Training loss: 1.4269 2.4673 sec/batch
Epoch 12/20  Iteration 1976/3560 Training loss: 1.4281 2.4457 sec/batch
Epoch 12/20  Iteration 1977/3560 Training loss: 1.4280 2.4631 sec/batch
Epoch 12/20  Iteration 1978/3560 Training loss: 1.4289 2.4552 sec/batch
Epoch 12/20  Iteration 1979/3560 Training loss: 1.4282 2.4560 sec/batch
Epoch 12/20  Iteration 1980/3560 Training loss: 1.4286 2.4668 sec/batch
Epoch 12/20  Iteration 1981/3560 Training loss: 1.4272 2.4549 sec/batch
Epoch 12/20  Iteration 1982/3560 Training loss: 1.4266 2.4633 sec/batch
Epoch 12/20  Iteration 1983/3560 Training loss: 1.4267 2.4488 sec/batch
Epoch 12/20  Iteration 1984/3560 Training loss: 1.4246 2.4497 sec/batch
Epoch 12/20  Iteration 1985/3560 Training loss: 1.4229 2.7189 sec/batch
Epoch 12/20  Iteration 1986/3560 Training loss: 1.4233 2.4510 sec/batch
Epoch 12/20  Iteration 1987/3560 Training loss: 1.4233 2.4608 sec/batch
Epoch 12/20  Iteration 1988/3560 Training loss: 1.4236 2.4526 sec/batch
Epoch 12/20  Iteration 1989/3560 Training loss: 1.4230 2.4520 sec/batch
Epoch 12/20  Iteration 1990/3560 Training loss: 1.4219 2.5262 sec/batch
Epoch 12/20  Iteration 1991/3560 Training loss: 1.4220 2.6140 sec/batch
Epoch 12/20  Iteration 1992/3560 Training loss: 1.4222 2.4911 sec/batch
Epoch 12/20  Iteration 1993/3560 Training loss: 1.4218 2.4489 sec/batch
Epoch 12/20  Iteration 1994/3560 Training loss: 1.4213 2.4739 sec/batch
Epoch 12/20  Iteration 1995/3560 Training loss: 1.4205 2.4459 sec/batch
Epoch 12/20  Iteration 1996/3560 Training loss: 1.4192 2.4641 sec/batch
Epoch 12/20  Iteration 1997/3560 Training loss: 1.4177 2.4777 sec/batch
Epoch 12/20  Iteration 1998/3560 Training loss: 1.4170 2.4504 sec/batch
Epoch 12/20  Iteration 1999/3560 Training loss: 1.4163 2.4714 sec/batch
Epoch 12/20  Iteration 2000/3560 Training loss: 1.4174 2.4484 sec/batch
Epoch 12/20  Iteration 2001/3560 Training loss: 1.4170 2.4509 sec/batch
Epoch 12/20  Iteration 2002/3560 Training loss: 1.4161 2.4596 sec/batch
Epoch 12/20  Iteration 2003/3560 Training loss: 1.4163 2.4503 sec/batch
Epoch 12/20  Iteration 2004/3560 Training loss: 1.4154 2.4514 sec/batch
Epoch 12/20  Iteration 2005/3560 Training loss: 1.4148 2.4488 sec/batch
Epoch 12/20  Iteration 2006/3560 Training loss: 1.4141 2.4537 sec/batch
Epoch 12/20  Iteration 2007/3560 Training loss: 1.4137 2.4659 sec/batch
Epoch 12/20  Iteration 2008/3560 Training loss: 1.4140 2.4498 sec/batch
Epoch 12/20  Iteration 2009/3560 Training loss: 1.4134 2.4774 sec/batch
Epoch 12/20  Iteration 2010/3560 Training loss: 1.4142 2.4523 sec/batch
Epoch 12/20  Iteration 2011/3560 Training loss: 1.4140 2.4607 sec/batch
Epoch 12/20  Iteration 2012/3560 Training loss: 1.4142 2.4459 sec/batch
Epoch 12/20  Iteration 2013/3560 Training loss: 1.4140 2.4464 sec/batch
Epoch 12/20  Iteration 2014/3560 Training loss: 1.4141 2.4675 sec/batch
Epoch 12/20  Iteration 2015/3560 Training loss: 1.4145 2.4560 sec/batch
Epoch 12/20  Iteration 2016/3560 Training loss: 1.4141 2.4747 sec/batch
Epoch 12/20  Iteration 2017/3560 Training loss: 1.4134 2.4506 sec/batch
Epoch 12/20  Iteration 2018/3560 Training loss: 1.4140 2.4663 sec/batch
Epoch 12/20  Iteration 2019/3560 Training loss: 1.4138 2.4549 sec/batch
Epoch 12/20  Iteration 2020/3560 Training loss: 1.4146 2.4659 sec/batch
Epoch 12/20  Iteration 2021/3560 Training loss: 1.4150 2.4639 sec/batch
Epoch 12/20  Iteration 2022/3560 Training loss: 1.4151 2.4503 sec/batch
Epoch 12/20  Iteration 2023/3560 Training loss: 1.4149 2.6845 sec/batch
Epoch 12/20  Iteration 2024/3560 Training loss: 1.4150 2.4561 sec/batch
Epoch 12/20  Iteration 2025/3560 Training loss: 1.4152 2.4596 sec/batch
Epoch 12/20  Iteration 2026/3560 Training loss: 1.4149 2.4581 sec/batch
Epoch 12/20  Iteration 2027/3560 Training loss: 1.4149 2.4672 sec/batch
Epoch 12/20  Iteration 2028/3560 Training loss: 1.4148 2.4617 sec/batch
Epoch 12/20  Iteration 2029/3560 Training loss: 1.4153 2.4562 sec/batch
Epoch 12/20  Iteration 2030/3560 Training loss: 1.4157 2.4655 sec/batch
Epoch 12/20  Iteration 2031/3560 Training loss: 1.4161 2.4507 sec/batch
Epoch 12/20  Iteration 2032/3560 Training loss: 1.4158 2.4483 sec/batch
Epoch 12/20  Iteration 2033/3560 Training loss: 1.4156 2.4772 sec/batch
Epoch 12/20  Iteration 2034/3560 Training loss: 1.4157 2.4552 sec/batch
Epoch 12/20  Iteration 2035/3560 Training loss: 1.4156 2.4668 sec/batch
Epoch 12/20  Iteration 2036/3560 Training loss: 1.4155 2.4798 sec/batch
Epoch 12/20  Iteration 2037/3560 Training loss: 1.4147 2.4734 sec/batch
Epoch 12/20  Iteration 2038/3560 Training loss: 1.4146 2.4589 sec/batch
Epoch 12/20  Iteration 2039/3560 Training loss: 1.4141 2.6520 sec/batch
Epoch 12/20  Iteration 2040/3560 Training loss: 1.4140 2.5269 sec/batch
Epoch 12/20  Iteration 2041/3560 Training loss: 1.4136 2.4472 sec/batch
Epoch 12/20  Iteration 2042/3560 Training loss: 1.4135 2.4614 sec/batch
Epoch 12/20  Iteration 2043/3560 Training loss: 1.4133 2.4477 sec/batch
Epoch 12/20  Iteration 2044/3560 Training loss: 1.4130 2.4587 sec/batch
Epoch 12/20  Iteration 2045/3560 Training loss: 1.4127 2.4681 sec/batch
Epoch 12/20  Iteration 2046/3560 Training loss: 1.4123 2.4559 sec/batch
Epoch 12/20  Iteration 2047/3560 Training loss: 1.4119 2.4551 sec/batch
Epoch 12/20  Iteration 2048/3560 Training loss: 1.4118 2.4444 sec/batch
Epoch 12/20  Iteration 2049/3560 Training loss: 1.4116 2.4606 sec/batch
Epoch 12/20  Iteration 2050/3560 Training loss: 1.4115 2.4443 sec/batch
Epoch 12/20  Iteration 2051/3560 Training loss: 1.4111 2.4633 sec/batch
Epoch 12/20  Iteration 2052/3560 Training loss: 1.4109 2.4493 sec/batch
Epoch 12/20  Iteration 2053/3560 Training loss: 1.4106 2.4555 sec/batch
Epoch 12/20  Iteration 2054/3560 Training loss: 1.4106 2.4597 sec/batch
Epoch 12/20  Iteration 2055/3560 Training loss: 1.4107 2.4496 sec/batch
Epoch 12/20  Iteration 2056/3560 Training loss: 1.4103 2.4642 sec/batch
Epoch 12/20  Iteration 2057/3560 Training loss: 1.4099 2.4658 sec/batch
Epoch 12/20  Iteration 2058/3560 Training loss: 1.4094 2.4531 sec/batch
Epoch 12/20  Iteration 2059/3560 Training loss: 1.4093 2.4540 sec/batch
Epoch 12/20  Iteration 2060/3560 Training loss: 1.4091 2.4632 sec/batch
Epoch 12/20  Iteration 2061/3560 Training loss: 1.4090 2.4659 sec/batch
Epoch 12/20  Iteration 2062/3560 Training loss: 1.4088 2.4617 sec/batch
Epoch 12/20  Iteration 2063/3560 Training loss: 1.4086 2.4712 sec/batch
Epoch 12/20  Iteration 2064/3560 Training loss: 1.4085 2.4538 sec/batch
Epoch 12/20  Iteration 2065/3560 Training loss: 1.4083 2.4575 sec/batch
Epoch 12/20  Iteration 2066/3560 Training loss: 1.4082 2.4591 sec/batch
Epoch 12/20  Iteration 2067/3560 Training loss: 1.4080 2.4547 sec/batch
Epoch 12/20  Iteration 2068/3560 Training loss: 1.4081 2.4688 sec/batch
Epoch 12/20  Iteration 2069/3560 Training loss: 1.4078 2.4631 sec/batch
Epoch 12/20  Iteration 2070/3560 Training loss: 1.4077 2.4493 sec/batch
Epoch 12/20  Iteration 2071/3560 Training loss: 1.4075 2.4551 sec/batch
Epoch 12/20  Iteration 2072/3560 Training loss: 1.4073 2.4494 sec/batch
Epoch 12/20  Iteration 2073/3560 Training loss: 1.4071 2.4667 sec/batch
Epoch 12/20  Iteration 2074/3560 Training loss: 1.4067 2.4487 sec/batch
Epoch 12/20  Iteration 2075/3560 Training loss: 1.4067 2.4494 sec/batch
Epoch 12/20  Iteration 2076/3560 Training loss: 1.4066 2.4677 sec/batch
Epoch 12/20  Iteration 2077/3560 Training loss: 1.4064 2.4454 sec/batch
Epoch 12/20  Iteration 2078/3560 Training loss: 1.4062 2.4583 sec/batch
Epoch 12/20  Iteration 2079/3560 Training loss: 1.4061 2.4575 sec/batch
Epoch 12/20  Iteration 2080/3560 Training loss: 1.4057 2.4557 sec/batch
Epoch 12/20  Iteration 2081/3560 Training loss: 1.4052 2.4613 sec/batch
Epoch 12/20  Iteration 2082/3560 Training loss: 1.4053 2.4686 sec/batch
Epoch 12/20  Iteration 2083/3560 Training loss: 1.4052 2.4692 sec/batch
Epoch 12/20  Iteration 2084/3560 Training loss: 1.4048 2.4496 sec/batch
Epoch 12/20  Iteration 2085/3560 Training loss: 1.4047 2.4584 sec/batch
Epoch 12/20  Iteration 2086/3560 Training loss: 1.4047 2.4590 sec/batch
Epoch 12/20  Iteration 2087/3560 Training loss: 1.4044 2.6483 sec/batch
Epoch 12/20  Iteration 2088/3560 Training loss: 1.4041 2.5209 sec/batch
Epoch 12/20  Iteration 2089/3560 Training loss: 1.4036 2.4567 sec/batch
Epoch 12/20  Iteration 2090/3560 Training loss: 1.4034 2.4630 sec/batch
Epoch 12/20  Iteration 2091/3560 Training loss: 1.4034 2.4536 sec/batch
Epoch 12/20  Iteration 2092/3560 Training loss: 1.4035 2.4537 sec/batch
Epoch 12/20  Iteration 2093/3560 Training loss: 1.4034 2.4546 sec/batch
Epoch 12/20  Iteration 2094/3560 Training loss: 1.4035 2.4678 sec/batch
Epoch 12/20  Iteration 2095/3560 Training loss: 1.4036 2.4598 sec/batch
Epoch 12/20  Iteration 2096/3560 Training loss: 1.4037 2.4458 sec/batch
Epoch 12/20  Iteration 2097/3560 Training loss: 1.4037 2.4524 sec/batch
Epoch 12/20  Iteration 2098/3560 Training loss: 1.4037 2.4539 sec/batch
Epoch 12/20  Iteration 2099/3560 Training loss: 1.4039 2.4487 sec/batch
Epoch 12/20  Iteration 2100/3560 Training loss: 1.4039 2.4651 sec/batch
Epoch 12/20  Iteration 2101/3560 Training loss: 1.4038 2.4464 sec/batch
Epoch 12/20  Iteration 2102/3560 Training loss: 1.4040 2.4651 sec/batch
Epoch 12/20  Iteration 2103/3560 Training loss: 1.4038 2.4492 sec/batch
Epoch 12/20  Iteration 2104/3560 Training loss: 1.4039 2.4522 sec/batch
Epoch 12/20  Iteration 2105/3560 Training loss: 1.4039 2.4576 sec/batch
Epoch 12/20  Iteration 2106/3560 Training loss: 1.4042 2.4677 sec/batch
Epoch 12/20  Iteration 2107/3560 Training loss: 1.4043 2.4551 sec/batch
Epoch 12/20  Iteration 2108/3560 Training loss: 1.4041 2.4580 sec/batch
Epoch 12/20  Iteration 2109/3560 Training loss: 1.4037 2.4473 sec/batch
Epoch 12/20  Iteration 2110/3560 Training loss: 1.4035 2.4580 sec/batch
Epoch 12/20  Iteration 2111/3560 Training loss: 1.4036 2.4589 sec/batch
Epoch 12/20  Iteration 2112/3560 Training loss: 1.4036 2.4588 sec/batch
Epoch 12/20  Iteration 2113/3560 Training loss: 1.4035 2.4499 sec/batch
Epoch 12/20  Iteration 2114/3560 Training loss: 1.4035 2.4408 sec/batch
Epoch 12/20  Iteration 2115/3560 Training loss: 1.4035 2.4542 sec/batch
Epoch 12/20  Iteration 2116/3560 Training loss: 1.4034 2.4684 sec/batch
Epoch 12/20  Iteration 2117/3560 Training loss: 1.4031 2.4653 sec/batch
Epoch 12/20  Iteration 2118/3560 Training loss: 1.4031 2.4679 sec/batch
Epoch 12/20  Iteration 2119/3560 Training loss: 1.4032 2.4620 sec/batch
Epoch 12/20  Iteration 2120/3560 Training loss: 1.4031 2.4507 sec/batch
Epoch 12/20  Iteration 2121/3560 Training loss: 1.4031 2.4505 sec/batch
Epoch 12/20  Iteration 2122/3560 Training loss: 1.4031 2.4544 sec/batch
Epoch 12/20  Iteration 2123/3560 Training loss: 1.4030 2.4533 sec/batch
Epoch 12/20  Iteration 2124/3560 Training loss: 1.4029 2.4657 sec/batch
Epoch 12/20  Iteration 2125/3560 Training loss: 1.4030 2.4509 sec/batch
Epoch 12/20  Iteration 2126/3560 Training loss: 1.4034 2.4572 sec/batch
Epoch 12/20  Iteration 2127/3560 Training loss: 1.4034 2.4486 sec/batch
Epoch 12/20  Iteration 2128/3560 Training loss: 1.4033 2.4625 sec/batch
Epoch 12/20  Iteration 2129/3560 Training loss: 1.4032 2.4528 sec/batch
Epoch 12/20  Iteration 2130/3560 Training loss: 1.4029 2.4737 sec/batch
Epoch 12/20  Iteration 2131/3560 Training loss: 1.4030 2.4481 sec/batch
Epoch 12/20  Iteration 2132/3560 Training loss: 1.4030 2.4534 sec/batch
Epoch 12/20  Iteration 2133/3560 Training loss: 1.4030 2.4909 sec/batch
Epoch 12/20  Iteration 2134/3560 Training loss: 1.4029 2.4526 sec/batch
Epoch 12/20  Iteration 2135/3560 Training loss: 1.4027 2.4649 sec/batch
Epoch 12/20  Iteration 2136/3560 Training loss: 1.4028 2.4666 sec/batch
Epoch 13/20  Iteration 2137/3560 Training loss: 1.5287 2.6387 sec/batch
Epoch 13/20  Iteration 2138/3560 Training loss: 1.4635 2.4688 sec/batch
Epoch 13/20  Iteration 2139/3560 Training loss: 1.4383 2.4493 sec/batch
Epoch 13/20  Iteration 2140/3560 Training loss: 1.4310 2.4688 sec/batch
Epoch 13/20  Iteration 2141/3560 Training loss: 1.4188 2.4574 sec/batch
Epoch 13/20  Iteration 2142/3560 Training loss: 1.4064 2.4752 sec/batch
Epoch 13/20  Iteration 2143/3560 Training loss: 1.4043 2.4412 sec/batch
Epoch 13/20  Iteration 2144/3560 Training loss: 1.4015 2.7954 sec/batch
Epoch 13/20  Iteration 2145/3560 Training loss: 1.3996 2.4986 sec/batch
Epoch 13/20  Iteration 2146/3560 Training loss: 1.3981 2.4610 sec/batch
Epoch 13/20  Iteration 2147/3560 Training loss: 1.3941 2.4741 sec/batch
Epoch 13/20  Iteration 2148/3560 Training loss: 1.3937 2.4824 sec/batch
Epoch 13/20  Iteration 2149/3560 Training loss: 1.3936 2.4602 sec/batch
Epoch 13/20  Iteration 2150/3560 Training loss: 1.3952 2.4732 sec/batch
Epoch 13/20  Iteration 2151/3560 Training loss: 1.3943 2.4606 sec/batch
Epoch 13/20  Iteration 2152/3560 Training loss: 1.3925 2.4640 sec/batch
Epoch 13/20  Iteration 2153/3560 Training loss: 1.3927 2.4524 sec/batch
Epoch 13/20  Iteration 2154/3560 Training loss: 1.3938 2.4770 sec/batch
Epoch 13/20  Iteration 2155/3560 Training loss: 1.3940 2.4553 sec/batch
Epoch 13/20  Iteration 2156/3560 Training loss: 1.3953 2.4598 sec/batch
Epoch 13/20  Iteration 2157/3560 Training loss: 1.3944 2.4571 sec/batch
Epoch 13/20  Iteration 2158/3560 Training loss: 1.3945 2.4447 sec/batch
Epoch 13/20  Iteration 2159/3560 Training loss: 1.3933 2.4469 sec/batch
Epoch 13/20  Iteration 2160/3560 Training loss: 1.3929 2.4689 sec/batch
Epoch 13/20  Iteration 2161/3560 Training loss: 1.3931 2.4470 sec/batch
Epoch 13/20  Iteration 2162/3560 Training loss: 1.3910 2.4582 sec/batch
Epoch 13/20  Iteration 2163/3560 Training loss: 1.3897 2.4496 sec/batch
Epoch 13/20  Iteration 2164/3560 Training loss: 1.3900 2.4604 sec/batch
Epoch 13/20  Iteration 2165/3560 Training loss: 1.3905 2.4446 sec/batch
Epoch 13/20  Iteration 2166/3560 Training loss: 1.3907 2.4767 sec/batch
Epoch 13/20  Iteration 2167/3560 Training loss: 1.3904 2.4531 sec/batch
Epoch 13/20  Iteration 2168/3560 Training loss: 1.3893 2.4498 sec/batch
Epoch 13/20  Iteration 2169/3560 Training loss: 1.3895 2.4658 sec/batch
Epoch 13/20  Iteration 2170/3560 Training loss: 1.3897 2.4613 sec/batch
Epoch 13/20  Iteration 2171/3560 Training loss: 1.3894 2.4565 sec/batch
Epoch 13/20  Iteration 2172/3560 Training loss: 1.3890 2.4624 sec/batch
Epoch 13/20  Iteration 2173/3560 Training loss: 1.3884 2.4487 sec/batch
Epoch 13/20  Iteration 2174/3560 Training loss: 1.3873 2.4668 sec/batch
Epoch 13/20  Iteration 2175/3560 Training loss: 1.3861 2.4460 sec/batch
Epoch 13/20  Iteration 2176/3560 Training loss: 1.3855 2.4661 sec/batch
Epoch 13/20  Iteration 2177/3560 Training loss: 1.3847 2.4517 sec/batch
Epoch 13/20  Iteration 2178/3560 Training loss: 1.3858 2.4439 sec/batch
Epoch 13/20  Iteration 2179/3560 Training loss: 1.3852 2.4716 sec/batch
Epoch 13/20  Iteration 2180/3560 Training loss: 1.3846 2.4557 sec/batch
Epoch 13/20  Iteration 2181/3560 Training loss: 1.3848 2.4570 sec/batch
Epoch 13/20  Iteration 2182/3560 Training loss: 1.3839 2.4493 sec/batch
Epoch 13/20  Iteration 2183/3560 Training loss: 1.3835 2.4520 sec/batch
Epoch 13/20  Iteration 2184/3560 Training loss: 1.3830 2.6858 sec/batch
Epoch 13/20  Iteration 2185/3560 Training loss: 1.3831 2.4934 sec/batch
Epoch 13/20  Iteration 2186/3560 Training loss: 1.3835 2.4709 sec/batch
Epoch 13/20  Iteration 2187/3560 Training loss: 1.3830 2.4448 sec/batch
Epoch 13/20  Iteration 2188/3560 Training loss: 1.3839 2.4688 sec/batch
Epoch 13/20  Iteration 2189/3560 Training loss: 1.3838 2.4549 sec/batch
Epoch 13/20  Iteration 2190/3560 Training loss: 1.3841 2.4527 sec/batch
Epoch 13/20  Iteration 2191/3560 Training loss: 1.3839 2.4697 sec/batch
Epoch 13/20  Iteration 2192/3560 Training loss: 1.3839 2.4529 sec/batch
Epoch 13/20  Iteration 2193/3560 Training loss: 1.3842 2.4617 sec/batch
Epoch 13/20  Iteration 2194/3560 Training loss: 1.3838 2.4452 sec/batch
Epoch 13/20  Iteration 2195/3560 Training loss: 1.3834 2.4683 sec/batch
Epoch 13/20  Iteration 2196/3560 Training loss: 1.3839 2.4488 sec/batch
Epoch 13/20  Iteration 2197/3560 Training loss: 1.3838 2.4567 sec/batch
Epoch 13/20  Iteration 2198/3560 Training loss: 1.3846 2.4647 sec/batch
Epoch 13/20  Iteration 2199/3560 Training loss: 1.3850 2.4480 sec/batch
Epoch 13/20  Iteration 2200/3560 Training loss: 1.3851 2.4729 sec/batch
Epoch 13/20  Iteration 2201/3560 Training loss: 1.3849 2.4507 sec/batch
Epoch 13/20  Iteration 2202/3560 Training loss: 1.3851 2.4635 sec/batch
Epoch 13/20  Iteration 2203/3560 Training loss: 1.3855 2.4758 sec/batch
Epoch 13/20  Iteration 2204/3560 Training loss: 1.3851 2.4551 sec/batch
Epoch 13/20  Iteration 2205/3560 Training loss: 1.3851 2.4605 sec/batch
Epoch 13/20  Iteration 2206/3560 Training loss: 1.3850 2.4563 sec/batch
Epoch 13/20  Iteration 2207/3560 Training loss: 1.3858 2.4546 sec/batch
Epoch 13/20  Iteration 2208/3560 Training loss: 1.3861 2.4523 sec/batch
Epoch 13/20  Iteration 2209/3560 Training loss: 1.3867 2.4463 sec/batch
Epoch 13/20  Iteration 2210/3560 Training loss: 1.3864 2.4604 sec/batch
Epoch 13/20  Iteration 2211/3560 Training loss: 1.3861 2.4551 sec/batch
Epoch 13/20  Iteration 2212/3560 Training loss: 1.3861 2.4674 sec/batch
Epoch 13/20  Iteration 2213/3560 Training loss: 1.3860 2.4574 sec/batch
Epoch 13/20  Iteration 2214/3560 Training loss: 1.3858 2.4521 sec/batch
Epoch 13/20  Iteration 2215/3560 Training loss: 1.3851 2.4658 sec/batch
Epoch 13/20  Iteration 2216/3560 Training loss: 1.3852 2.4480 sec/batch
Epoch 13/20  Iteration 2217/3560 Training loss: 1.3847 2.4498 sec/batch
Epoch 13/20  Iteration 2218/3560 Training loss: 1.3848 2.4451 sec/batch
Epoch 13/20  Iteration 2219/3560 Training loss: 1.3843 2.4602 sec/batch
Epoch 13/20  Iteration 2220/3560 Training loss: 1.3842 2.4439 sec/batch
Epoch 13/20  Iteration 2221/3560 Training loss: 1.3839 2.4430 sec/batch
Epoch 13/20  Iteration 2222/3560 Training loss: 1.3838 2.4638 sec/batch
Epoch 13/20  Iteration 2223/3560 Training loss: 1.3835 2.4526 sec/batch
Epoch 13/20  Iteration 2224/3560 Training loss: 1.3832 2.4589 sec/batch
Epoch 13/20  Iteration 2225/3560 Training loss: 1.3827 2.4523 sec/batch
Epoch 13/20  Iteration 2226/3560 Training loss: 1.3826 2.4470 sec/batch
Epoch 13/20  Iteration 2227/3560 Training loss: 1.3825 2.4780 sec/batch
Epoch 13/20  Iteration 2228/3560 Training loss: 1.3823 2.4573 sec/batch
Epoch 13/20  Iteration 2229/3560 Training loss: 1.3819 2.4771 sec/batch
Epoch 13/20  Iteration 2230/3560 Training loss: 1.3816 2.4444 sec/batch
Epoch 13/20  Iteration 2231/3560 Training loss: 1.3813 2.4578 sec/batch
Epoch 13/20  Iteration 2232/3560 Training loss: 1.3814 2.4530 sec/batch
Epoch 13/20  Iteration 2233/3560 Training loss: 1.3814 2.6623 sec/batch
Epoch 13/20  Iteration 2234/3560 Training loss: 1.3810 2.5088 sec/batch
Epoch 13/20  Iteration 2235/3560 Training loss: 1.3806 2.4541 sec/batch
Epoch 13/20  Iteration 2236/3560 Training loss: 1.3802 2.4585 sec/batch
Epoch 13/20  Iteration 2237/3560 Training loss: 1.3801 2.4586 sec/batch
Epoch 13/20  Iteration 2238/3560 Training loss: 1.3798 2.4747 sec/batch
Epoch 13/20  Iteration 2239/3560 Training loss: 1.3797 2.4726 sec/batch
Epoch 13/20  Iteration 2240/3560 Training loss: 1.3796 2.4469 sec/batch
Epoch 13/20  Iteration 2241/3560 Training loss: 1.3794 2.4664 sec/batch
Epoch 13/20  Iteration 2242/3560 Training loss: 1.3794 2.4588 sec/batch
Epoch 13/20  Iteration 2243/3560 Training loss: 1.3794 2.4617 sec/batch
Epoch 13/20  Iteration 2244/3560 Training loss: 1.3793 2.4444 sec/batch
Epoch 13/20  Iteration 2245/3560 Training loss: 1.3791 2.4533 sec/batch
Epoch 13/20  Iteration 2246/3560 Training loss: 1.3790 2.4572 sec/batch
Epoch 13/20  Iteration 2247/3560 Training loss: 1.3788 2.4455 sec/batch
Epoch 13/20  Iteration 2248/3560 Training loss: 1.3787 2.4629 sec/batch
Epoch 13/20  Iteration 2249/3560 Training loss: 1.3785 2.4469 sec/batch
Epoch 13/20  Iteration 2250/3560 Training loss: 1.3783 2.4583 sec/batch
Epoch 13/20  Iteration 2251/3560 Training loss: 1.3780 2.4785 sec/batch
Epoch 13/20  Iteration 2252/3560 Training loss: 1.3776 2.4541 sec/batch
Epoch 13/20  Iteration 2253/3560 Training loss: 1.3775 2.4631 sec/batch
Epoch 13/20  Iteration 2254/3560 Training loss: 1.3774 2.4549 sec/batch
Epoch 13/20  Iteration 2255/3560 Training loss: 1.3773 2.4732 sec/batch
Epoch 13/20  Iteration 2256/3560 Training loss: 1.3771 2.4573 sec/batch
Epoch 13/20  Iteration 2257/3560 Training loss: 1.3770 2.4656 sec/batch
Epoch 13/20  Iteration 2258/3560 Training loss: 1.3766 2.4550 sec/batch
Epoch 13/20  Iteration 2259/3560 Training loss: 1.3762 2.4467 sec/batch
Epoch 13/20  Iteration 2260/3560 Training loss: 1.3762 2.4561 sec/batch
Epoch 13/20  Iteration 2261/3560 Training loss: 1.3761 2.4564 sec/batch
Epoch 13/20  Iteration 2262/3560 Training loss: 1.3757 2.4570 sec/batch
Epoch 13/20  Iteration 2263/3560 Training loss: 1.3757 2.4538 sec/batch
Epoch 13/20  Iteration 2264/3560 Training loss: 1.3756 2.4724 sec/batch
Epoch 13/20  Iteration 2265/3560 Training loss: 1.3754 2.4664 sec/batch
Epoch 13/20  Iteration 2266/3560 Training loss: 1.3752 2.6749 sec/batch
Epoch 13/20  Iteration 2267/3560 Training loss: 1.3747 2.4614 sec/batch
Epoch 13/20  Iteration 2268/3560 Training loss: 1.3744 2.4524 sec/batch
Epoch 13/20  Iteration 2269/3560 Training loss: 1.3744 2.4607 sec/batch
Epoch 13/20  Iteration 2270/3560 Training loss: 1.3744 2.4567 sec/batch
Epoch 13/20  Iteration 2271/3560 Training loss: 1.3743 2.4504 sec/batch
Epoch 13/20  Iteration 2272/3560 Training loss: 1.3744 2.4654 sec/batch
Epoch 13/20  Iteration 2273/3560 Training loss: 1.3746 2.4563 sec/batch
Epoch 13/20  Iteration 2274/3560 Training loss: 1.3746 2.4463 sec/batch
Epoch 13/20  Iteration 2275/3560 Training loss: 1.3746 2.4573 sec/batch
Epoch 13/20  Iteration 2276/3560 Training loss: 1.3746 2.4722 sec/batch
Epoch 13/20  Iteration 2277/3560 Training loss: 1.3749 2.4620 sec/batch
Epoch 13/20  Iteration 2278/3560 Training loss: 1.3749 2.4539 sec/batch
Epoch 13/20  Iteration 2279/3560 Training loss: 1.3748 2.4889 sec/batch
Epoch 13/20  Iteration 2280/3560 Training loss: 1.3749 2.4496 sec/batch
Epoch 13/20  Iteration 2281/3560 Training loss: 1.3748 2.4509 sec/batch
Epoch 13/20  Iteration 2282/3560 Training loss: 1.3749 2.6566 sec/batch
Epoch 13/20  Iteration 2283/3560 Training loss: 1.3750 2.4880 sec/batch
Epoch 13/20  Iteration 2284/3560 Training loss: 1.3752 2.4597 sec/batch
Epoch 13/20  Iteration 2285/3560 Training loss: 1.3754 2.4487 sec/batch
Epoch 13/20  Iteration 2286/3560 Training loss: 1.3752 2.4748 sec/batch
Epoch 13/20  Iteration 2287/3560 Training loss: 1.3748 2.4523 sec/batch
Epoch 13/20  Iteration 2288/3560 Training loss: 1.3747 2.4710 sec/batch
Epoch 13/20  Iteration 2289/3560 Training loss: 1.3748 2.4634 sec/batch
Epoch 13/20  Iteration 2290/3560 Training loss: 1.3748 2.4505 sec/batch
Epoch 13/20  Iteration 2291/3560 Training loss: 1.3748 2.4599 sec/batch
Epoch 13/20  Iteration 2292/3560 Training loss: 1.3748 2.4527 sec/batch
Epoch 13/20  Iteration 2293/3560 Training loss: 1.3748 2.4584 sec/batch
Epoch 13/20  Iteration 2294/3560 Training loss: 1.3746 2.4609 sec/batch
Epoch 13/20  Iteration 2295/3560 Training loss: 1.3744 2.4421 sec/batch
Epoch 13/20  Iteration 2296/3560 Training loss: 1.3744 2.4668 sec/batch
Epoch 13/20  Iteration 2297/3560 Training loss: 1.3746 2.4460 sec/batch
Epoch 13/20  Iteration 2298/3560 Training loss: 1.3746 2.4457 sec/batch
Epoch 13/20  Iteration 2299/3560 Training loss: 1.3745 2.4640 sec/batch
Epoch 13/20  Iteration 2300/3560 Training loss: 1.3745 2.4671 sec/batch
Epoch 13/20  Iteration 2301/3560 Training loss: 1.3745 2.4620 sec/batch
Epoch 13/20  Iteration 2302/3560 Training loss: 1.3744 2.4424 sec/batch
Epoch 13/20  Iteration 2303/3560 Training loss: 1.3745 2.4667 sec/batch
Epoch 13/20  Iteration 2304/3560 Training loss: 1.3750 2.4583 sec/batch
Epoch 13/20  Iteration 2305/3560 Training loss: 1.3750 2.4534 sec/batch
Epoch 13/20  Iteration 2306/3560 Training loss: 1.3749 2.4728 sec/batch
Epoch 13/20  Iteration 2307/3560 Training loss: 1.3748 2.4423 sec/batch
Epoch 13/20  Iteration 2308/3560 Training loss: 1.3746 2.4669 sec/batch
Epoch 13/20  Iteration 2309/3560 Training loss: 1.3747 2.4410 sec/batch
Epoch 13/20  Iteration 2310/3560 Training loss: 1.3746 2.4401 sec/batch
Epoch 13/20  Iteration 2311/3560 Training loss: 1.3747 2.4652 sec/batch
Epoch 13/20  Iteration 2312/3560 Training loss: 1.3745 2.4660 sec/batch
Epoch 13/20  Iteration 2313/3560 Training loss: 1.3743 2.4542 sec/batch
Epoch 13/20  Iteration 2314/3560 Training loss: 1.3745 2.4613 sec/batch
Epoch 14/20  Iteration 2315/3560 Training loss: 1.4962 2.4464 sec/batch
Epoch 14/20  Iteration 2316/3560 Training loss: 1.4363 2.4595 sec/batch
Epoch 14/20  Iteration 2317/3560 Training loss: 1.4156 2.4531 sec/batch
Epoch 14/20  Iteration 2318/3560 Training loss: 1.4061 2.4795 sec/batch
Epoch 14/20  Iteration 2319/3560 Training loss: 1.3948 2.4419 sec/batch
Epoch 14/20  Iteration 2320/3560 Training loss: 1.3826 2.4548 sec/batch
Epoch 14/20  Iteration 2321/3560 Training loss: 1.3822 2.4433 sec/batch
Epoch 14/20  Iteration 2322/3560 Training loss: 1.3788 2.4568 sec/batch
Epoch 14/20  Iteration 2323/3560 Training loss: 1.3761 2.4623 sec/batch
Epoch 14/20  Iteration 2324/3560 Training loss: 1.3744 2.4637 sec/batch
Epoch 14/20  Iteration 2325/3560 Training loss: 1.3701 2.4662 sec/batch
Epoch 14/20  Iteration 2326/3560 Training loss: 1.3696 2.4552 sec/batch
Epoch 14/20  Iteration 2327/3560 Training loss: 1.3703 2.4621 sec/batch
Epoch 14/20  Iteration 2328/3560 Training loss: 1.3719 2.4645 sec/batch
Epoch 14/20  Iteration 2329/3560 Training loss: 1.3704 2.4538 sec/batch
Epoch 14/20  Iteration 2330/3560 Training loss: 1.3695 2.5593 sec/batch
Epoch 14/20  Iteration 2331/3560 Training loss: 1.3694 2.5776 sec/batch
Epoch 14/20  Iteration 2332/3560 Training loss: 1.3705 2.4850 sec/batch
Epoch 14/20  Iteration 2333/3560 Training loss: 1.3703 2.4482 sec/batch
Epoch 14/20  Iteration 2334/3560 Training loss: 1.3713 2.4485 sec/batch
Epoch 14/20  Iteration 2335/3560 Training loss: 1.3710 2.4663 sec/batch
Epoch 14/20  Iteration 2336/3560 Training loss: 1.3714 2.4603 sec/batch
Epoch 14/20  Iteration 2337/3560 Training loss: 1.3705 2.4646 sec/batch
Epoch 14/20  Iteration 2338/3560 Training loss: 1.3702 2.4523 sec/batch
Epoch 14/20  Iteration 2339/3560 Training loss: 1.3701 2.4498 sec/batch
Epoch 14/20  Iteration 2340/3560 Training loss: 1.3679 2.4576 sec/batch
Epoch 14/20  Iteration 2341/3560 Training loss: 1.3666 2.4558 sec/batch
Epoch 14/20  Iteration 2342/3560 Training loss: 1.3670 2.6440 sec/batch
Epoch 14/20  Iteration 2343/3560 Training loss: 1.3674 2.4568 sec/batch
Epoch 14/20  Iteration 2344/3560 Training loss: 1.3677 2.4687 sec/batch
Epoch 14/20  Iteration 2345/3560 Training loss: 1.3673 2.4576 sec/batch
Epoch 14/20  Iteration 2346/3560 Training loss: 1.3666 2.4424 sec/batch
Epoch 14/20  Iteration 2347/3560 Training loss: 1.3667 2.4600 sec/batch
Epoch 14/20  Iteration 2348/3560 Training loss: 1.3671 2.4516 sec/batch
Epoch 14/20  Iteration 2349/3560 Training loss: 1.3668 2.4940 sec/batch
Epoch 14/20  Iteration 2350/3560 Training loss: 1.3663 2.4541 sec/batch
Epoch 14/20  Iteration 2351/3560 Training loss: 1.3654 2.4601 sec/batch
Epoch 14/20  Iteration 2352/3560 Training loss: 1.3642 2.4625 sec/batch
Epoch 14/20  Iteration 2353/3560 Training loss: 1.3630 2.4581 sec/batch
Epoch 14/20  Iteration 2354/3560 Training loss: 1.3626 2.4660 sec/batch
Epoch 14/20  Iteration 2355/3560 Training loss: 1.3617 2.4435 sec/batch
Epoch 14/20  Iteration 2356/3560 Training loss: 1.3626 2.4667 sec/batch
Epoch 14/20  Iteration 2357/3560 Training loss: 1.3623 2.4563 sec/batch
Epoch 14/20  Iteration 2358/3560 Training loss: 1.3614 2.4514 sec/batch
Epoch 14/20  Iteration 2359/3560 Training loss: 1.3615 2.4606 sec/batch
Epoch 14/20  Iteration 2360/3560 Training loss: 1.3606 2.4510 sec/batch
Epoch 14/20  Iteration 2361/3560 Training loss: 1.3601 2.4601 sec/batch
Epoch 14/20  Iteration 2362/3560 Training loss: 1.3597 2.4641 sec/batch
Epoch 14/20  Iteration 2363/3560 Training loss: 1.3596 2.4494 sec/batch
Epoch 14/20  Iteration 2364/3560 Training loss: 1.3601 2.4617 sec/batch
Epoch 14/20  Iteration 2365/3560 Training loss: 1.3595 2.4442 sec/batch
Epoch 14/20  Iteration 2366/3560 Training loss: 1.3604 2.4531 sec/batch
Epoch 14/20  Iteration 2367/3560 Training loss: 1.3602 2.4499 sec/batch
Epoch 14/20  Iteration 2368/3560 Training loss: 1.3603 2.4505 sec/batch
Epoch 14/20  Iteration 2369/3560 Training loss: 1.3602 2.4620 sec/batch
Epoch 14/20  Iteration 2370/3560 Training loss: 1.3602 2.4484 sec/batch
Epoch 14/20  Iteration 2371/3560 Training loss: 1.3605 2.4693 sec/batch
Epoch 14/20  Iteration 2372/3560 Training loss: 1.3601 2.4523 sec/batch
Epoch 14/20  Iteration 2373/3560 Training loss: 1.3595 2.4585 sec/batch
Epoch 14/20  Iteration 2374/3560 Training loss: 1.3600 2.4717 sec/batch
Epoch 14/20  Iteration 2375/3560 Training loss: 1.3600 2.4620 sec/batch
Epoch 14/20  Iteration 2376/3560 Training loss: 1.3607 2.4645 sec/batch
Epoch 14/20  Iteration 2377/3560 Training loss: 1.3610 2.4517 sec/batch
Epoch 14/20  Iteration 2378/3560 Training loss: 1.3611 2.4569 sec/batch
Epoch 14/20  Iteration 2379/3560 Training loss: 1.3611 2.6462 sec/batch
Epoch 14/20  Iteration 2380/3560 Training loss: 1.3612 2.5147 sec/batch
Epoch 14/20  Iteration 2381/3560 Training loss: 1.3614 2.4566 sec/batch
Epoch 14/20  Iteration 2382/3560 Training loss: 1.3609 2.4468 sec/batch
Epoch 14/20  Iteration 2383/3560 Training loss: 1.3608 2.4671 sec/batch
Epoch 14/20  Iteration 2384/3560 Training loss: 1.3608 2.4528 sec/batch
Epoch 14/20  Iteration 2385/3560 Training loss: 1.3615 2.4752 sec/batch
Epoch 14/20  Iteration 2386/3560 Training loss: 1.3619 2.4554 sec/batch
Epoch 14/20  Iteration 2387/3560 Training loss: 1.3625 2.8861 sec/batch
Epoch 14/20  Iteration 2388/3560 Training loss: 1.3622 2.4646 sec/batch
Epoch 14/20  Iteration 2389/3560 Training loss: 1.3620 2.4710 sec/batch
Epoch 14/20  Iteration 2390/3560 Training loss: 1.3621 2.4700 sec/batch
Epoch 14/20  Iteration 2391/3560 Training loss: 1.3620 2.4722 sec/batch
Epoch 14/20  Iteration 2392/3560 Training loss: 1.3621 2.4771 sec/batch
Epoch 14/20  Iteration 2393/3560 Training loss: 1.3615 2.4691 sec/batch
Epoch 14/20  Iteration 2394/3560 Training loss: 1.3614 2.4549 sec/batch
Epoch 14/20  Iteration 2395/3560 Training loss: 1.3609 2.4960 sec/batch
Epoch 14/20  Iteration 2396/3560 Training loss: 1.3609 2.4412 sec/batch
Epoch 14/20  Iteration 2397/3560 Training loss: 1.3604 2.4707 sec/batch
Epoch 14/20  Iteration 2398/3560 Training loss: 1.3604 2.4493 sec/batch
Epoch 14/20  Iteration 2399/3560 Training loss: 1.3602 2.4568 sec/batch
Epoch 14/20  Iteration 2400/3560 Training loss: 1.3601 2.4587 sec/batch
Epoch 14/20  Iteration 2401/3560 Training loss: 1.3598 2.4487 sec/batch
Epoch 14/20  Iteration 2402/3560 Training loss: 1.3594 2.4570 sec/batch
Epoch 14/20  Iteration 2403/3560 Training loss: 1.3591 2.4465 sec/batch
Epoch 14/20  Iteration 2404/3560 Training loss: 1.3591 2.4660 sec/batch
Epoch 14/20  Iteration 2405/3560 Training loss: 1.3589 2.4494 sec/batch
Epoch 14/20  Iteration 2406/3560 Training loss: 1.3589 2.4545 sec/batch
Epoch 14/20  Iteration 2407/3560 Training loss: 1.3586 2.4584 sec/batch
Epoch 14/20  Iteration 2408/3560 Training loss: 1.3583 2.4534 sec/batch
Epoch 14/20  Iteration 2409/3560 Training loss: 1.3581 2.4760 sec/batch
Epoch 14/20  Iteration 2410/3560 Training loss: 1.3582 2.4609 sec/batch
Epoch 14/20  Iteration 2411/3560 Training loss: 1.3582 2.4435 sec/batch
Epoch 14/20  Iteration 2412/3560 Training loss: 1.3579 2.4655 sec/batch
Epoch 14/20  Iteration 2413/3560 Training loss: 1.3575 2.4486 sec/batch
Epoch 14/20  Iteration 2414/3560 Training loss: 1.3571 2.4572 sec/batch
Epoch 14/20  Iteration 2415/3560 Training loss: 1.3571 2.4507 sec/batch
Epoch 14/20  Iteration 2416/3560 Training loss: 1.3570 2.4504 sec/batch
Epoch 14/20  Iteration 2417/3560 Training loss: 1.3568 2.4528 sec/batch
Epoch 14/20  Iteration 2418/3560 Training loss: 1.3567 2.4536 sec/batch
Epoch 14/20  Iteration 2419/3560 Training loss: 1.3565 2.4627 sec/batch
Epoch 14/20  Iteration 2420/3560 Training loss: 1.3564 2.4459 sec/batch
Epoch 14/20  Iteration 2421/3560 Training loss: 1.3564 2.4578 sec/batch
Epoch 14/20  Iteration 2422/3560 Training loss: 1.3565 2.4670 sec/batch
Epoch 14/20  Iteration 2423/3560 Training loss: 1.3562 2.4465 sec/batch
Epoch 14/20  Iteration 2424/3560 Training loss: 1.3563 2.4772 sec/batch
Epoch 14/20  Iteration 2425/3560 Training loss: 1.3561 2.4552 sec/batch
Epoch 14/20  Iteration 2426/3560 Training loss: 1.3561 2.4572 sec/batch
Epoch 14/20  Iteration 2427/3560 Training loss: 1.3560 2.5674 sec/batch
Epoch 14/20  Iteration 2428/3560 Training loss: 1.3558 2.5656 sec/batch
Epoch 14/20  Iteration 2429/3560 Training loss: 1.3555 2.4804 sec/batch
Epoch 14/20  Iteration 2430/3560 Training loss: 1.3552 2.4386 sec/batch
Epoch 14/20  Iteration 2431/3560 Training loss: 1.3551 2.4724 sec/batch
Epoch 14/20  Iteration 2432/3560 Training loss: 1.3551 2.4601 sec/batch
Epoch 14/20  Iteration 2433/3560 Training loss: 1.3550 2.4812 sec/batch
Epoch 14/20  Iteration 2434/3560 Training loss: 1.3547 2.4451 sec/batch
Epoch 14/20  Iteration 2435/3560 Training loss: 1.3546 2.4480 sec/batch
Epoch 14/20  Iteration 2436/3560 Training loss: 1.3542 2.4549 sec/batch
Epoch 14/20  Iteration 2437/3560 Training loss: 1.3538 2.4449 sec/batch
Epoch 14/20  Iteration 2438/3560 Training loss: 1.3537 2.4658 sec/batch
Epoch 14/20  Iteration 2439/3560 Training loss: 1.3537 2.4431 sec/batch
Epoch 14/20  Iteration 2440/3560 Training loss: 1.3533 2.4476 sec/batch
Epoch 14/20  Iteration 2441/3560 Training loss: 1.3533 2.4677 sec/batch
Epoch 14/20  Iteration 2442/3560 Training loss: 1.3531 2.4479 sec/batch
Epoch 14/20  Iteration 2443/3560 Training loss: 1.3529 2.4556 sec/batch
Epoch 14/20  Iteration 2444/3560 Training loss: 1.3526 2.4498 sec/batch
Epoch 14/20  Iteration 2445/3560 Training loss: 1.3522 2.4455 sec/batch
Epoch 14/20  Iteration 2446/3560 Training loss: 1.3520 2.4804 sec/batch
Epoch 14/20  Iteration 2447/3560 Training loss: 1.3521 2.4489 sec/batch
Epoch 14/20  Iteration 2448/3560 Training loss: 1.3520 2.4535 sec/batch
Epoch 14/20  Iteration 2449/3560 Training loss: 1.3520 2.4517 sec/batch
Epoch 14/20  Iteration 2450/3560 Training loss: 1.3521 2.4658 sec/batch
Epoch 14/20  Iteration 2451/3560 Training loss: 1.3522 2.4538 sec/batch
Epoch 14/20  Iteration 2452/3560 Training loss: 1.3523 2.4464 sec/batch
Epoch 14/20  Iteration 2453/3560 Training loss: 1.3523 2.4688 sec/batch
Epoch 14/20  Iteration 2454/3560 Training loss: 1.3522 2.4495 sec/batch
Epoch 14/20  Iteration 2455/3560 Training loss: 1.3525 2.4669 sec/batch
Epoch 14/20  Iteration 2456/3560 Training loss: 1.3525 2.4611 sec/batch
Epoch 14/20  Iteration 2457/3560 Training loss: 1.3524 2.4551 sec/batch
Epoch 14/20  Iteration 2458/3560 Training loss: 1.3526 2.4716 sec/batch
Epoch 14/20  Iteration 2459/3560 Training loss: 1.3525 2.4561 sec/batch
Epoch 14/20  Iteration 2460/3560 Training loss: 1.3526 2.4666 sec/batch
Epoch 14/20  Iteration 2461/3560 Training loss: 1.3526 2.4528 sec/batch
Epoch 14/20  Iteration 2462/3560 Training loss: 1.3529 2.4680 sec/batch
Epoch 14/20  Iteration 2463/3560 Training loss: 1.3530 2.4555 sec/batch
Epoch 14/20  Iteration 2464/3560 Training loss: 1.3529 2.4528 sec/batch
Epoch 14/20  Iteration 2465/3560 Training loss: 1.3525 2.4687 sec/batch
Epoch 14/20  Iteration 2466/3560 Training loss: 1.3524 2.4527 sec/batch
Epoch 14/20  Iteration 2467/3560 Training loss: 1.3524 2.4749 sec/batch
Epoch 14/20  Iteration 2468/3560 Training loss: 1.3525 2.4502 sec/batch
Epoch 14/20  Iteration 2469/3560 Training loss: 1.3524 2.4480 sec/batch
Epoch 14/20  Iteration 2470/3560 Training loss: 1.3525 2.4807 sec/batch
Epoch 14/20  Iteration 2471/3560 Training loss: 1.3525 2.4616 sec/batch
Epoch 14/20  Iteration 2472/3560 Training loss: 1.3524 2.4705 sec/batch
Epoch 14/20  Iteration 2473/3560 Training loss: 1.3522 2.4464 sec/batch
Epoch 14/20  Iteration 2474/3560 Training loss: 1.3523 2.4842 sec/batch
Epoch 14/20  Iteration 2475/3560 Training loss: 1.3525 2.4510 sec/batch
Epoch 14/20  Iteration 2476/3560 Training loss: 1.3524 2.6442 sec/batch
Epoch 14/20  Iteration 2477/3560 Training loss: 1.3524 2.5155 sec/batch
Epoch 14/20  Iteration 2478/3560 Training loss: 1.3524 2.4446 sec/batch
Epoch 14/20  Iteration 2479/3560 Training loss: 1.3524 2.4624 sec/batch
Epoch 14/20  Iteration 2480/3560 Training loss: 1.3523 2.4557 sec/batch
Epoch 14/20  Iteration 2481/3560 Training loss: 1.3524 2.4651 sec/batch
Epoch 14/20  Iteration 2482/3560 Training loss: 1.3527 2.4678 sec/batch
Epoch 14/20  Iteration 2483/3560 Training loss: 1.3527 2.4674 sec/batch
Epoch 14/20  Iteration 2484/3560 Training loss: 1.3528 2.4474 sec/batch
Epoch 14/20  Iteration 2485/3560 Training loss: 1.3527 2.4602 sec/batch
Epoch 14/20  Iteration 2486/3560 Training loss: 1.3525 2.4645 sec/batch
Epoch 14/20  Iteration 2487/3560 Training loss: 1.3526 2.4415 sec/batch
Epoch 14/20  Iteration 2488/3560 Training loss: 1.3526 2.4490 sec/batch
Epoch 14/20  Iteration 2489/3560 Training loss: 1.3527 2.4572 sec/batch
Epoch 14/20  Iteration 2490/3560 Training loss: 1.3525 2.4616 sec/batch
Epoch 14/20  Iteration 2491/3560 Training loss: 1.3524 2.4544 sec/batch
Epoch 14/20  Iteration 2492/3560 Training loss: 1.3525 2.4485 sec/batch
Epoch 15/20  Iteration 2493/3560 Training loss: 1.4709 2.4600 sec/batch
Epoch 15/20  Iteration 2494/3560 Training loss: 1.4175 2.4609 sec/batch
Epoch 15/20  Iteration 2495/3560 Training loss: 1.3919 2.4521 sec/batch
Epoch 15/20  Iteration 2496/3560 Training loss: 1.3817 2.4680 sec/batch
Epoch 15/20  Iteration 2497/3560 Training loss: 1.3686 2.4510 sec/batch
Epoch 15/20  Iteration 2498/3560 Training loss: 1.3569 2.4880 sec/batch
Epoch 15/20  Iteration 2499/3560 Training loss: 1.3566 2.4564 sec/batch
Epoch 15/20  Iteration 2500/3560 Training loss: 1.3539 2.4592 sec/batch
Epoch 15/20  Iteration 2501/3560 Training loss: 1.3527 2.4620 sec/batch
Epoch 15/20  Iteration 2502/3560 Training loss: 1.3509 2.4528 sec/batch
Epoch 15/20  Iteration 2503/3560 Training loss: 1.3464 2.4616 sec/batch
Epoch 15/20  Iteration 2504/3560 Training loss: 1.3454 2.4600 sec/batch
Epoch 15/20  Iteration 2505/3560 Training loss: 1.3461 2.4689 sec/batch
Epoch 15/20  Iteration 2506/3560 Training loss: 1.3468 2.4602 sec/batch
Epoch 15/20  Iteration 2507/3560 Training loss: 1.3459 2.4494 sec/batch
Epoch 15/20  Iteration 2508/3560 Training loss: 1.3440 2.7083 sec/batch
Epoch 15/20  Iteration 2509/3560 Training loss: 1.3438 2.4532 sec/batch
Epoch 15/20  Iteration 2510/3560 Training loss: 1.3447 2.4632 sec/batch
Epoch 15/20  Iteration 2511/3560 Training loss: 1.3447 2.4538 sec/batch
Epoch 15/20  Iteration 2512/3560 Training loss: 1.3460 2.4508 sec/batch
Epoch 15/20  Iteration 2513/3560 Training loss: 1.3459 2.4630 sec/batch
Epoch 15/20  Iteration 2514/3560 Training loss: 1.3461 2.4547 sec/batch
Epoch 15/20  Iteration 2515/3560 Training loss: 1.3457 2.4746 sec/batch
Epoch 15/20  Iteration 2516/3560 Training loss: 1.3457 2.4450 sec/batch
Epoch 15/20  Iteration 2517/3560 Training loss: 1.3455 2.5943 sec/batch
Epoch 15/20  Iteration 2518/3560 Training loss: 1.3434 2.4665 sec/batch
Epoch 15/20  Iteration 2519/3560 Training loss: 1.3421 2.4705 sec/batch
Epoch 15/20  Iteration 2520/3560 Training loss: 1.3425 2.4487 sec/batch
Epoch 15/20  Iteration 2521/3560 Training loss: 1.3430 2.4748 sec/batch
Epoch 15/20  Iteration 2522/3560 Training loss: 1.3432 2.4862 sec/batch
Epoch 15/20  Iteration 2523/3560 Training loss: 1.3426 2.4568 sec/batch
Epoch 15/20  Iteration 2524/3560 Training loss: 1.3416 2.6040 sec/batch
Epoch 15/20  Iteration 2525/3560 Training loss: 1.3418 2.5441 sec/batch
Epoch 15/20  Iteration 2526/3560 Training loss: 1.3421 2.4679 sec/batch
Epoch 15/20  Iteration 2527/3560 Training loss: 1.3418 2.4665 sec/batch
Epoch 15/20  Iteration 2528/3560 Training loss: 1.3415 2.4633 sec/batch
Epoch 15/20  Iteration 2529/3560 Training loss: 1.3407 2.4561 sec/batch
Epoch 15/20  Iteration 2530/3560 Training loss: 1.3394 2.4602 sec/batch
Epoch 15/20  Iteration 2531/3560 Training loss: 1.3382 2.4643 sec/batch
Epoch 15/20  Iteration 2532/3560 Training loss: 1.3378 2.4454 sec/batch
Epoch 15/20  Iteration 2533/3560 Training loss: 1.3370 2.4674 sec/batch
Epoch 15/20  Iteration 2534/3560 Training loss: 1.3380 2.4511 sec/batch
Epoch 15/20  Iteration 2535/3560 Training loss: 1.3375 2.4479 sec/batch
Epoch 15/20  Iteration 2536/3560 Training loss: 1.3370 2.4616 sec/batch
Epoch 15/20  Iteration 2537/3560 Training loss: 1.3373 2.4592 sec/batch
Epoch 15/20  Iteration 2538/3560 Training loss: 1.3366 2.4604 sec/batch
Epoch 15/20  Iteration 2539/3560 Training loss: 1.3362 2.4500 sec/batch
Epoch 15/20  Iteration 2540/3560 Training loss: 1.3358 2.4502 sec/batch
Epoch 15/20  Iteration 2541/3560 Training loss: 1.3357 2.4650 sec/batch
Epoch 15/20  Iteration 2542/3560 Training loss: 1.3364 2.4454 sec/batch
Epoch 15/20  Iteration 2543/3560 Training loss: 1.3359 2.4813 sec/batch
Epoch 15/20  Iteration 2544/3560 Training loss: 1.3369 2.4523 sec/batch
Epoch 15/20  Iteration 2545/3560 Training loss: 1.3369 2.4588 sec/batch
Epoch 15/20  Iteration 2546/3560 Training loss: 1.3371 2.4713 sec/batch
Epoch 15/20  Iteration 2547/3560 Training loss: 1.3367 2.4589 sec/batch
Epoch 15/20  Iteration 2548/3560 Training loss: 1.3368 2.4634 sec/batch
Epoch 15/20  Iteration 2549/3560 Training loss: 1.3371 2.4590 sec/batch
Epoch 15/20  Iteration 2550/3560 Training loss: 1.3368 2.4469 sec/batch
Epoch 15/20  Iteration 2551/3560 Training loss: 1.3364 2.4643 sec/batch
Epoch 15/20  Iteration 2552/3560 Training loss: 1.3368 2.4463 sec/batch
Epoch 15/20  Iteration 2553/3560 Training loss: 1.3368 2.4639 sec/batch
Epoch 15/20  Iteration 2554/3560 Training loss: 1.3376 2.4561 sec/batch
Epoch 15/20  Iteration 2555/3560 Training loss: 1.3378 2.4646 sec/batch
Epoch 15/20  Iteration 2556/3560 Training loss: 1.3380 2.4564 sec/batch
Epoch 15/20  Iteration 2557/3560 Training loss: 1.3380 2.4521 sec/batch
Epoch 15/20  Iteration 2558/3560 Training loss: 1.3381 2.4638 sec/batch
Epoch 15/20  Iteration 2559/3560 Training loss: 1.3383 2.4445 sec/batch
Epoch 15/20  Iteration 2560/3560 Training loss: 1.3380 2.4616 sec/batch
Epoch 15/20  Iteration 2561/3560 Training loss: 1.3381 2.4439 sec/batch
Epoch 15/20  Iteration 2562/3560 Training loss: 1.3381 2.4618 sec/batch
Epoch 15/20  Iteration 2563/3560 Training loss: 1.3387 2.4705 sec/batch
Epoch 15/20  Iteration 2564/3560 Training loss: 1.3390 2.4570 sec/batch
Epoch 15/20  Iteration 2565/3560 Training loss: 1.3395 2.4605 sec/batch
Epoch 15/20  Iteration 2566/3560 Training loss: 1.3392 2.4526 sec/batch
Epoch 15/20  Iteration 2567/3560 Training loss: 1.3391 2.4647 sec/batch
Epoch 15/20  Iteration 2568/3560 Training loss: 1.3394 2.4575 sec/batch
Epoch 15/20  Iteration 2569/3560 Training loss: 1.3392 2.4549 sec/batch
Epoch 15/20  Iteration 2570/3560 Training loss: 1.3392 2.4698 sec/batch
Epoch 15/20  Iteration 2571/3560 Training loss: 1.3385 2.4449 sec/batch
Epoch 15/20  Iteration 2572/3560 Training loss: 1.3384 2.4659 sec/batch
Epoch 15/20  Iteration 2573/3560 Training loss: 1.3379 2.6618 sec/batch
Epoch 15/20  Iteration 2574/3560 Training loss: 1.3378 2.4910 sec/batch
Epoch 15/20  Iteration 2575/3560 Training loss: 1.3375 2.4775 sec/batch
Epoch 15/20  Iteration 2576/3560 Training loss: 1.3376 2.4587 sec/batch
Epoch 15/20  Iteration 2577/3560 Training loss: 1.3374 2.4610 sec/batch
Epoch 15/20  Iteration 2578/3560 Training loss: 1.3372 2.4565 sec/batch
Epoch 15/20  Iteration 2579/3560 Training loss: 1.3369 2.4787 sec/batch
Epoch 15/20  Iteration 2580/3560 Training loss: 1.3365 2.4508 sec/batch
Epoch 15/20  Iteration 2581/3560 Training loss: 1.3361 2.4481 sec/batch
Epoch 15/20  Iteration 2582/3560 Training loss: 1.3361 2.4634 sec/batch
Epoch 15/20  Iteration 2583/3560 Training loss: 1.3360 2.4454 sec/batch
Epoch 15/20  Iteration 2584/3560 Training loss: 1.3359 2.4667 sec/batch
Epoch 15/20  Iteration 2585/3560 Training loss: 1.3355 2.4519 sec/batch
Epoch 15/20  Iteration 2586/3560 Training loss: 1.3352 2.4640 sec/batch
Epoch 15/20  Iteration 2587/3560 Training loss: 1.3350 2.4565 sec/batch
Epoch 15/20  Iteration 2588/3560 Training loss: 1.3349 2.4563 sec/batch
Epoch 15/20  Iteration 2589/3560 Training loss: 1.3349 2.4609 sec/batch
Epoch 15/20  Iteration 2590/3560 Training loss: 1.3345 2.4602 sec/batch
Epoch 15/20  Iteration 2591/3560 Training loss: 1.3341 2.5021 sec/batch
Epoch 15/20  Iteration 2592/3560 Training loss: 1.3338 2.4794 sec/batch
Epoch 15/20  Iteration 2593/3560 Training loss: 1.3337 2.4717 sec/batch
Epoch 15/20  Iteration 2594/3560 Training loss: 1.3335 2.5024 sec/batch
Epoch 15/20  Iteration 2595/3560 Training loss: 1.3334 2.4599 sec/batch
Epoch 15/20  Iteration 2596/3560 Training loss: 1.3332 2.4514 sec/batch
Epoch 15/20  Iteration 2597/3560 Training loss: 1.3331 2.4493 sec/batch
Epoch 15/20  Iteration 2598/3560 Training loss: 1.3330 2.4666 sec/batch
Epoch 15/20  Iteration 2599/3560 Training loss: 1.3330 2.4650 sec/batch
Epoch 15/20  Iteration 2600/3560 Training loss: 1.3330 2.4622 sec/batch
Epoch 15/20  Iteration 2601/3560 Training loss: 1.3328 2.4613 sec/batch
Epoch 15/20  Iteration 2602/3560 Training loss: 1.3329 2.4547 sec/batch
Epoch 15/20  Iteration 2603/3560 Training loss: 1.3328 2.4853 sec/batch
Epoch 15/20  Iteration 2604/3560 Training loss: 1.3327 2.4520 sec/batch
Epoch 15/20  Iteration 2605/3560 Training loss: 1.3325 2.4509 sec/batch
Epoch 15/20  Iteration 2606/3560 Training loss: 1.3323 2.4683 sec/batch
Epoch 15/20  Iteration 2607/3560 Training loss: 1.3321 2.4581 sec/batch
Epoch 15/20  Iteration 2608/3560 Training loss: 1.3317 2.4688 sec/batch
Epoch 15/20  Iteration 2609/3560 Training loss: 1.3316 2.4528 sec/batch
Epoch 15/20  Iteration 2610/3560 Training loss: 1.3316 2.4607 sec/batch
Epoch 15/20  Iteration 2611/3560 Training loss: 1.3315 2.4601 sec/batch
Epoch 15/20  Iteration 2612/3560 Training loss: 1.3313 2.4543 sec/batch
Epoch 15/20  Iteration 2613/3560 Training loss: 1.3311 2.4578 sec/batch
Epoch 15/20  Iteration 2614/3560 Training loss: 1.3307 2.4459 sec/batch
Epoch 15/20  Iteration 2615/3560 Training loss: 1.3303 2.4743 sec/batch
Epoch 15/20  Iteration 2616/3560 Training loss: 1.3303 2.4553 sec/batch
Epoch 15/20  Iteration 2617/3560 Training loss: 1.3301 2.4634 sec/batch
Epoch 15/20  Iteration 2618/3560 Training loss: 1.3297 2.4592 sec/batch
Epoch 15/20  Iteration 2619/3560 Training loss: 1.3297 2.4622 sec/batch
Epoch 15/20  Iteration 2620/3560 Training loss: 1.3297 2.4486 sec/batch
Epoch 15/20  Iteration 2621/3560 Training loss: 1.3296 2.6109 sec/batch
Epoch 15/20  Iteration 2622/3560 Training loss: 1.3293 2.5773 sec/batch
Epoch 15/20  Iteration 2623/3560 Training loss: 1.3289 2.4644 sec/batch
Epoch 15/20  Iteration 2624/3560 Training loss: 1.3286 2.4464 sec/batch
Epoch 15/20  Iteration 2625/3560 Training loss: 1.3287 2.4581 sec/batch
Epoch 15/20  Iteration 2626/3560 Training loss: 1.3288 2.4578 sec/batch
Epoch 15/20  Iteration 2627/3560 Training loss: 1.3288 2.4602 sec/batch
Epoch 15/20  Iteration 2628/3560 Training loss: 1.3288 2.4663 sec/batch
Epoch 15/20  Iteration 2629/3560 Training loss: 1.3289 2.5468 sec/batch
Epoch 15/20  Iteration 2630/3560 Training loss: 1.3291 2.6998 sec/batch
Epoch 15/20  Iteration 2631/3560 Training loss: 1.3291 2.4653 sec/batch
Epoch 15/20  Iteration 2632/3560 Training loss: 1.3291 2.4679 sec/batch
Epoch 15/20  Iteration 2633/3560 Training loss: 1.3294 2.4570 sec/batch
Epoch 15/20  Iteration 2634/3560 Training loss: 1.3294 2.4611 sec/batch
Epoch 15/20  Iteration 2635/3560 Training loss: 1.3294 2.4531 sec/batch
Epoch 15/20  Iteration 2636/3560 Training loss: 1.3296 2.4525 sec/batch
Epoch 15/20  Iteration 2637/3560 Training loss: 1.3294 2.4667 sec/batch
Epoch 15/20  Iteration 2638/3560 Training loss: 1.3296 2.4488 sec/batch
Epoch 15/20  Iteration 2639/3560 Training loss: 1.3296 2.4548 sec/batch
Epoch 15/20  Iteration 2640/3560 Training loss: 1.3299 2.4778 sec/batch
Epoch 15/20  Iteration 2641/3560 Training loss: 1.3301 2.4551 sec/batch
Epoch 15/20  Iteration 2642/3560 Training loss: 1.3299 2.4639 sec/batch
Epoch 15/20  Iteration 2643/3560 Training loss: 1.3296 2.4479 sec/batch
Epoch 15/20  Iteration 2644/3560 Training loss: 1.3294 2.4572 sec/batch
Epoch 15/20  Iteration 2645/3560 Training loss: 1.3294 2.4580 sec/batch
Epoch 15/20  Iteration 2646/3560 Training loss: 1.3294 2.4738 sec/batch
Epoch 15/20  Iteration 2647/3560 Training loss: 1.3294 2.4500 sec/batch
Epoch 15/20  Iteration 2648/3560 Training loss: 1.3294 2.4584 sec/batch
Epoch 15/20  Iteration 2649/3560 Training loss: 1.3294 2.4462 sec/batch
Epoch 15/20  Iteration 2650/3560 Training loss: 1.3293 2.4574 sec/batch
Epoch 15/20  Iteration 2651/3560 Training loss: 1.3290 2.4630 sec/batch
Epoch 15/20  Iteration 2652/3560 Training loss: 1.3291 2.4664 sec/batch
Epoch 15/20  Iteration 2653/3560 Training loss: 1.3293 2.4557 sec/batch
Epoch 15/20  Iteration 2654/3560 Training loss: 1.3292 2.4474 sec/batch
Epoch 15/20  Iteration 2655/3560 Training loss: 1.3292 2.4552 sec/batch
Epoch 15/20  Iteration 2656/3560 Training loss: 1.3291 2.4619 sec/batch
Epoch 15/20  Iteration 2657/3560 Training loss: 1.3291 2.4445 sec/batch
Epoch 15/20  Iteration 2658/3560 Training loss: 1.3291 2.4642 sec/batch
Epoch 15/20  Iteration 2659/3560 Training loss: 1.3291 2.4476 sec/batch
Epoch 15/20  Iteration 2660/3560 Training loss: 1.3295 2.4616 sec/batch
Epoch 15/20  Iteration 2661/3560 Training loss: 1.3295 2.4413 sec/batch
Epoch 15/20  Iteration 2662/3560 Training loss: 1.3295 2.4658 sec/batch
Epoch 15/20  Iteration 2663/3560 Training loss: 1.3294 2.4496 sec/batch
Epoch 15/20  Iteration 2664/3560 Training loss: 1.3292 2.4768 sec/batch
Epoch 15/20  Iteration 2665/3560 Training loss: 1.3293 2.4621 sec/batch
Epoch 15/20  Iteration 2666/3560 Training loss: 1.3293 2.4595 sec/batch
Epoch 15/20  Iteration 2667/3560 Training loss: 1.3293 2.4648 sec/batch
Epoch 15/20  Iteration 2668/3560 Training loss: 1.3292 2.4547 sec/batch
Epoch 15/20  Iteration 2669/3560 Training loss: 1.3291 2.4536 sec/batch
Epoch 15/20  Iteration 2670/3560 Training loss: 1.3292 2.6544 sec/batch
Epoch 16/20  Iteration 2671/3560 Training loss: 1.4444 2.4831 sec/batch
Epoch 16/20  Iteration 2672/3560 Training loss: 1.3902 2.4557 sec/batch
Epoch 16/20  Iteration 2673/3560 Training loss: 1.3695 2.4520 sec/batch
Epoch 16/20  Iteration 2674/3560 Training loss: 1.3619 2.4648 sec/batch
Epoch 16/20  Iteration 2675/3560 Training loss: 1.3497 2.4612 sec/batch
Epoch 16/20  Iteration 2676/3560 Training loss: 1.3374 2.4622 sec/batch
Epoch 16/20  Iteration 2677/3560 Training loss: 1.3368 2.4569 sec/batch
Epoch 16/20  Iteration 2678/3560 Training loss: 1.3341 2.4606 sec/batch
Epoch 16/20  Iteration 2679/3560 Training loss: 1.3323 2.4567 sec/batch
Epoch 16/20  Iteration 2680/3560 Training loss: 1.3301 2.4512 sec/batch
Epoch 16/20  Iteration 2681/3560 Training loss: 1.3263 2.4470 sec/batch
Epoch 16/20  Iteration 2682/3560 Training loss: 1.3252 2.4643 sec/batch
Epoch 16/20  Iteration 2683/3560 Training loss: 1.3256 2.4525 sec/batch
Epoch 16/20  Iteration 2684/3560 Training loss: 1.3264 2.4612 sec/batch
Epoch 16/20  Iteration 2685/3560 Training loss: 1.3253 2.4500 sec/batch
Epoch 16/20  Iteration 2686/3560 Training loss: 1.3234 2.4706 sec/batch
Epoch 16/20  Iteration 2687/3560 Training loss: 1.3232 2.4784 sec/batch
Epoch 16/20  Iteration 2688/3560 Training loss: 1.3241 2.4670 sec/batch
Epoch 16/20  Iteration 2689/3560 Training loss: 1.3244 2.4648 sec/batch
Epoch 16/20  Iteration 2690/3560 Training loss: 1.3256 2.4529 sec/batch
Epoch 16/20  Iteration 2691/3560 Training loss: 1.3252 2.4562 sec/batch
Epoch 16/20  Iteration 2692/3560 Training loss: 1.3255 2.4492 sec/batch
Epoch 16/20  Iteration 2693/3560 Training loss: 1.3248 2.4592 sec/batch
Epoch 16/20  Iteration 2694/3560 Training loss: 1.3247 2.4553 sec/batch
Epoch 16/20  Iteration 2695/3560 Training loss: 1.3247 2.4617 sec/batch
Epoch 16/20  Iteration 2696/3560 Training loss: 1.3226 2.4475 sec/batch
Epoch 16/20  Iteration 2697/3560 Training loss: 1.3212 2.4527 sec/batch
Epoch 16/20  Iteration 2698/3560 Training loss: 1.3217 2.4593 sec/batch
Epoch 16/20  Iteration 2699/3560 Training loss: 1.3217 2.4454 sec/batch
Epoch 16/20  Iteration 2700/3560 Training loss: 1.3219 2.4829 sec/batch
Epoch 16/20  Iteration 2701/3560 Training loss: 1.3214 2.4607 sec/batch
Epoch 16/20  Iteration 2702/3560 Training loss: 1.3204 2.4492 sec/batch
Epoch 16/20  Iteration 2703/3560 Training loss: 1.3205 2.4611 sec/batch
Epoch 16/20  Iteration 2704/3560 Training loss: 1.3207 2.5444 sec/batch
Epoch 16/20  Iteration 2705/3560 Training loss: 1.3205 2.4601 sec/batch
Epoch 16/20  Iteration 2706/3560 Training loss: 1.3202 2.4481 sec/batch
Epoch 16/20  Iteration 2707/3560 Training loss: 1.3196 2.4617 sec/batch
Epoch 16/20  Iteration 2708/3560 Training loss: 1.3183 2.4447 sec/batch
Epoch 16/20  Iteration 2709/3560 Training loss: 1.3170 2.4498 sec/batch
Epoch 16/20  Iteration 2710/3560 Training loss: 1.3166 2.4626 sec/batch
Epoch 16/20  Iteration 2711/3560 Training loss: 1.3160 2.4559 sec/batch
Epoch 16/20  Iteration 2712/3560 Training loss: 1.3171 2.4562 sec/batch
Epoch 16/20  Iteration 2713/3560 Training loss: 1.3166 2.4792 sec/batch
Epoch 16/20  Iteration 2714/3560 Training loss: 1.3161 2.4514 sec/batch
Epoch 16/20  Iteration 2715/3560 Training loss: 1.3163 2.4652 sec/batch
Epoch 16/20  Iteration 2716/3560 Training loss: 1.3154 2.4583 sec/batch
Epoch 16/20  Iteration 2717/3560 Training loss: 1.3149 2.4597 sec/batch
Epoch 16/20  Iteration 2718/3560 Training loss: 1.3146 2.6463 sec/batch
Epoch 16/20  Iteration 2719/3560 Training loss: 1.3145 2.5228 sec/batch
Epoch 16/20  Iteration 2720/3560 Training loss: 1.3150 2.4555 sec/batch
Epoch 16/20  Iteration 2721/3560 Training loss: 1.3146 2.4461 sec/batch
Epoch 16/20  Iteration 2722/3560 Training loss: 1.3156 2.4623 sec/batch
Epoch 16/20  Iteration 2723/3560 Training loss: 1.3155 2.4523 sec/batch
Epoch 16/20  Iteration 2724/3560 Training loss: 1.3157 2.4632 sec/batch
Epoch 16/20  Iteration 2725/3560 Training loss: 1.3154 2.4661 sec/batch
Epoch 16/20  Iteration 2726/3560 Training loss: 1.3154 2.4658 sec/batch
Epoch 16/20  Iteration 2727/3560 Training loss: 1.3156 2.4461 sec/batch
Epoch 16/20  Iteration 2728/3560 Training loss: 1.3154 2.4508 sec/batch
Epoch 16/20  Iteration 2729/3560 Training loss: 1.3149 2.4640 sec/batch
Epoch 16/20  Iteration 2730/3560 Training loss: 1.3154 2.4551 sec/batch
Epoch 16/20  Iteration 2731/3560 Training loss: 1.3154 2.4555 sec/batch
Epoch 16/20  Iteration 2732/3560 Training loss: 1.3162 2.4524 sec/batch
Epoch 16/20  Iteration 2733/3560 Training loss: 1.3164 2.4624 sec/batch
Epoch 16/20  Iteration 2734/3560 Training loss: 1.3166 2.4537 sec/batch
Epoch 16/20  Iteration 2735/3560 Training loss: 1.3164 2.4672 sec/batch
Epoch 16/20  Iteration 2736/3560 Training loss: 1.3166 2.4887 sec/batch
Epoch 16/20  Iteration 2737/3560 Training loss: 1.3169 2.4665 sec/batch
Epoch 16/20  Iteration 2738/3560 Training loss: 1.3166 2.4676 sec/batch
Epoch 16/20  Iteration 2739/3560 Training loss: 1.3167 2.4487 sec/batch
Epoch 16/20  Iteration 2740/3560 Training loss: 1.3166 2.4526 sec/batch
Epoch 16/20  Iteration 2741/3560 Training loss: 1.3173 2.4556 sec/batch
Epoch 16/20  Iteration 2742/3560 Training loss: 1.3177 2.4553 sec/batch
Epoch 16/20  Iteration 2743/3560 Training loss: 1.3182 2.4709 sec/batch
Epoch 16/20  Iteration 2744/3560 Training loss: 1.3180 2.4479 sec/batch
Epoch 16/20  Iteration 2745/3560 Training loss: 1.3179 2.4789 sec/batch
Epoch 16/20  Iteration 2746/3560 Training loss: 1.3180 2.4653 sec/batch
Epoch 16/20  Iteration 2747/3560 Training loss: 1.3179 2.4534 sec/batch
Epoch 16/20  Iteration 2748/3560 Training loss: 1.3178 2.4587 sec/batch
Epoch 16/20  Iteration 2749/3560 Training loss: 1.3172 2.4579 sec/batch
Epoch 16/20  Iteration 2750/3560 Training loss: 1.3172 2.5541 sec/batch
Epoch 16/20  Iteration 2751/3560 Training loss: 1.3169 2.8384 sec/batch
Epoch 16/20  Iteration 2752/3560 Training loss: 1.3167 2.4637 sec/batch
Epoch 16/20  Iteration 2753/3560 Training loss: 1.3163 2.4803 sec/batch
Epoch 16/20  Iteration 2754/3560 Training loss: 1.3164 2.4462 sec/batch
Epoch 16/20  Iteration 2755/3560 Training loss: 1.3160 2.4696 sec/batch
Epoch 16/20  Iteration 2756/3560 Training loss: 1.3160 2.4768 sec/batch
Epoch 16/20  Iteration 2757/3560 Training loss: 1.3158 2.4732 sec/batch
Epoch 16/20  Iteration 2758/3560 Training loss: 1.3155 2.4496 sec/batch
Epoch 16/20  Iteration 2759/3560 Training loss: 1.3152 2.4521 sec/batch
Epoch 16/20  Iteration 2760/3560 Training loss: 1.3153 2.4642 sec/batch
Epoch 16/20  Iteration 2761/3560 Training loss: 1.3152 2.4612 sec/batch
Epoch 16/20  Iteration 2762/3560 Training loss: 1.3150 2.4656 sec/batch
Epoch 16/20  Iteration 2763/3560 Training loss: 1.3147 2.4469 sec/batch
Epoch 16/20  Iteration 2764/3560 Training loss: 1.3144 2.4634 sec/batch
Epoch 16/20  Iteration 2765/3560 Training loss: 1.3142 2.4858 sec/batch
Epoch 16/20  Iteration 2766/3560 Training loss: 1.3143 2.4428 sec/batch
Epoch 16/20  Iteration 2767/3560 Training loss: 1.3144 2.6685 sec/batch
Epoch 16/20  Iteration 2768/3560 Training loss: 1.3140 2.5140 sec/batch
Epoch 16/20  Iteration 2769/3560 Training loss: 1.3136 2.4537 sec/batch
Epoch 16/20  Iteration 2770/3560 Training loss: 1.3133 2.4593 sec/batch
Epoch 16/20  Iteration 2771/3560 Training loss: 1.3133 2.4525 sec/batch
Epoch 16/20  Iteration 2772/3560 Training loss: 1.3132 2.4560 sec/batch
Epoch 16/20  Iteration 2773/3560 Training loss: 1.3132 2.4686 sec/batch
Epoch 16/20  Iteration 2774/3560 Training loss: 1.3130 2.4691 sec/batch
Epoch 16/20  Iteration 2775/3560 Training loss: 1.3129 2.4501 sec/batch
Epoch 16/20  Iteration 2776/3560 Training loss: 1.3128 2.4520 sec/batch
Epoch 16/20  Iteration 2777/3560 Training loss: 1.3127 2.4678 sec/batch
Epoch 16/20  Iteration 2778/3560 Training loss: 1.3127 2.4585 sec/batch
Epoch 16/20  Iteration 2779/3560 Training loss: 1.3126 2.4586 sec/batch
Epoch 16/20  Iteration 2780/3560 Training loss: 1.3127 2.4586 sec/batch
Epoch 16/20  Iteration 2781/3560 Training loss: 1.3125 2.4578 sec/batch
Epoch 16/20  Iteration 2782/3560 Training loss: 1.3124 2.4591 sec/batch
Epoch 16/20  Iteration 2783/3560 Training loss: 1.3123 2.4476 sec/batch
Epoch 16/20  Iteration 2784/3560 Training loss: 1.3122 2.4703 sec/batch
Epoch 16/20  Iteration 2785/3560 Training loss: 1.3120 2.4634 sec/batch
Epoch 16/20  Iteration 2786/3560 Training loss: 1.3117 2.4548 sec/batch
Epoch 16/20  Iteration 2787/3560 Training loss: 1.3117 2.4571 sec/batch
Epoch 16/20  Iteration 2788/3560 Training loss: 1.3117 2.4550 sec/batch
Epoch 16/20  Iteration 2789/3560 Training loss: 1.3116 2.4652 sec/batch
Epoch 16/20  Iteration 2790/3560 Training loss: 1.3115 2.4545 sec/batch
Epoch 16/20  Iteration 2791/3560 Training loss: 1.3114 2.4654 sec/batch
Epoch 16/20  Iteration 2792/3560 Training loss: 1.3110 2.4560 sec/batch
Epoch 16/20  Iteration 2793/3560 Training loss: 1.3107 2.4601 sec/batch
Epoch 16/20  Iteration 2794/3560 Training loss: 1.3107 2.4642 sec/batch
Epoch 16/20  Iteration 2795/3560 Training loss: 1.3106 2.4508 sec/batch
Epoch 16/20  Iteration 2796/3560 Training loss: 1.3102 2.4714 sec/batch
Epoch 16/20  Iteration 2797/3560 Training loss: 1.3103 2.4642 sec/batch
Epoch 16/20  Iteration 2798/3560 Training loss: 1.3102 2.4505 sec/batch
Epoch 16/20  Iteration 2799/3560 Training loss: 1.3102 2.4612 sec/batch
Epoch 16/20  Iteration 2800/3560 Training loss: 1.3098 2.4560 sec/batch
Epoch 16/20  Iteration 2801/3560 Training loss: 1.3095 2.4710 sec/batch
Epoch 16/20  Iteration 2802/3560 Training loss: 1.3093 2.4505 sec/batch
Epoch 16/20  Iteration 2803/3560 Training loss: 1.3094 2.4592 sec/batch
Epoch 16/20  Iteration 2804/3560 Training loss: 1.3094 2.4517 sec/batch
Epoch 16/20  Iteration 2805/3560 Training loss: 1.3094 2.4493 sec/batch
Epoch 16/20  Iteration 2806/3560 Training loss: 1.3095 2.4518 sec/batch
Epoch 16/20  Iteration 2807/3560 Training loss: 1.3096 2.4605 sec/batch
Epoch 16/20  Iteration 2808/3560 Training loss: 1.3098 2.4778 sec/batch
Epoch 16/20  Iteration 2809/3560 Training loss: 1.3098 2.4555 sec/batch
Epoch 16/20  Iteration 2810/3560 Training loss: 1.3098 2.4567 sec/batch
Epoch 16/20  Iteration 2811/3560 Training loss: 1.3100 2.4630 sec/batch
Epoch 16/20  Iteration 2812/3560 Training loss: 1.3101 2.4467 sec/batch
Epoch 16/20  Iteration 2813/3560 Training loss: 1.3100 2.4632 sec/batch
Epoch 16/20  Iteration 2814/3560 Training loss: 1.3102 2.4665 sec/batch
Epoch 16/20  Iteration 2815/3560 Training loss: 1.3101 2.4986 sec/batch
Epoch 16/20  Iteration 2816/3560 Training loss: 1.3102 2.6280 sec/batch
Epoch 16/20  Iteration 2817/3560 Training loss: 1.3103 2.4706 sec/batch
Epoch 16/20  Iteration 2818/3560 Training loss: 1.3105 2.4524 sec/batch
Epoch 16/20  Iteration 2819/3560 Training loss: 1.3107 2.4477 sec/batch
Epoch 16/20  Iteration 2820/3560 Training loss: 1.3105 2.4495 sec/batch
Epoch 16/20  Iteration 2821/3560 Training loss: 1.3102 2.4642 sec/batch
Epoch 16/20  Iteration 2822/3560 Training loss: 1.3100 2.4627 sec/batch
Epoch 16/20  Iteration 2823/3560 Training loss: 1.3101 2.4567 sec/batch
Epoch 16/20  Iteration 2824/3560 Training loss: 1.3101 2.4518 sec/batch
Epoch 16/20  Iteration 2825/3560 Training loss: 1.3101 2.4661 sec/batch
Epoch 16/20  Iteration 2826/3560 Training loss: 1.3102 2.4476 sec/batch
Epoch 16/20  Iteration 2827/3560 Training loss: 1.3102 2.4503 sec/batch
Epoch 16/20  Iteration 2828/3560 Training loss: 1.3100 2.4587 sec/batch
Epoch 16/20  Iteration 2829/3560 Training loss: 1.3098 2.4481 sec/batch
Epoch 16/20  Iteration 2830/3560 Training loss: 1.3099 2.4645 sec/batch
Epoch 16/20  Iteration 2831/3560 Training loss: 1.3101 2.4524 sec/batch
Epoch 16/20  Iteration 2832/3560 Training loss: 1.3100 2.4546 sec/batch
Epoch 16/20  Iteration 2833/3560 Training loss: 1.3100 2.4597 sec/batch
Epoch 16/20  Iteration 2834/3560 Training loss: 1.3099 2.4587 sec/batch
Epoch 16/20  Iteration 2835/3560 Training loss: 1.3099 2.4592 sec/batch
Epoch 16/20  Iteration 2836/3560 Training loss: 1.3100 2.4600 sec/batch
Epoch 16/20  Iteration 2837/3560 Training loss: 1.3102 2.4562 sec/batch
Epoch 16/20  Iteration 2838/3560 Training loss: 1.3105 2.4569 sec/batch
Epoch 16/20  Iteration 2839/3560 Training loss: 1.3105 2.4534 sec/batch
Epoch 16/20  Iteration 2840/3560 Training loss: 1.3105 2.4682 sec/batch
Epoch 16/20  Iteration 2841/3560 Training loss: 1.3104 2.4587 sec/batch
Epoch 16/20  Iteration 2842/3560 Training loss: 1.3102 2.4489 sec/batch
Epoch 16/20  Iteration 2843/3560 Training loss: 1.3103 2.4591 sec/batch
Epoch 16/20  Iteration 2844/3560 Training loss: 1.3103 2.4589 sec/batch
Epoch 16/20  Iteration 2845/3560 Training loss: 1.3103 2.4531 sec/batch
Epoch 16/20  Iteration 2846/3560 Training loss: 1.3101 2.4622 sec/batch
Epoch 16/20  Iteration 2847/3560 Training loss: 1.3100 2.4586 sec/batch
Epoch 16/20  Iteration 2848/3560 Training loss: 1.3102 2.4578 sec/batch
Epoch 17/20  Iteration 2849/3560 Training loss: 1.4209 2.4657 sec/batch
Epoch 17/20  Iteration 2850/3560 Training loss: 1.3642 2.4590 sec/batch
Epoch 17/20  Iteration 2851/3560 Training loss: 1.3438 2.4626 sec/batch
Epoch 17/20  Iteration 2852/3560 Training loss: 1.3376 2.4506 sec/batch
Epoch 17/20  Iteration 2853/3560 Training loss: 1.3274 2.4546 sec/batch
Epoch 17/20  Iteration 2854/3560 Training loss: 1.3152 2.4578 sec/batch
Epoch 17/20  Iteration 2855/3560 Training loss: 1.3150 2.4493 sec/batch
Epoch 17/20  Iteration 2856/3560 Training loss: 1.3124 2.4677 sec/batch
Epoch 17/20  Iteration 2857/3560 Training loss: 1.3101 2.4433 sec/batch
Epoch 17/20  Iteration 2858/3560 Training loss: 1.3086 2.4709 sec/batch
Epoch 17/20  Iteration 2859/3560 Training loss: 1.3049 2.4659 sec/batch
Epoch 17/20  Iteration 2860/3560 Training loss: 1.3050 2.4501 sec/batch
Epoch 17/20  Iteration 2861/3560 Training loss: 1.3063 2.4840 sec/batch
Epoch 17/20  Iteration 2862/3560 Training loss: 1.3068 2.4529 sec/batch
Epoch 17/20  Iteration 2863/3560 Training loss: 1.3055 2.4485 sec/batch
Epoch 17/20  Iteration 2864/3560 Training loss: 1.3048 2.5260 sec/batch
Epoch 17/20  Iteration 2865/3560 Training loss: 1.3048 2.5917 sec/batch
Epoch 17/20  Iteration 2866/3560 Training loss: 1.3063 2.4741 sec/batch
Epoch 17/20  Iteration 2867/3560 Training loss: 1.3064 2.4515 sec/batch
Epoch 17/20  Iteration 2868/3560 Training loss: 1.3074 2.4654 sec/batch
Epoch 17/20  Iteration 2869/3560 Training loss: 1.3071 2.4575 sec/batch
Epoch 17/20  Iteration 2870/3560 Training loss: 1.3074 2.4716 sec/batch
Epoch 17/20  Iteration 2871/3560 Training loss: 1.3068 2.4596 sec/batch
Epoch 17/20  Iteration 2872/3560 Training loss: 1.3067 2.7780 sec/batch
Epoch 17/20  Iteration 2873/3560 Training loss: 1.3060 2.5479 sec/batch
Epoch 17/20  Iteration 2874/3560 Training loss: 1.3038 2.4637 sec/batch
Epoch 17/20  Iteration 2875/3560 Training loss: 1.3024 2.4755 sec/batch
Epoch 17/20  Iteration 2876/3560 Training loss: 1.3032 2.4432 sec/batch
Epoch 17/20  Iteration 2877/3560 Training loss: 1.3034 2.4609 sec/batch
Epoch 17/20  Iteration 2878/3560 Training loss: 1.3039 2.4598 sec/batch
Epoch 17/20  Iteration 2879/3560 Training loss: 1.3033 2.4594 sec/batch
Epoch 17/20  Iteration 2880/3560 Training loss: 1.3021 2.4966 sec/batch
Epoch 17/20  Iteration 2881/3560 Training loss: 1.3024 2.4612 sec/batch
Epoch 17/20  Iteration 2882/3560 Training loss: 1.3028 2.4812 sec/batch
Epoch 17/20  Iteration 2883/3560 Training loss: 1.3023 2.4570 sec/batch
Epoch 17/20  Iteration 2884/3560 Training loss: 1.3022 2.4554 sec/batch
Epoch 17/20  Iteration 2885/3560 Training loss: 1.3018 2.4656 sec/batch
Epoch 17/20  Iteration 2886/3560 Training loss: 1.3007 2.4486 sec/batch
Epoch 17/20  Iteration 2887/3560 Training loss: 1.2997 2.4577 sec/batch
Epoch 17/20  Iteration 2888/3560 Training loss: 1.2994 2.4444 sec/batch
Epoch 17/20  Iteration 2889/3560 Training loss: 1.2987 2.4493 sec/batch
Epoch 17/20  Iteration 2890/3560 Training loss: 1.2996 2.4606 sec/batch
Epoch 17/20  Iteration 2891/3560 Training loss: 1.2993 2.4552 sec/batch
Epoch 17/20  Iteration 2892/3560 Training loss: 1.2988 2.4657 sec/batch
Epoch 17/20  Iteration 2893/3560 Training loss: 1.2990 2.4628 sec/batch
Epoch 17/20  Iteration 2894/3560 Training loss: 1.2985 2.4670 sec/batch
Epoch 17/20  Iteration 2895/3560 Training loss: 1.2982 2.4646 sec/batch
Epoch 17/20  Iteration 2896/3560 Training loss: 1.2975 2.4532 sec/batch
Epoch 17/20  Iteration 2897/3560 Training loss: 1.2975 2.4553 sec/batch
Epoch 17/20  Iteration 2898/3560 Training loss: 1.2979 2.4422 sec/batch
Epoch 17/20  Iteration 2899/3560 Training loss: 1.2975 2.4549 sec/batch
Epoch 17/20  Iteration 2900/3560 Training loss: 1.2983 2.4407 sec/batch
Epoch 17/20  Iteration 2901/3560 Training loss: 1.2983 2.4589 sec/batch
Epoch 17/20  Iteration 2902/3560 Training loss: 1.2983 2.4655 sec/batch
Epoch 17/20  Iteration 2903/3560 Training loss: 1.2980 2.4569 sec/batch
Epoch 17/20  Iteration 2904/3560 Training loss: 1.2982 2.4724 sec/batch
Epoch 17/20  Iteration 2905/3560 Training loss: 1.2985 2.4446 sec/batch
Epoch 17/20  Iteration 2906/3560 Training loss: 1.2980 2.4451 sec/batch
Epoch 17/20  Iteration 2907/3560 Training loss: 1.2976 2.7884 sec/batch
Epoch 17/20  Iteration 2908/3560 Training loss: 1.2980 2.4484 sec/batch
Epoch 17/20  Iteration 2909/3560 Training loss: 1.2980 2.4554 sec/batch
Epoch 17/20  Iteration 2910/3560 Training loss: 1.2987 2.4497 sec/batch
Epoch 17/20  Iteration 2911/3560 Training loss: 1.2990 2.4537 sec/batch
Epoch 17/20  Iteration 2912/3560 Training loss: 1.2991 2.5935 sec/batch
Epoch 17/20  Iteration 2913/3560 Training loss: 1.2990 2.5518 sec/batch
Epoch 17/20  Iteration 2914/3560 Training loss: 1.2992 2.4814 sec/batch
Epoch 17/20  Iteration 2915/3560 Training loss: 1.2995 2.4503 sec/batch
Epoch 17/20  Iteration 2916/3560 Training loss: 1.2992 2.4628 sec/batch
Epoch 17/20  Iteration 2917/3560 Training loss: 1.2994 2.4594 sec/batch
Epoch 17/20  Iteration 2918/3560 Training loss: 1.2994 2.4761 sec/batch
Epoch 17/20  Iteration 2919/3560 Training loss: 1.3001 2.4712 sec/batch
Epoch 17/20  Iteration 2920/3560 Training loss: 1.3004 2.4513 sec/batch
Epoch 17/20  Iteration 2921/3560 Training loss: 1.3009 2.4593 sec/batch
Epoch 17/20  Iteration 2922/3560 Training loss: 1.3007 2.4469 sec/batch
Epoch 17/20  Iteration 2923/3560 Training loss: 1.3006 2.4583 sec/batch
Epoch 17/20  Iteration 2924/3560 Training loss: 1.3007 2.4628 sec/batch
Epoch 17/20  Iteration 2925/3560 Training loss: 1.3006 2.4580 sec/batch
Epoch 17/20  Iteration 2926/3560 Training loss: 1.3007 2.4607 sec/batch
Epoch 17/20  Iteration 2927/3560 Training loss: 1.3001 2.4693 sec/batch
Epoch 17/20  Iteration 2928/3560 Training loss: 1.3001 2.4529 sec/batch
Epoch 17/20  Iteration 2929/3560 Training loss: 1.2998 2.4650 sec/batch
Epoch 17/20  Iteration 2930/3560 Training loss: 1.2997 2.4500 sec/batch
Epoch 17/20  Iteration 2931/3560 Training loss: 1.2992 2.4729 sec/batch
Epoch 17/20  Iteration 2932/3560 Training loss: 1.2992 2.4542 sec/batch
Epoch 17/20  Iteration 2933/3560 Training loss: 1.2989 2.4621 sec/batch
Epoch 17/20  Iteration 2934/3560 Training loss: 1.2988 2.4515 sec/batch
Epoch 17/20  Iteration 2935/3560 Training loss: 1.2986 2.4503 sec/batch
Epoch 17/20  Iteration 2936/3560 Training loss: 1.2983 2.4584 sec/batch
Epoch 17/20  Iteration 2937/3560 Training loss: 1.2979 2.4528 sec/batch
Epoch 17/20  Iteration 2938/3560 Training loss: 1.2979 2.4603 sec/batch
Epoch 17/20  Iteration 2939/3560 Training loss: 1.2977 2.4555 sec/batch
Epoch 17/20  Iteration 2940/3560 Training loss: 1.2976 2.4522 sec/batch
Epoch 17/20  Iteration 2941/3560 Training loss: 1.2972 2.4630 sec/batch
Epoch 17/20  Iteration 2942/3560 Training loss: 1.2969 2.4539 sec/batch
Epoch 17/20  Iteration 2943/3560 Training loss: 1.2967 2.4881 sec/batch
Epoch 17/20  Iteration 2944/3560 Training loss: 1.2967 2.4472 sec/batch
Epoch 17/20  Iteration 2945/3560 Training loss: 1.2967 2.4575 sec/batch
Epoch 17/20  Iteration 2946/3560 Training loss: 1.2963 2.4473 sec/batch
Epoch 17/20  Iteration 2947/3560 Training loss: 1.2959 2.4480 sec/batch
Epoch 17/20  Iteration 2948/3560 Training loss: 1.2955 2.4638 sec/batch
Epoch 17/20  Iteration 2949/3560 Training loss: 1.2954 2.4446 sec/batch
Epoch 17/20  Iteration 2950/3560 Training loss: 1.2953 2.4652 sec/batch
Epoch 17/20  Iteration 2951/3560 Training loss: 1.2953 2.4466 sec/batch
Epoch 17/20  Iteration 2952/3560 Training loss: 1.2951 2.4478 sec/batch
Epoch 17/20  Iteration 2953/3560 Training loss: 1.2949 2.4593 sec/batch
Epoch 17/20  Iteration 2954/3560 Training loss: 1.2949 2.4471 sec/batch
Epoch 17/20  Iteration 2955/3560 Training loss: 1.2950 2.4755 sec/batch
Epoch 17/20  Iteration 2956/3560 Training loss: 1.2949 2.4484 sec/batch
Epoch 17/20  Iteration 2957/3560 Training loss: 1.2948 2.4542 sec/batch
Epoch 17/20  Iteration 2958/3560 Training loss: 1.2949 2.4619 sec/batch
Epoch 17/20  Iteration 2959/3560 Training loss: 1.2948 2.4511 sec/batch
Epoch 17/20  Iteration 2960/3560 Training loss: 1.2948 2.4624 sec/batch
Epoch 17/20  Iteration 2961/3560 Training loss: 1.2947 2.6140 sec/batch
Epoch 17/20  Iteration 2962/3560 Training loss: 1.2944 2.5463 sec/batch
Epoch 17/20  Iteration 2963/3560 Training loss: 1.2943 2.4561 sec/batch
Epoch 17/20  Iteration 2964/3560 Training loss: 1.2940 2.4715 sec/batch
Epoch 17/20  Iteration 2965/3560 Training loss: 1.2939 2.4833 sec/batch
Epoch 17/20  Iteration 2966/3560 Training loss: 1.2939 2.4644 sec/batch
Epoch 17/20  Iteration 2967/3560 Training loss: 1.2938 2.4787 sec/batch
Epoch 17/20  Iteration 2968/3560 Training loss: 1.2937 2.4580 sec/batch
Epoch 17/20  Iteration 2969/3560 Training loss: 1.2936 2.4617 sec/batch
Epoch 17/20  Iteration 2970/3560 Training loss: 1.2933 2.4587 sec/batch
Epoch 17/20  Iteration 2971/3560 Training loss: 1.2929 2.4518 sec/batch
Epoch 17/20  Iteration 2972/3560 Training loss: 1.2929 2.4641 sec/batch
Epoch 17/20  Iteration 2973/3560 Training loss: 1.2928 2.4480 sec/batch
Epoch 17/20  Iteration 2974/3560 Training loss: 1.2924 2.4610 sec/batch
Epoch 17/20  Iteration 2975/3560 Training loss: 1.2925 2.4622 sec/batch
Epoch 17/20  Iteration 2976/3560 Training loss: 1.2925 2.4575 sec/batch
Epoch 17/20  Iteration 2977/3560 Training loss: 1.2924 2.4632 sec/batch
Epoch 17/20  Iteration 2978/3560 Training loss: 1.2920 2.4631 sec/batch
Epoch 17/20  Iteration 2979/3560 Training loss: 1.2917 2.4773 sec/batch
Epoch 17/20  Iteration 2980/3560 Training loss: 1.2915 2.4656 sec/batch
Epoch 17/20  Iteration 2981/3560 Training loss: 1.2916 2.4742 sec/batch
Epoch 17/20  Iteration 2982/3560 Training loss: 1.2916 2.4623 sec/batch
Epoch 17/20  Iteration 2983/3560 Training loss: 1.2915 2.4578 sec/batch
Epoch 17/20  Iteration 2984/3560 Training loss: 1.2916 2.4603 sec/batch
Epoch 17/20  Iteration 2985/3560 Training loss: 1.2918 2.4554 sec/batch
Epoch 17/20  Iteration 2986/3560 Training loss: 1.2920 2.4613 sec/batch
Epoch 17/20  Iteration 2987/3560 Training loss: 1.2919 2.4464 sec/batch
Epoch 17/20  Iteration 2988/3560 Training loss: 1.2920 2.4483 sec/batch
Epoch 17/20  Iteration 2989/3560 Training loss: 1.2923 2.4611 sec/batch
Epoch 17/20  Iteration 2990/3560 Training loss: 1.2923 2.4515 sec/batch
Epoch 17/20  Iteration 2991/3560 Training loss: 1.2923 2.4813 sec/batch
Epoch 17/20  Iteration 2992/3560 Training loss: 1.2925 2.4448 sec/batch
Epoch 17/20  Iteration 2993/3560 Training loss: 1.2924 2.6672 sec/batch
Epoch 17/20  Iteration 2994/3560 Training loss: 1.2925 2.6450 sec/batch
Epoch 17/20  Iteration 2995/3560 Training loss: 1.2926 2.4703 sec/batch
Epoch 17/20  Iteration 2996/3560 Training loss: 1.2929 2.4566 sec/batch
Epoch 17/20  Iteration 2997/3560 Training loss: 1.2930 2.4500 sec/batch
Epoch 17/20  Iteration 2998/3560 Training loss: 1.2929 2.4604 sec/batch
Epoch 17/20  Iteration 2999/3560 Training loss: 1.2925 2.4730 sec/batch
Epoch 17/20  Iteration 3000/3560 Training loss: 1.2924 2.4753 sec/batch
Epoch 17/20  Iteration 3001/3560 Training loss: 1.2924 2.4518 sec/batch
Epoch 17/20  Iteration 3002/3560 Training loss: 1.2924 2.4563 sec/batch
Epoch 17/20  Iteration 3003/3560 Training loss: 1.2923 2.4764 sec/batch
Epoch 17/20  Iteration 3004/3560 Training loss: 1.2924 2.4576 sec/batch
Epoch 17/20  Iteration 3005/3560 Training loss: 1.2924 2.4509 sec/batch
Epoch 17/20  Iteration 3006/3560 Training loss: 1.2923 2.4609 sec/batch
Epoch 17/20  Iteration 3007/3560 Training loss: 1.2921 2.4846 sec/batch
Epoch 17/20  Iteration 3008/3560 Training loss: 1.2921 2.4550 sec/batch
Epoch 17/20  Iteration 3009/3560 Training loss: 1.2923 2.6309 sec/batch
Epoch 17/20  Iteration 3010/3560 Training loss: 1.2923 2.5258 sec/batch
Epoch 17/20  Iteration 3011/3560 Training loss: 1.2923 2.4551 sec/batch
Epoch 17/20  Iteration 3012/3560 Training loss: 1.2923 2.4577 sec/batch
Epoch 17/20  Iteration 3013/3560 Training loss: 1.2922 2.4461 sec/batch
Epoch 17/20  Iteration 3014/3560 Training loss: 1.2921 2.4703 sec/batch
Epoch 17/20  Iteration 3015/3560 Training loss: 1.2922 2.4669 sec/batch
Epoch 17/20  Iteration 3016/3560 Training loss: 1.2926 2.4530 sec/batch
Epoch 17/20  Iteration 3017/3560 Training loss: 1.2926 2.4972 sec/batch
Epoch 17/20  Iteration 3018/3560 Training loss: 1.2926 2.4555 sec/batch
Epoch 17/20  Iteration 3019/3560 Training loss: 1.2925 2.4524 sec/batch
Epoch 17/20  Iteration 3020/3560 Training loss: 1.2923 2.4455 sec/batch
Epoch 17/20  Iteration 3021/3560 Training loss: 1.2924 2.4606 sec/batch
Epoch 17/20  Iteration 3022/3560 Training loss: 1.2923 2.4758 sec/batch
Epoch 17/20  Iteration 3023/3560 Training loss: 1.2924 2.4554 sec/batch
Epoch 17/20  Iteration 3024/3560 Training loss: 1.2922 2.4546 sec/batch
Epoch 17/20  Iteration 3025/3560 Training loss: 1.2922 2.4517 sec/batch
Epoch 17/20  Iteration 3026/3560 Training loss: 1.2924 2.4584 sec/batch
Epoch 18/20  Iteration 3027/3560 Training loss: 1.4088 2.4408 sec/batch
Epoch 18/20  Iteration 3028/3560 Training loss: 1.3552 2.4792 sec/batch
Epoch 18/20  Iteration 3029/3560 Training loss: 1.3312 2.4645 sec/batch
Epoch 18/20  Iteration 3030/3560 Training loss: 1.3248 2.4562 sec/batch
Epoch 18/20  Iteration 3031/3560 Training loss: 1.3127 2.4694 sec/batch
Epoch 18/20  Iteration 3032/3560 Training loss: 1.2997 2.4386 sec/batch
Epoch 18/20  Iteration 3033/3560 Training loss: 1.2989 2.4539 sec/batch
Epoch 18/20  Iteration 3034/3560 Training loss: 1.2959 2.4585 sec/batch
Epoch 18/20  Iteration 3035/3560 Training loss: 1.2947 2.4482 sec/batch
Epoch 18/20  Iteration 3036/3560 Training loss: 1.2930 2.4593 sec/batch
Epoch 18/20  Iteration 3037/3560 Training loss: 1.2890 2.4464 sec/batch
Epoch 18/20  Iteration 3038/3560 Training loss: 1.2887 2.4628 sec/batch
Epoch 18/20  Iteration 3039/3560 Training loss: 1.2893 2.4591 sec/batch
Epoch 18/20  Iteration 3040/3560 Training loss: 1.2898 2.4723 sec/batch
Epoch 18/20  Iteration 3041/3560 Training loss: 1.2884 2.4609 sec/batch
Epoch 18/20  Iteration 3042/3560 Training loss: 1.2870 2.4544 sec/batch
Epoch 18/20  Iteration 3043/3560 Training loss: 1.2871 2.4622 sec/batch
Epoch 18/20  Iteration 3044/3560 Training loss: 1.2881 2.4471 sec/batch
Epoch 18/20  Iteration 3045/3560 Training loss: 1.2881 2.4510 sec/batch
Epoch 18/20  Iteration 3046/3560 Training loss: 1.2892 2.4667 sec/batch
Epoch 18/20  Iteration 3047/3560 Training loss: 1.2889 2.4494 sec/batch
Epoch 18/20  Iteration 3048/3560 Training loss: 1.2895 2.4630 sec/batch
Epoch 18/20  Iteration 3049/3560 Training loss: 1.2890 2.4621 sec/batch
Epoch 18/20  Iteration 3050/3560 Training loss: 1.2889 2.4500 sec/batch
Epoch 18/20  Iteration 3051/3560 Training loss: 1.2886 2.4567 sec/batch
Epoch 18/20  Iteration 3052/3560 Training loss: 1.2863 2.4670 sec/batch
Epoch 18/20  Iteration 3053/3560 Training loss: 1.2852 2.4667 sec/batch
Epoch 18/20  Iteration 3054/3560 Training loss: 1.2859 2.4500 sec/batch
Epoch 18/20  Iteration 3055/3560 Training loss: 1.2863 2.4610 sec/batch
Epoch 18/20  Iteration 3056/3560 Training loss: 1.2864 2.4605 sec/batch
Epoch 18/20  Iteration 3057/3560 Training loss: 1.2857 2.4497 sec/batch
Epoch 18/20  Iteration 3058/3560 Training loss: 1.2850 2.7336 sec/batch
Epoch 18/20  Iteration 3059/3560 Training loss: 1.2852 2.5091 sec/batch
Epoch 18/20  Iteration 3060/3560 Training loss: 1.2855 2.4599 sec/batch
Epoch 18/20  Iteration 3061/3560 Training loss: 1.2851 2.4530 sec/batch
Epoch 18/20  Iteration 3062/3560 Training loss: 1.2848 2.4473 sec/batch
Epoch 18/20  Iteration 3063/3560 Training loss: 1.2840 2.4602 sec/batch
Epoch 18/20  Iteration 3064/3560 Training loss: 1.2830 2.4666 sec/batch
Epoch 18/20  Iteration 3065/3560 Training loss: 1.2818 2.4635 sec/batch
Epoch 18/20  Iteration 3066/3560 Training loss: 1.2816 2.4474 sec/batch
Epoch 18/20  Iteration 3067/3560 Training loss: 1.2809 2.4535 sec/batch
Epoch 18/20  Iteration 3068/3560 Training loss: 1.2818 2.4639 sec/batch
Epoch 18/20  Iteration 3069/3560 Training loss: 1.2814 2.4541 sec/batch
Epoch 18/20  Iteration 3070/3560 Training loss: 1.2808 2.4685 sec/batch
Epoch 18/20  Iteration 3071/3560 Training loss: 1.2811 2.4506 sec/batch
Epoch 18/20  Iteration 3072/3560 Training loss: 1.2805 2.4672 sec/batch
Epoch 18/20  Iteration 3073/3560 Training loss: 1.2801 2.4440 sec/batch
Epoch 18/20  Iteration 3074/3560 Training loss: 1.2795 2.4471 sec/batch
Epoch 18/20  Iteration 3075/3560 Training loss: 1.2793 2.4557 sec/batch
Epoch 18/20  Iteration 3076/3560 Training loss: 1.2798 2.4622 sec/batch
Epoch 18/20  Iteration 3077/3560 Training loss: 1.2795 2.4614 sec/batch
Epoch 18/20  Iteration 3078/3560 Training loss: 1.2803 2.4486 sec/batch
Epoch 18/20  Iteration 3079/3560 Training loss: 1.2802 2.4572 sec/batch
Epoch 18/20  Iteration 3080/3560 Training loss: 1.2804 2.4598 sec/batch
Epoch 18/20  Iteration 3081/3560 Training loss: 1.2801 2.4459 sec/batch
Epoch 18/20  Iteration 3082/3560 Training loss: 1.2803 2.4626 sec/batch
Epoch 18/20  Iteration 3083/3560 Training loss: 1.2806 2.4609 sec/batch
Epoch 18/20  Iteration 3084/3560 Training loss: 1.2802 2.4503 sec/batch
Epoch 18/20  Iteration 3085/3560 Training loss: 1.2798 2.4602 sec/batch
Epoch 18/20  Iteration 3086/3560 Training loss: 1.2801 2.4518 sec/batch
Epoch 18/20  Iteration 3087/3560 Training loss: 1.2800 2.4581 sec/batch
Epoch 18/20  Iteration 3088/3560 Training loss: 1.2810 2.4702 sec/batch
Epoch 18/20  Iteration 3089/3560 Training loss: 1.2812 2.4983 sec/batch
Epoch 18/20  Iteration 3090/3560 Training loss: 1.2812 2.4760 sec/batch
Epoch 18/20  Iteration 3091/3560 Training loss: 1.2810 2.4794 sec/batch
Epoch 18/20  Iteration 3092/3560 Training loss: 1.2811 2.4583 sec/batch
Epoch 18/20  Iteration 3093/3560 Training loss: 1.2815 2.5967 sec/batch
Epoch 18/20  Iteration 3094/3560 Training loss: 1.2812 2.5159 sec/batch
Epoch 18/20  Iteration 3095/3560 Training loss: 1.2813 2.4560 sec/batch
Epoch 18/20  Iteration 3096/3560 Training loss: 1.2813 2.4804 sec/batch
Epoch 18/20  Iteration 3097/3560 Training loss: 1.2820 2.4510 sec/batch
Epoch 18/20  Iteration 3098/3560 Training loss: 1.2823 2.4489 sec/batch
Epoch 18/20  Iteration 3099/3560 Training loss: 1.2827 2.4566 sec/batch
Epoch 18/20  Iteration 3100/3560 Training loss: 1.2826 2.4627 sec/batch
Epoch 18/20  Iteration 3101/3560 Training loss: 1.2826 2.6221 sec/batch
Epoch 18/20  Iteration 3102/3560 Training loss: 1.2829 2.4507 sec/batch
Epoch 18/20  Iteration 3103/3560 Training loss: 1.2828 2.4632 sec/batch
Epoch 18/20  Iteration 3104/3560 Training loss: 1.2828 2.4640 sec/batch
Epoch 18/20  Iteration 3105/3560 Training loss: 1.2823 2.4521 sec/batch
Epoch 18/20  Iteration 3106/3560 Training loss: 1.2823 2.5987 sec/batch
Epoch 18/20  Iteration 3107/3560 Training loss: 1.2820 2.5492 sec/batch
Epoch 18/20  Iteration 3108/3560 Training loss: 1.2818 2.4801 sec/batch
Epoch 18/20  Iteration 3109/3560 Training loss: 1.2814 2.4494 sec/batch
Epoch 18/20  Iteration 3110/3560 Training loss: 1.2815 2.4502 sec/batch
Epoch 18/20  Iteration 3111/3560 Training loss: 1.2812 2.4623 sec/batch
Epoch 18/20  Iteration 3112/3560 Training loss: 1.2811 2.4727 sec/batch
Epoch 18/20  Iteration 3113/3560 Training loss: 1.2810 2.4613 sec/batch
Epoch 18/20  Iteration 3114/3560 Training loss: 1.2807 2.5992 sec/batch
Epoch 18/20  Iteration 3115/3560 Training loss: 1.2805 2.6219 sec/batch
Epoch 18/20  Iteration 3116/3560 Training loss: 1.2805 2.4489 sec/batch
Epoch 18/20  Iteration 3117/3560 Training loss: 1.2802 2.4754 sec/batch
Epoch 18/20  Iteration 3118/3560 Training loss: 1.2802 2.4466 sec/batch
Epoch 18/20  Iteration 3119/3560 Training loss: 1.2799 2.4569 sec/batch
Epoch 18/20  Iteration 3120/3560 Training loss: 1.2796 2.4615 sec/batch
Epoch 18/20  Iteration 3121/3560 Training loss: 1.2795 2.4543 sec/batch
Epoch 18/20  Iteration 3122/3560 Training loss: 1.2796 2.4494 sec/batch
Epoch 18/20  Iteration 3123/3560 Training loss: 1.2796 2.4574 sec/batch
Epoch 18/20  Iteration 3124/3560 Training loss: 1.2792 2.4507 sec/batch
Epoch 18/20  Iteration 3125/3560 Training loss: 1.2788 2.4805 sec/batch
Epoch 18/20  Iteration 3126/3560 Training loss: 1.2784 2.4543 sec/batch
Epoch 18/20  Iteration 3127/3560 Training loss: 1.2784 2.4669 sec/batch
Epoch 18/20  Iteration 3128/3560 Training loss: 1.2782 2.4478 sec/batch
Epoch 18/20  Iteration 3129/3560 Training loss: 1.2782 2.4511 sec/batch
Epoch 18/20  Iteration 3130/3560 Training loss: 1.2781 2.4553 sec/batch
Epoch 18/20  Iteration 3131/3560 Training loss: 1.2779 2.4513 sec/batch
Epoch 18/20  Iteration 3132/3560 Training loss: 1.2779 2.4591 sec/batch
Epoch 18/20  Iteration 3133/3560 Training loss: 1.2779 2.4482 sec/batch
Epoch 18/20  Iteration 3134/3560 Training loss: 1.2779 2.4538 sec/batch
Epoch 18/20  Iteration 3135/3560 Training loss: 1.2778 2.4644 sec/batch
Epoch 18/20  Iteration 3136/3560 Training loss: 1.2778 2.4515 sec/batch
Epoch 18/20  Iteration 3137/3560 Training loss: 1.2777 2.4827 sec/batch
Epoch 18/20  Iteration 3138/3560 Training loss: 1.2776 2.4432 sec/batch
Epoch 18/20  Iteration 3139/3560 Training loss: 1.2775 2.4591 sec/batch
Epoch 18/20  Iteration 3140/3560 Training loss: 1.2773 2.4467 sec/batch
Epoch 18/20  Iteration 3141/3560 Training loss: 1.2771 2.4513 sec/batch
Epoch 18/20  Iteration 3142/3560 Training loss: 1.2768 2.4672 sec/batch
Epoch 18/20  Iteration 3143/3560 Training loss: 1.2769 2.4563 sec/batch
Epoch 18/20  Iteration 3144/3560 Training loss: 1.2769 2.4642 sec/batch
Epoch 18/20  Iteration 3145/3560 Training loss: 1.2768 2.4503 sec/batch
Epoch 18/20  Iteration 3146/3560 Training loss: 1.2767 2.4430 sec/batch
Epoch 18/20  Iteration 3147/3560 Training loss: 1.2766 2.4609 sec/batch
Epoch 18/20  Iteration 3148/3560 Training loss: 1.2762 2.4697 sec/batch
Epoch 18/20  Iteration 3149/3560 Training loss: 1.2758 2.4747 sec/batch
Epoch 18/20  Iteration 3150/3560 Training loss: 1.2758 2.4470 sec/batch
Epoch 18/20  Iteration 3151/3560 Training loss: 1.2757 2.4508 sec/batch
Epoch 18/20  Iteration 3152/3560 Training loss: 1.2753 2.4665 sec/batch
Epoch 18/20  Iteration 3153/3560 Training loss: 1.2754 2.4507 sec/batch
Epoch 18/20  Iteration 3154/3560 Training loss: 1.2754 2.4617 sec/batch
Epoch 18/20  Iteration 3155/3560 Training loss: 1.2752 2.6401 sec/batch
Epoch 18/20  Iteration 3156/3560 Training loss: 1.2749 2.5247 sec/batch
Epoch 18/20  Iteration 3157/3560 Training loss: 1.2745 2.4500 sec/batch
Epoch 18/20  Iteration 3158/3560 Training loss: 1.2742 2.4516 sec/batch
Epoch 18/20  Iteration 3159/3560 Training loss: 1.2744 2.4598 sec/batch
Epoch 18/20  Iteration 3160/3560 Training loss: 1.2743 2.4575 sec/batch
Epoch 18/20  Iteration 3161/3560 Training loss: 1.2743 2.4756 sec/batch
Epoch 18/20  Iteration 3162/3560 Training loss: 1.2744 2.4494 sec/batch
Epoch 18/20  Iteration 3163/3560 Training loss: 1.2746 2.4509 sec/batch
Epoch 18/20  Iteration 3164/3560 Training loss: 1.2746 2.4548 sec/batch
Epoch 18/20  Iteration 3165/3560 Training loss: 1.2746 2.4481 sec/batch
Epoch 18/20  Iteration 3166/3560 Training loss: 1.2746 2.4670 sec/batch
Epoch 18/20  Iteration 3167/3560 Training loss: 1.2750 2.4473 sec/batch
Epoch 18/20  Iteration 3168/3560 Training loss: 1.2750 2.4426 sec/batch
Epoch 18/20  Iteration 3169/3560 Training loss: 1.2749 2.4608 sec/batch
Epoch 18/20  Iteration 3170/3560 Training loss: 1.2751 2.4621 sec/batch
Epoch 18/20  Iteration 3171/3560 Training loss: 1.2750 2.4644 sec/batch
Epoch 18/20  Iteration 3172/3560 Training loss: 1.2751 2.4510 sec/batch
Epoch 18/20  Iteration 3173/3560 Training loss: 1.2752 2.4771 sec/batch
Epoch 18/20  Iteration 3174/3560 Training loss: 1.2754 2.4600 sec/batch
Epoch 18/20  Iteration 3175/3560 Training loss: 1.2755 2.4593 sec/batch
Epoch 18/20  Iteration 3176/3560 Training loss: 1.2753 2.4645 sec/batch
Epoch 18/20  Iteration 3177/3560 Training loss: 1.2750 2.4540 sec/batch
Epoch 18/20  Iteration 3178/3560 Training loss: 1.2749 2.4611 sec/batch
Epoch 18/20  Iteration 3179/3560 Training loss: 1.2749 2.4557 sec/batch
Epoch 18/20  Iteration 3180/3560 Training loss: 1.2748 2.4528 sec/batch
Epoch 18/20  Iteration 3181/3560 Training loss: 1.2748 2.4664 sec/batch
Epoch 18/20  Iteration 3182/3560 Training loss: 1.2748 2.4596 sec/batch
Epoch 18/20  Iteration 3183/3560 Training loss: 1.2747 2.4654 sec/batch
Epoch 18/20  Iteration 3184/3560 Training loss: 1.2747 2.4566 sec/batch
Epoch 18/20  Iteration 3185/3560 Training loss: 1.2744 2.4640 sec/batch
Epoch 18/20  Iteration 3186/3560 Training loss: 1.2745 2.4552 sec/batch
Epoch 18/20  Iteration 3187/3560 Training loss: 1.2747 2.4587 sec/batch
Epoch 18/20  Iteration 3188/3560 Training loss: 1.2747 2.4704 sec/batch
Epoch 18/20  Iteration 3189/3560 Training loss: 1.2746 2.4511 sec/batch
Epoch 18/20  Iteration 3190/3560 Training loss: 1.2746 2.4506 sec/batch
Epoch 18/20  Iteration 3191/3560 Training loss: 1.2746 2.4545 sec/batch
Epoch 18/20  Iteration 3192/3560 Training loss: 1.2746 2.4461 sec/batch
Epoch 18/20  Iteration 3193/3560 Training loss: 1.2747 2.4715 sec/batch
Epoch 18/20  Iteration 3194/3560 Training loss: 1.2751 2.4562 sec/batch
Epoch 18/20  Iteration 3195/3560 Training loss: 1.2751 2.4611 sec/batch
Epoch 18/20  Iteration 3196/3560 Training loss: 1.2752 2.4542 sec/batch
Epoch 18/20  Iteration 3197/3560 Training loss: 1.2751 2.4555 sec/batch
Epoch 18/20  Iteration 3198/3560 Training loss: 1.2749 2.4732 sec/batch
Epoch 18/20  Iteration 3199/3560 Training loss: 1.2750 2.4463 sec/batch
Epoch 18/20  Iteration 3200/3560 Training loss: 1.2750 2.4696 sec/batch
Epoch 18/20  Iteration 3201/3560 Training loss: 1.2750 2.4487 sec/batch
Epoch 18/20  Iteration 3202/3560 Training loss: 1.2749 2.4643 sec/batch
Epoch 18/20  Iteration 3203/3560 Training loss: 1.2748 2.6488 sec/batch
Epoch 18/20  Iteration 3204/3560 Training loss: 1.2750 2.5116 sec/batch
Epoch 19/20  Iteration 3205/3560 Training loss: 1.3852 2.4738 sec/batch
Epoch 19/20  Iteration 3206/3560 Training loss: 1.3371 2.4503 sec/batch
Epoch 19/20  Iteration 3207/3560 Training loss: 1.3176 2.4527 sec/batch
Epoch 19/20  Iteration 3208/3560 Training loss: 1.3096 2.4481 sec/batch
Epoch 19/20  Iteration 3209/3560 Training loss: 1.2971 2.4644 sec/batch
Epoch 19/20  Iteration 3210/3560 Training loss: 1.2858 2.4725 sec/batch
Epoch 19/20  Iteration 3211/3560 Training loss: 1.2848 2.4512 sec/batch
Epoch 19/20  Iteration 3212/3560 Training loss: 1.2821 2.4656 sec/batch
Epoch 19/20  Iteration 3213/3560 Training loss: 1.2802 2.4517 sec/batch
Epoch 19/20  Iteration 3214/3560 Training loss: 1.2781 2.4634 sec/batch
Epoch 19/20  Iteration 3215/3560 Training loss: 1.2741 2.4514 sec/batch
Epoch 19/20  Iteration 3216/3560 Training loss: 1.2741 2.4576 sec/batch
Epoch 19/20  Iteration 3217/3560 Training loss: 1.2749 2.4665 sec/batch
Epoch 19/20  Iteration 3218/3560 Training loss: 1.2754 2.4417 sec/batch
Epoch 19/20  Iteration 3219/3560 Training loss: 1.2739 2.4631 sec/batch
Epoch 19/20  Iteration 3220/3560 Training loss: 1.2725 2.4544 sec/batch
Epoch 19/20  Iteration 3221/3560 Training loss: 1.2729 2.4493 sec/batch
Epoch 19/20  Iteration 3222/3560 Training loss: 1.2746 2.4810 sec/batch
Epoch 19/20  Iteration 3223/3560 Training loss: 1.2751 2.4493 sec/batch
Epoch 19/20  Iteration 3224/3560 Training loss: 1.2761 2.4731 sec/batch
Epoch 19/20  Iteration 3225/3560 Training loss: 1.2755 2.4547 sec/batch
Epoch 19/20  Iteration 3226/3560 Training loss: 1.2756 2.4724 sec/batch
Epoch 19/20  Iteration 3227/3560 Training loss: 1.2753 2.4495 sec/batch
Epoch 19/20  Iteration 3228/3560 Training loss: 1.2754 2.4474 sec/batch
Epoch 19/20  Iteration 3229/3560 Training loss: 1.2749 2.4612 sec/batch
Epoch 19/20  Iteration 3230/3560 Training loss: 1.2726 2.4505 sec/batch
Epoch 19/20  Iteration 3231/3560 Training loss: 1.2715 2.4462 sec/batch
Epoch 19/20  Iteration 3232/3560 Training loss: 1.2721 2.4621 sec/batch
Epoch 19/20  Iteration 3233/3560 Training loss: 1.2721 2.4524 sec/batch
Epoch 19/20  Iteration 3234/3560 Training loss: 1.2725 2.4708 sec/batch
Epoch 19/20  Iteration 3235/3560 Training loss: 1.2720 2.4553 sec/batch
Epoch 19/20  Iteration 3236/3560 Training loss: 1.2711 2.6919 sec/batch
Epoch 19/20  Iteration 3237/3560 Training loss: 1.2713 2.4584 sec/batch
Epoch 19/20  Iteration 3238/3560 Training loss: 1.2716 2.4558 sec/batch
Epoch 19/20  Iteration 3239/3560 Training loss: 1.2712 2.4476 sec/batch
Epoch 19/20  Iteration 3240/3560 Training loss: 1.2710 2.4634 sec/batch
Epoch 19/20  Iteration 3241/3560 Training loss: 1.2703 2.4582 sec/batch
Epoch 19/20  Iteration 3242/3560 Training loss: 1.2693 2.4704 sec/batch
Epoch 19/20  Iteration 3243/3560 Training loss: 1.2683 2.4619 sec/batch
Epoch 19/20  Iteration 3244/3560 Training loss: 1.2679 2.4496 sec/batch
Epoch 19/20  Iteration 3245/3560 Training loss: 1.2672 2.4630 sec/batch
Epoch 19/20  Iteration 3246/3560 Training loss: 1.2683 2.4639 sec/batch
Epoch 19/20  Iteration 3247/3560 Training loss: 1.2681 2.4708 sec/batch
Epoch 19/20  Iteration 3248/3560 Training loss: 1.2675 2.4571 sec/batch
Epoch 19/20  Iteration 3249/3560 Training loss: 1.2677 2.4591 sec/batch
Epoch 19/20  Iteration 3250/3560 Training loss: 1.2670 2.4811 sec/batch
Epoch 19/20  Iteration 3251/3560 Training loss: 1.2665 2.4537 sec/batch
Epoch 19/20  Iteration 3252/3560 Training loss: 1.2660 2.6628 sec/batch
Epoch 19/20  Iteration 3253/3560 Training loss: 1.2660 2.4837 sec/batch
Epoch 19/20  Iteration 3254/3560 Training loss: 1.2662 2.5009 sec/batch
Epoch 19/20  Iteration 3255/3560 Training loss: 1.2657 2.4546 sec/batch
Epoch 19/20  Iteration 3256/3560 Training loss: 1.2663 2.4615 sec/batch
Epoch 19/20  Iteration 3257/3560 Training loss: 1.2662 2.4566 sec/batch
Epoch 19/20  Iteration 3258/3560 Training loss: 1.2665 2.4687 sec/batch
Epoch 19/20  Iteration 3259/3560 Training loss: 1.2662 2.4570 sec/batch
Epoch 19/20  Iteration 3260/3560 Training loss: 1.2664 2.4517 sec/batch
Epoch 19/20  Iteration 3261/3560 Training loss: 1.2666 2.4547 sec/batch
Epoch 19/20  Iteration 3262/3560 Training loss: 1.2663 2.4746 sec/batch
Epoch 19/20  Iteration 3263/3560 Training loss: 1.2659 2.4454 sec/batch
Epoch 19/20  Iteration 3264/3560 Training loss: 1.2664 2.4580 sec/batch
Epoch 19/20  Iteration 3265/3560 Training loss: 1.2663 2.4512 sec/batch
Epoch 19/20  Iteration 3266/3560 Training loss: 1.2671 2.4629 sec/batch
Epoch 19/20  Iteration 3267/3560 Training loss: 1.2672 2.4455 sec/batch
Epoch 19/20  Iteration 3268/3560 Training loss: 1.2672 2.4483 sec/batch
Epoch 19/20  Iteration 3269/3560 Training loss: 1.2672 2.4615 sec/batch
Epoch 19/20  Iteration 3270/3560 Training loss: 1.2673 2.4590 sec/batch
Epoch 19/20  Iteration 3271/3560 Training loss: 1.2676 2.4572 sec/batch
Epoch 19/20  Iteration 3272/3560 Training loss: 1.2673 2.4706 sec/batch
Epoch 19/20  Iteration 3273/3560 Training loss: 1.2674 2.4595 sec/batch
Epoch 19/20  Iteration 3274/3560 Training loss: 1.2674 2.4631 sec/batch
Epoch 19/20  Iteration 3275/3560 Training loss: 1.2679 2.4544 sec/batch
Epoch 19/20  Iteration 3276/3560 Training loss: 1.2683 2.4659 sec/batch
Epoch 19/20  Iteration 3277/3560 Training loss: 1.2686 2.4515 sec/batch
Epoch 19/20  Iteration 3278/3560 Training loss: 1.2684 2.4676 sec/batch
Epoch 19/20  Iteration 3279/3560 Training loss: 1.2683 2.4471 sec/batch
Epoch 19/20  Iteration 3280/3560 Training loss: 1.2686 2.4689 sec/batch
Epoch 19/20  Iteration 3281/3560 Training loss: 1.2685 2.4477 sec/batch
Epoch 19/20  Iteration 3282/3560 Training loss: 1.2685 2.4714 sec/batch
Epoch 19/20  Iteration 3283/3560 Training loss: 1.2678 2.4611 sec/batch
Epoch 19/20  Iteration 3284/3560 Training loss: 1.2678 2.4482 sec/batch
Epoch 19/20  Iteration 3285/3560 Training loss: 1.2674 2.4608 sec/batch
Epoch 19/20  Iteration 3286/3560 Training loss: 1.2674 2.4533 sec/batch
Epoch 19/20  Iteration 3287/3560 Training loss: 1.2670 2.4463 sec/batch
Epoch 19/20  Iteration 3288/3560 Training loss: 1.2670 2.4763 sec/batch
Epoch 19/20  Iteration 3289/3560 Training loss: 1.2667 2.4477 sec/batch
Epoch 19/20  Iteration 3290/3560 Training loss: 1.2667 2.4707 sec/batch
Epoch 19/20  Iteration 3291/3560 Training loss: 1.2665 2.4547 sec/batch
Epoch 19/20  Iteration 3292/3560 Training loss: 1.2662 2.4549 sec/batch
Epoch 19/20  Iteration 3293/3560 Training loss: 1.2659 2.4548 sec/batch
Epoch 19/20  Iteration 3294/3560 Training loss: 1.2660 2.4674 sec/batch
Epoch 19/20  Iteration 3295/3560 Training loss: 1.2657 2.4792 sec/batch
Epoch 19/20  Iteration 3296/3560 Training loss: 1.2657 2.4560 sec/batch
Epoch 19/20  Iteration 3297/3560 Training loss: 1.2653 2.4649 sec/batch
Epoch 19/20  Iteration 3298/3560 Training loss: 1.2650 2.4584 sec/batch
Epoch 19/20  Iteration 3299/3560 Training loss: 1.2648 2.4632 sec/batch
Epoch 19/20  Iteration 3300/3560 Training loss: 1.2648 2.6542 sec/batch
Epoch 19/20  Iteration 3301/3560 Training loss: 1.2648 2.5081 sec/batch
Epoch 19/20  Iteration 3302/3560 Training loss: 1.2645 2.4661 sec/batch
Epoch 19/20  Iteration 3303/3560 Training loss: 1.2640 2.4482 sec/batch
Epoch 19/20  Iteration 3304/3560 Training loss: 1.2636 2.4694 sec/batch
Epoch 19/20  Iteration 3305/3560 Training loss: 1.2636 2.4498 sec/batch
Epoch 19/20  Iteration 3306/3560 Training loss: 1.2635 2.4623 sec/batch
Epoch 19/20  Iteration 3307/3560 Training loss: 1.2634 2.4753 sec/batch
Epoch 19/20  Iteration 3308/3560 Training loss: 1.2633 2.4466 sec/batch
Epoch 19/20  Iteration 3309/3560 Training loss: 1.2633 2.4596 sec/batch
Epoch 19/20  Iteration 3310/3560 Training loss: 1.2631 2.4561 sec/batch
Epoch 19/20  Iteration 3311/3560 Training loss: 1.2631 2.4631 sec/batch
Epoch 19/20  Iteration 3312/3560 Training loss: 1.2632 2.4595 sec/batch
Epoch 19/20  Iteration 3313/3560 Training loss: 1.2631 2.4494 sec/batch
Epoch 19/20  Iteration 3314/3560 Training loss: 1.2632 2.4758 sec/batch
Epoch 19/20  Iteration 3315/3560 Training loss: 1.2630 2.4513 sec/batch
Epoch 19/20  Iteration 3316/3560 Training loss: 1.2630 2.4590 sec/batch
Epoch 19/20  Iteration 3317/3560 Training loss: 1.2628 2.4718 sec/batch
Epoch 19/20  Iteration 3318/3560 Training loss: 1.2626 2.4784 sec/batch
Epoch 19/20  Iteration 3319/3560 Training loss: 1.2624 2.4728 sec/batch
Epoch 19/20  Iteration 3320/3560 Training loss: 1.2621 2.4494 sec/batch
Epoch 19/20  Iteration 3321/3560 Training loss: 1.2621 2.4915 sec/batch
Epoch 19/20  Iteration 3322/3560 Training loss: 1.2622 2.4451 sec/batch
Epoch 19/20  Iteration 3323/3560 Training loss: 1.2620 2.4638 sec/batch
Epoch 19/20  Iteration 3324/3560 Training loss: 1.2620 2.4473 sec/batch
Epoch 19/20  Iteration 3325/3560 Training loss: 1.2619 2.4871 sec/batch
Epoch 19/20  Iteration 3326/3560 Training loss: 1.2615 2.4599 sec/batch
Epoch 19/20  Iteration 3327/3560 Training loss: 1.2611 2.4485 sec/batch
Epoch 19/20  Iteration 3328/3560 Training loss: 1.2611 2.4629 sec/batch
Epoch 19/20  Iteration 3329/3560 Training loss: 1.2609 2.4677 sec/batch
Epoch 19/20  Iteration 3330/3560 Training loss: 1.2605 2.4651 sec/batch
Epoch 19/20  Iteration 3331/3560 Training loss: 1.2605 2.4706 sec/batch
Epoch 19/20  Iteration 3332/3560 Training loss: 1.2605 2.4616 sec/batch
Epoch 19/20  Iteration 3333/3560 Training loss: 1.2603 2.4584 sec/batch
Epoch 19/20  Iteration 3334/3560 Training loss: 1.2600 2.4621 sec/batch
Epoch 19/20  Iteration 3335/3560 Training loss: 1.2596 2.4518 sec/batch
Epoch 19/20  Iteration 3336/3560 Training loss: 1.2594 2.4521 sec/batch
Epoch 19/20  Iteration 3337/3560 Training loss: 1.2595 2.4559 sec/batch
Epoch 19/20  Iteration 3338/3560 Training loss: 1.2595 2.4433 sec/batch
Epoch 19/20  Iteration 3339/3560 Training loss: 1.2594 2.4656 sec/batch
Epoch 19/20  Iteration 3340/3560 Training loss: 1.2595 2.4556 sec/batch
Epoch 19/20  Iteration 3341/3560 Training loss: 1.2597 2.4493 sec/batch
Epoch 19/20  Iteration 3342/3560 Training loss: 1.2598 2.4737 sec/batch
Epoch 19/20  Iteration 3343/3560 Training loss: 1.2599 2.4785 sec/batch
Epoch 19/20  Iteration 3344/3560 Training loss: 1.2599 2.4599 sec/batch
Epoch 19/20  Iteration 3345/3560 Training loss: 1.2603 2.4525 sec/batch
Epoch 19/20  Iteration 3346/3560 Training loss: 1.2603 2.4563 sec/batch
Epoch 19/20  Iteration 3347/3560 Training loss: 1.2603 2.4581 sec/batch
Epoch 19/20  Iteration 3348/3560 Training loss: 1.2605 2.4524 sec/batch
Epoch 19/20  Iteration 3349/3560 Training loss: 1.2604 2.4832 sec/batch
Epoch 19/20  Iteration 3350/3560 Training loss: 1.2606 2.6928 sec/batch
Epoch 19/20  Iteration 3351/3560 Training loss: 1.2607 2.4521 sec/batch
Epoch 19/20  Iteration 3352/3560 Training loss: 1.2610 2.4631 sec/batch
Epoch 19/20  Iteration 3353/3560 Training loss: 1.2611 2.4568 sec/batch
Epoch 19/20  Iteration 3354/3560 Training loss: 1.2610 2.4555 sec/batch
Epoch 19/20  Iteration 3355/3560 Training loss: 1.2607 2.4686 sec/batch
Epoch 19/20  Iteration 3356/3560 Training loss: 1.2606 2.4690 sec/batch
Epoch 19/20  Iteration 3357/3560 Training loss: 1.2607 2.8623 sec/batch
Epoch 19/20  Iteration 3358/3560 Training loss: 1.2606 2.5405 sec/batch
Epoch 19/20  Iteration 3359/3560 Training loss: 1.2606 2.4541 sec/batch
Epoch 19/20  Iteration 3360/3560 Training loss: 1.2606 2.4489 sec/batch
Epoch 19/20  Iteration 3361/3560 Training loss: 1.2606 2.4635 sec/batch
Epoch 19/20  Iteration 3362/3560 Training loss: 1.2605 2.4450 sec/batch
Epoch 19/20  Iteration 3363/3560 Training loss: 1.2604 2.4634 sec/batch
Epoch 19/20  Iteration 3364/3560 Training loss: 1.2604 2.4668 sec/batch
Epoch 19/20  Iteration 3365/3560 Training loss: 1.2606 2.5113 sec/batch
Epoch 19/20  Iteration 3366/3560 Training loss: 1.2606 2.4835 sec/batch
Epoch 19/20  Iteration 3367/3560 Training loss: 1.2605 2.4637 sec/batch
Epoch 19/20  Iteration 3368/3560 Training loss: 1.2605 2.4760 sec/batch
Epoch 19/20  Iteration 3369/3560 Training loss: 1.2605 2.4653 sec/batch
Epoch 19/20  Iteration 3370/3560 Training loss: 1.2605 2.4697 sec/batch
Epoch 19/20  Iteration 3371/3560 Training loss: 1.2606 2.4437 sec/batch
Epoch 19/20  Iteration 3372/3560 Training loss: 1.2610 2.5443 sec/batch
Epoch 19/20  Iteration 3373/3560 Training loss: 1.2610 2.5543 sec/batch
Epoch 19/20  Iteration 3374/3560 Training loss: 1.2611 2.4450 sec/batch
Epoch 19/20  Iteration 3375/3560 Training loss: 1.2610 2.4564 sec/batch
Epoch 19/20  Iteration 3376/3560 Training loss: 1.2608 2.4386 sec/batch
Epoch 19/20  Iteration 3377/3560 Training loss: 1.2609 2.4712 sec/batch
Epoch 19/20  Iteration 3378/3560 Training loss: 1.2609 2.4510 sec/batch
Epoch 19/20  Iteration 3379/3560 Training loss: 1.2610 2.4678 sec/batch
Epoch 19/20  Iteration 3380/3560 Training loss: 1.2608 2.4632 sec/batch
Epoch 19/20  Iteration 3381/3560 Training loss: 1.2608 2.4564 sec/batch
Epoch 19/20  Iteration 3382/3560 Training loss: 1.2610 2.4611 sec/batch
Epoch 20/20  Iteration 3383/3560 Training loss: 1.3721 2.4458 sec/batch
Epoch 20/20  Iteration 3384/3560 Training loss: 1.3215 2.4535 sec/batch
Epoch 20/20  Iteration 3385/3560 Training loss: 1.3018 2.4527 sec/batch
Epoch 20/20  Iteration 3386/3560 Training loss: 1.2956 2.4494 sec/batch
Epoch 20/20  Iteration 3387/3560 Training loss: 1.2835 2.4567 sec/batch
Epoch 20/20  Iteration 3388/3560 Training loss: 1.2722 2.4619 sec/batch
Epoch 20/20  Iteration 3389/3560 Training loss: 1.2710 2.4536 sec/batch
Epoch 20/20  Iteration 3390/3560 Training loss: 1.2686 2.4527 sec/batch
Epoch 20/20  Iteration 3391/3560 Training loss: 1.2672 2.4734 sec/batch
Epoch 20/20  Iteration 3392/3560 Training loss: 1.2653 2.4673 sec/batch
Epoch 20/20  Iteration 3393/3560 Training loss: 1.2609 2.4507 sec/batch
Epoch 20/20  Iteration 3394/3560 Training loss: 1.2610 2.4657 sec/batch
Epoch 20/20  Iteration 3395/3560 Training loss: 1.2613 2.4555 sec/batch
Epoch 20/20  Iteration 3396/3560 Training loss: 1.2616 2.4470 sec/batch
Epoch 20/20  Iteration 3397/3560 Training loss: 1.2607 2.5856 sec/batch
Epoch 20/20  Iteration 3398/3560 Training loss: 1.2595 2.5517 sec/batch
Epoch 20/20  Iteration 3399/3560 Training loss: 1.2599 2.4788 sec/batch
Epoch 20/20  Iteration 3400/3560 Training loss: 1.2612 2.4531 sec/batch
Epoch 20/20  Iteration 3401/3560 Training loss: 1.2617 2.4669 sec/batch
Epoch 20/20  Iteration 3402/3560 Training loss: 1.2626 2.4498 sec/batch
Epoch 20/20  Iteration 3403/3560 Training loss: 1.2621 2.4482 sec/batch
Epoch 20/20  Iteration 3404/3560 Training loss: 1.2628 2.4841 sec/batch
Epoch 20/20  Iteration 3405/3560 Training loss: 1.2623 2.4521 sec/batch
Epoch 20/20  Iteration 3406/3560 Training loss: 1.2623 2.4627 sec/batch
Epoch 20/20  Iteration 3407/3560 Training loss: 1.2623 2.4545 sec/batch
Epoch 20/20  Iteration 3408/3560 Training loss: 1.2599 2.4462 sec/batch
Epoch 20/20  Iteration 3409/3560 Training loss: 1.2588 2.4632 sec/batch
Epoch 20/20  Iteration 3410/3560 Training loss: 1.2597 2.4458 sec/batch
Epoch 20/20  Iteration 3411/3560 Training loss: 1.2597 2.4632 sec/batch
Epoch 20/20  Iteration 3412/3560 Training loss: 1.2601 2.4445 sec/batch
Epoch 20/20  Iteration 3413/3560 Training loss: 1.2596 2.4505 sec/batch
Epoch 20/20  Iteration 3414/3560 Training loss: 1.2587 2.4716 sec/batch
Epoch 20/20  Iteration 3415/3560 Training loss: 1.2589 2.4611 sec/batch
Epoch 20/20  Iteration 3416/3560 Training loss: 1.2589 2.4760 sec/batch
Epoch 20/20  Iteration 3417/3560 Training loss: 1.2585 2.4449 sec/batch
Epoch 20/20  Iteration 3418/3560 Training loss: 1.2582 2.4623 sec/batch
Epoch 20/20  Iteration 3419/3560 Training loss: 1.2575 2.4528 sec/batch
Epoch 20/20  Iteration 3420/3560 Training loss: 1.2563 2.4621 sec/batch
Epoch 20/20  Iteration 3421/3560 Training loss: 1.2553 2.4509 sec/batch
Epoch 20/20  Iteration 3422/3560 Training loss: 1.2550 2.4541 sec/batch
Epoch 20/20  Iteration 3423/3560 Training loss: 1.2541 2.4621 sec/batch
Epoch 20/20  Iteration 3424/3560 Training loss: 1.2551 2.4505 sec/batch
Epoch 20/20  Iteration 3425/3560 Training loss: 1.2546 2.4773 sec/batch
Epoch 20/20  Iteration 3426/3560 Training loss: 1.2539 2.4504 sec/batch
Epoch 20/20  Iteration 3427/3560 Training loss: 1.2541 2.4537 sec/batch
Epoch 20/20  Iteration 3428/3560 Training loss: 1.2535 2.4781 sec/batch
Epoch 20/20  Iteration 3429/3560 Training loss: 1.2530 2.4536 sec/batch
Epoch 20/20  Iteration 3430/3560 Training loss: 1.2524 2.4719 sec/batch
Epoch 20/20  Iteration 3431/3560 Training loss: 1.2523 2.4479 sec/batch
Epoch 20/20  Iteration 3432/3560 Training loss: 1.2527 2.4508 sec/batch
Epoch 20/20  Iteration 3433/3560 Training loss: 1.2521 2.4584 sec/batch
Epoch 20/20  Iteration 3434/3560 Training loss: 1.2530 2.4567 sec/batch
Epoch 20/20  Iteration 3435/3560 Training loss: 1.2531 2.4543 sec/batch
Epoch 20/20  Iteration 3436/3560 Training loss: 1.2534 2.4601 sec/batch
Epoch 20/20  Iteration 3437/3560 Training loss: 1.2531 2.4579 sec/batch
Epoch 20/20  Iteration 3438/3560 Training loss: 1.2533 2.4514 sec/batch
Epoch 20/20  Iteration 3439/3560 Training loss: 1.2536 2.4531 sec/batch
Epoch 20/20  Iteration 3440/3560 Training loss: 1.2532 2.4779 sec/batch
Epoch 20/20  Iteration 3441/3560 Training loss: 1.2527 2.4497 sec/batch
Epoch 20/20  Iteration 3442/3560 Training loss: 1.2531 2.4612 sec/batch
Epoch 20/20  Iteration 3443/3560 Training loss: 1.2531 2.4636 sec/batch
Epoch 20/20  Iteration 3444/3560 Training loss: 1.2539 2.4581 sec/batch
Epoch 20/20  Iteration 3445/3560 Training loss: 1.2540 2.4526 sec/batch
Epoch 20/20  Iteration 3446/3560 Training loss: 1.2541 2.5045 sec/batch
Epoch 20/20  Iteration 3447/3560 Training loss: 1.2540 2.6035 sec/batch
Epoch 20/20  Iteration 3448/3560 Training loss: 1.2543 2.4833 sec/batch
Epoch 20/20  Iteration 3449/3560 Training loss: 1.2546 2.4492 sec/batch
Epoch 20/20  Iteration 3450/3560 Training loss: 1.2543 2.4572 sec/batch
Epoch 20/20  Iteration 3451/3560 Training loss: 1.2544 2.4593 sec/batch
Epoch 20/20  Iteration 3452/3560 Training loss: 1.2545 2.4703 sec/batch
Epoch 20/20  Iteration 3453/3560 Training loss: 1.2550 2.4469 sec/batch
Epoch 20/20  Iteration 3454/3560 Training loss: 1.2554 2.4670 sec/batch
Epoch 20/20  Iteration 3455/3560 Training loss: 1.2559 2.4564 sec/batch
Epoch 20/20  Iteration 3456/3560 Training loss: 1.2556 2.4633 sec/batch
Epoch 20/20  Iteration 3457/3560 Training loss: 1.2556 2.4516 sec/batch
Epoch 20/20  Iteration 3458/3560 Training loss: 1.2557 2.4625 sec/batch
Epoch 20/20  Iteration 3459/3560 Training loss: 1.2558 2.4573 sec/batch
Epoch 20/20  Iteration 3460/3560 Training loss: 1.2557 2.4531 sec/batch
Epoch 20/20  Iteration 3461/3560 Training loss: 1.2552 2.4621 sec/batch
Epoch 20/20  Iteration 3462/3560 Training loss: 1.2553 2.4478 sec/batch
Epoch 20/20  Iteration 3463/3560 Training loss: 1.2549 2.4727 sec/batch
Epoch 20/20  Iteration 3464/3560 Training loss: 1.2548 2.4647 sec/batch
Epoch 20/20  Iteration 3465/3560 Training loss: 1.2544 2.7913 sec/batch
Epoch 20/20  Iteration 3466/3560 Training loss: 1.2545 2.5994 sec/batch
Epoch 20/20  Iteration 3467/3560 Training loss: 1.2542 2.4505 sec/batch
Epoch 20/20  Iteration 3468/3560 Training loss: 1.2541 2.4699 sec/batch
Epoch 20/20  Iteration 3469/3560 Training loss: 1.2538 2.4605 sec/batch
Epoch 20/20  Iteration 3470/3560 Training loss: 1.2535 2.4565 sec/batch
Epoch 20/20  Iteration 3471/3560 Training loss: 1.2532 2.4634 sec/batch
Epoch 20/20  Iteration 3472/3560 Training loss: 1.2532 2.4491 sec/batch
Epoch 20/20  Iteration 3473/3560 Training loss: 1.2529 2.4702 sec/batch
Epoch 20/20  Iteration 3474/3560 Training loss: 1.2528 2.4591 sec/batch
Epoch 20/20  Iteration 3475/3560 Training loss: 1.2524 2.4612 sec/batch
Epoch 20/20  Iteration 3476/3560 Training loss: 1.2521 2.4667 sec/batch
Epoch 20/20  Iteration 3477/3560 Training loss: 1.2519 2.4527 sec/batch
Epoch 20/20  Iteration 3478/3560 Training loss: 1.2520 2.7107 sec/batch
Epoch 20/20  Iteration 3479/3560 Training loss: 1.2520 2.4509 sec/batch
Epoch 20/20  Iteration 3480/3560 Training loss: 1.2516 2.4696 sec/batch
Epoch 20/20  Iteration 3481/3560 Training loss: 1.2511 2.4517 sec/batch
Epoch 20/20  Iteration 3482/3560 Training loss: 1.2508 2.4646 sec/batch
Epoch 20/20  Iteration 3483/3560 Training loss: 1.2508 2.4457 sec/batch
Epoch 20/20  Iteration 3484/3560 Training loss: 1.2507 2.4594 sec/batch
Epoch 20/20  Iteration 3485/3560 Training loss: 1.2506 2.4484 sec/batch
Epoch 20/20  Iteration 3486/3560 Training loss: 1.2504 2.4481 sec/batch
Epoch 20/20  Iteration 3487/3560 Training loss: 1.2503 2.4601 sec/batch
Epoch 20/20  Iteration 3488/3560 Training loss: 1.2502 2.4700 sec/batch
Epoch 20/20  Iteration 3489/3560 Training loss: 1.2502 2.4671 sec/batch
Epoch 20/20  Iteration 3490/3560 Training loss: 1.2502 2.4431 sec/batch
Epoch 20/20  Iteration 3491/3560 Training loss: 1.2502 2.4671 sec/batch
Epoch 20/20  Iteration 3492/3560 Training loss: 1.2503 2.4742 sec/batch
Epoch 20/20  Iteration 3493/3560 Training loss: 1.2500 2.4505 sec/batch
Epoch 20/20  Iteration 3494/3560 Training loss: 1.2500 2.6503 sec/batch
Epoch 20/20  Iteration 3495/3560 Training loss: 1.2499 2.5614 sec/batch
Epoch 20/20  Iteration 3496/3560 Training loss: 1.2497 2.4716 sec/batch
Epoch 20/20  Iteration 3497/3560 Training loss: 1.2495 2.4499 sec/batch
Epoch 20/20  Iteration 3498/3560 Training loss: 1.2490 2.4546 sec/batch
Epoch 20/20  Iteration 3499/3560 Training loss: 1.2490 2.4762 sec/batch
Epoch 20/20  Iteration 3500/3560 Training loss: 1.2491 2.4703 sec/batch
Epoch 20/20  Iteration 3501/3560 Training loss: 1.2489 2.4721 sec/batch
Epoch 20/20  Iteration 3502/3560 Training loss: 1.2488 2.4491 sec/batch
Epoch 20/20  Iteration 3503/3560 Training loss: 1.2487 2.4516 sec/batch
Epoch 20/20  Iteration 3504/3560 Training loss: 1.2484 2.4612 sec/batch
Epoch 20/20  Iteration 3505/3560 Training loss: 1.2480 2.4505 sec/batch
Epoch 20/20  Iteration 3506/3560 Training loss: 1.2480 2.4734 sec/batch
Epoch 20/20  Iteration 3507/3560 Training loss: 1.2478 2.4567 sec/batch
Epoch 20/20  Iteration 3508/3560 Training loss: 1.2473 2.4604 sec/batch
Epoch 20/20  Iteration 3509/3560 Training loss: 1.2473 2.4684 sec/batch
Epoch 20/20  Iteration 3510/3560 Training loss: 1.2472 2.4486 sec/batch
Epoch 20/20  Iteration 3511/3560 Training loss: 1.2470 2.4637 sec/batch
Epoch 20/20  Iteration 3512/3560 Training loss: 1.2467 2.4594 sec/batch
Epoch 20/20  Iteration 3513/3560 Training loss: 1.2463 2.4751 sec/batch
Epoch 20/20  Iteration 3514/3560 Training loss: 1.2461 2.4666 sec/batch
Epoch 20/20  Iteration 3515/3560 Training loss: 1.2462 2.4539 sec/batch
Epoch 20/20  Iteration 3516/3560 Training loss: 1.2462 2.4726 sec/batch
Epoch 20/20  Iteration 3517/3560 Training loss: 1.2462 2.4501 sec/batch
Epoch 20/20  Iteration 3518/3560 Training loss: 1.2463 2.4599 sec/batch
Epoch 20/20  Iteration 3519/3560 Training loss: 1.2464 2.4608 sec/batch
Epoch 20/20  Iteration 3520/3560 Training loss: 1.2466 2.4527 sec/batch
Epoch 20/20  Iteration 3521/3560 Training loss: 1.2466 2.4607 sec/batch
Epoch 20/20  Iteration 3522/3560 Training loss: 1.2466 2.4472 sec/batch
Epoch 20/20  Iteration 3523/3560 Training loss: 1.2469 2.4576 sec/batch
Epoch 20/20  Iteration 3524/3560 Training loss: 1.2470 2.4476 sec/batch
Epoch 20/20  Iteration 3525/3560 Training loss: 1.2470 2.4808 sec/batch
Epoch 20/20  Iteration 3526/3560 Training loss: 1.2472 2.4525 sec/batch
Epoch 20/20  Iteration 3527/3560 Training loss: 1.2471 2.4515 sec/batch
Epoch 20/20  Iteration 3528/3560 Training loss: 1.2473 2.4634 sec/batch
Epoch 20/20  Iteration 3529/3560 Training loss: 1.2473 2.4468 sec/batch
Epoch 20/20  Iteration 3530/3560 Training loss: 1.2476 2.4643 sec/batch
Epoch 20/20  Iteration 3531/3560 Training loss: 1.2477 2.4456 sec/batch
Epoch 20/20  Iteration 3532/3560 Training loss: 1.2476 2.4663 sec/batch
Epoch 20/20  Iteration 3533/3560 Training loss: 1.2472 2.4477 sec/batch
Epoch 20/20  Iteration 3534/3560 Training loss: 1.2471 2.4516 sec/batch
Epoch 20/20  Iteration 3535/3560 Training loss: 1.2471 2.4581 sec/batch
Epoch 20/20  Iteration 3536/3560 Training loss: 1.2471 2.4570 sec/batch
Epoch 20/20  Iteration 3537/3560 Training loss: 1.2470 2.4764 sec/batch
Epoch 20/20  Iteration 3538/3560 Training loss: 1.2470 2.4534 sec/batch
Epoch 20/20  Iteration 3539/3560 Training loss: 1.2470 2.4701 sec/batch
Epoch 20/20  Iteration 3540/3560 Training loss: 1.2469 2.4567 sec/batch
Epoch 20/20  Iteration 3541/3560 Training loss: 1.2466 2.4520 sec/batch
Epoch 20/20  Iteration 3542/3560 Training loss: 1.2467 2.4584 sec/batch
Epoch 20/20  Iteration 3543/3560 Training loss: 1.2468 2.6327 sec/batch
Epoch 20/20  Iteration 3544/3560 Training loss: 1.2468 2.5265 sec/batch
Epoch 20/20  Iteration 3545/3560 Training loss: 1.2468 2.4494 sec/batch
Epoch 20/20  Iteration 3546/3560 Training loss: 1.2467 2.4488 sec/batch
Epoch 20/20  Iteration 3547/3560 Training loss: 1.2467 2.4623 sec/batch
Epoch 20/20  Iteration 3548/3560 Training loss: 1.2467 2.4597 sec/batch
Epoch 20/20  Iteration 3549/3560 Training loss: 1.2469 2.4883 sec/batch
Epoch 20/20  Iteration 3550/3560 Training loss: 1.2472 2.4452 sec/batch
Epoch 20/20  Iteration 3551/3560 Training loss: 1.2473 2.4683 sec/batch
Epoch 20/20  Iteration 3552/3560 Training loss: 1.2473 2.4536 sec/batch
Epoch 20/20  Iteration 3553/3560 Training loss: 1.2473 2.4443 sec/batch
Epoch 20/20  Iteration 3554/3560 Training loss: 1.2471 2.4559 sec/batch
Epoch 20/20  Iteration 3555/3560 Training loss: 1.2472 2.4376 sec/batch
Epoch 20/20  Iteration 3556/3560 Training loss: 1.2472 2.4694 sec/batch
Epoch 20/20  Iteration 3557/3560 Training loss: 1.2472 2.4526 sec/batch
Epoch 20/20  Iteration 3558/3560 Training loss: 1.2471 2.4482 sec/batch
Epoch 20/20  Iteration 3559/3560 Training loss: 1.2470 2.4582 sec/batch
Epoch 20/20  Iteration 3560/3560 Training loss: 1.2472 2.4510 sec/batch
Epoch 1/20  Iteration 1/3560 Training loss: 4.4184 5.9151 sec/batch
Epoch 1/20  Iteration 2/3560 Training loss: 4.3203 5.9017 sec/batch
Epoch 1/20  Iteration 3/3560 Training loss: 5.0684 5.8663 sec/batch
Epoch 1/20  Iteration 4/3560 Training loss: 4.8961 5.9055 sec/batch
Epoch 1/20  Iteration 5/3560 Training loss: 4.6781 5.8664 sec/batch
Epoch 1/20  Iteration 6/3560 Training loss: 4.5064 5.8733 sec/batch
Epoch 1/20  Iteration 7/3560 Training loss: 4.3697 5.8702 sec/batch
Epoch 1/20  Iteration 8/3560 Training loss: 4.2602 5.8800 sec/batch
Epoch 1/20  Iteration 9/3560 Training loss: 4.1673 5.8704 sec/batch
Epoch 1/20  Iteration 10/3560 Training loss: 4.0897 5.8973 sec/batch
Epoch 1/20  Iteration 11/3560 Training loss: 4.0225 5.8708 sec/batch
Epoch 1/20  Iteration 12/3560 Training loss: 3.9644 5.8815 sec/batch
Epoch 1/20  Iteration 13/3560 Training loss: 3.9125 6.0811 sec/batch
Epoch 1/20  Iteration 14/3560 Training loss: 3.8682 5.9163 sec/batch
Epoch 1/20  Iteration 15/3560 Training loss: 3.8280 5.8608 sec/batch
Epoch 1/20  Iteration 16/3560 Training loss: 3.7928 5.8651 sec/batch
Epoch 1/20  Iteration 17/3560 Training loss: 3.7615 6.2742 sec/batch
Epoch 1/20  Iteration 18/3560 Training loss: 3.7342 5.9051 sec/batch
Epoch 1/20  Iteration 19/3560 Training loss: 3.7088 5.8513 sec/batch
Epoch 1/20  Iteration 20/3560 Training loss: 3.6835 5.8328 sec/batch
Epoch 1/20  Iteration 21/3560 Training loss: 3.6611 5.8869 sec/batch
Epoch 1/20  Iteration 22/3560 Training loss: 3.6411 5.8886 sec/batch
Epoch 1/20  Iteration 23/3560 Training loss: 3.6226 5.8663 sec/batch
Epoch 1/20  Iteration 24/3560 Training loss: 3.6055 5.8218 sec/batch
Epoch 1/20  Iteration 25/3560 Training loss: 3.5892 5.8722 sec/batch
Epoch 1/20  Iteration 26/3560 Training loss: 3.5746 5.8764 sec/batch
Epoch 1/20  Iteration 27/3560 Training loss: 3.5609 5.8479 sec/batch
Epoch 1/20  Iteration 28/3560 Training loss: 3.5473 5.8627 sec/batch
Epoch 1/20  Iteration 29/3560 Training loss: 3.5348 5.8600 sec/batch
Epoch 1/20  Iteration 30/3560 Training loss: 3.5233 5.9277 sec/batch
Epoch 1/20  Iteration 31/3560 Training loss: 3.5131 5.8517 sec/batch
Epoch 1/20  Iteration 32/3560 Training loss: 3.5024 5.9200 sec/batch
Epoch 1/20  Iteration 33/3560 Training loss: 3.4922 5.8755 sec/batch
Epoch 1/20  Iteration 34/3560 Training loss: 3.4830 6.1180 sec/batch
Epoch 1/20  Iteration 35/3560 Training loss: 3.4739 5.8288 sec/batch
Epoch 1/20  Iteration 36/3560 Training loss: 3.4656 5.9085 sec/batch
Epoch 1/20  Iteration 37/3560 Training loss: 3.4569 5.8416 sec/batch
Epoch 1/20  Iteration 38/3560 Training loss: 3.4487 5.8939 sec/batch
Epoch 1/20  Iteration 39/3560 Training loss: 3.4407 5.8795 sec/batch
Epoch 1/20  Iteration 40/3560 Training loss: 3.4333 5.8448 sec/batch
Epoch 1/20  Iteration 41/3560 Training loss: 3.4259 5.8126 sec/batch
Epoch 1/20  Iteration 42/3560 Training loss: 3.4191 5.8458 sec/batch
Epoch 1/20  Iteration 43/3560 Training loss: 3.4123 5.8468 sec/batch
Epoch 1/20  Iteration 44/3560 Training loss: 3.4059 5.8722 sec/batch
Epoch 1/20  Iteration 45/3560 Training loss: 3.3996 5.8938 sec/batch
Epoch 1/20  Iteration 46/3560 Training loss: 3.3939 5.8792 sec/batch
Epoch 1/20  Iteration 47/3560 Training loss: 3.3883 5.8869 sec/batch
Epoch 1/20  Iteration 48/3560 Training loss: 3.3830 5.8572 sec/batch
Epoch 1/20  Iteration 49/3560 Training loss: 3.3779 5.8848 sec/batch
Epoch 1/20  Iteration 50/3560 Training loss: 3.3730 5.8459 sec/batch
Epoch 1/20  Iteration 51/3560 Training loss: 3.3679 5.8944 sec/batch
Epoch 1/20  Iteration 52/3560 Training loss: 3.3628 5.8955 sec/batch
Epoch 1/20  Iteration 53/3560 Training loss: 3.3582 5.8670 sec/batch
Epoch 1/20  Iteration 54/3560 Training loss: 3.3534 6.1242 sec/batch
Epoch 1/20  Iteration 55/3560 Training loss: 3.3487 5.8877 sec/batch
Epoch 1/20  Iteration 56/3560 Training loss: 3.3438 5.8865 sec/batch
Epoch 1/20  Iteration 57/3560 Training loss: 3.3394 5.8514 sec/batch
Epoch 1/20  Iteration 58/3560 Training loss: 3.3349 5.8384 sec/batch
Epoch 1/20  Iteration 59/3560 Training loss: 3.3303 5.8682 sec/batch
Epoch 1/20  Iteration 60/3560 Training loss: 3.3260 5.8829 sec/batch
Epoch 1/20  Iteration 61/3560 Training loss: 3.3217 5.8470 sec/batch
Epoch 1/20  Iteration 62/3560 Training loss: 3.3176 5.9133 sec/batch
Epoch 1/20  Iteration 63/3560 Training loss: 3.3137 5.8595 sec/batch
Epoch 1/20  Iteration 64/3560 Training loss: 3.3089 5.8701 sec/batch
Epoch 1/20  Iteration 65/3560 Training loss: 3.3044 5.8679 sec/batch
Epoch 1/20  Iteration 66/3560 Training loss: 3.3002 5.8433 sec/batch
Epoch 1/20  Iteration 67/3560 Training loss: 3.2959 6.0172 sec/batch
Epoch 1/20  Iteration 68/3560 Training loss: 3.2908 5.9541 sec/batch
Epoch 1/20  Iteration 69/3560 Training loss: 3.2862 5.9268 sec/batch
Epoch 1/20  Iteration 70/3560 Training loss: 3.2819 5.8823 sec/batch
Epoch 1/20  Iteration 71/3560 Training loss: 3.2772 5.8416 sec/batch
Epoch 1/20  Iteration 72/3560 Training loss: 3.2728 5.9011 sec/batch
Epoch 1/20  Iteration 73/3560 Training loss: 3.2681 5.8926 sec/batch
Epoch 1/20  Iteration 74/3560 Training loss: 3.2638 6.0158 sec/batch
Epoch 1/20  Iteration 75/3560 Training loss: 3.2593 5.9348 sec/batch
Epoch 1/20  Iteration 76/3560 Training loss: 3.2548 5.8829 sec/batch
Epoch 1/20  Iteration 77/3560 Training loss: 3.2501 5.8973 sec/batch
Epoch 1/20  Iteration 78/3560 Training loss: 3.2454 5.8368 sec/batch
Epoch 1/20  Iteration 79/3560 Training loss: 3.2421 5.8588 sec/batch
Epoch 1/20  Iteration 80/3560 Training loss: 3.2382 5.8922 sec/batch
Epoch 1/20  Iteration 81/3560 Training loss: 3.2341 5.8797 sec/batch
Epoch 1/20  Iteration 82/3560 Training loss: 3.2299 5.8742 sec/batch
Epoch 1/20  Iteration 83/3560 Training loss: 3.2255 5.8774 sec/batch
Epoch 1/20  Iteration 84/3560 Training loss: 3.2208 5.8345 sec/batch
Epoch 1/20  Iteration 85/3560 Training loss: 3.2158 5.8179 sec/batch
Epoch 1/20  Iteration 86/3560 Training loss: 3.2109 5.8706 sec/batch
Epoch 1/20  Iteration 87/3560 Training loss: 3.2058 5.9297 sec/batch
Epoch 1/20  Iteration 88/3560 Training loss: 3.2008 5.8891 sec/batch
Epoch 1/20  Iteration 89/3560 Training loss: 3.1958 5.8172 sec/batch
Epoch 1/20  Iteration 90/3560 Training loss: 3.1908 5.8914 sec/batch
Epoch 1/20  Iteration 91/3560 Training loss: 3.1858 5.8963 sec/batch
Epoch 1/20  Iteration 92/3560 Training loss: 3.1817 5.8406 sec/batch
Epoch 1/20  Iteration 93/3560 Training loss: 3.1802 5.8699 sec/batch
Epoch 1/20  Iteration 94/3560 Training loss: 3.1782 5.8387 sec/batch
Epoch 1/20  Iteration 95/3560 Training loss: 3.1756 6.0982 sec/batch
Epoch 1/20  Iteration 96/3560 Training loss: 3.1722 5.8395 sec/batch
Epoch 1/20  Iteration 97/3560 Training loss: 3.1677 5.8587 sec/batch
Epoch 1/20  Iteration 98/3560 Training loss: 3.1629 5.8632 sec/batch
Epoch 1/20  Iteration 99/3560 Training loss: 3.1582 5.8757 sec/batch
Epoch 1/20  Iteration 100/3560 Training loss: 3.1535 5.8459 sec/batch
Epoch 1/20  Iteration 101/3560 Training loss: 3.1489 5.8762 sec/batch
Epoch 1/20  Iteration 102/3560 Training loss: 3.1442 5.8267 sec/batch
Epoch 1/20  Iteration 103/3560 Training loss: 3.1393 5.8939 sec/batch
Epoch 1/20  Iteration 104/3560 Training loss: 3.1343 5.8377 sec/batch
Epoch 1/20  Iteration 105/3560 Training loss: 3.1294 5.8811 sec/batch
Epoch 1/20  Iteration 106/3560 Training loss: 3.1246 5.8527 sec/batch
Epoch 1/20  Iteration 107/3560 Training loss: 3.1197 5.8684 sec/batch
Epoch 1/20  Iteration 108/3560 Training loss: 3.1151 5.8371 sec/batch
Epoch 1/20  Iteration 109/3560 Training loss: 3.1104 5.8252 sec/batch
Epoch 1/20  Iteration 110/3560 Training loss: 3.1053 5.8542 sec/batch
Epoch 1/20  Iteration 111/3560 Training loss: 3.1007 5.9094 sec/batch
Epoch 1/20  Iteration 112/3560 Training loss: 3.0960 5.9197 sec/batch
Epoch 1/20  Iteration 113/3560 Training loss: 3.0912 5.8979 sec/batch
Epoch 1/20  Iteration 114/3560 Training loss: 3.0865 5.8905 sec/batch
Epoch 1/20  Iteration 115/3560 Training loss: 3.0822 6.1250 sec/batch
Epoch 1/20  Iteration 116/3560 Training loss: 3.0773 5.9167 sec/batch
Epoch 1/20  Iteration 117/3560 Training loss: 3.0728 5.8836 sec/batch
Epoch 1/20  Iteration 118/3560 Training loss: 3.0683 6.2906 sec/batch
Epoch 1/20  Iteration 119/3560 Training loss: 3.0641 6.1088 sec/batch
Epoch 1/20  Iteration 120/3560 Training loss: 3.0596 5.8497 sec/batch
Epoch 1/20  Iteration 121/3560 Training loss: 3.0554 5.9122 sec/batch
Epoch 1/20  Iteration 122/3560 Training loss: 3.0512 5.9285 sec/batch
Epoch 1/20  Iteration 123/3560 Training loss: 3.0467 5.9312 sec/batch
Epoch 1/20  Iteration 124/3560 Training loss: 3.0425 5.8731 sec/batch
Epoch 1/20  Iteration 125/3560 Training loss: 3.0383 5.8636 sec/batch
Epoch 1/20  Iteration 126/3560 Training loss: 3.0338 5.8532 sec/batch
Epoch 1/20  Iteration 127/3560 Training loss: 3.0297 5.8750 sec/batch
Epoch 1/20  Iteration 128/3560 Training loss: 3.0256 5.8388 sec/batch
Epoch 1/20  Iteration 129/3560 Training loss: 3.0214 5.9172 sec/batch
Epoch 1/20  Iteration 130/3560 Training loss: 3.0172 5.8848 sec/batch
Epoch 1/20  Iteration 131/3560 Training loss: 3.0131 5.8501 sec/batch
Epoch 1/20  Iteration 132/3560 Training loss: 3.0088 5.8262 sec/batch
Epoch 1/20  Iteration 133/3560 Training loss: 3.0048 5.8809 sec/batch
Epoch 1/20  Iteration 134/3560 Training loss: 3.0008 5.8747 sec/batch
Epoch 1/20  Iteration 135/3560 Training loss: 2.9965 6.0046 sec/batch
Epoch 1/20  Iteration 136/3560 Training loss: 2.9925 6.0062 sec/batch
Epoch 1/20  Iteration 137/3560 Training loss: 2.9884 5.8406 sec/batch
Epoch 1/20  Iteration 138/3560 Training loss: 2.9845 5.8726 sec/batch
Epoch 1/20  Iteration 139/3560 Training loss: 2.9807 5.8643 sec/batch
Epoch 1/20  Iteration 140/3560 Training loss: 2.9767 5.8524 sec/batch
Epoch 1/20  Iteration 141/3560 Training loss: 2.9729 5.8978 sec/batch
Epoch 1/20  Iteration 142/3560 Training loss: 2.9689 5.8399 sec/batch
Epoch 1/20  Iteration 143/3560 Training loss: 2.9651 5.8407 sec/batch
Epoch 1/20  Iteration 144/3560 Training loss: 2.9612 5.8923 sec/batch
Epoch 1/20  Iteration 145/3560 Training loss: 2.9574 5.8680 sec/batch
Epoch 1/20  Iteration 146/3560 Training loss: 2.9539 5.8285 sec/batch
Epoch 1/20  Iteration 147/3560 Training loss: 2.9501 5.8090 sec/batch
Epoch 1/20  Iteration 148/3560 Training loss: 2.9467 5.8773 sec/batch
Epoch 1/20  Iteration 149/3560 Training loss: 2.9429 5.8784 sec/batch
Epoch 1/20  Iteration 150/3560 Training loss: 2.9392 5.8932 sec/batch
Epoch 1/20  Iteration 151/3560 Training loss: 2.9358 5.8477 sec/batch
Epoch 1/20  Iteration 152/3560 Training loss: 2.9325 5.8203 sec/batch
Epoch 1/20  Iteration 153/3560 Training loss: 2.9290 5.8886 sec/batch
Epoch 1/20  Iteration 154/3560 Training loss: 2.9256 5.8491 sec/batch
Epoch 1/20  Iteration 155/3560 Training loss: 2.9219 5.8521 sec/batch
Epoch 1/20  Iteration 156/3560 Training loss: 2.9184 6.1439 sec/batch
Epoch 1/20  Iteration 157/3560 Training loss: 2.9148 5.8744 sec/batch
Epoch 1/20  Iteration 158/3560 Training loss: 2.9113 5.8896 sec/batch
Epoch 1/20  Iteration 159/3560 Training loss: 2.9078 5.8524 sec/batch
Epoch 1/20  Iteration 160/3560 Training loss: 2.9045 5.8701 sec/batch
Epoch 1/20  Iteration 161/3560 Training loss: 2.9011 5.8588 sec/batch
Epoch 1/20  Iteration 162/3560 Training loss: 2.8975 5.8334 sec/batch
Epoch 1/20  Iteration 163/3560 Training loss: 2.8941 5.9229 sec/batch
Epoch 1/20  Iteration 164/3560 Training loss: 2.8907 5.8363 sec/batch
Epoch 1/20  Iteration 165/3560 Training loss: 2.8874 5.9079 sec/batch
Epoch 1/20  Iteration 166/3560 Training loss: 2.8841 5.8803 sec/batch
Epoch 1/20  Iteration 167/3560 Training loss: 2.8808 5.8966 sec/batch
Epoch 1/20  Iteration 168/3560 Training loss: 2.8776 5.8848 sec/batch
Epoch 1/20  Iteration 169/3560 Training loss: 2.8743 6.3930 sec/batch
Epoch 1/20  Iteration 170/3560 Training loss: 2.8709 6.1534 sec/batch
Epoch 1/20  Iteration 171/3560 Training loss: 2.8677 5.9531 sec/batch
Epoch 1/20  Iteration 172/3560 Training loss: 2.8647 5.9215 sec/batch
Epoch 1/20  Iteration 173/3560 Training loss: 2.8617 5.8788 sec/batch
Epoch 1/20  Iteration 174/3560 Training loss: 2.8587 5.8748 sec/batch
Epoch 1/20  Iteration 175/3560 Training loss: 2.8558 5.9506 sec/batch
Epoch 1/20  Iteration 176/3560 Training loss: 2.8526 6.0878 sec/batch
Epoch 1/20  Iteration 177/3560 Training loss: 2.8494 5.8441 sec/batch
Epoch 1/20  Iteration 178/3560 Training loss: 2.8462 5.8521 sec/batch
Epoch 2/20  Iteration 179/3560 Training loss: 2.3367 5.8764 sec/batch
Epoch 2/20  Iteration 180/3560 Training loss: 2.2951 5.9216 sec/batch
Epoch 2/20  Iteration 181/3560 Training loss: 2.2853 5.8585 sec/batch
Epoch 2/20  Iteration 182/3560 Training loss: 2.2811 5.8778 sec/batch
Epoch 2/20  Iteration 183/3560 Training loss: 2.2791 5.9005 sec/batch
Epoch 2/20  Iteration 184/3560 Training loss: 2.2755 6.1015 sec/batch
Epoch 2/20  Iteration 185/3560 Training loss: 2.2743 5.8817 sec/batch
Epoch 2/20  Iteration 186/3560 Training loss: 2.2738 5.8304 sec/batch
Epoch 2/20  Iteration 187/3560 Training loss: 2.2745 5.8360 sec/batch
Epoch 2/20  Iteration 188/3560 Training loss: 2.2724 5.8636 sec/batch
Epoch 2/20  Iteration 189/3560 Training loss: 2.2689 5.8997 sec/batch
Epoch 2/20  Iteration 190/3560 Training loss: 2.2661 5.8460 sec/batch
Epoch 2/20  Iteration 191/3560 Training loss: 2.2651 5.8351 sec/batch
Epoch 2/20  Iteration 192/3560 Training loss: 2.2653 5.8363 sec/batch
Epoch 2/20  Iteration 193/3560 Training loss: 2.2640 5.9580 sec/batch
Epoch 2/20  Iteration 194/3560 Training loss: 2.2624 5.8931 sec/batch
Epoch 2/20  Iteration 195/3560 Training loss: 2.2601 5.9265 sec/batch
Epoch 2/20  Iteration 196/3560 Training loss: 2.2603 6.0424 sec/batch
Epoch 2/20  Iteration 197/3560 Training loss: 2.2590 5.9454 sec/batch
Epoch 2/20  Iteration 198/3560 Training loss: 2.2563 5.8780 sec/batch
Epoch 2/20  Iteration 199/3560 Training loss: 2.2536 5.8971 sec/batch
Epoch 2/20  Iteration 200/3560 Training loss: 2.2534 5.8739 sec/batch
Epoch 2/20  Iteration 201/3560 Training loss: 2.2515 5.8786 sec/batch
Epoch 2/20  Iteration 202/3560 Training loss: 2.2490 5.8589 sec/batch
Epoch 2/20  Iteration 203/3560 Training loss: 2.2473 5.8804 sec/batch
Epoch 2/20  Iteration 204/3560 Training loss: 2.2458 5.8598 sec/batch
Epoch 2/20  Iteration 205/3560 Training loss: 2.2438 5.8589 sec/batch
Epoch 2/20  Iteration 206/3560 Training loss: 2.2424 5.8730 sec/batch
Epoch 2/20  Iteration 207/3560 Training loss: 2.2415 5.9266 sec/batch
Epoch 2/20  Iteration 208/3560 Training loss: 2.2400 5.9245 sec/batch
Epoch 2/20  Iteration 209/3560 Training loss: 2.2387 5.8864 sec/batch
Epoch 2/20  Iteration 210/3560 Training loss: 2.2365 5.8804 sec/batch
Epoch 2/20  Iteration 211/3560 Training loss: 2.2342 5.9155 sec/batch
Epoch 2/20  Iteration 212/3560 Training loss: 2.2334 5.8556 sec/batch
Epoch 2/20  Iteration 213/3560 Training loss: 2.2314 5.8797 sec/batch
Epoch 2/20  Iteration 214/3560 Training loss: 2.2296 5.8961 sec/batch
Epoch 2/20  Iteration 215/3560 Training loss: 2.2277 5.8686 sec/batch
Epoch 2/20  Iteration 216/3560 Training loss: 2.2249 5.8935 sec/batch
Epoch 2/20  Iteration 217/3560 Training loss: 2.2224 6.1244 sec/batch
Epoch 2/20  Iteration 218/3560 Training loss: 2.2201 5.8705 sec/batch
Epoch 2/20  Iteration 219/3560 Training loss: 2.2179 5.9348 sec/batch
Epoch 2/20  Iteration 220/3560 Training loss: 2.2160 6.0950 sec/batch
Epoch 2/20  Iteration 221/3560 Training loss: 2.2135 5.9142 sec/batch
Epoch 2/20  Iteration 222/3560 Training loss: 2.2114 5.9432 sec/batch
Epoch 2/20  Iteration 223/3560 Training loss: 2.2094 5.9022 sec/batch
Epoch 2/20  Iteration 224/3560 Training loss: 2.2065 5.8722 sec/batch
Epoch 2/20  Iteration 225/3560 Training loss: 2.2049 5.8820 sec/batch
Epoch 2/20  Iteration 226/3560 Training loss: 2.2029 5.8760 sec/batch
Epoch 2/20  Iteration 227/3560 Training loss: 2.2012 5.8758 sec/batch
Epoch 2/20  Iteration 228/3560 Training loss: 2.2001 5.8381 sec/batch
Epoch 2/20  Iteration 229/3560 Training loss: 2.1977 5.8881 sec/batch
Epoch 2/20  Iteration 230/3560 Training loss: 2.1965 5.8817 sec/batch
Epoch 2/20  Iteration 231/3560 Training loss: 2.1944 5.8489 sec/batch
Epoch 2/20  Iteration 232/3560 Training loss: 2.1927 5.8377 sec/batch
Epoch 2/20  Iteration 233/3560 Training loss: 2.1908 5.8818 sec/batch
Epoch 2/20  Iteration 234/3560 Training loss: 2.1893 5.9091 sec/batch
Epoch 2/20  Iteration 235/3560 Training loss: 2.1878 5.9027 sec/batch
Epoch 2/20  Iteration 236/3560 Training loss: 2.1860 5.9450 sec/batch
Epoch 2/20  Iteration 237/3560 Training loss: 2.1843 6.1062 sec/batch
Epoch 2/20  Iteration 238/3560 Training loss: 2.1833 5.8916 sec/batch
Epoch 2/20  Iteration 239/3560 Training loss: 2.1815 5.9147 sec/batch
Epoch 2/20  Iteration 240/3560 Training loss: 2.1804 5.8164 sec/batch
Epoch 2/20  Iteration 241/3560 Training loss: 2.1793 5.8691 sec/batch
Epoch 2/20  Iteration 242/3560 Training loss: 2.1779 5.8801 sec/batch
Epoch 2/20  Iteration 243/3560 Training loss: 2.1763 5.9123 sec/batch
Epoch 2/20  Iteration 244/3560 Training loss: 2.1751 5.8265 sec/batch
Epoch 2/20  Iteration 245/3560 Training loss: 2.1737 5.8476 sec/batch
Epoch 2/20  Iteration 246/3560 Training loss: 2.1718 5.8834 sec/batch
Epoch 2/20  Iteration 247/3560 Training loss: 2.1702 5.8249 sec/batch
Epoch 2/20  Iteration 248/3560 Training loss: 2.1688 5.9010 sec/batch
Epoch 2/20  Iteration 249/3560 Training loss: 2.1676 5.8654 sec/batch
Epoch 2/20  Iteration 250/3560 Training loss: 2.1663 5.8766 sec/batch
Epoch 2/20  Iteration 251/3560 Training loss: 2.1651 5.8787 sec/batch
Epoch 2/20  Iteration 252/3560 Training loss: 2.1635 5.8757 sec/batch
Epoch 2/20  Iteration 253/3560 Training loss: 2.1621 5.8943 sec/batch
Epoch 2/20  Iteration 254/3560 Training loss: 2.1610 5.8821 sec/batch
Epoch 2/20  Iteration 255/3560 Training loss: 2.1595 5.8650 sec/batch
Epoch 2/20  Iteration 256/3560 Training loss: 2.1582 5.8845 sec/batch
Epoch 2/20  Iteration 257/3560 Training loss: 2.1563 6.2178 sec/batch
Epoch 2/20  Iteration 258/3560 Training loss: 2.1547 6.2653 sec/batch
Epoch 2/20  Iteration 259/3560 Training loss: 2.1528 6.3528 sec/batch
Epoch 2/20  Iteration 260/3560 Training loss: 2.1515 6.5764 sec/batch
Epoch 2/20  Iteration 261/3560 Training loss: 2.1497 6.4480 sec/batch
Epoch 2/20  Iteration 262/3560 Training loss: 2.1482 6.4049 sec/batch
Epoch 2/20  Iteration 263/3560 Training loss: 2.1462 5.9597 sec/batch
Epoch 2/20  Iteration 264/3560 Training loss: 2.1446 6.1382 sec/batch
Epoch 2/20  Iteration 265/3560 Training loss: 2.1432 5.8900 sec/batch
Epoch 2/20  Iteration 266/3560 Training loss: 2.1414 6.6320 sec/batch
Epoch 2/20  Iteration 267/3560 Training loss: 2.1397 6.2241 sec/batch
Epoch 2/20  Iteration 268/3560 Training loss: 2.1386 5.9646 sec/batch
Epoch 2/20  Iteration 269/3560 Training loss: 2.1369 5.9247 sec/batch
Epoch 2/20  Iteration 270/3560 Training loss: 2.1356 6.2838 sec/batch
Epoch 2/20  Iteration 271/3560 Training loss: 2.1337 5.9538 sec/batch
Epoch 2/20  Iteration 272/3560 Training loss: 2.1321 6.1905 sec/batch
Epoch 2/20  Iteration 273/3560 Training loss: 2.1305 6.0068 sec/batch
Epoch 2/20  Iteration 274/3560 Training loss: 2.1289 5.9452 sec/batch
Epoch 2/20  Iteration 275/3560 Training loss: 2.1276 5.8841 sec/batch
Epoch 2/20  Iteration 276/3560 Training loss: 2.1261 6.0447 sec/batch
Epoch 2/20  Iteration 277/3560 Training loss: 2.1244 6.0639 sec/batch
Epoch 2/20  Iteration 278/3560 Training loss: 2.1228 5.8886 sec/batch
Epoch 2/20  Iteration 279/3560 Training loss: 2.1216 5.9000 sec/batch
Epoch 2/20  Iteration 280/3560 Training loss: 2.1202 5.8959 sec/batch
Epoch 2/20  Iteration 281/3560 Training loss: 2.1186 5.8836 sec/batch
Epoch 2/20  Iteration 282/3560 Training loss: 2.1171 5.8990 sec/batch
Epoch 2/20  Iteration 283/3560 Training loss: 2.1155 5.8467 sec/batch
Epoch 2/20  Iteration 284/3560 Training loss: 2.1141 5.9131 sec/batch
Epoch 2/20  Iteration 285/3560 Training loss: 2.1128 5.8788 sec/batch
Epoch 2/20  Iteration 286/3560 Training loss: 2.1115 5.8765 sec/batch
Epoch 2/20  Iteration 287/3560 Training loss: 2.1103 5.8861 sec/batch
Epoch 2/20  Iteration 288/3560 Training loss: 2.1089 5.8718 sec/batch
Epoch 2/20  Iteration 289/3560 Training loss: 2.1076 5.9377 sec/batch
Epoch 2/20  Iteration 290/3560 Training loss: 2.1062 5.8811 sec/batch
Epoch 2/20  Iteration 291/3560 Training loss: 2.1049 5.9037 sec/batch
Epoch 2/20  Iteration 292/3560 Training loss: 2.1035 6.0863 sec/batch
Epoch 2/20  Iteration 293/3560 Training loss: 2.1020 5.8552 sec/batch
Epoch 2/20  Iteration 294/3560 Training loss: 2.1005 5.9408 sec/batch
Epoch 2/20  Iteration 295/3560 Training loss: 2.0992 5.9100 sec/batch
Epoch 2/20  Iteration 296/3560 Training loss: 2.0979 5.9174 sec/batch
Epoch 2/20  Iteration 297/3560 Training loss: 2.0966 6.1719 sec/batch
Epoch 2/20  Iteration 298/3560 Training loss: 2.0954 5.9032 sec/batch
Epoch 2/20  Iteration 299/3560 Training loss: 2.0942 5.9201 sec/batch
Epoch 2/20  Iteration 300/3560 Training loss: 2.0928 5.9043 sec/batch
Epoch 2/20  Iteration 301/3560 Training loss: 2.0914 5.8819 sec/batch
Epoch 2/20  Iteration 302/3560 Training loss: 2.0904 5.8584 sec/batch
Epoch 2/20  Iteration 303/3560 Training loss: 2.0892 5.8854 sec/batch
Epoch 2/20  Iteration 304/3560 Training loss: 2.0877 5.8975 sec/batch
Epoch 2/20  Iteration 305/3560 Training loss: 2.0867 6.1090 sec/batch
Epoch 2/20  Iteration 306/3560 Training loss: 2.0856 5.9424 sec/batch
Epoch 2/20  Iteration 307/3560 Training loss: 2.0844 5.8467 sec/batch
Epoch 2/20  Iteration 308/3560 Training loss: 2.0834 5.8916 sec/batch
Epoch 2/20  Iteration 309/3560 Training loss: 2.0821 5.9001 sec/batch
Epoch 2/20  Iteration 310/3560 Training loss: 2.0808 5.9136 sec/batch
Epoch 2/20  Iteration 311/3560 Training loss: 2.0797 5.8865 sec/batch
Epoch 2/20  Iteration 312/3560 Training loss: 2.0787 5.8708 sec/batch
Epoch 2/20  Iteration 313/3560 Training loss: 2.0776 5.9344 sec/batch
Epoch 2/20  Iteration 314/3560 Training loss: 2.0766 5.8910 sec/batch
Epoch 2/20  Iteration 315/3560 Training loss: 2.0755 5.8800 sec/batch
Epoch 2/20  Iteration 316/3560 Training loss: 2.0744 5.8911 sec/batch
Epoch 2/20  Iteration 317/3560 Training loss: 2.0735 6.2266 sec/batch
Epoch 2/20  Iteration 318/3560 Training loss: 2.0724 5.8843 sec/batch
Epoch 2/20  Iteration 319/3560 Training loss: 2.0715 5.8713 sec/batch
Epoch 2/20  Iteration 320/3560 Training loss: 2.0703 5.9884 sec/batch
Epoch 2/20  Iteration 321/3560 Training loss: 2.0692 6.2589 sec/batch
Epoch 2/20  Iteration 322/3560 Training loss: 2.0681 5.9560 sec/batch
Epoch 2/20  Iteration 323/3560 Training loss: 2.0669 5.8814 sec/batch
Epoch 2/20  Iteration 324/3560 Training loss: 2.0660 5.9444 sec/batch
Epoch 2/20  Iteration 325/3560 Training loss: 2.0649 5.9352 sec/batch
Epoch 2/20  Iteration 326/3560 Training loss: 2.0640 5.8486 sec/batch
Epoch 2/20  Iteration 327/3560 Training loss: 2.0629 5.8896 sec/batch
Epoch 2/20  Iteration 328/3560 Training loss: 2.0617 5.9021 sec/batch
Epoch 2/20  Iteration 329/3560 Training loss: 2.0605 5.8936 sec/batch
Epoch 2/20  Iteration 330/3560 Training loss: 2.0596 5.8995 sec/batch
Epoch 2/20  Iteration 331/3560 Training loss: 2.0586 5.8495 sec/batch
Epoch 2/20  Iteration 332/3560 Training loss: 2.0575 5.8878 sec/batch
Epoch 2/20  Iteration 333/3560 Training loss: 2.0566 5.9339 sec/batch
Epoch 2/20  Iteration 334/3560 Training loss: 2.0554 5.8727 sec/batch
Epoch 2/20  Iteration 335/3560 Training loss: 2.0544 5.8953 sec/batch
Epoch 2/20  Iteration 336/3560 Training loss: 2.0532 5.8998 sec/batch
Epoch 2/20  Iteration 337/3560 Training loss: 2.0519 6.0976 sec/batch
Epoch 2/20  Iteration 338/3560 Training loss: 2.0511 5.9903 sec/batch
Epoch 2/20  Iteration 339/3560 Training loss: 2.0503 5.8669 sec/batch
Epoch 2/20  Iteration 340/3560 Training loss: 2.0493 5.9290 sec/batch
Epoch 2/20  Iteration 341/3560 Training loss: 2.0484 5.9069 sec/batch
Epoch 2/20  Iteration 342/3560 Training loss: 2.0474 5.9021 sec/batch
Epoch 2/20  Iteration 343/3560 Training loss: 2.0464 5.8868 sec/batch
Epoch 2/20  Iteration 344/3560 Training loss: 2.0453 5.9051 sec/batch
Epoch 2/20  Iteration 345/3560 Training loss: 2.0443 5.8747 sec/batch
Epoch 2/20  Iteration 346/3560 Training loss: 2.0436 5.8926 sec/batch
Epoch 2/20  Iteration 347/3560 Training loss: 2.0425 5.8965 sec/batch
Epoch 2/20  Iteration 348/3560 Training loss: 2.0415 5.8791 sec/batch
Epoch 2/20  Iteration 349/3560 Training loss: 2.0403 5.8595 sec/batch
Epoch 2/20  Iteration 350/3560 Training loss: 2.0392 5.8717 sec/batch
Epoch 2/20  Iteration 351/3560 Training loss: 2.0384 5.8745 sec/batch
Epoch 2/20  Iteration 352/3560 Training loss: 2.0374 5.8428 sec/batch
Epoch 2/20  Iteration 353/3560 Training loss: 2.0365 5.8401 sec/batch
Epoch 2/20  Iteration 354/3560 Training loss: 2.0355 5.9488 sec/batch
Epoch 2/20  Iteration 355/3560 Training loss: 2.0344 5.8871 sec/batch
Epoch 2/20  Iteration 356/3560 Training loss: 2.0335 5.8588 sec/batch
Epoch 3/20  Iteration 357/3560 Training loss: 1.9332 5.9669 sec/batch
Epoch 3/20  Iteration 358/3560 Training loss: 1.8850 6.1505 sec/batch
Epoch 3/20  Iteration 359/3560 Training loss: 1.8704 5.8981 sec/batch
Epoch 3/20  Iteration 360/3560 Training loss: 1.8643 5.8916 sec/batch
Epoch 3/20  Iteration 361/3560 Training loss: 1.8592 5.9110 sec/batch
Epoch 3/20  Iteration 362/3560 Training loss: 1.8492 5.8734 sec/batch
Epoch 3/20  Iteration 363/3560 Training loss: 1.8484 5.8858 sec/batch
Epoch 3/20  Iteration 364/3560 Training loss: 1.8467 5.8564 sec/batch
Epoch 3/20  Iteration 365/3560 Training loss: 1.8495 5.9083 sec/batch
Epoch 3/20  Iteration 366/3560 Training loss: 1.8490 5.9161 sec/batch
Epoch 3/20  Iteration 367/3560 Training loss: 1.8460 5.9117 sec/batch
Epoch 3/20  Iteration 368/3560 Training loss: 1.8435 5.9435 sec/batch
Epoch 3/20  Iteration 369/3560 Training loss: 1.8428 5.9264 sec/batch
Epoch 3/20  Iteration 370/3560 Training loss: 1.8437 5.9863 sec/batch
Epoch 3/20  Iteration 371/3560 Training loss: 1.8415 6.1000 sec/batch
Epoch 3/20  Iteration 372/3560 Training loss: 1.8397 5.9069 sec/batch
Epoch 3/20  Iteration 373/3560 Training loss: 1.8389 5.9141 sec/batch
Epoch 3/20  Iteration 374/3560 Training loss: 1.8401 5.9159 sec/batch
Epoch 3/20  Iteration 375/3560 Training loss: 1.8393 5.8658 sec/batch
Epoch 3/20  Iteration 376/3560 Training loss: 1.8388 5.8297 sec/batch
Epoch 3/20  Iteration 377/3560 Training loss: 1.8371 5.9012 sec/batch
Epoch 3/20  Iteration 378/3560 Training loss: 1.8381 6.1003 sec/batch
Epoch 3/20  Iteration 379/3560 Training loss: 1.8366 5.8707 sec/batch
Epoch 3/20  Iteration 380/3560 Training loss: 1.8349 5.8832 sec/batch
Epoch 3/20  Iteration 381/3560 Training loss: 1.8336 5.8844 sec/batch
Epoch 3/20  Iteration 382/3560 Training loss: 1.8318 5.8507 sec/batch
Epoch 3/20  Iteration 383/3560 Training loss: 1.8303 5.8730 sec/batch
Epoch 3/20  Iteration 384/3560 Training loss: 1.8302 5.8880 sec/batch
Epoch 3/20  Iteration 385/3560 Training loss: 1.8306 5.8640 sec/batch
Epoch 3/20  Iteration 386/3560 Training loss: 1.8301 5.8879 sec/batch
Epoch 3/20  Iteration 387/3560 Training loss: 1.8296 5.8658 sec/batch
Epoch 3/20  Iteration 388/3560 Training loss: 1.8279 5.8729 sec/batch
Epoch 3/20  Iteration 389/3560 Training loss: 1.8274 5.8735 sec/batch
Epoch 3/20  Iteration 390/3560 Training loss: 1.8274 5.8340 sec/batch
Epoch 3/20  Iteration 391/3560 Training loss: 1.8264 5.8508 sec/batch
Epoch 3/20  Iteration 392/3560 Training loss: 1.8253 5.8286 sec/batch
Epoch 3/20  Iteration 393/3560 Training loss: 1.8239 5.8872 sec/batch
Epoch 3/20  Iteration 394/3560 Training loss: 1.8223 5.8553 sec/batch
Epoch 3/20  Iteration 395/3560 Training loss: 1.8204 5.8924 sec/batch
Epoch 3/20  Iteration 396/3560 Training loss: 1.8191 5.8900 sec/batch
Epoch 3/20  Iteration 397/3560 Training loss: 1.8178 5.9071 sec/batch
Epoch 3/20  Iteration 398/3560 Training loss: 1.8173 6.0200 sec/batch
Epoch 3/20  Iteration 399/3560 Training loss: 1.8160 6.0004 sec/batch
Epoch 3/20  Iteration 400/3560 Training loss: 1.8143 5.8880 sec/batch
Epoch 3/20  Iteration 401/3560 Training loss: 1.8138 5.8885 sec/batch
Epoch 3/20  Iteration 402/3560 Training loss: 1.8120 5.8330 sec/batch
Epoch 3/20  Iteration 403/3560 Training loss: 1.8110 5.8077 sec/batch
Epoch 3/20  Iteration 404/3560 Training loss: 1.8098 6.0331 sec/batch
Epoch 3/20  Iteration 405/3560 Training loss: 1.8089 5.9121 sec/batch
Epoch 3/20  Iteration 406/3560 Training loss: 1.8091 5.8757 sec/batch
Epoch 3/20  Iteration 407/3560 Training loss: 1.8079 5.8669 sec/batch
Epoch 3/20  Iteration 408/3560 Training loss: 1.8080 5.8647 sec/batch
Epoch 3/20  Iteration 409/3560 Training loss: 1.8071 5.8971 sec/batch
Epoch 3/20  Iteration 410/3560 Training loss: 1.8064 5.8630 sec/batch
Epoch 3/20  Iteration 411/3560 Training loss: 1.8055 5.8873 sec/batch
Epoch 3/20  Iteration 412/3560 Training loss: 1.8049 5.8618 sec/batch
Epoch 3/20  Iteration 413/3560 Training loss: 1.8044 5.8935 sec/batch
Epoch 3/20  Iteration 414/3560 Training loss: 1.8035 5.8524 sec/batch
Epoch 3/20  Iteration 415/3560 Training loss: 1.8024 5.8563 sec/batch
Epoch 3/20  Iteration 416/3560 Training loss: 1.8023 5.9019 sec/batch
Epoch 3/20  Iteration 417/3560 Training loss: 1.8016 5.8538 sec/batch
Epoch 3/20  Iteration 418/3560 Training loss: 1.8017 5.8888 sec/batch
Epoch 3/20  Iteration 419/3560 Training loss: 1.8015 6.1744 sec/batch
Epoch 3/20  Iteration 420/3560 Training loss: 1.8012 5.8504 sec/batch
Epoch 3/20  Iteration 421/3560 Training loss: 1.8004 5.8350 sec/batch
Epoch 3/20  Iteration 422/3560 Training loss: 1.8003 6.3115 sec/batch
Epoch 3/20  Iteration 423/3560 Training loss: 1.7998 5.9221 sec/batch
Epoch 3/20  Iteration 424/3560 Training loss: 1.7990 5.9693 sec/batch
Epoch 3/20  Iteration 425/3560 Training loss: 1.7982 5.8712 sec/batch
Epoch 3/20  Iteration 426/3560 Training loss: 1.7976 5.8863 sec/batch
Epoch 3/20  Iteration 427/3560 Training loss: 1.7975 5.8896 sec/batch
Epoch 3/20  Iteration 428/3560 Training loss: 1.7971 5.8658 sec/batch
Epoch 3/20  Iteration 429/3560 Training loss: 1.7969 5.8771 sec/batch
Epoch 3/20  Iteration 430/3560 Training loss: 1.7961 5.8609 sec/batch
Epoch 3/20  Iteration 431/3560 Training loss: 1.7952 5.8786 sec/batch
Epoch 3/20  Iteration 432/3560 Training loss: 1.7949 5.8258 sec/batch
Epoch 3/20  Iteration 433/3560 Training loss: 1.7942 5.8917 sec/batch
Epoch 3/20  Iteration 434/3560 Training loss: 1.7937 5.8340 sec/batch
Epoch 3/20  Iteration 435/3560 Training loss: 1.7926 5.8747 sec/batch
Epoch 3/20  Iteration 436/3560 Training loss: 1.7919 5.8534 sec/batch
Epoch 3/20  Iteration 437/3560 Training loss: 1.7908 5.8668 sec/batch
Epoch 3/20  Iteration 438/3560 Training loss: 1.7905 5.8494 sec/batch
Epoch 3/20  Iteration 439/3560 Training loss: 1.7895 6.1486 sec/batch
Epoch 3/20  Iteration 440/3560 Training loss: 1.7891 5.8145 sec/batch
Epoch 3/20  Iteration 441/3560 Training loss: 1.7881 5.8459 sec/batch
Epoch 3/20  Iteration 442/3560 Training loss: 1.7873 5.8542 sec/batch
Epoch 3/20  Iteration 443/3560 Training loss: 1.7865 5.8658 sec/batch
Epoch 3/20  Iteration 444/3560 Training loss: 1.7857 5.8375 sec/batch
Epoch 3/20  Iteration 445/3560 Training loss: 1.7846 5.8670 sec/batch
Epoch 3/20  Iteration 446/3560 Training loss: 1.7842 5.8687 sec/batch
Epoch 3/20  Iteration 447/3560 Training loss: 1.7834 5.8732 sec/batch
Epoch 3/20  Iteration 448/3560 Training loss: 1.7827 5.8616 sec/batch
Epoch 3/20  Iteration 449/3560 Training loss: 1.7817 5.8949 sec/batch
Epoch 3/20  Iteration 450/3560 Training loss: 1.7809 5.8382 sec/batch
Epoch 3/20  Iteration 451/3560 Training loss: 1.7801 5.8268 sec/batch
Epoch 3/20  Iteration 452/3560 Training loss: 1.7794 5.8543 sec/batch
Epoch 3/20  Iteration 453/3560 Training loss: 1.7788 5.8391 sec/batch
Epoch 3/20  Iteration 454/3560 Training loss: 1.7778 5.8408 sec/batch
Epoch 3/20  Iteration 455/3560 Training loss: 1.7769 5.8257 sec/batch
Epoch 3/20  Iteration 456/3560 Training loss: 1.7759 5.8667 sec/batch
Epoch 3/20  Iteration 457/3560 Training loss: 1.7752 5.8659 sec/batch
Epoch 3/20  Iteration 458/3560 Training loss: 1.7747 5.8902 sec/batch
Epoch 3/20  Iteration 459/3560 Training loss: 1.7740 5.8904 sec/batch
Epoch 3/20  Iteration 460/3560 Training loss: 1.7733 6.1051 sec/batch
Epoch 3/20  Iteration 461/3560 Training loss: 1.7725 5.8867 sec/batch
Epoch 3/20  Iteration 462/3560 Training loss: 1.7720 5.8847 sec/batch
Epoch 3/20  Iteration 463/3560 Training loss: 1.7714 5.8655 sec/batch
Epoch 3/20  Iteration 464/3560 Training loss: 1.7708 5.8604 sec/batch
Epoch 3/20  Iteration 465/3560 Training loss: 1.7703 5.8382 sec/batch
Epoch 3/20  Iteration 466/3560 Training loss: 1.7698 5.8478 sec/batch
Epoch 3/20  Iteration 467/3560 Training loss: 1.7691 5.8346 sec/batch
Epoch 3/20  Iteration 468/3560 Training loss: 1.7683 5.7930 sec/batch
Epoch 3/20  Iteration 469/3560 Training loss: 1.7676 5.8577 sec/batch
Epoch 3/20  Iteration 470/3560 Training loss: 1.7670 5.8695 sec/batch
Epoch 3/20  Iteration 471/3560 Training loss: 1.7662 5.8842 sec/batch
Epoch 3/20  Iteration 472/3560 Training loss: 1.7652 5.9092 sec/batch
Epoch 3/20  Iteration 473/3560 Training loss: 1.7647 6.1935 sec/batch
Epoch 3/20  Iteration 474/3560 Training loss: 1.7640 5.8726 sec/batch
Epoch 3/20  Iteration 475/3560 Training loss: 1.7634 5.8793 sec/batch
Epoch 3/20  Iteration 476/3560 Training loss: 1.7627 5.9363 sec/batch
Epoch 3/20  Iteration 477/3560 Training loss: 1.7621 5.8743 sec/batch
Epoch 3/20  Iteration 478/3560 Training loss: 1.7612 5.9067 sec/batch
Epoch 3/20  Iteration 479/3560 Training loss: 1.7604 5.9230 sec/batch
Epoch 3/20  Iteration 480/3560 Training loss: 1.7600 6.1136 sec/batch
Epoch 3/20  Iteration 481/3560 Training loss: 1.7593 6.0018 sec/batch
Epoch 3/20  Iteration 482/3560 Training loss: 1.7584 5.8755 sec/batch
Epoch 3/20  Iteration 483/3560 Training loss: 1.7579 5.9059 sec/batch
Epoch 3/20  Iteration 484/3560 Training loss: 1.7575 5.8933 sec/batch
Epoch 3/20  Iteration 485/3560 Training loss: 1.7568 5.8633 sec/batch
Epoch 3/20  Iteration 486/3560 Training loss: 1.7561 5.8607 sec/batch
Epoch 3/20  Iteration 487/3560 Training loss: 1.7552 5.8686 sec/batch
Epoch 3/20  Iteration 488/3560 Training loss: 1.7545 5.9112 sec/batch
Epoch 3/20  Iteration 489/3560 Training loss: 1.7540 5.9025 sec/batch
Epoch 3/20  Iteration 490/3560 Training loss: 1.7535 5.8781 sec/batch
Epoch 3/20  Iteration 491/3560 Training loss: 1.7529 5.8949 sec/batch
Epoch 3/20  Iteration 492/3560 Training loss: 1.7525 5.9477 sec/batch
Epoch 3/20  Iteration 493/3560 Training loss: 1.7520 5.8461 sec/batch
Epoch 3/20  Iteration 494/3560 Training loss: 1.7516 5.8872 sec/batch
Epoch 3/20  Iteration 495/3560 Training loss: 1.7512 5.9236 sec/batch
Epoch 3/20  Iteration 496/3560 Training loss: 1.7506 5.8961 sec/batch
Epoch 3/20  Iteration 497/3560 Training loss: 1.7504 5.8666 sec/batch
Epoch 3/20  Iteration 498/3560 Training loss: 1.7498 5.8646 sec/batch
Epoch 3/20  Iteration 499/3560 Training loss: 1.7492 5.8922 sec/batch
Epoch 3/20  Iteration 500/3560 Training loss: 1.7487 6.1795 sec/batch
Epoch 3/20  Iteration 501/3560 Training loss: 1.7481 5.9264 sec/batch
Epoch 3/20  Iteration 502/3560 Training loss: 1.7478 5.9010 sec/batch
Epoch 3/20  Iteration 503/3560 Training loss: 1.7472 5.8643 sec/batch
Epoch 3/20  Iteration 504/3560 Training loss: 1.7469 5.8555 sec/batch
Epoch 3/20  Iteration 505/3560 Training loss: 1.7464 5.8867 sec/batch
Epoch 3/20  Iteration 506/3560 Training loss: 1.7459 5.8655 sec/batch
Epoch 3/20  Iteration 507/3560 Training loss: 1.7450 5.9486 sec/batch
Epoch 3/20  Iteration 508/3560 Training loss: 1.7446 5.9144 sec/batch
Epoch 3/20  Iteration 509/3560 Training loss: 1.7441 5.8994 sec/batch
Epoch 3/20  Iteration 510/3560 Training loss: 1.7436 5.9370 sec/batch
Epoch 3/20  Iteration 511/3560 Training loss: 1.7431 5.8520 sec/batch
Epoch 3/20  Iteration 512/3560 Training loss: 1.7426 5.9018 sec/batch
Epoch 3/20  Iteration 513/3560 Training loss: 1.7422 5.9006 sec/batch
Epoch 3/20  Iteration 514/3560 Training loss: 1.7416 5.8679 sec/batch
Epoch 3/20  Iteration 515/3560 Training loss: 1.7409 5.8495 sec/batch
Epoch 3/20  Iteration 516/3560 Training loss: 1.7406 5.8622 sec/batch
Epoch 3/20  Iteration 517/3560 Training loss: 1.7403 5.9190 sec/batch
Epoch 3/20  Iteration 518/3560 Training loss: 1.7398 5.9427 sec/batch
Epoch 3/20  Iteration 519/3560 Training loss: 1.7393 5.8780 sec/batch
Epoch 3/20  Iteration 520/3560 Training loss: 1.7388 6.2153 sec/batch
Epoch 3/20  Iteration 521/3560 Training loss: 1.7383 5.9185 sec/batch
Epoch 3/20  Iteration 522/3560 Training loss: 1.7377 5.8741 sec/batch
Epoch 3/20  Iteration 523/3560 Training loss: 1.7374 5.8954 sec/batch
Epoch 3/20  Iteration 524/3560 Training loss: 1.7373 6.3585 sec/batch
Epoch 3/20  Iteration 525/3560 Training loss: 1.7367 5.9186 sec/batch
Epoch 3/20  Iteration 526/3560 Training loss: 1.7362 5.9237 sec/batch
Epoch 3/20  Iteration 527/3560 Training loss: 1.7356 5.9525 sec/batch
Epoch 3/20  Iteration 528/3560 Training loss: 1.7350 5.8833 sec/batch
Epoch 3/20  Iteration 529/3560 Training loss: 1.7346 5.8888 sec/batch
Epoch 3/20  Iteration 530/3560 Training loss: 1.7342 5.9141 sec/batch
Epoch 3/20  Iteration 531/3560 Training loss: 1.7338 5.8988 sec/batch
Epoch 3/20  Iteration 532/3560 Training loss: 1.7332 5.9016 sec/batch
Epoch 3/20  Iteration 533/3560 Training loss: 1.7326 5.9273 sec/batch
Epoch 3/20  Iteration 534/3560 Training loss: 1.7322 5.8616 sec/batch
Epoch 4/20  Iteration 535/3560 Training loss: 1.7202 5.8663 sec/batch
Epoch 4/20  Iteration 536/3560 Training loss: 1.6796 5.9296 sec/batch
Epoch 4/20  Iteration 537/3560 Training loss: 1.6651 6.0299 sec/batch
Epoch 4/20  Iteration 538/3560 Training loss: 1.6596 5.9239 sec/batch
Epoch 4/20  Iteration 539/3560 Training loss: 1.6532 5.9129 sec/batch
Epoch 4/20  Iteration 540/3560 Training loss: 1.6424 6.1257 sec/batch
Epoch 4/20  Iteration 541/3560 Training loss: 1.6416 5.9284 sec/batch
Epoch 4/20  Iteration 542/3560 Training loss: 1.6396 5.8353 sec/batch
Epoch 4/20  Iteration 543/3560 Training loss: 1.6415 5.8950 sec/batch
Epoch 4/20  Iteration 544/3560 Training loss: 1.6397 5.8812 sec/batch
Epoch 4/20  Iteration 545/3560 Training loss: 1.6363 5.9176 sec/batch
Epoch 4/20  Iteration 546/3560 Training loss: 1.6338 5.8843 sec/batch
Epoch 4/20  Iteration 547/3560 Training loss: 1.6335 5.8457 sec/batch
Epoch 4/20  Iteration 548/3560 Training loss: 1.6360 5.9370 sec/batch
Epoch 4/20  Iteration 549/3560 Training loss: 1.6343 5.9047 sec/batch
Epoch 4/20  Iteration 550/3560 Training loss: 1.6326 5.8918 sec/batch
Epoch 4/20  Iteration 551/3560 Training loss: 1.6322 5.9097 sec/batch
Epoch 4/20  Iteration 552/3560 Training loss: 1.6338 5.8913 sec/batch
Epoch 4/20  Iteration 553/3560 Training loss: 1.6334 5.8709 sec/batch
Epoch 4/20  Iteration 554/3560 Training loss: 1.6340 5.8858 sec/batch
Epoch 4/20  Iteration 555/3560 Training loss: 1.6334 5.9278 sec/batch
Epoch 4/20  Iteration 556/3560 Training loss: 1.6346 5.9320 sec/batch
Epoch 4/20  Iteration 557/3560 Training loss: 1.6336 5.8900 sec/batch
Epoch 4/20  Iteration 558/3560 Training loss: 1.6333 5.8847 sec/batch
Epoch 4/20  Iteration 559/3560 Training loss: 1.6328 5.9142 sec/batch
Epoch 4/20  Iteration 560/3560 Training loss: 1.6313 6.0250 sec/batch
Epoch 4/20  Iteration 561/3560 Training loss: 1.6299 6.1317 sec/batch
Epoch 4/20  Iteration 562/3560 Training loss: 1.6302 5.8543 sec/batch
Epoch 4/20  Iteration 563/3560 Training loss: 1.6308 5.9109 sec/batch
Epoch 4/20  Iteration 564/3560 Training loss: 1.6307 5.8931 sec/batch
Epoch 4/20  Iteration 565/3560 Training loss: 1.6302 5.8745 sec/batch
Epoch 4/20  Iteration 566/3560 Training loss: 1.6291 5.8900 sec/batch
Epoch 4/20  Iteration 567/3560 Training loss: 1.6292 5.8770 sec/batch
Epoch 4/20  Iteration 568/3560 Training loss: 1.6293 5.8886 sec/batch
Epoch 4/20  Iteration 569/3560 Training loss: 1.6285 5.9327 sec/batch
Epoch 4/20  Iteration 570/3560 Training loss: 1.6280 5.9029 sec/batch
Epoch 4/20  Iteration 571/3560 Training loss: 1.6271 5.8883 sec/batch
Epoch 4/20  Iteration 572/3560 Training loss: 1.6255 5.8788 sec/batch
Epoch 4/20  Iteration 573/3560 Training loss: 1.6237 5.8642 sec/batch
Epoch 4/20  Iteration 574/3560 Training loss: 1.6227 6.1787 sec/batch
Epoch 4/20  Iteration 575/3560 Training loss: 1.6218 5.8680 sec/batch
Epoch 4/20  Iteration 576/3560 Training loss: 1.6217 5.9642 sec/batch
Epoch 4/20  Iteration 577/3560 Training loss: 1.6208 5.9632 sec/batch
Epoch 4/20  Iteration 578/3560 Training loss: 1.6197 5.9345 sec/batch
Epoch 4/20  Iteration 579/3560 Training loss: 1.6195 5.9081 sec/batch
Epoch 4/20  Iteration 580/3560 Training loss: 1.6185 5.9297 sec/batch
Epoch 4/20  Iteration 581/3560 Training loss: 1.6179 6.1557 sec/batch
Epoch 4/20  Iteration 582/3560 Training loss: 1.6170 5.8705 sec/batch
Epoch 4/20  Iteration 583/3560 Training loss: 1.6164 5.8874 sec/batch
Epoch 4/20  Iteration 584/3560 Training loss: 1.6164 5.8660 sec/batch
Epoch 4/20  Iteration 585/3560 Training loss: 1.6158 5.9379 sec/batch
Epoch 4/20  Iteration 586/3560 Training loss: 1.6163 5.8719 sec/batch
Epoch 4/20  Iteration 587/3560 Training loss: 1.6159 5.8264 sec/batch
Epoch 4/20  Iteration 588/3560 Training loss: 1.6155 5.8708 sec/batch
Epoch 4/20  Iteration 589/3560 Training loss: 1.6149 5.9066 sec/batch
Epoch 4/20  Iteration 590/3560 Training loss: 1.6145 5.8677 sec/batch
Epoch 4/20  Iteration 591/3560 Training loss: 1.6147 5.8303 sec/batch
Epoch 4/20  Iteration 592/3560 Training loss: 1.6139 5.8389 sec/batch
Epoch 4/20  Iteration 593/3560 Training loss: 1.6133 5.8883 sec/batch
Epoch 4/20  Iteration 594/3560 Training loss: 1.6134 5.8376 sec/batch
Epoch 4/20  Iteration 595/3560 Training loss: 1.6132 5.8623 sec/batch
Epoch 4/20  Iteration 596/3560 Training loss: 1.6138 5.9255 sec/batch
Epoch 4/20  Iteration 597/3560 Training loss: 1.6139 5.8478 sec/batch
Epoch 4/20  Iteration 598/3560 Training loss: 1.6138 5.8953 sec/batch
Epoch 4/20  Iteration 599/3560 Training loss: 1.6134 5.8717 sec/batch
Epoch 4/20  Iteration 600/3560 Training loss: 1.6133 5.8472 sec/batch
Epoch 4/20  Iteration 601/3560 Training loss: 1.6132 6.0596 sec/batch
Epoch 4/20  Iteration 602/3560 Training loss: 1.6127 5.9421 sec/batch
Epoch 4/20  Iteration 603/3560 Training loss: 1.6125 5.9123 sec/batch
Epoch 4/20  Iteration 604/3560 Training loss: 1.6121 5.8462 sec/batch
Epoch 4/20  Iteration 605/3560 Training loss: 1.6123 5.8513 sec/batch
Epoch 4/20  Iteration 606/3560 Training loss: 1.6122 5.8352 sec/batch
Epoch 4/20  Iteration 607/3560 Training loss: 1.6123 5.8549 sec/batch
Epoch 4/20  Iteration 608/3560 Training loss: 1.6117 5.8489 sec/batch
Epoch 4/20  Iteration 609/3560 Training loss: 1.6113 5.8803 sec/batch
Epoch 4/20  Iteration 610/3560 Training loss: 1.6112 5.8816 sec/batch
Epoch 4/20  Iteration 611/3560 Training loss: 1.6106 5.8822 sec/batch
Epoch 4/20  Iteration 612/3560 Training loss: 1.6103 5.8478 sec/batch
Epoch 4/20  Iteration 613/3560 Training loss: 1.6095 5.8419 sec/batch
Epoch 4/20  Iteration 614/3560 Training loss: 1.6092 5.8975 sec/batch
Epoch 4/20  Iteration 615/3560 Training loss: 1.6083 5.9170 sec/batch
Epoch 4/20  Iteration 616/3560 Training loss: 1.6080 5.8729 sec/batch
Epoch 4/20  Iteration 617/3560 Training loss: 1.6073 5.8537 sec/batch
Epoch 4/20  Iteration 618/3560 Training loss: 1.6072 5.8282 sec/batch
Epoch 4/20  Iteration 619/3560 Training loss: 1.6066 5.8901 sec/batch
Epoch 4/20  Iteration 620/3560 Training loss: 1.6060 5.8863 sec/batch
Epoch 4/20  Iteration 621/3560 Training loss: 1.6055 5.8600 sec/batch
Epoch 4/20  Iteration 622/3560 Training loss: 1.6049 6.1036 sec/batch
Epoch 4/20  Iteration 623/3560 Training loss: 1.6042 5.9353 sec/batch
Epoch 4/20  Iteration 624/3560 Training loss: 1.6042 5.8529 sec/batch
Epoch 4/20  Iteration 625/3560 Training loss: 1.6036 6.2164 sec/batch
Epoch 4/20  Iteration 626/3560 Training loss: 1.6031 5.9002 sec/batch
Epoch 4/20  Iteration 627/3560 Training loss: 1.6025 5.9865 sec/batch
Epoch 4/20  Iteration 628/3560 Training loss: 1.6018 5.8904 sec/batch
Epoch 4/20  Iteration 629/3560 Training loss: 1.6013 5.8854 sec/batch
Epoch 4/20  Iteration 630/3560 Training loss: 1.6011 5.8406 sec/batch
Epoch 4/20  Iteration 631/3560 Training loss: 1.6006 5.9116 sec/batch
Epoch 4/20  Iteration 632/3560 Training loss: 1.5999 5.8462 sec/batch
Epoch 4/20  Iteration 633/3560 Training loss: 1.5993 5.8494 sec/batch
Epoch 4/20  Iteration 634/3560 Training loss: 1.5986 5.9034 sec/batch
Epoch 4/20  Iteration 635/3560 Training loss: 1.5982 5.8631 sec/batch
Epoch 4/20  Iteration 636/3560 Training loss: 1.5977 5.8762 sec/batch
Epoch 4/20  Iteration 637/3560 Training loss: 1.5972 5.8701 sec/batch
Epoch 4/20  Iteration 638/3560 Training loss: 1.5968 5.8584 sec/batch
Epoch 4/20  Iteration 639/3560 Training loss: 1.5964 5.8473 sec/batch
Epoch 4/20  Iteration 640/3560 Training loss: 1.5960 5.8585 sec/batch
Epoch 4/20  Iteration 641/3560 Training loss: 1.5957 5.8832 sec/batch
Epoch 4/20  Iteration 642/3560 Training loss: 1.5953 6.1017 sec/batch
Epoch 4/20  Iteration 643/3560 Training loss: 1.5950 5.8834 sec/batch
Epoch 4/20  Iteration 644/3560 Training loss: 1.5948 5.8950 sec/batch
Epoch 4/20  Iteration 645/3560 Training loss: 1.5944 5.9054 sec/batch
Epoch 4/20  Iteration 646/3560 Training loss: 1.5940 5.8555 sec/batch
Epoch 4/20  Iteration 647/3560 Training loss: 1.5935 5.8420 sec/batch
Epoch 4/20  Iteration 648/3560 Training loss: 1.5931 5.8572 sec/batch
Epoch 4/20  Iteration 649/3560 Training loss: 1.5926 5.8781 sec/batch
Epoch 4/20  Iteration 650/3560 Training loss: 1.5920 5.9390 sec/batch
Epoch 4/20  Iteration 651/3560 Training loss: 1.5916 5.9004 sec/batch
Epoch 4/20  Iteration 652/3560 Training loss: 1.5913 5.8395 sec/batch
Epoch 4/20  Iteration 653/3560 Training loss: 1.5909 5.8198 sec/batch
Epoch 4/20  Iteration 654/3560 Training loss: 1.5905 5.8698 sec/batch
Epoch 4/20  Iteration 655/3560 Training loss: 1.5902 5.9005 sec/batch
Epoch 4/20  Iteration 656/3560 Training loss: 1.5894 5.8804 sec/batch
Epoch 4/20  Iteration 657/3560 Training loss: 1.5888 5.8623 sec/batch
Epoch 4/20  Iteration 658/3560 Training loss: 1.5885 5.8625 sec/batch
Epoch 4/20  Iteration 659/3560 Training loss: 1.5882 5.8747 sec/batch
Epoch 4/20  Iteration 660/3560 Training loss: 1.5876 5.8328 sec/batch
Epoch 4/20  Iteration 661/3560 Training loss: 1.5873 5.8851 sec/batch
Epoch 4/20  Iteration 662/3560 Training loss: 1.5871 6.0488 sec/batch
Epoch 4/20  Iteration 663/3560 Training loss: 1.5867 6.0112 sec/batch
Epoch 4/20  Iteration 664/3560 Training loss: 1.5862 5.8840 sec/batch
Epoch 4/20  Iteration 665/3560 Training loss: 1.5856 5.9027 sec/batch
Epoch 4/20  Iteration 666/3560 Training loss: 1.5851 5.8927 sec/batch
Epoch 4/20  Iteration 667/3560 Training loss: 1.5849 5.8498 sec/batch
Epoch 4/20  Iteration 668/3560 Training loss: 1.5846 5.8564 sec/batch
Epoch 4/20  Iteration 669/3560 Training loss: 1.5843 5.8468 sec/batch
Epoch 4/20  Iteration 670/3560 Training loss: 1.5840 5.8771 sec/batch
Epoch 4/20  Iteration 671/3560 Training loss: 1.5840 5.8847 sec/batch
Epoch 4/20  Iteration 672/3560 Training loss: 1.5837 5.8777 sec/batch
Epoch 4/20  Iteration 673/3560 Training loss: 1.5836 5.8836 sec/batch
Epoch 4/20  Iteration 674/3560 Training loss: 1.5832 5.8742 sec/batch
Epoch 4/20  Iteration 675/3560 Training loss: 1.5833 5.8717 sec/batch
Epoch 4/20  Iteration 676/3560 Training loss: 1.5830 6.1056 sec/batch
Epoch 4/20  Iteration 677/3560 Training loss: 1.5827 5.8618 sec/batch
Epoch 4/20  Iteration 678/3560 Training loss: 1.5825 5.8879 sec/batch
Epoch 4/20  Iteration 679/3560 Training loss: 1.5821 5.9166 sec/batch
Epoch 4/20  Iteration 680/3560 Training loss: 1.5819 5.8492 sec/batch
Epoch 4/20  Iteration 681/3560 Training loss: 1.5816 5.8932 sec/batch
Epoch 4/20  Iteration 682/3560 Training loss: 1.5816 5.8682 sec/batch
Epoch 4/20  Iteration 683/3560 Training loss: 1.5813 6.0842 sec/batch
Epoch 4/20  Iteration 684/3560 Training loss: 1.5809 5.8777 sec/batch
Epoch 4/20  Iteration 685/3560 Training loss: 1.5803 5.8873 sec/batch
Epoch 4/20  Iteration 686/3560 Training loss: 1.5800 5.8767 sec/batch
Epoch 4/20  Iteration 687/3560 Training loss: 1.5798 5.8592 sec/batch
Epoch 4/20  Iteration 688/3560 Training loss: 1.5795 5.8703 sec/batch
Epoch 4/20  Iteration 689/3560 Training loss: 1.5792 5.8705 sec/batch
Epoch 4/20  Iteration 690/3560 Training loss: 1.5789 5.8325 sec/batch
Epoch 4/20  Iteration 691/3560 Training loss: 1.5787 5.8724 sec/batch
Epoch 4/20  Iteration 692/3560 Training loss: 1.5784 5.8249 sec/batch
Epoch 4/20  Iteration 693/3560 Training loss: 1.5778 5.8637 sec/batch
Epoch 4/20  Iteration 694/3560 Training loss: 1.5777 5.8698 sec/batch
Epoch 4/20  Iteration 695/3560 Training loss: 1.5776 5.8470 sec/batch
Epoch 4/20  Iteration 696/3560 Training loss: 1.5773 5.8318 sec/batch
Epoch 4/20  Iteration 697/3560 Training loss: 1.5770 5.8628 sec/batch
Epoch 4/20  Iteration 698/3560 Training loss: 1.5768 5.8791 sec/batch
Epoch 4/20  Iteration 699/3560 Training loss: 1.5765 5.8294 sec/batch
Epoch 4/20  Iteration 700/3560 Training loss: 1.5762 5.8364 sec/batch
Epoch 4/20  Iteration 701/3560 Training loss: 1.5761 5.8720 sec/batch
Epoch 4/20  Iteration 702/3560 Training loss: 1.5763 5.9049 sec/batch
Epoch 4/20  Iteration 703/3560 Training loss: 1.5760 6.1169 sec/batch
Epoch 4/20  Iteration 704/3560 Training loss: 1.5757 5.9054 sec/batch
Epoch 4/20  Iteration 705/3560 Training loss: 1.5754 5.8500 sec/batch
Epoch 4/20  Iteration 706/3560 Training loss: 1.5750 5.9039 sec/batch
Epoch 4/20  Iteration 707/3560 Training loss: 1.5748 5.9098 sec/batch
Epoch 4/20  Iteration 708/3560 Training loss: 1.5746 5.8896 sec/batch
Epoch 4/20  Iteration 709/3560 Training loss: 1.5744 5.8474 sec/batch
Epoch 4/20  Iteration 710/3560 Training loss: 1.5740 5.9335 sec/batch
Epoch 4/20  Iteration 711/3560 Training loss: 1.5736 6.1434 sec/batch
Epoch 4/20  Iteration 712/3560 Training loss: 1.5734 5.8436 sec/batch
Epoch 5/20  Iteration 713/3560 Training loss: 1.6136 5.9134 sec/batch
Epoch 5/20  Iteration 714/3560 Training loss: 1.5695 5.8827 sec/batch
Epoch 5/20  Iteration 715/3560 Training loss: 1.5541 5.8348 sec/batch
Epoch 5/20  Iteration 716/3560 Training loss: 1.5506 5.9421 sec/batch
Epoch 5/20  Iteration 717/3560 Training loss: 1.5424 5.8960 sec/batch
Epoch 5/20  Iteration 718/3560 Training loss: 1.5315 5.9156 sec/batch
Epoch 5/20  Iteration 719/3560 Training loss: 1.5318 5.8176 sec/batch
Epoch 5/20  Iteration 720/3560 Training loss: 1.5290 5.8857 sec/batch
Epoch 5/20  Iteration 721/3560 Training loss: 1.5290 5.8759 sec/batch
Epoch 5/20  Iteration 722/3560 Training loss: 1.5272 5.8986 sec/batch
Epoch 5/20  Iteration 723/3560 Training loss: 1.5230 5.8793 sec/batch
Epoch 5/20  Iteration 724/3560 Training loss: 1.5217 6.0719 sec/batch
Epoch 5/20  Iteration 725/3560 Training loss: 1.5212 5.8916 sec/batch
Epoch 5/20  Iteration 726/3560 Training loss: 1.5223 5.8945 sec/batch
Epoch 5/20  Iteration 727/3560 Training loss: 1.5211 6.2811 sec/batch
Epoch 5/20  Iteration 728/3560 Training loss: 1.5193 5.8875 sec/batch
Epoch 5/20  Iteration 729/3560 Training loss: 1.5190 5.8640 sec/batch
Epoch 5/20  Iteration 730/3560 Training loss: 1.5206 5.9072 sec/batch
Epoch 5/20  Iteration 731/3560 Training loss: 1.5206 5.8578 sec/batch
Epoch 5/20  Iteration 732/3560 Training loss: 1.5216 5.8974 sec/batch
Epoch 5/20  Iteration 733/3560 Training loss: 1.5208 5.8727 sec/batch
Epoch 5/20  Iteration 734/3560 Training loss: 1.5214 5.9104 sec/batch
Epoch 5/20  Iteration 735/3560 Training loss: 1.5201 5.8554 sec/batch
Epoch 5/20  Iteration 736/3560 Training loss: 1.5198 5.8873 sec/batch
Epoch 5/20  Iteration 737/3560 Training loss: 1.5194 5.8447 sec/batch
Epoch 5/20  Iteration 738/3560 Training loss: 1.5174 5.8595 sec/batch
Epoch 5/20  Iteration 739/3560 Training loss: 1.5158 5.8672 sec/batch
Epoch 5/20  Iteration 740/3560 Training loss: 1.5160 5.8701 sec/batch
Epoch 5/20  Iteration 741/3560 Training loss: 1.5163 5.8762 sec/batch
Epoch 5/20  Iteration 742/3560 Training loss: 1.5164 5.8672 sec/batch
Epoch 5/20  Iteration 743/3560 Training loss: 1.5159 5.8748 sec/batch
Epoch 5/20  Iteration 744/3560 Training loss: 1.5146 6.1112 sec/batch
Epoch 5/20  Iteration 745/3560 Training loss: 1.5148 5.8744 sec/batch
Epoch 5/20  Iteration 746/3560 Training loss: 1.5146 5.8389 sec/batch
Epoch 5/20  Iteration 747/3560 Training loss: 1.5141 5.8884 sec/batch
Epoch 5/20  Iteration 748/3560 Training loss: 1.5136 5.8813 sec/batch
Epoch 5/20  Iteration 749/3560 Training loss: 1.5125 5.9145 sec/batch
Epoch 5/20  Iteration 750/3560 Training loss: 1.5110 5.9070 sec/batch
Epoch 5/20  Iteration 751/3560 Training loss: 1.5095 5.9431 sec/batch
Epoch 5/20  Iteration 752/3560 Training loss: 1.5088 5.9013 sec/batch
Epoch 5/20  Iteration 753/3560 Training loss: 1.5080 5.8566 sec/batch
Epoch 5/20  Iteration 754/3560 Training loss: 1.5085 5.8695 sec/batch
Epoch 5/20  Iteration 755/3560 Training loss: 1.5077 5.8592 sec/batch
Epoch 5/20  Iteration 756/3560 Training loss: 1.5067 5.8979 sec/batch
Epoch 5/20  Iteration 757/3560 Training loss: 1.5068 5.9368 sec/batch
Epoch 5/20  Iteration 758/3560 Training loss: 1.5056 5.8522 sec/batch
Epoch 5/20  Iteration 759/3560 Training loss: 1.5052 5.8767 sec/batch
Epoch 5/20  Iteration 760/3560 Training loss: 1.5044 5.8596 sec/batch
Epoch 5/20  Iteration 761/3560 Training loss: 1.5040 5.9075 sec/batch
Epoch 5/20  Iteration 762/3560 Training loss: 1.5043 5.9143 sec/batch
Epoch 5/20  Iteration 763/3560 Training loss: 1.5036 5.9043 sec/batch
Epoch 5/20  Iteration 764/3560 Training loss: 1.5042 6.1046 sec/batch
Epoch 5/20  Iteration 765/3560 Training loss: 1.5041 5.8846 sec/batch
Epoch 5/20  Iteration 766/3560 Training loss: 1.5041 5.8824 sec/batch
Epoch 5/20  Iteration 767/3560 Training loss: 1.5036 5.9659 sec/batch
Epoch 5/20  Iteration 768/3560 Training loss: 1.5035 5.8696 sec/batch
Epoch 5/20  Iteration 769/3560 Training loss: 1.5037 5.8541 sec/batch
Epoch 5/20  Iteration 770/3560 Training loss: 1.5031 5.8538 sec/batch
Epoch 5/20  Iteration 771/3560 Training loss: 1.5024 5.8575 sec/batch
Epoch 5/20  Iteration 772/3560 Training loss: 1.5028 5.8982 sec/batch
Epoch 5/20  Iteration 773/3560 Training loss: 1.5026 5.9280 sec/batch
Epoch 5/20  Iteration 774/3560 Training loss: 1.5035 5.8440 sec/batch
Epoch 5/20  Iteration 775/3560 Training loss: 1.5036 6.0051 sec/batch
Epoch 5/20  Iteration 776/3560 Training loss: 1.5036 6.1237 sec/batch
Epoch 5/20  Iteration 777/3560 Training loss: 1.5034 7.1340 sec/batch
Epoch 5/20  Iteration 778/3560 Training loss: 1.5032 7.3079 sec/batch
Epoch 5/20  Iteration 779/3560 Training loss: 1.5031 6.3334 sec/batch
Epoch 5/20  Iteration 780/3560 Training loss: 1.5025 7.5404 sec/batch
Epoch 5/20  Iteration 781/3560 Training loss: 1.5023 6.9731 sec/batch
Epoch 5/20  Iteration 782/3560 Training loss: 1.5020 5.9806 sec/batch
Epoch 5/20  Iteration 783/3560 Training loss: 1.5024 6.2196 sec/batch
Epoch 5/20  Iteration 784/3560 Training loss: 1.5024 5.9397 sec/batch
Epoch 5/20  Iteration 785/3560 Training loss: 1.5026 5.9067 sec/batch
Epoch 5/20  Iteration 786/3560 Training loss: 1.5022 5.9085 sec/batch
Epoch 5/20  Iteration 787/3560 Training loss: 1.5019 5.9081 sec/batch
Epoch 5/20  Iteration 788/3560 Training loss: 1.5018 5.8890 sec/batch
Epoch 5/20  Iteration 789/3560 Training loss: 1.5015 5.8829 sec/batch
Epoch 5/20  Iteration 790/3560 Training loss: 1.5012 5.8607 sec/batch
Epoch 5/20  Iteration 791/3560 Training loss: 1.5005 5.9107 sec/batch
Epoch 5/20  Iteration 792/3560 Training loss: 1.5004 5.8651 sec/batch
Epoch 5/20  Iteration 793/3560 Training loss: 1.4996 5.8953 sec/batch
Epoch 5/20  Iteration 794/3560 Training loss: 1.4994 5.8579 sec/batch
Epoch 5/20  Iteration 795/3560 Training loss: 1.4988 5.8679 sec/batch
Epoch 5/20  Iteration 796/3560 Training loss: 1.4986 5.9229 sec/batch
Epoch 5/20  Iteration 797/3560 Training loss: 1.4981 5.8947 sec/batch
Epoch 5/20  Iteration 798/3560 Training loss: 1.4977 5.8795 sec/batch
Epoch 5/20  Iteration 799/3560 Training loss: 1.4972 5.8734 sec/batch
Epoch 5/20  Iteration 800/3560 Training loss: 1.4969 5.9178 sec/batch
Epoch 5/20  Iteration 801/3560 Training loss: 1.4963 5.8684 sec/batch
Epoch 5/20  Iteration 802/3560 Training loss: 1.4964 5.8639 sec/batch
Epoch 5/20  Iteration 803/3560 Training loss: 1.4960 6.1162 sec/batch
Epoch 5/20  Iteration 804/3560 Training loss: 1.4957 5.9752 sec/batch
Epoch 5/20  Iteration 805/3560 Training loss: 1.4951 5.8986 sec/batch
Epoch 5/20  Iteration 806/3560 Training loss: 1.4946 5.9157 sec/batch
Epoch 5/20  Iteration 807/3560 Training loss: 1.4943 5.9289 sec/batch
Epoch 5/20  Iteration 808/3560 Training loss: 1.4941 5.9116 sec/batch
Epoch 5/20  Iteration 809/3560 Training loss: 1.4939 5.8750 sec/batch
Epoch 5/20  Iteration 810/3560 Training loss: 1.4934 5.8787 sec/batch
Epoch 5/20  Iteration 811/3560 Training loss: 1.4929 5.9323 sec/batch
Epoch 5/20  Iteration 812/3560 Training loss: 1.4922 5.8667 sec/batch
Epoch 5/20  Iteration 813/3560 Training loss: 1.4920 5.8961 sec/batch
Epoch 5/20  Iteration 814/3560 Training loss: 1.4918 5.9091 sec/batch
Epoch 5/20  Iteration 815/3560 Training loss: 1.4915 5.9182 sec/batch
Epoch 5/20  Iteration 816/3560 Training loss: 1.4913 5.8998 sec/batch
Epoch 5/20  Iteration 817/3560 Training loss: 1.4909 5.9090 sec/batch
Epoch 5/20  Iteration 818/3560 Training loss: 1.4907 5.8674 sec/batch
Epoch 5/20  Iteration 819/3560 Training loss: 1.4904 5.8762 sec/batch
Epoch 5/20  Iteration 820/3560 Training loss: 1.4902 5.9000 sec/batch
Epoch 5/20  Iteration 821/3560 Training loss: 1.4900 5.8709 sec/batch
Epoch 5/20  Iteration 822/3560 Training loss: 1.4899 5.8770 sec/batch
Epoch 5/20  Iteration 823/3560 Training loss: 1.4895 5.9287 sec/batch
Epoch 5/20  Iteration 824/3560 Training loss: 1.4893 6.2128 sec/batch
Epoch 5/20  Iteration 825/3560 Training loss: 1.4889 5.9235 sec/batch
Epoch 5/20  Iteration 826/3560 Training loss: 1.4886 5.9689 sec/batch
Epoch 5/20  Iteration 827/3560 Training loss: 1.4882 6.1193 sec/batch
Epoch 5/20  Iteration 828/3560 Training loss: 1.4876 5.9162 sec/batch
Epoch 5/20  Iteration 829/3560 Training loss: 1.4873 5.9228 sec/batch
Epoch 5/20  Iteration 830/3560 Training loss: 1.4871 5.8867 sec/batch
Epoch 5/20  Iteration 831/3560 Training loss: 1.4868 5.8420 sec/batch
Epoch 5/20  Iteration 832/3560 Training loss: 1.4866 5.9510 sec/batch
Epoch 5/20  Iteration 833/3560 Training loss: 1.4863 5.9227 sec/batch
Epoch 5/20  Iteration 834/3560 Training loss: 1.4857 5.8454 sec/batch
Epoch 5/20  Iteration 835/3560 Training loss: 1.4852 5.8746 sec/batch
Epoch 5/20  Iteration 836/3560 Training loss: 1.4851 5.9119 sec/batch
Epoch 5/20  Iteration 837/3560 Training loss: 1.4849 5.8572 sec/batch
Epoch 5/20  Iteration 838/3560 Training loss: 1.4844 5.8802 sec/batch
Epoch 5/20  Iteration 839/3560 Training loss: 1.4843 5.8961 sec/batch
Epoch 5/20  Iteration 840/3560 Training loss: 1.4841 5.9176 sec/batch
Epoch 5/20  Iteration 841/3560 Training loss: 1.4838 5.8884 sec/batch
Epoch 5/20  Iteration 842/3560 Training loss: 1.4834 6.1622 sec/batch
Epoch 5/20  Iteration 843/3560 Training loss: 1.4829 5.9001 sec/batch
Epoch 5/20  Iteration 844/3560 Training loss: 1.4825 6.1752 sec/batch
Epoch 5/20  Iteration 845/3560 Training loss: 1.4825 5.8900 sec/batch
Epoch 5/20  Iteration 846/3560 Training loss: 1.4823 5.8759 sec/batch
Epoch 5/20  Iteration 847/3560 Training loss: 1.4821 5.8908 sec/batch
Epoch 5/20  Iteration 848/3560 Training loss: 1.4820 5.9490 sec/batch
Epoch 5/20  Iteration 849/3560 Training loss: 1.4820 5.8391 sec/batch
Epoch 5/20  Iteration 850/3560 Training loss: 1.4819 5.9560 sec/batch
Epoch 5/20  Iteration 851/3560 Training loss: 1.4818 5.8645 sec/batch
Epoch 5/20  Iteration 852/3560 Training loss: 1.4816 5.8684 sec/batch
Epoch 5/20  Iteration 853/3560 Training loss: 1.4818 5.9136 sec/batch
Epoch 5/20  Iteration 854/3560 Training loss: 1.4816 5.8688 sec/batch
Epoch 5/20  Iteration 855/3560 Training loss: 1.4815 5.8630 sec/batch
Epoch 5/20  Iteration 856/3560 Training loss: 1.4815 5.8872 sec/batch
Epoch 5/20  Iteration 857/3560 Training loss: 1.4812 5.8670 sec/batch
Epoch 5/20  Iteration 858/3560 Training loss: 1.4811 5.8763 sec/batch
Epoch 5/20  Iteration 859/3560 Training loss: 1.4810 5.8662 sec/batch
Epoch 5/20  Iteration 860/3560 Training loss: 1.4810 5.8667 sec/batch
Epoch 5/20  Iteration 861/3560 Training loss: 1.4810 5.8993 sec/batch
Epoch 5/20  Iteration 862/3560 Training loss: 1.4807 5.9098 sec/batch
Epoch 5/20  Iteration 863/3560 Training loss: 1.4802 5.9242 sec/batch
Epoch 5/20  Iteration 864/3560 Training loss: 1.4800 6.1980 sec/batch
Epoch 5/20  Iteration 865/3560 Training loss: 1.4798 5.8834 sec/batch
Epoch 5/20  Iteration 866/3560 Training loss: 1.4796 5.8986 sec/batch
Epoch 5/20  Iteration 867/3560 Training loss: 1.4795 5.9022 sec/batch
Epoch 5/20  Iteration 868/3560 Training loss: 1.4793 5.9146 sec/batch
Epoch 5/20  Iteration 869/3560 Training loss: 1.4792 5.9130 sec/batch
Epoch 5/20  Iteration 870/3560 Training loss: 1.4789 5.8893 sec/batch
Epoch 5/20  Iteration 871/3560 Training loss: 1.4785 5.8966 sec/batch
Epoch 5/20  Iteration 872/3560 Training loss: 1.4785 5.9393 sec/batch
Epoch 5/20  Iteration 873/3560 Training loss: 1.4785 5.8533 sec/batch
Epoch 5/20  Iteration 874/3560 Training loss: 1.4784 5.8518 sec/batch
Epoch 5/20  Iteration 875/3560 Training loss: 1.4782 5.8866 sec/batch
Epoch 5/20  Iteration 876/3560 Training loss: 1.4781 5.8884 sec/batch
Epoch 5/20  Iteration 877/3560 Training loss: 1.4779 5.9155 sec/batch
Epoch 5/20  Iteration 878/3560 Training loss: 1.4777 6.3541 sec/batch
Epoch 5/20  Iteration 879/3560 Training loss: 1.4777 5.8874 sec/batch
Epoch 5/20  Iteration 880/3560 Training loss: 1.4779 5.8949 sec/batch
Epoch 5/20  Iteration 881/3560 Training loss: 1.4778 5.8463 sec/batch
Epoch 5/20  Iteration 882/3560 Training loss: 1.4776 5.9246 sec/batch
Epoch 5/20  Iteration 883/3560 Training loss: 1.4774 6.0498 sec/batch
Epoch 5/20  Iteration 884/3560 Training loss: 1.4771 5.9303 sec/batch
Epoch 5/20  Iteration 885/3560 Training loss: 1.4770 6.1557 sec/batch
Epoch 5/20  Iteration 886/3560 Training loss: 1.4768 5.9445 sec/batch
Epoch 5/20  Iteration 887/3560 Training loss: 1.4767 5.9255 sec/batch
Epoch 5/20  Iteration 888/3560 Training loss: 1.4764 5.8659 sec/batch
Epoch 5/20  Iteration 889/3560 Training loss: 1.4761 5.8782 sec/batch
Epoch 5/20  Iteration 890/3560 Training loss: 1.4761 5.8552 sec/batch
Epoch 6/20  Iteration 891/3560 Training loss: 1.5232 5.8294 sec/batch
Epoch 6/20  Iteration 892/3560 Training loss: 1.4916 5.8772 sec/batch
Epoch 6/20  Iteration 893/3560 Training loss: 1.4740 5.8626 sec/batch
Epoch 6/20  Iteration 894/3560 Training loss: 1.4698 5.8480 sec/batch
Epoch 6/20  Iteration 895/3560 Training loss: 1.4613 5.8536 sec/batch
Epoch 6/20  Iteration 896/3560 Training loss: 1.4509 5.8481 sec/batch
Epoch 6/20  Iteration 897/3560 Training loss: 1.4499 5.8552 sec/batch
Epoch 6/20  Iteration 898/3560 Training loss: 1.4471 5.8604 sec/batch
Epoch 6/20  Iteration 899/3560 Training loss: 1.4476 5.9015 sec/batch
Epoch 6/20  Iteration 900/3560 Training loss: 1.4458 5.9113 sec/batch
Epoch 6/20  Iteration 901/3560 Training loss: 1.4422 5.8668 sec/batch
Epoch 6/20  Iteration 902/3560 Training loss: 1.4409 5.8711 sec/batch
Epoch 6/20  Iteration 903/3560 Training loss: 1.4408 5.9128 sec/batch
Epoch 6/20  Iteration 904/3560 Training loss: 1.4424 5.8557 sec/batch
Epoch 6/20  Iteration 905/3560 Training loss: 1.4416 6.0784 sec/batch
Epoch 6/20  Iteration 906/3560 Training loss: 1.4390 5.8535 sec/batch
Epoch 6/20  Iteration 907/3560 Training loss: 1.4388 5.9051 sec/batch
Epoch 6/20  Iteration 908/3560 Training loss: 1.4403 5.8589 sec/batch
Epoch 6/20  Iteration 909/3560 Training loss: 1.4407 5.8742 sec/batch
Epoch 6/20  Iteration 910/3560 Training loss: 1.4422 5.8620 sec/batch
Epoch 6/20  Iteration 911/3560 Training loss: 1.4413 5.8193 sec/batch
Epoch 6/20  Iteration 912/3560 Training loss: 1.4419 5.8361 sec/batch
Epoch 6/20  Iteration 913/3560 Training loss: 1.4408 5.8899 sec/batch
Epoch 6/20  Iteration 914/3560 Training loss: 1.4408 5.8899 sec/batch
Epoch 6/20  Iteration 915/3560 Training loss: 1.4403 5.8297 sec/batch
Epoch 6/20  Iteration 916/3560 Training loss: 1.4383 5.8813 sec/batch
Epoch 6/20  Iteration 917/3560 Training loss: 1.4368 5.8338 sec/batch
Epoch 6/20  Iteration 918/3560 Training loss: 1.4368 5.8818 sec/batch
Epoch 6/20  Iteration 919/3560 Training loss: 1.4370 5.9403 sec/batch
Epoch 6/20  Iteration 920/3560 Training loss: 1.4371 5.9149 sec/batch
Epoch 6/20  Iteration 921/3560 Training loss: 1.4365 5.8972 sec/batch
Epoch 6/20  Iteration 922/3560 Training loss: 1.4358 5.8907 sec/batch
Epoch 6/20  Iteration 923/3560 Training loss: 1.4363 5.9532 sec/batch
Epoch 6/20  Iteration 924/3560 Training loss: 1.4367 5.8922 sec/batch
Epoch 6/20  Iteration 925/3560 Training loss: 1.4366 6.0467 sec/batch
Epoch 6/20  Iteration 926/3560 Training loss: 1.4362 5.9301 sec/batch
Epoch 6/20  Iteration 927/3560 Training loss: 1.4353 5.8800 sec/batch
Epoch 6/20  Iteration 928/3560 Training loss: 1.4339 5.9275 sec/batch
Epoch 6/20  Iteration 929/3560 Training loss: 1.4325 6.2842 sec/batch
Epoch 6/20  Iteration 930/3560 Training loss: 1.4319 5.9417 sec/batch
Epoch 6/20  Iteration 931/3560 Training loss: 1.4311 5.8871 sec/batch
Epoch 6/20  Iteration 932/3560 Training loss: 1.4313 5.9039 sec/batch
Epoch 6/20  Iteration 933/3560 Training loss: 1.4307 5.8965 sec/batch
Epoch 6/20  Iteration 934/3560 Training loss: 1.4299 5.8443 sec/batch
Epoch 6/20  Iteration 935/3560 Training loss: 1.4301 5.8742 sec/batch
Epoch 6/20  Iteration 936/3560 Training loss: 1.4293 5.8750 sec/batch
Epoch 6/20  Iteration 937/3560 Training loss: 1.4290 5.8686 sec/batch
Epoch 6/20  Iteration 938/3560 Training loss: 1.4284 5.8928 sec/batch
Epoch 6/20  Iteration 939/3560 Training loss: 1.4282 5.8887 sec/batch
Epoch 6/20  Iteration 940/3560 Training loss: 1.4284 5.8739 sec/batch
Epoch 6/20  Iteration 941/3560 Training loss: 1.4278 5.8554 sec/batch
Epoch 6/20  Iteration 942/3560 Training loss: 1.4284 5.8587 sec/batch
Epoch 6/20  Iteration 943/3560 Training loss: 1.4283 5.8985 sec/batch
Epoch 6/20  Iteration 944/3560 Training loss: 1.4285 5.8677 sec/batch
Epoch 6/20  Iteration 945/3560 Training loss: 1.4282 5.8806 sec/batch
Epoch 6/20  Iteration 946/3560 Training loss: 1.4281 6.1641 sec/batch
Epoch 6/20  Iteration 947/3560 Training loss: 1.4284 5.8462 sec/batch
Epoch 6/20  Iteration 948/3560 Training loss: 1.4281 5.8653 sec/batch
Epoch 6/20  Iteration 949/3560 Training loss: 1.4275 5.8569 sec/batch
Epoch 6/20  Iteration 950/3560 Training loss: 1.4277 5.8422 sec/batch
Epoch 6/20  Iteration 951/3560 Training loss: 1.4274 5.8930 sec/batch
Epoch 6/20  Iteration 952/3560 Training loss: 1.4283 5.8319 sec/batch
Epoch 6/20  Iteration 953/3560 Training loss: 1.4287 5.9043 sec/batch
Epoch 6/20  Iteration 954/3560 Training loss: 1.4287 5.8618 sec/batch
Epoch 6/20  Iteration 955/3560 Training loss: 1.4284 5.9335 sec/batch
Epoch 6/20  Iteration 956/3560 Training loss: 1.4284 5.8417 sec/batch
Epoch 6/20  Iteration 957/3560 Training loss: 1.4283 5.8685 sec/batch
Epoch 6/20  Iteration 958/3560 Training loss: 1.4280 5.8534 sec/batch
Epoch 6/20  Iteration 959/3560 Training loss: 1.4280 5.8680 sec/batch
Epoch 6/20  Iteration 960/3560 Training loss: 1.4278 5.8779 sec/batch
Epoch 6/20  Iteration 961/3560 Training loss: 1.4282 5.8351 sec/batch
Epoch 6/20  Iteration 962/3560 Training loss: 1.4284 5.8715 sec/batch
Epoch 6/20  Iteration 963/3560 Training loss: 1.4288 5.8666 sec/batch
Epoch 6/20  Iteration 964/3560 Training loss: 1.4284 5.8975 sec/batch
Epoch 6/20  Iteration 965/3560 Training loss: 1.4282 5.8612 sec/batch
Epoch 6/20  Iteration 966/3560 Training loss: 1.4283 6.1032 sec/batch
Epoch 6/20  Iteration 967/3560 Training loss: 1.4280 5.8595 sec/batch
Epoch 6/20  Iteration 968/3560 Training loss: 1.4279 5.8592 sec/batch
Epoch 6/20  Iteration 969/3560 Training loss: 1.4272 5.8301 sec/batch
Epoch 6/20  Iteration 970/3560 Training loss: 1.4270 5.8613 sec/batch
Epoch 6/20  Iteration 971/3560 Training loss: 1.4264 5.8720 sec/batch
Epoch 6/20  Iteration 972/3560 Training loss: 1.4263 5.8860 sec/batch
Epoch 6/20  Iteration 973/3560 Training loss: 1.4257 5.8793 sec/batch
Epoch 6/20  Iteration 974/3560 Training loss: 1.4256 5.8660 sec/batch
Epoch 6/20  Iteration 975/3560 Training loss: 1.4252 5.8723 sec/batch
Epoch 6/20  Iteration 976/3560 Training loss: 1.4250 5.8882 sec/batch
Epoch 6/20  Iteration 977/3560 Training loss: 1.4246 5.8566 sec/batch
Epoch 6/20  Iteration 978/3560 Training loss: 1.4243 5.8540 sec/batch
Epoch 6/20  Iteration 979/3560 Training loss: 1.4239 6.0936 sec/batch
Epoch 6/20  Iteration 980/3560 Training loss: 1.4240 5.9009 sec/batch
Epoch 6/20  Iteration 981/3560 Training loss: 1.4236 5.8747 sec/batch
Epoch 6/20  Iteration 982/3560 Training loss: 1.4233 5.8931 sec/batch
Epoch 6/20  Iteration 983/3560 Training loss: 1.4229 5.8731 sec/batch
Epoch 6/20  Iteration 984/3560 Training loss: 1.4224 5.8992 sec/batch
Epoch 6/20  Iteration 985/3560 Training loss: 1.4220 5.9279 sec/batch
Epoch 6/20  Iteration 986/3560 Training loss: 1.4218 6.1263 sec/batch
Epoch 6/20  Iteration 987/3560 Training loss: 1.4217 5.9184 sec/batch
Epoch 6/20  Iteration 988/3560 Training loss: 1.4211 5.8701 sec/batch
Epoch 6/20  Iteration 989/3560 Training loss: 1.4207 5.8638 sec/batch
Epoch 6/20  Iteration 990/3560 Training loss: 1.4202 5.8607 sec/batch
Epoch 6/20  Iteration 991/3560 Training loss: 1.4200 5.8825 sec/batch
Epoch 6/20  Iteration 992/3560 Training loss: 1.4199 5.8768 sec/batch
Epoch 6/20  Iteration 993/3560 Training loss: 1.4197 5.8556 sec/batch
Epoch 6/20  Iteration 994/3560 Training loss: 1.4194 5.8733 sec/batch
Epoch 6/20  Iteration 995/3560 Training loss: 1.4191 5.8751 sec/batch
Epoch 6/20  Iteration 996/3560 Training loss: 1.4189 5.8964 sec/batch
Epoch 6/20  Iteration 997/3560 Training loss: 1.4188 5.8553 sec/batch
Epoch 6/20  Iteration 998/3560 Training loss: 1.4186 5.8880 sec/batch
Epoch 6/20  Iteration 999/3560 Training loss: 1.4184 5.8402 sec/batch
Epoch 6/20  Iteration 1000/3560 Training loss: 1.4184 5.8628 sec/batch
Epoch 6/20  Iteration 1001/3560 Training loss: 1.4181 5.8859 sec/batch
Epoch 6/20  Iteration 1002/3560 Training loss: 1.4180 5.8804 sec/batch
Epoch 6/20  Iteration 1003/3560 Training loss: 1.4178 5.9010 sec/batch
Epoch 6/20  Iteration 1004/3560 Training loss: 1.4174 5.8639 sec/batch
Epoch 6/20  Iteration 1005/3560 Training loss: 1.4171 5.8560 sec/batch
Epoch 6/20  Iteration 1006/3560 Training loss: 1.4166 5.8673 sec/batch
Epoch 6/20  Iteration 1007/3560 Training loss: 1.4165 6.1096 sec/batch
Epoch 6/20  Iteration 1008/3560 Training loss: 1.4164 5.8567 sec/batch
Epoch 6/20  Iteration 1009/3560 Training loss: 1.4161 5.8917 sec/batch
Epoch 6/20  Iteration 1010/3560 Training loss: 1.4160 5.9959 sec/batch
Epoch 6/20  Iteration 1011/3560 Training loss: 1.4158 5.8859 sec/batch
Epoch 6/20  Iteration 1012/3560 Training loss: 1.4153 6.3404 sec/batch
Epoch 6/20  Iteration 1013/3560 Training loss: 1.4148 6.6958 sec/batch
Epoch 6/20  Iteration 1014/3560 Training loss: 1.4147 7.2784 sec/batch
Epoch 6/20  Iteration 1015/3560 Training loss: 1.4146 5.9080 sec/batch
Epoch 6/20  Iteration 1016/3560 Training loss: 1.4141 5.8933 sec/batch
Epoch 6/20  Iteration 1017/3560 Training loss: 1.4141 5.9002 sec/batch
Epoch 6/20  Iteration 1018/3560 Training loss: 1.4140 5.8853 sec/batch
Epoch 6/20  Iteration 1019/3560 Training loss: 1.4138 5.8801 sec/batch
Epoch 6/20  Iteration 1020/3560 Training loss: 1.4134 5.9105 sec/batch
Epoch 6/20  Iteration 1021/3560 Training loss: 1.4129 5.9331 sec/batch
Epoch 6/20  Iteration 1022/3560 Training loss: 1.4126 5.8946 sec/batch
Epoch 6/20  Iteration 1023/3560 Training loss: 1.4125 5.9019 sec/batch
Epoch 6/20  Iteration 1024/3560 Training loss: 1.4125 5.9116 sec/batch
Epoch 6/20  Iteration 1025/3560 Training loss: 1.4123 5.8868 sec/batch
Epoch 6/20  Iteration 1026/3560 Training loss: 1.4122 6.1003 sec/batch
Epoch 6/20  Iteration 1027/3560 Training loss: 1.4123 5.9760 sec/batch
Epoch 6/20  Iteration 1028/3560 Training loss: 1.4123 5.8823 sec/batch
Epoch 6/20  Iteration 1029/3560 Training loss: 1.4122 5.9021 sec/batch
Epoch 6/20  Iteration 1030/3560 Training loss: 1.4120 6.3784 sec/batch
Epoch 6/20  Iteration 1031/3560 Training loss: 1.4123 5.9157 sec/batch
Epoch 6/20  Iteration 1032/3560 Training loss: 1.4122 5.9110 sec/batch
Epoch 6/20  Iteration 1033/3560 Training loss: 1.4119 5.8853 sec/batch
Epoch 6/20  Iteration 1034/3560 Training loss: 1.4120 5.9229 sec/batch
Epoch 6/20  Iteration 1035/3560 Training loss: 1.4117 5.9081 sec/batch
Epoch 6/20  Iteration 1036/3560 Training loss: 1.4117 5.8881 sec/batch
Epoch 6/20  Iteration 1037/3560 Training loss: 1.4117 5.8894 sec/batch
Epoch 6/20  Iteration 1038/3560 Training loss: 1.4118 5.8500 sec/batch
Epoch 6/20  Iteration 1039/3560 Training loss: 1.4118 5.9174 sec/batch
Epoch 6/20  Iteration 1040/3560 Training loss: 1.4117 5.8638 sec/batch
Epoch 6/20  Iteration 1041/3560 Training loss: 1.4112 5.8776 sec/batch
Epoch 6/20  Iteration 1042/3560 Training loss: 1.4110 5.9133 sec/batch
Epoch 6/20  Iteration 1043/3560 Training loss: 1.4110 5.8955 sec/batch
Epoch 6/20  Iteration 1044/3560 Training loss: 1.4108 5.9070 sec/batch
Epoch 6/20  Iteration 1045/3560 Training loss: 1.4108 5.8859 sec/batch
Epoch 6/20  Iteration 1046/3560 Training loss: 1.4107 6.1018 sec/batch
Epoch 6/20  Iteration 1047/3560 Training loss: 1.4106 6.0426 sec/batch
Epoch 6/20  Iteration 1048/3560 Training loss: 1.4105 5.8755 sec/batch
Epoch 6/20  Iteration 1049/3560 Training loss: 1.4102 5.8942 sec/batch
Epoch 6/20  Iteration 1050/3560 Training loss: 1.4102 5.8514 sec/batch
Epoch 6/20  Iteration 1051/3560 Training loss: 1.4104 5.9003 sec/batch
Epoch 6/20  Iteration 1052/3560 Training loss: 1.4103 6.7668 sec/batch
Epoch 6/20  Iteration 1053/3560 Training loss: 1.4102 7.1554 sec/batch
Epoch 6/20  Iteration 1054/3560 Training loss: 1.4101 6.2593 sec/batch
Epoch 6/20  Iteration 1055/3560 Training loss: 1.4100 6.0289 sec/batch
Epoch 6/20  Iteration 1056/3560 Training loss: 1.4098 6.1785 sec/batch
Epoch 6/20  Iteration 1057/3560 Training loss: 1.4099 6.0858 sec/batch
Epoch 6/20  Iteration 1058/3560 Training loss: 1.4102 6.9098 sec/batch
Epoch 6/20  Iteration 1059/3560 Training loss: 1.4101 6.9183 sec/batch
Epoch 6/20  Iteration 1060/3560 Training loss: 1.4100 6.1705 sec/batch
Epoch 6/20  Iteration 1061/3560 Training loss: 1.4099 5.9952 sec/batch
Epoch 6/20  Iteration 1062/3560 Training loss: 1.4096 5.9206 sec/batch
Epoch 6/20  Iteration 1063/3560 Training loss: 1.4096 6.1403 sec/batch
Epoch 6/20  Iteration 1064/3560 Training loss: 1.4095 6.1285 sec/batch
Epoch 6/20  Iteration 1065/3560 Training loss: 1.4095 5.8984 sec/batch
Epoch 6/20  Iteration 1066/3560 Training loss: 1.4092 6.3280 sec/batch
Epoch 6/20  Iteration 1067/3560 Training loss: 1.4090 6.4215 sec/batch
Epoch 6/20  Iteration 1068/3560 Training loss: 1.4090 6.3290 sec/batch
Epoch 7/20  Iteration 1069/3560 Training loss: 1.4785 6.3369 sec/batch
Epoch 7/20  Iteration 1070/3560 Training loss: 1.4393 6.1827 sec/batch
Epoch 7/20  Iteration 1071/3560 Training loss: 1.4238 5.9715 sec/batch
Epoch 7/20  Iteration 1072/3560 Training loss: 1.4185 6.3488 sec/batch
Epoch 7/20  Iteration 1073/3560 Training loss: 1.4075 6.2647 sec/batch
Epoch 7/20  Iteration 1074/3560 Training loss: 1.3979 6.0581 sec/batch
Epoch 7/20  Iteration 1075/3560 Training loss: 1.3979 5.8805 sec/batch
Epoch 7/20  Iteration 1076/3560 Training loss: 1.3954 5.9817 sec/batch
Epoch 7/20  Iteration 1077/3560 Training loss: 1.3955 6.0647 sec/batch
Epoch 7/20  Iteration 1078/3560 Training loss: 1.3935 5.9232 sec/batch
Epoch 7/20  Iteration 1079/3560 Training loss: 1.3903 6.4036 sec/batch
Epoch 7/20  Iteration 1080/3560 Training loss: 1.3888 6.3933 sec/batch
Epoch 7/20  Iteration 1081/3560 Training loss: 1.3878 6.2924 sec/batch
Epoch 7/20  Iteration 1082/3560 Training loss: 1.3887 5.9533 sec/batch
Epoch 7/20  Iteration 1083/3560 Training loss: 1.3875 6.0451 sec/batch
Epoch 7/20  Iteration 1084/3560 Training loss: 1.3854 5.8905 sec/batch
Epoch 7/20  Iteration 1085/3560 Training loss: 1.3852 5.9167 sec/batch
Epoch 7/20  Iteration 1086/3560 Training loss: 1.3860 6.2302 sec/batch
Epoch 7/20  Iteration 1087/3560 Training loss: 1.3859 8.1703 sec/batch
Epoch 7/20  Iteration 1088/3560 Training loss: 1.3872 9.4033 sec/batch
Epoch 7/20  Iteration 1089/3560 Training loss: 1.3871 9.3341 sec/batch
Epoch 7/20  Iteration 1090/3560 Training loss: 1.3875 9.1270 sec/batch
Epoch 7/20  Iteration 1091/3560 Training loss: 1.3869 8.1809 sec/batch
Epoch 7/20  Iteration 1092/3560 Training loss: 1.3866 9.1468 sec/batch
Epoch 7/20  Iteration 1093/3560 Training loss: 1.3864 8.2678 sec/batch
Epoch 7/20  Iteration 1094/3560 Training loss: 1.3842 7.3826 sec/batch
Epoch 7/20  Iteration 1095/3560 Training loss: 1.3832 7.2693 sec/batch
Epoch 7/20  Iteration 1096/3560 Training loss: 1.3839 7.7919 sec/batch
Epoch 7/20  Iteration 1097/3560 Training loss: 1.3841 7.6202 sec/batch
Epoch 7/20  Iteration 1098/3560 Training loss: 1.3842 7.6072 sec/batch
Epoch 7/20  Iteration 1099/3560 Training loss: 1.3838 8.7239 sec/batch
Epoch 7/20  Iteration 1100/3560 Training loss: 1.3827 10.4875 sec/batch
Epoch 7/20  Iteration 1101/3560 Training loss: 1.3829 7.4347 sec/batch
Epoch 7/20  Iteration 1102/3560 Training loss: 1.3831 7.0805 sec/batch
Epoch 7/20  Iteration 1103/3560 Training loss: 1.3828 7.5834 sec/batch
Epoch 7/20  Iteration 1104/3560 Training loss: 1.3823 6.5851 sec/batch
Epoch 7/20  Iteration 1105/3560 Training loss: 1.3814 6.2165 sec/batch
Epoch 7/20  Iteration 1106/3560 Training loss: 1.3805 6.5147 sec/batch
Epoch 7/20  Iteration 1107/3560 Training loss: 1.3792 6.3937 sec/batch
Epoch 7/20  Iteration 1108/3560 Training loss: 1.3786 6.7689 sec/batch
Epoch 7/20  Iteration 1109/3560 Training loss: 1.3777 6.0617 sec/batch
Epoch 7/20  Iteration 1110/3560 Training loss: 1.3784 6.2092 sec/batch
Epoch 7/20  Iteration 1111/3560 Training loss: 1.3779 6.3279 sec/batch
Epoch 7/20  Iteration 1112/3560 Training loss: 1.3770 7.7193 sec/batch
Epoch 7/20  Iteration 1113/3560 Training loss: 1.3770 6.4462 sec/batch
Epoch 7/20  Iteration 1114/3560 Training loss: 1.3761 6.0531 sec/batch
Epoch 7/20  Iteration 1115/3560 Training loss: 1.3760 7.5860 sec/batch
Epoch 7/20  Iteration 1116/3560 Training loss: 1.3754 6.6068 sec/batch
Epoch 7/20  Iteration 1117/3560 Training loss: 1.3751 6.0904 sec/batch
Epoch 7/20  Iteration 1118/3560 Training loss: 1.3752 6.3536 sec/batch
Epoch 7/20  Iteration 1119/3560 Training loss: 1.3745 6.0465 sec/batch
Epoch 7/20  Iteration 1120/3560 Training loss: 1.3752 6.0190 sec/batch
Epoch 7/20  Iteration 1121/3560 Training loss: 1.3750 6.4770 sec/batch
Epoch 7/20  Iteration 1122/3560 Training loss: 1.3753 5.9833 sec/batch
Epoch 7/20  Iteration 1123/3560 Training loss: 1.3749 5.9531 sec/batch
Epoch 7/20  Iteration 1124/3560 Training loss: 1.3748 6.0197 sec/batch
Epoch 7/20  Iteration 1125/3560 Training loss: 1.3751 6.0170 sec/batch
Epoch 7/20  Iteration 1126/3560 Training loss: 1.3747 6.5028 sec/batch
Epoch 7/20  Iteration 1127/3560 Training loss: 1.3741 6.0458 sec/batch
Epoch 7/20  Iteration 1128/3560 Training loss: 1.3747 5.9596 sec/batch
Epoch 7/20  Iteration 1129/3560 Training loss: 1.3747 5.9598 sec/batch
Epoch 7/20  Iteration 1130/3560 Training loss: 1.3756 5.9759 sec/batch
Epoch 7/20  Iteration 1131/3560 Training loss: 1.3757 6.0414 sec/batch
Epoch 7/20  Iteration 1132/3560 Training loss: 1.3758 6.1229 sec/batch
Epoch 7/20  Iteration 1133/3560 Training loss: 1.3756 5.9258 sec/batch
Epoch 7/20  Iteration 1134/3560 Training loss: 1.3754 6.0038 sec/batch
Epoch 7/20  Iteration 1135/3560 Training loss: 1.3755 6.0014 sec/batch
Epoch 7/20  Iteration 1136/3560 Training loss: 1.3750 5.9567 sec/batch
Epoch 7/20  Iteration 1137/3560 Training loss: 1.3751 6.1428 sec/batch
Epoch 7/20  Iteration 1138/3560 Training loss: 1.3749 6.1077 sec/batch
Epoch 7/20  Iteration 1139/3560 Training loss: 1.3754 6.1992 sec/batch
Epoch 7/20  Iteration 1140/3560 Training loss: 1.3757 6.7708 sec/batch
Epoch 7/20  Iteration 1141/3560 Training loss: 1.3760 6.2005 sec/batch
Epoch 7/20  Iteration 1142/3560 Training loss: 1.3756 6.3180 sec/batch
Epoch 7/20  Iteration 1143/3560 Training loss: 1.3754 6.2628 sec/batch
Epoch 7/20  Iteration 1144/3560 Training loss: 1.3755 5.9764 sec/batch
Epoch 7/20  Iteration 1145/3560 Training loss: 1.3753 5.9747 sec/batch
Epoch 7/20  Iteration 1146/3560 Training loss: 1.3751 5.9216 sec/batch
Epoch 7/20  Iteration 1147/3560 Training loss: 1.3745 5.9297 sec/batch
Epoch 7/20  Iteration 1148/3560 Training loss: 1.3742 5.9408 sec/batch
Epoch 7/20  Iteration 1149/3560 Training loss: 1.3736 6.0507 sec/batch
Epoch 7/20  Iteration 1150/3560 Training loss: 1.3735 5.9406 sec/batch
Epoch 7/20  Iteration 1151/3560 Training loss: 1.3729 5.9584 sec/batch
Epoch 7/20  Iteration 1152/3560 Training loss: 1.3728 6.2112 sec/batch
Epoch 7/20  Iteration 1153/3560 Training loss: 1.3726 7.1901 sec/batch
Epoch 7/20  Iteration 1154/3560 Training loss: 1.3724 6.1874 sec/batch
Epoch 7/20  Iteration 1155/3560 Training loss: 1.3722 5.8759 sec/batch
Epoch 7/20  Iteration 1156/3560 Training loss: 1.3719 6.0293 sec/batch
Epoch 7/20  Iteration 1157/3560 Training loss: 1.3715 6.2328 sec/batch
Epoch 7/20  Iteration 1158/3560 Training loss: 1.3715 6.1267 sec/batch
Epoch 7/20  Iteration 1159/3560 Training loss: 1.3712 7.1566 sec/batch
Epoch 7/20  Iteration 1160/3560 Training loss: 1.3711 9.3358 sec/batch
Epoch 7/20  Iteration 1161/3560 Training loss: 1.3706 11.1542 sec/batch
Epoch 7/20  Iteration 1162/3560 Training loss: 1.3701 7.1477 sec/batch
Epoch 7/20  Iteration 1163/3560 Training loss: 1.3698 6.6692 sec/batch
Epoch 7/20  Iteration 1164/3560 Training loss: 1.3699 7.0097 sec/batch
Epoch 7/20  Iteration 1165/3560 Training loss: 1.3698 7.1657 sec/batch
Epoch 7/20  Iteration 1166/3560 Training loss: 1.3693 7.0658 sec/batch
Epoch 7/20  Iteration 1167/3560 Training loss: 1.3688 6.9295 sec/batch
Epoch 7/20  Iteration 1168/3560 Training loss: 1.3683 6.8922 sec/batch
Epoch 7/20  Iteration 1169/3560 Training loss: 1.3682 6.6697 sec/batch
Epoch 7/20  Iteration 1170/3560 Training loss: 1.3680 6.5934 sec/batch
Epoch 7/20  Iteration 1171/3560 Training loss: 1.3679 6.6395 sec/batch
Epoch 7/20  Iteration 1172/3560 Training loss: 1.3676 6.6355 sec/batch
Epoch 7/20  Iteration 1173/3560 Training loss: 1.3674 6.6689 sec/batch
Epoch 7/20  Iteration 1174/3560 Training loss: 1.3673 6.9456 sec/batch
Epoch 7/20  Iteration 1175/3560 Training loss: 1.3671 7.0105 sec/batch
Epoch 7/20  Iteration 1176/3560 Training loss: 1.3670 7.0258 sec/batch
Epoch 7/20  Iteration 1177/3560 Training loss: 1.3668 6.6618 sec/batch
Epoch 7/20  Iteration 1178/3560 Training loss: 1.3668 6.7709 sec/batch
Epoch 7/20  Iteration 1179/3560 Training loss: 1.3665 6.6033 sec/batch
Epoch 7/20  Iteration 1180/3560 Training loss: 1.3664 6.7322 sec/batch
Epoch 7/20  Iteration 1181/3560 Training loss: 1.3663 6.6056 sec/batch
Epoch 7/20  Iteration 1182/3560 Training loss: 1.3660 7.3900 sec/batch
Epoch 7/20  Iteration 1183/3560 Training loss: 1.3657 6.4652 sec/batch
Epoch 7/20  Iteration 1184/3560 Training loss: 1.3652 6.4643 sec/batch
Epoch 7/20  Iteration 1185/3560 Training loss: 1.3651 7.7391 sec/batch
Epoch 7/20  Iteration 1186/3560 Training loss: 1.3651 8.1659 sec/batch
Epoch 7/20  Iteration 1187/3560 Training loss: 1.3649 6.1941 sec/batch
Epoch 7/20  Iteration 1188/3560 Training loss: 1.3646 6.4152 sec/batch
Epoch 7/20  Iteration 1189/3560 Training loss: 1.3645 6.4249 sec/batch
Epoch 7/20  Iteration 1190/3560 Training loss: 1.3640 6.1899 sec/batch
Epoch 7/20  Iteration 1191/3560 Training loss: 1.3635 6.8158 sec/batch
Epoch 7/20  Iteration 1192/3560 Training loss: 1.3634 6.0456 sec/batch
Epoch 7/20  Iteration 1193/3560 Training loss: 1.3633 6.0404 sec/batch
Epoch 7/20  Iteration 1194/3560 Training loss: 1.3629 5.9976 sec/batch
Epoch 7/20  Iteration 1195/3560 Training loss: 1.3628 5.9810 sec/batch
Epoch 7/20  Iteration 1196/3560 Training loss: 1.3627 6.2258 sec/batch
Epoch 7/20  Iteration 1197/3560 Training loss: 1.3624 6.7309 sec/batch
Epoch 7/20  Iteration 1198/3560 Training loss: 1.3619 6.5939 sec/batch
Epoch 7/20  Iteration 1199/3560 Training loss: 1.3614 6.2113 sec/batch
Epoch 7/20  Iteration 1200/3560 Training loss: 1.3610 6.2669 sec/batch
Epoch 7/20  Iteration 1201/3560 Training loss: 1.3611 6.0045 sec/batch
Epoch 7/20  Iteration 1202/3560 Training loss: 1.3611 6.1797 sec/batch
Epoch 7/20  Iteration 1203/3560 Training loss: 1.3610 6.5998 sec/batch
Epoch 7/20  Iteration 1204/3560 Training loss: 1.3609 6.3038 sec/batch
Epoch 7/20  Iteration 1205/3560 Training loss: 1.3611 6.1738 sec/batch
Epoch 7/20  Iteration 1206/3560 Training loss: 1.3612 6.2723 sec/batch
Epoch 7/20  Iteration 1207/3560 Training loss: 1.3611 6.1797 sec/batch
Epoch 7/20  Iteration 1208/3560 Training loss: 1.3610 6.1526 sec/batch
Epoch 7/20  Iteration 1209/3560 Training loss: 1.3613 5.9785 sec/batch
Epoch 7/20  Iteration 1210/3560 Training loss: 1.3612 6.4011 sec/batch
Epoch 7/20  Iteration 1211/3560 Training loss: 1.3610 6.5818 sec/batch
Epoch 7/20  Iteration 1212/3560 Training loss: 1.3612 7.1441 sec/batch
Epoch 7/20  Iteration 1213/3560 Training loss: 1.3611 6.2785 sec/batch
Epoch 7/20  Iteration 1214/3560 Training loss: 1.3612 6.4448 sec/batch
Epoch 7/20  Iteration 1215/3560 Training loss: 1.3612 6.0262 sec/batch
Epoch 7/20  Iteration 1216/3560 Training loss: 1.3614 5.8981 sec/batch
Epoch 7/20  Iteration 1217/3560 Training loss: 1.3613 6.0043 sec/batch
Epoch 7/20  Iteration 1218/3560 Training loss: 1.3611 5.9653 sec/batch
Epoch 7/20  Iteration 1219/3560 Training loss: 1.3608 5.9195 sec/batch
Epoch 7/20  Iteration 1220/3560 Training loss: 1.3606 5.9016 sec/batch
Epoch 7/20  Iteration 1221/3560 Training loss: 1.3605 5.9281 sec/batch
Epoch 7/20  Iteration 1222/3560 Training loss: 1.3604 5.9526 sec/batch
Epoch 7/20  Iteration 1223/3560 Training loss: 1.3603 6.0082 sec/batch
Epoch 7/20  Iteration 1224/3560 Training loss: 1.3602 5.9518 sec/batch
Epoch 7/20  Iteration 1225/3560 Training loss: 1.3602 5.9243 sec/batch
Epoch 7/20  Iteration 1226/3560 Training loss: 1.3600 5.9590 sec/batch
Epoch 7/20  Iteration 1227/3560 Training loss: 1.3598 5.9781 sec/batch
Epoch 7/20  Iteration 1228/3560 Training loss: 1.3598 5.9018 sec/batch
Epoch 7/20  Iteration 1229/3560 Training loss: 1.3600 5.9543 sec/batch
Epoch 7/20  Iteration 1230/3560 Training loss: 1.3599 6.3503 sec/batch
Epoch 7/20  Iteration 1231/3560 Training loss: 1.3599 5.9914 sec/batch
Epoch 7/20  Iteration 1232/3560 Training loss: 1.3598 5.9196 sec/batch
Epoch 7/20  Iteration 1233/3560 Training loss: 1.3598 5.9354 sec/batch
Epoch 7/20  Iteration 1234/3560 Training loss: 1.3597 5.9079 sec/batch
Epoch 7/20  Iteration 1235/3560 Training loss: 1.3598 5.9850 sec/batch
Epoch 7/20  Iteration 1236/3560 Training loss: 1.3601 5.9493 sec/batch
Epoch 7/20  Iteration 1237/3560 Training loss: 1.3600 5.9723 sec/batch
Epoch 7/20  Iteration 1238/3560 Training loss: 1.3600 5.9467 sec/batch
Epoch 7/20  Iteration 1239/3560 Training loss: 1.3599 5.9281 sec/batch
Epoch 7/20  Iteration 1240/3560 Training loss: 1.3597 5.9027 sec/batch
Epoch 7/20  Iteration 1241/3560 Training loss: 1.3598 5.9650 sec/batch
Epoch 7/20  Iteration 1242/3560 Training loss: 1.3597 5.9563 sec/batch
Epoch 7/20  Iteration 1243/3560 Training loss: 1.3597 5.9438 sec/batch
Epoch 7/20  Iteration 1244/3560 Training loss: 1.3595 5.9376 sec/batch
Epoch 7/20  Iteration 1245/3560 Training loss: 1.3592 5.9343 sec/batch
Epoch 7/20  Iteration 1246/3560 Training loss: 1.3593 5.9270 sec/batch
Epoch 8/20  Iteration 1247/3560 Training loss: 1.4351 5.9932 sec/batch
Epoch 8/20  Iteration 1248/3560 Training loss: 1.3951 5.9873 sec/batch
Epoch 8/20  Iteration 1249/3560 Training loss: 1.3798 6.0620 sec/batch
Epoch 8/20  Iteration 1250/3560 Training loss: 1.3746 6.2858 sec/batch
Epoch 8/20  Iteration 1251/3560 Training loss: 1.3644 5.9326 sec/batch
Epoch 8/20  Iteration 1252/3560 Training loss: 1.3534 5.9622 sec/batch
Epoch 8/20  Iteration 1253/3560 Training loss: 1.3538 5.9383 sec/batch
Epoch 8/20  Iteration 1254/3560 Training loss: 1.3521 5.9454 sec/batch
Epoch 8/20  Iteration 1255/3560 Training loss: 1.3506 5.9385 sec/batch
Epoch 8/20  Iteration 1256/3560 Training loss: 1.3499 5.9938 sec/batch
Epoch 8/20  Iteration 1257/3560 Training loss: 1.3466 5.9329 sec/batch
Epoch 8/20  Iteration 1258/3560 Training loss: 1.3450 5.9703 sec/batch
Epoch 8/20  Iteration 1259/3560 Training loss: 1.3451 5.9633 sec/batch
Epoch 8/20  Iteration 1260/3560 Training loss: 1.3462 6.0420 sec/batch
Epoch 8/20  Iteration 1261/3560 Training loss: 1.3449 5.9229 sec/batch
Epoch 8/20  Iteration 1262/3560 Training loss: 1.3426 6.0298 sec/batch
Epoch 8/20  Iteration 1263/3560 Training loss: 1.3426 6.1511 sec/batch
Epoch 8/20  Iteration 1264/3560 Training loss: 1.3437 5.9809 sec/batch
Epoch 8/20  Iteration 1265/3560 Training loss: 1.3433 5.9139 sec/batch
Epoch 8/20  Iteration 1266/3560 Training loss: 1.3445 5.9191 sec/batch
Epoch 8/20  Iteration 1267/3560 Training loss: 1.3441 5.9418 sec/batch
Epoch 8/20  Iteration 1268/3560 Training loss: 1.3442 5.9715 sec/batch
Epoch 8/20  Iteration 1269/3560 Training loss: 1.3433 6.0198 sec/batch
Epoch 8/20  Iteration 1270/3560 Training loss: 1.3434 6.2483 sec/batch
Epoch 8/20  Iteration 1271/3560 Training loss: 1.3431 5.9512 sec/batch
Epoch 8/20  Iteration 1272/3560 Training loss: 1.3413 5.8970 sec/batch
Epoch 8/20  Iteration 1273/3560 Training loss: 1.3402 5.9514 sec/batch
Epoch 8/20  Iteration 1274/3560 Training loss: 1.3408 5.9583 sec/batch
Epoch 8/20  Iteration 1275/3560 Training loss: 1.3408 5.9215 sec/batch
Epoch 8/20  Iteration 1276/3560 Training loss: 1.3412 5.9289 sec/batch
Epoch 8/20  Iteration 1277/3560 Training loss: 1.3404 5.9326 sec/batch
Epoch 8/20  Iteration 1278/3560 Training loss: 1.3398 5.9306 sec/batch
Epoch 8/20  Iteration 1279/3560 Training loss: 1.3401 5.9166 sec/batch
Epoch 8/20  Iteration 1280/3560 Training loss: 1.3399 5.9146 sec/batch
Epoch 8/20  Iteration 1281/3560 Training loss: 1.3396 5.9250 sec/batch
Epoch 8/20  Iteration 1282/3560 Training loss: 1.3394 5.8894 sec/batch
Epoch 8/20  Iteration 1283/3560 Training loss: 1.3387 5.9223 sec/batch
Epoch 8/20  Iteration 1284/3560 Training loss: 1.3375 5.9594 sec/batch
Epoch 8/20  Iteration 1285/3560 Training loss: 1.3365 5.9582 sec/batch
Epoch 8/20  Iteration 1286/3560 Training loss: 1.3360 5.9203 sec/batch
Epoch 8/20  Iteration 1287/3560 Training loss: 1.3354 5.9501 sec/batch
Epoch 8/20  Iteration 1288/3560 Training loss: 1.3359 5.9744 sec/batch
Epoch 8/20  Iteration 1289/3560 Training loss: 1.3355 5.9430 sec/batch
Epoch 8/20  Iteration 1290/3560 Training loss: 1.3350 6.2963 sec/batch
Epoch 8/20  Iteration 1291/3560 Training loss: 1.3352 5.9586 sec/batch
Epoch 8/20  Iteration 1292/3560 Training loss: 1.3344 5.9436 sec/batch
Epoch 8/20  Iteration 1293/3560 Training loss: 1.3342 5.9694 sec/batch
Epoch 8/20  Iteration 1294/3560 Training loss: 1.3336 5.9382 sec/batch
Epoch 8/20  Iteration 1295/3560 Training loss: 1.3334 5.9528 sec/batch
Epoch 8/20  Iteration 1296/3560 Training loss: 1.3335 5.9161 sec/batch
Epoch 8/20  Iteration 1297/3560 Training loss: 1.3332 5.9790 sec/batch
Epoch 8/20  Iteration 1298/3560 Training loss: 1.3339 5.9462 sec/batch
Epoch 8/20  Iteration 1299/3560 Training loss: 1.3338 5.9548 sec/batch
Epoch 8/20  Iteration 1300/3560 Training loss: 1.3341 6.1803 sec/batch
Epoch 8/20  Iteration 1301/3560 Training loss: 1.3338 5.9625 sec/batch
Epoch 8/20  Iteration 1302/3560 Training loss: 1.3338 5.9530 sec/batch
Epoch 8/20  Iteration 1303/3560 Training loss: 1.3341 5.9327 sec/batch
Epoch 8/20  Iteration 1304/3560 Training loss: 1.3337 5.9401 sec/batch
Epoch 8/20  Iteration 1305/3560 Training loss: 1.3332 6.0121 sec/batch
Epoch 8/20  Iteration 1306/3560 Training loss: 1.3336 5.9386 sec/batch
Epoch 8/20  Iteration 1307/3560 Training loss: 1.3336 5.9274 sec/batch
Epoch 8/20  Iteration 1308/3560 Training loss: 1.3345 5.9827 sec/batch
Epoch 8/20  Iteration 1309/3560 Training loss: 1.3347 5.9585 sec/batch
Epoch 8/20  Iteration 1310/3560 Training loss: 1.3349 6.3162 sec/batch
Epoch 8/20  Iteration 1311/3560 Training loss: 1.3348 5.9683 sec/batch
Epoch 8/20  Iteration 1312/3560 Training loss: 1.3348 5.9875 sec/batch
Epoch 8/20  Iteration 1313/3560 Training loss: 1.3349 6.0322 sec/batch
Epoch 8/20  Iteration 1314/3560 Training loss: 1.3346 6.3535 sec/batch
Epoch 8/20  Iteration 1315/3560 Training loss: 1.3346 5.9759 sec/batch
Epoch 8/20  Iteration 1316/3560 Training loss: 1.3345 5.9413 sec/batch
Epoch 8/20  Iteration 1317/3560 Training loss: 1.3349 6.0403 sec/batch
Epoch 8/20  Iteration 1318/3560 Training loss: 1.3350 5.9276 sec/batch
Epoch 8/20  Iteration 1319/3560 Training loss: 1.3354 5.9620 sec/batch
Epoch 8/20  Iteration 1320/3560 Training loss: 1.3350 5.9553 sec/batch
Epoch 8/20  Iteration 1321/3560 Training loss: 1.3349 5.9184 sec/batch
Epoch 8/20  Iteration 1322/3560 Training loss: 1.3350 5.9197 sec/batch
Epoch 8/20  Iteration 1323/3560 Training loss: 1.3347 5.9302 sec/batch
Epoch 8/20  Iteration 1324/3560 Training loss: 1.3346 5.9280 sec/batch
Epoch 8/20  Iteration 1325/3560 Training loss: 1.3340 5.9441 sec/batch
Epoch 8/20  Iteration 1326/3560 Training loss: 1.3339 5.9267 sec/batch
Epoch 8/20  Iteration 1327/3560 Training loss: 1.3335 5.9516 sec/batch
Epoch 8/20  Iteration 1328/3560 Training loss: 1.3332 5.9678 sec/batch
Epoch 8/20  Iteration 1329/3560 Training loss: 1.3328 5.9511 sec/batch
Epoch 8/20  Iteration 1330/3560 Training loss: 1.3329 6.2260 sec/batch
Epoch 8/20  Iteration 1331/3560 Training loss: 1.3326 5.9460 sec/batch
Epoch 8/20  Iteration 1332/3560 Training loss: 1.3324 5.8929 sec/batch
Epoch 8/20  Iteration 1333/3560 Training loss: 1.3321 5.9817 sec/batch
Epoch 8/20  Iteration 1334/3560 Training loss: 1.3319 5.9365 sec/batch
Epoch 8/20  Iteration 1335/3560 Training loss: 1.3316 5.9546 sec/batch
Epoch 8/20  Iteration 1336/3560 Training loss: 1.3316 5.9348 sec/batch
Epoch 8/20  Iteration 1337/3560 Training loss: 1.3313 6.0146 sec/batch
Epoch 8/20  Iteration 1338/3560 Training loss: 1.3312 5.9464 sec/batch
Epoch 8/20  Iteration 1339/3560 Training loss: 1.3307 5.9236 sec/batch
Epoch 8/20  Iteration 1340/3560 Training loss: 1.3303 5.9469 sec/batch
Epoch 8/20  Iteration 1341/3560 Training loss: 1.3299 5.9202 sec/batch
Epoch 8/20  Iteration 1342/3560 Training loss: 1.3299 5.8997 sec/batch
Epoch 8/20  Iteration 1343/3560 Training loss: 1.3298 5.9959 sec/batch
Epoch 8/20  Iteration 1344/3560 Training loss: 1.3293 5.9452 sec/batch
Epoch 8/20  Iteration 1345/3560 Training loss: 1.3289 5.9499 sec/batch
Epoch 8/20  Iteration 1346/3560 Training loss: 1.3284 5.9598 sec/batch
Epoch 8/20  Iteration 1347/3560 Training loss: 1.3283 5.9481 sec/batch
Epoch 8/20  Iteration 1348/3560 Training loss: 1.3281 5.9610 sec/batch
Epoch 8/20  Iteration 1349/3560 Training loss: 1.3280 5.9619 sec/batch
Epoch 8/20  Iteration 1350/3560 Training loss: 1.3278 6.3172 sec/batch
Epoch 8/20  Iteration 1351/3560 Training loss: 1.3275 5.9311 sec/batch
Epoch 8/20  Iteration 1352/3560 Training loss: 1.3274 5.9896 sec/batch
Epoch 8/20  Iteration 1353/3560 Training loss: 1.3272 5.9846 sec/batch
Epoch 8/20  Iteration 1354/3560 Training loss: 1.3272 5.8848 sec/batch
Epoch 8/20  Iteration 1355/3560 Training loss: 1.3270 5.9313 sec/batch
Epoch 8/20  Iteration 1356/3560 Training loss: 1.3270 5.9655 sec/batch
Epoch 8/20  Iteration 1357/3560 Training loss: 1.3268 6.0140 sec/batch
Epoch 8/20  Iteration 1358/3560 Training loss: 1.3267 5.9843 sec/batch
Epoch 8/20  Iteration 1359/3560 Training loss: 1.3265 5.9403 sec/batch
Epoch 8/20  Iteration 1360/3560 Training loss: 1.3262 5.9409 sec/batch
Epoch 8/20  Iteration 1361/3560 Training loss: 1.3259 5.9484 sec/batch
Epoch 8/20  Iteration 1362/3560 Training loss: 1.3256 5.9658 sec/batch
Epoch 8/20  Iteration 1363/3560 Training loss: 1.3256 5.9507 sec/batch
Epoch 8/20  Iteration 1364/3560 Training loss: 1.3255 6.2791 sec/batch
Epoch 8/20  Iteration 1365/3560 Training loss: 1.3253 5.9946 sec/batch
Epoch 8/20  Iteration 1366/3560 Training loss: 1.3252 5.9734 sec/batch
Epoch 8/20  Iteration 1367/3560 Training loss: 1.3251 5.9307 sec/batch
Epoch 8/20  Iteration 1368/3560 Training loss: 1.3246 5.9513 sec/batch
Epoch 8/20  Iteration 1369/3560 Training loss: 1.3242 5.9497 sec/batch
Epoch 8/20  Iteration 1370/3560 Training loss: 1.3241 6.2156 sec/batch
Epoch 8/20  Iteration 1371/3560 Training loss: 1.3240 5.9759 sec/batch
Epoch 8/20  Iteration 1372/3560 Training loss: 1.3235 6.0040 sec/batch
Epoch 8/20  Iteration 1373/3560 Training loss: 1.3235 5.9056 sec/batch
Epoch 8/20  Iteration 1374/3560 Training loss: 1.3234 5.9700 sec/batch
Epoch 8/20  Iteration 1375/3560 Training loss: 1.3232 5.9483 sec/batch
Epoch 8/20  Iteration 1376/3560 Training loss: 1.3228 5.9657 sec/batch
Epoch 8/20  Iteration 1377/3560 Training loss: 1.3223 5.9553 sec/batch
Epoch 8/20  Iteration 1378/3560 Training loss: 1.3220 5.9415 sec/batch
Epoch 8/20  Iteration 1379/3560 Training loss: 1.3220 5.9574 sec/batch
Epoch 8/20  Iteration 1380/3560 Training loss: 1.3220 5.9745 sec/batch
Epoch 8/20  Iteration 1381/3560 Training loss: 1.3219 6.0239 sec/batch
Epoch 8/20  Iteration 1382/3560 Training loss: 1.3220 5.9566 sec/batch
Epoch 8/20  Iteration 1383/3560 Training loss: 1.3221 5.9377 sec/batch
Epoch 8/20  Iteration 1384/3560 Training loss: 1.3222 5.9143 sec/batch
Epoch 8/20  Iteration 1385/3560 Training loss: 1.3222 5.9841 sec/batch
Epoch 8/20  Iteration 1386/3560 Training loss: 1.3221 5.9248 sec/batch
Epoch 8/20  Iteration 1387/3560 Training loss: 1.3223 6.0172 sec/batch
Epoch 8/20  Iteration 1388/3560 Training loss: 1.3223 5.9911 sec/batch
Epoch 8/20  Iteration 1389/3560 Training loss: 1.3221 6.0155 sec/batch
Epoch 8/20  Iteration 1390/3560 Training loss: 1.3223 6.2957 sec/batch
Epoch 8/20  Iteration 1391/3560 Training loss: 1.3221 6.0181 sec/batch
Epoch 8/20  Iteration 1392/3560 Training loss: 1.3222 5.9587 sec/batch
Epoch 8/20  Iteration 1393/3560 Training loss: 1.3222 5.9529 sec/batch
Epoch 8/20  Iteration 1394/3560 Training loss: 1.3223 5.9275 sec/batch
Epoch 8/20  Iteration 1395/3560 Training loss: 1.3225 5.9798 sec/batch
Epoch 8/20  Iteration 1396/3560 Training loss: 1.3223 5.9766 sec/batch
Epoch 8/20  Iteration 1397/3560 Training loss: 1.3219 5.9134 sec/batch
Epoch 8/20  Iteration 1398/3560 Training loss: 1.3217 6.0016 sec/batch
Epoch 8/20  Iteration 1399/3560 Training loss: 1.3217 5.9483 sec/batch
Epoch 8/20  Iteration 1400/3560 Training loss: 1.3216 5.9479 sec/batch
Epoch 8/20  Iteration 1401/3560 Training loss: 1.3215 5.9777 sec/batch
Epoch 8/20  Iteration 1402/3560 Training loss: 1.3215 5.9812 sec/batch
Epoch 8/20  Iteration 1403/3560 Training loss: 1.3215 5.9925 sec/batch
Epoch 8/20  Iteration 1404/3560 Training loss: 1.3214 5.9693 sec/batch
Epoch 8/20  Iteration 1405/3560 Training loss: 1.3211 5.9665 sec/batch
Epoch 8/20  Iteration 1406/3560 Training loss: 1.3212 5.9474 sec/batch
Epoch 8/20  Iteration 1407/3560 Training loss: 1.3213 5.9915 sec/batch
Epoch 8/20  Iteration 1408/3560 Training loss: 1.3213 5.9986 sec/batch
Epoch 8/20  Iteration 1409/3560 Training loss: 1.3213 5.9949 sec/batch
Epoch 8/20  Iteration 1410/3560 Training loss: 1.3212 6.2667 sec/batch
Epoch 8/20  Iteration 1411/3560 Training loss: 1.3211 6.0591 sec/batch
Epoch 8/20  Iteration 1412/3560 Training loss: 1.3211 5.9674 sec/batch
Epoch 8/20  Iteration 1413/3560 Training loss: 1.3212 5.9742 sec/batch
Epoch 8/20  Iteration 1414/3560 Training loss: 1.3216 6.3893 sec/batch
Epoch 8/20  Iteration 1415/3560 Training loss: 1.3216 5.9240 sec/batch
Epoch 8/20  Iteration 1416/3560 Training loss: 1.3216 6.0736 sec/batch
Epoch 8/20  Iteration 1417/3560 Training loss: 1.3215 5.9495 sec/batch
Epoch 8/20  Iteration 1418/3560 Training loss: 1.3213 5.9611 sec/batch
Epoch 8/20  Iteration 1419/3560 Training loss: 1.3214 5.9373 sec/batch
Epoch 8/20  Iteration 1420/3560 Training loss: 1.3214 5.9402 sec/batch
Epoch 8/20  Iteration 1421/3560 Training loss: 1.3214 5.8932 sec/batch
Epoch 8/20  Iteration 1422/3560 Training loss: 1.3212 5.8993 sec/batch
Epoch 8/20  Iteration 1423/3560 Training loss: 1.3210 5.9217 sec/batch
Epoch 8/20  Iteration 1424/3560 Training loss: 1.3211 5.9171 sec/batch
Epoch 9/20  Iteration 1425/3560 Training loss: 1.3989 5.9126 sec/batch
Epoch 9/20  Iteration 1426/3560 Training loss: 1.3650 5.9271 sec/batch
Epoch 9/20  Iteration 1427/3560 Training loss: 1.3465 5.9262 sec/batch
Epoch 9/20  Iteration 1428/3560 Training loss: 1.3398 5.9630 sec/batch
Epoch 9/20  Iteration 1429/3560 Training loss: 1.3283 5.9578 sec/batch
Epoch 9/20  Iteration 1430/3560 Training loss: 1.3168 5.9592 sec/batch
Epoch 9/20  Iteration 1431/3560 Training loss: 1.3172 6.2205 sec/batch
Epoch 9/20  Iteration 1432/3560 Training loss: 1.3163 5.9191 sec/batch
Epoch 9/20  Iteration 1433/3560 Training loss: 1.3158 5.9781 sec/batch
Epoch 9/20  Iteration 1434/3560 Training loss: 1.3151 5.8870 sec/batch
Epoch 9/20  Iteration 1435/3560 Training loss: 1.3121 5.9423 sec/batch
Epoch 9/20  Iteration 1436/3560 Training loss: 1.3111 5.9582 sec/batch
Epoch 9/20  Iteration 1437/3560 Training loss: 1.3108 5.9246 sec/batch
Epoch 9/20  Iteration 1438/3560 Training loss: 1.3114 6.0008 sec/batch
Epoch 9/20  Iteration 1439/3560 Training loss: 1.3101 5.9655 sec/batch
Epoch 9/20  Iteration 1440/3560 Training loss: 1.3081 5.9459 sec/batch
Epoch 9/20  Iteration 1441/3560 Training loss: 1.3076 5.9756 sec/batch
Epoch 9/20  Iteration 1442/3560 Training loss: 1.3090 5.9270 sec/batch
Epoch 9/20  Iteration 1443/3560 Training loss: 1.3092 5.9082 sec/batch
Epoch 9/20  Iteration 1444/3560 Training loss: 1.3104 5.9082 sec/batch
Epoch 9/20  Iteration 1445/3560 Training loss: 1.3095 5.9494 sec/batch
Epoch 9/20  Iteration 1446/3560 Training loss: 1.3102 5.8950 sec/batch
Epoch 9/20  Iteration 1447/3560 Training loss: 1.3097 5.8796 sec/batch
Epoch 9/20  Iteration 1448/3560 Training loss: 1.3095 5.9381 sec/batch
Epoch 9/20  Iteration 1449/3560 Training loss: 1.3091 5.9701 sec/batch
Epoch 9/20  Iteration 1450/3560 Training loss: 1.3071 5.9934 sec/batch
Epoch 9/20  Iteration 1451/3560 Training loss: 1.3057 6.2281 sec/batch
Epoch 9/20  Iteration 1452/3560 Training loss: 1.3061 5.9981 sec/batch
Epoch 9/20  Iteration 1453/3560 Training loss: 1.3063 5.9360 sec/batch
Epoch 9/20  Iteration 1454/3560 Training loss: 1.3062 5.9892 sec/batch
Epoch 9/20  Iteration 1455/3560 Training loss: 1.3056 5.9518 sec/batch
Epoch 9/20  Iteration 1456/3560 Training loss: 1.3046 5.9913 sec/batch
Epoch 9/20  Iteration 1457/3560 Training loss: 1.3045 5.9381 sec/batch
Epoch 9/20  Iteration 1458/3560 Training loss: 1.3042 6.0111 sec/batch
Epoch 9/20  Iteration 1459/3560 Training loss: 1.3038 5.9419 sec/batch
Epoch 9/20  Iteration 1460/3560 Training loss: 1.3034 5.9606 sec/batch
Epoch 9/20  Iteration 1461/3560 Training loss: 1.3025 5.9687 sec/batch
Epoch 9/20  Iteration 1462/3560 Training loss: 1.3017 5.9643 sec/batch
Epoch 9/20  Iteration 1463/3560 Training loss: 1.3004 6.0137 sec/batch
Epoch 9/20  Iteration 1464/3560 Training loss: 1.3001 6.3310 sec/batch
Epoch 9/20  Iteration 1465/3560 Training loss: 1.2995 5.9933 sec/batch
Epoch 9/20  Iteration 1466/3560 Training loss: 1.3001 6.4000 sec/batch
Epoch 9/20  Iteration 1467/3560 Training loss: 1.2997 6.3743 sec/batch
Epoch 9/20  Iteration 1468/3560 Training loss: 1.2990 5.9361 sec/batch
Epoch 9/20  Iteration 1469/3560 Training loss: 1.2989 5.9388 sec/batch
Epoch 9/20  Iteration 1470/3560 Training loss: 1.2980 5.9107 sec/batch
Epoch 9/20  Iteration 1471/3560 Training loss: 1.2976 6.1449 sec/batch
Epoch 9/20  Iteration 1472/3560 Training loss: 1.2969 5.9467 sec/batch
Epoch 9/20  Iteration 1473/3560 Training loss: 1.2966 5.9427 sec/batch
Epoch 9/20  Iteration 1474/3560 Training loss: 1.2967 5.9297 sec/batch
Epoch 9/20  Iteration 1475/3560 Training loss: 1.2962 5.9150 sec/batch
Epoch 9/20  Iteration 1476/3560 Training loss: 1.2967 5.9303 sec/batch
Epoch 9/20  Iteration 1477/3560 Training loss: 1.2966 5.9096 sec/batch
Epoch 9/20  Iteration 1478/3560 Training loss: 1.2968 5.9467 sec/batch
Epoch 9/20  Iteration 1479/3560 Training loss: 1.2965 5.8762 sec/batch
Epoch 9/20  Iteration 1480/3560 Training loss: 1.2965 5.9289 sec/batch
Epoch 9/20  Iteration 1481/3560 Training loss: 1.2966 5.9226 sec/batch
Epoch 9/20  Iteration 1482/3560 Training loss: 1.2962 5.9369 sec/batch
Epoch 9/20  Iteration 1483/3560 Training loss: 1.2956 5.9720 sec/batch
Epoch 9/20  Iteration 1484/3560 Training loss: 1.2961 5.9628 sec/batch
Epoch 9/20  Iteration 1485/3560 Training loss: 1.2961 5.8745 sec/batch
Epoch 9/20  Iteration 1486/3560 Training loss: 1.2968 5.9504 sec/batch
Epoch 9/20  Iteration 1487/3560 Training loss: 1.2971 5.9309 sec/batch
Epoch 9/20  Iteration 1488/3560 Training loss: 1.2970 5.9665 sec/batch
Epoch 9/20  Iteration 1489/3560 Training loss: 1.2969 5.9470 sec/batch
Epoch 9/20  Iteration 1490/3560 Training loss: 1.2967 5.9200 sec/batch
Epoch 9/20  Iteration 1491/3560 Training loss: 1.2971 6.1925 sec/batch
Epoch 9/20  Iteration 1492/3560 Training loss: 1.2967 5.9153 sec/batch
Epoch 9/20  Iteration 1493/3560 Training loss: 1.2967 5.9361 sec/batch
Epoch 9/20  Iteration 1494/3560 Training loss: 1.2964 5.9498 sec/batch
Epoch 9/20  Iteration 1495/3560 Training loss: 1.2969 5.9248 sec/batch
Epoch 9/20  Iteration 1496/3560 Training loss: 1.2970 5.9338 sec/batch
Epoch 9/20  Iteration 1497/3560 Training loss: 1.2975 5.9140 sec/batch
Epoch 9/20  Iteration 1498/3560 Training loss: 1.2971 5.9461 sec/batch
Epoch 9/20  Iteration 1499/3560 Training loss: 1.2967 5.9741 sec/batch
Epoch 9/20  Iteration 1500/3560 Training loss: 1.2967 5.9179 sec/batch
Epoch 9/20  Iteration 1501/3560 Training loss: 1.2966 5.9532 sec/batch
Epoch 9/20  Iteration 1502/3560 Training loss: 1.2964 5.9034 sec/batch
Epoch 9/20  Iteration 1503/3560 Training loss: 1.2958 5.9737 sec/batch
Epoch 9/20  Iteration 1504/3560 Training loss: 1.2957 5.9018 sec/batch
Epoch 9/20  Iteration 1505/3560 Training loss: 1.2952 5.8872 sec/batch
Epoch 9/20  Iteration 1506/3560 Training loss: 1.2950 5.8930 sec/batch
Epoch 9/20  Iteration 1507/3560 Training loss: 1.2944 5.9281 sec/batch
Epoch 9/20  Iteration 1508/3560 Training loss: 1.2945 5.9454 sec/batch
Epoch 9/20  Iteration 1509/3560 Training loss: 1.2942 6.2681 sec/batch
Epoch 9/20  Iteration 1510/3560 Training loss: 1.2940 7.1712 sec/batch
Epoch 9/20  Iteration 1511/3560 Training loss: 1.2937 6.1866 sec/batch
Epoch 9/20  Iteration 1512/3560 Training loss: 1.2935 5.9603 sec/batch
Epoch 9/20  Iteration 1513/3560 Training loss: 1.2931 5.9792 sec/batch
Epoch 9/20  Iteration 1514/3560 Training loss: 1.2932 6.4349 sec/batch
Epoch 9/20  Iteration 1515/3560 Training loss: 1.2929 5.9388 sec/batch
Epoch 9/20  Iteration 1516/3560 Training loss: 1.2927 5.9263 sec/batch
Epoch 9/20  Iteration 1517/3560 Training loss: 1.2924 5.9997 sec/batch
Epoch 9/20  Iteration 1518/3560 Training loss: 1.2920 5.9081 sec/batch
Epoch 9/20  Iteration 1519/3560 Training loss: 1.2918 5.9224 sec/batch
Epoch 9/20  Iteration 1520/3560 Training loss: 1.2918 5.9606 sec/batch
Epoch 9/20  Iteration 1521/3560 Training loss: 1.2917 5.9179 sec/batch
Epoch 9/20  Iteration 1522/3560 Training loss: 1.2913 5.9432 sec/batch
Epoch 9/20  Iteration 1523/3560 Training loss: 1.2909 5.9559 sec/batch
Epoch 9/20  Iteration 1524/3560 Training loss: 1.2905 5.9224 sec/batch
Epoch 9/20  Iteration 1525/3560 Training loss: 1.2904 5.9594 sec/batch
Epoch 9/20  Iteration 1526/3560 Training loss: 1.2903 5.9767 sec/batch
Epoch 9/20  Iteration 1527/3560 Training loss: 1.2902 5.9122 sec/batch
Epoch 9/20  Iteration 1528/3560 Training loss: 1.2899 5.9317 sec/batch
Epoch 9/20  Iteration 1529/3560 Training loss: 1.2896 5.9354 sec/batch
Epoch 9/20  Iteration 1530/3560 Training loss: 1.2895 6.1819 sec/batch
Epoch 9/20  Iteration 1531/3560 Training loss: 1.2894 6.0566 sec/batch
Epoch 9/20  Iteration 1532/3560 Training loss: 1.2895 5.9615 sec/batch
Epoch 9/20  Iteration 1533/3560 Training loss: 1.2893 5.9442 sec/batch
Epoch 9/20  Iteration 1534/3560 Training loss: 1.2894 5.9480 sec/batch
Epoch 9/20  Iteration 1535/3560 Training loss: 1.2892 5.9382 sec/batch
Epoch 9/20  Iteration 1536/3560 Training loss: 1.2891 6.0573 sec/batch
Epoch 9/20  Iteration 1537/3560 Training loss: 1.2889 5.9286 sec/batch
Epoch 9/20  Iteration 1538/3560 Training loss: 1.2887 6.0302 sec/batch
Epoch 9/20  Iteration 1539/3560 Training loss: 1.2884 5.9844 sec/batch
Epoch 9/20  Iteration 1540/3560 Training loss: 1.2880 5.9521 sec/batch
Epoch 9/20  Iteration 1541/3560 Training loss: 1.2880 5.9150 sec/batch
Epoch 9/20  Iteration 1542/3560 Training loss: 1.2879 6.0076 sec/batch
Epoch 9/20  Iteration 1543/3560 Training loss: 1.2876 5.9863 sec/batch
Epoch 9/20  Iteration 1544/3560 Training loss: 1.2875 6.0494 sec/batch
Epoch 9/20  Iteration 1545/3560 Training loss: 1.2874 5.9602 sec/batch
Epoch 9/20  Iteration 1546/3560 Training loss: 1.2869 5.9390 sec/batch
Epoch 9/20  Iteration 1547/3560 Training loss: 1.2865 5.9498 sec/batch
Epoch 9/20  Iteration 1548/3560 Training loss: 1.2863 6.0020 sec/batch
Epoch 9/20  Iteration 1549/3560 Training loss: 1.2862 5.9656 sec/batch
Epoch 9/20  Iteration 1550/3560 Training loss: 1.2858 6.2381 sec/batch
Epoch 9/20  Iteration 1551/3560 Training loss: 1.2857 6.0602 sec/batch
Epoch 9/20  Iteration 1552/3560 Training loss: 1.2858 5.9594 sec/batch
Epoch 9/20  Iteration 1553/3560 Training loss: 1.2855 5.9637 sec/batch
Epoch 9/20  Iteration 1554/3560 Training loss: 1.2852 5.9444 sec/batch
Epoch 9/20  Iteration 1555/3560 Training loss: 1.2847 5.9195 sec/batch
Epoch 9/20  Iteration 1556/3560 Training loss: 1.2844 5.9261 sec/batch
Epoch 9/20  Iteration 1557/3560 Training loss: 1.2845 5.9000 sec/batch
Epoch 9/20  Iteration 1558/3560 Training loss: 1.2845 5.9491 sec/batch
Epoch 9/20  Iteration 1559/3560 Training loss: 1.2844 5.9535 sec/batch
Epoch 9/20  Iteration 1560/3560 Training loss: 1.2844 5.9774 sec/batch
Epoch 9/20  Iteration 1561/3560 Training loss: 1.2846 5.9432 sec/batch
Epoch 9/20  Iteration 1562/3560 Training loss: 1.2847 5.9471 sec/batch
Epoch 9/20  Iteration 1563/3560 Training loss: 1.2846 5.9950 sec/batch
Epoch 9/20  Iteration 1564/3560 Training loss: 1.2846 6.1558 sec/batch
Epoch 9/20  Iteration 1565/3560 Training loss: 1.2849 6.0364 sec/batch
Epoch 9/20  Iteration 1566/3560 Training loss: 1.2849 5.9974 sec/batch
Epoch 9/20  Iteration 1567/3560 Training loss: 1.2847 5.9749 sec/batch
Epoch 9/20  Iteration 1568/3560 Training loss: 1.2849 5.9396 sec/batch
Epoch 9/20  Iteration 1569/3560 Training loss: 1.2848 5.9520 sec/batch
Epoch 9/20  Iteration 1570/3560 Training loss: 1.2849 6.2247 sec/batch
Epoch 9/20  Iteration 1571/3560 Training loss: 1.2849 6.0335 sec/batch
Epoch 9/20  Iteration 1572/3560 Training loss: 1.2851 5.9243 sec/batch
Epoch 9/20  Iteration 1573/3560 Training loss: 1.2852 5.9248 sec/batch
Epoch 9/20  Iteration 1574/3560 Training loss: 1.2852 5.9361 sec/batch
Epoch 9/20  Iteration 1575/3560 Training loss: 1.2848 5.9286 sec/batch
Epoch 9/20  Iteration 1576/3560 Training loss: 1.2847 5.9259 sec/batch
Epoch 9/20  Iteration 1577/3560 Training loss: 1.2846 5.9385 sec/batch
Epoch 9/20  Iteration 1578/3560 Training loss: 1.2845 5.9295 sec/batch
Epoch 9/20  Iteration 1579/3560 Training loss: 1.2845 5.9059 sec/batch
Epoch 9/20  Iteration 1580/3560 Training loss: 1.2844 5.9564 sec/batch
Epoch 9/20  Iteration 1581/3560 Training loss: 1.2843 5.9332 sec/batch
Epoch 9/20  Iteration 1582/3560 Training loss: 1.2842 5.8997 sec/batch
Epoch 9/20  Iteration 1583/3560 Training loss: 1.2840 5.9593 sec/batch
Epoch 9/20  Iteration 1584/3560 Training loss: 1.2840 5.9527 sec/batch
Epoch 9/20  Iteration 1585/3560 Training loss: 1.2842 5.8832 sec/batch
Epoch 9/20  Iteration 1586/3560 Training loss: 1.2842 5.9487 sec/batch
Epoch 9/20  Iteration 1587/3560 Training loss: 1.2841 5.9587 sec/batch
Epoch 9/20  Iteration 1588/3560 Training loss: 1.2840 5.9480 sec/batch
Epoch 9/20  Iteration 1589/3560 Training loss: 1.2840 5.9511 sec/batch
Epoch 9/20  Iteration 1590/3560 Training loss: 1.2839 6.1064 sec/batch
Epoch 9/20  Iteration 1591/3560 Training loss: 1.2841 6.1727 sec/batch
Epoch 9/20  Iteration 1592/3560 Training loss: 1.2844 5.9845 sec/batch
Epoch 9/20  Iteration 1593/3560 Training loss: 1.2845 5.9501 sec/batch
Epoch 9/20  Iteration 1594/3560 Training loss: 1.2845 5.9092 sec/batch
Epoch 9/20  Iteration 1595/3560 Training loss: 1.2844 5.9394 sec/batch
Epoch 9/20  Iteration 1596/3560 Training loss: 1.2843 5.9139 sec/batch
Epoch 9/20  Iteration 1597/3560 Training loss: 1.2843 5.9464 sec/batch
Epoch 9/20  Iteration 1598/3560 Training loss: 1.2843 5.9488 sec/batch
Epoch 9/20  Iteration 1599/3560 Training loss: 1.2843 5.9034 sec/batch
Epoch 9/20  Iteration 1600/3560 Training loss: 1.2841 5.9351 sec/batch
Epoch 9/20  Iteration 1601/3560 Training loss: 1.2839 5.8707 sec/batch
Epoch 9/20  Iteration 1602/3560 Training loss: 1.2841 5.9455 sec/batch
Epoch 10/20  Iteration 1603/3560 Training loss: 1.3646 6.0316 sec/batch
Epoch 10/20  Iteration 1604/3560 Training loss: 1.3332 5.9342 sec/batch
Epoch 10/20  Iteration 1605/3560 Training loss: 1.3169 5.9361 sec/batch
Epoch 10/20  Iteration 1606/3560 Training loss: 1.3099 5.9477 sec/batch
Epoch 10/20  Iteration 1607/3560 Training loss: 1.2992 5.9386 sec/batch
Epoch 10/20  Iteration 1608/3560 Training loss: 1.2872 5.9408 sec/batch
Epoch 10/20  Iteration 1609/3560 Training loss: 1.2856 5.9601 sec/batch
Epoch 10/20  Iteration 1610/3560 Training loss: 1.2825 5.9410 sec/batch
Epoch 10/20  Iteration 1611/3560 Training loss: 1.2818 6.3377 sec/batch
Epoch 10/20  Iteration 1612/3560 Training loss: 1.2810 6.5056 sec/batch
Epoch 10/20  Iteration 1613/3560 Training loss: 1.2782 5.9864 sec/batch
Epoch 10/20  Iteration 1614/3560 Training loss: 1.2777 6.3608 sec/batch
Epoch 10/20  Iteration 1615/3560 Training loss: 1.2776 6.0418 sec/batch
Epoch 10/20  Iteration 1616/3560 Training loss: 1.2772 6.0294 sec/batch
Epoch 10/20  Iteration 1617/3560 Training loss: 1.2758 5.9513 sec/batch
Epoch 10/20  Iteration 1618/3560 Training loss: 1.2737 6.0643 sec/batch
Epoch 10/20  Iteration 1619/3560 Training loss: 1.2739 6.2665 sec/batch
Epoch 10/20  Iteration 1620/3560 Training loss: 1.2753 5.9561 sec/batch
Epoch 10/20  Iteration 1621/3560 Training loss: 1.2746 5.9007 sec/batch
Epoch 10/20  Iteration 1622/3560 Training loss: 1.2750 6.0287 sec/batch
Epoch 10/20  Iteration 1623/3560 Training loss: 1.2744 6.0492 sec/batch
Epoch 10/20  Iteration 1624/3560 Training loss: 1.2747 6.1530 sec/batch
Epoch 10/20  Iteration 1625/3560 Training loss: 1.2737 6.0488 sec/batch
Epoch 10/20  Iteration 1626/3560 Training loss: 1.2739 6.0519 sec/batch
Epoch 10/20  Iteration 1627/3560 Training loss: 1.2733 6.0660 sec/batch
Epoch 10/20  Iteration 1628/3560 Training loss: 1.2716 6.5237 sec/batch
Epoch 10/20  Iteration 1629/3560 Training loss: 1.2704 6.2254 sec/batch
Epoch 10/20  Iteration 1630/3560 Training loss: 1.2708 6.3705 sec/batch
Epoch 10/20  Iteration 1631/3560 Training loss: 1.2708 6.1242 sec/batch
Epoch 10/20  Iteration 1632/3560 Training loss: 1.2710 6.1024 sec/batch
Epoch 10/20  Iteration 1633/3560 Training loss: 1.2705 6.0002 sec/batch
Epoch 10/20  Iteration 1634/3560 Training loss: 1.2693 6.0340 sec/batch
Epoch 10/20  Iteration 1635/3560 Training loss: 1.2694 6.2522 sec/batch
Epoch 10/20  Iteration 1636/3560 Training loss: 1.2695 6.2743 sec/batch
Epoch 10/20  Iteration 1637/3560 Training loss: 1.2691 6.2442 sec/batch
Epoch 10/20  Iteration 1638/3560 Training loss: 1.2689 6.0052 sec/batch
Epoch 10/20  Iteration 1639/3560 Training loss: 1.2681 6.0835 sec/batch
Epoch 10/20  Iteration 1640/3560 Training loss: 1.2671 6.0326 sec/batch
Epoch 10/20  Iteration 1641/3560 Training loss: 1.2660 6.0213 sec/batch
Epoch 10/20  Iteration 1642/3560 Training loss: 1.2657 6.2757 sec/batch
Epoch 10/20  Iteration 1643/3560 Training loss: 1.2652 6.0780 sec/batch
Epoch 10/20  Iteration 1644/3560 Training loss: 1.2661 6.2812 sec/batch
Epoch 10/20  Iteration 1645/3560 Training loss: 1.2659 6.0143 sec/batch
Epoch 10/20  Iteration 1646/3560 Training loss: 1.2651 6.0224 sec/batch
Epoch 10/20  Iteration 1647/3560 Training loss: 1.2652 6.0680 sec/batch
Epoch 10/20  Iteration 1648/3560 Training loss: 1.2642 5.9974 sec/batch
Epoch 10/20  Iteration 1649/3560 Training loss: 1.2638 6.1084 sec/batch
Epoch 10/20  Iteration 1650/3560 Training loss: 1.2635 6.4736 sec/batch
Epoch 10/20  Iteration 1651/3560 Training loss: 1.2634 6.3790 sec/batch
Epoch 10/20  Iteration 1652/3560 Training loss: 1.2635 6.0876 sec/batch
Epoch 10/20  Iteration 1653/3560 Training loss: 1.2628 6.0612 sec/batch
Epoch 10/20  Iteration 1654/3560 Training loss: 1.2633 6.0665 sec/batch
Epoch 10/20  Iteration 1655/3560 Training loss: 1.2634 6.0494 sec/batch
Epoch 10/20  Iteration 1656/3560 Training loss: 1.2636 6.1559 sec/batch
Epoch 10/20  Iteration 1657/3560 Training loss: 1.2634 6.2245 sec/batch
Epoch 10/20  Iteration 1658/3560 Training loss: 1.2635 6.3397 sec/batch
Epoch 10/20  Iteration 1659/3560 Training loss: 1.2637 6.0408 sec/batch
Epoch 10/20  Iteration 1660/3560 Training loss: 1.2633 6.0643 sec/batch
Epoch 10/20  Iteration 1661/3560 Training loss: 1.2628 5.9932 sec/batch
Epoch 10/20  Iteration 1662/3560 Training loss: 1.2633 6.0283 sec/batch
Epoch 10/20  Iteration 1663/3560 Training loss: 1.2633 6.4558 sec/batch
Epoch 10/20  Iteration 1664/3560 Training loss: 1.2641 6.4023 sec/batch
Epoch 10/20  Iteration 1665/3560 Training loss: 1.2643 6.4056 sec/batch
Epoch 10/20  Iteration 1666/3560 Training loss: 1.2644 6.0599 sec/batch
Epoch 10/20  Iteration 1667/3560 Training loss: 1.2643 6.1283 sec/batch
Epoch 10/20  Iteration 1668/3560 Training loss: 1.2644 6.0417 sec/batch
Epoch 10/20  Iteration 1669/3560 Training loss: 1.2646 6.0716 sec/batch
Epoch 10/20  Iteration 1670/3560 Training loss: 1.2643 6.2955 sec/batch
Epoch 10/20  Iteration 1671/3560 Training loss: 1.2642 6.2667 sec/batch
Epoch 10/20  Iteration 1672/3560 Training loss: 1.2640 6.4041 sec/batch
Epoch 10/20  Iteration 1673/3560 Training loss: 1.2645 6.0857 sec/batch
Epoch 10/20  Iteration 1674/3560 Training loss: 1.2646 6.1319 sec/batch
Epoch 10/20  Iteration 1675/3560 Training loss: 1.2651 6.0338 sec/batch
Epoch 10/20  Iteration 1676/3560 Training loss: 1.2646 6.0035 sec/batch
Epoch 10/20  Iteration 1677/3560 Training loss: 1.2645 6.0283 sec/batch
Epoch 10/20  Iteration 1678/3560 Training loss: 1.2646 6.2993 sec/batch
Epoch 10/20  Iteration 1679/3560 Training loss: 1.2645 6.2850 sec/batch
Epoch 10/20  Iteration 1680/3560 Training loss: 1.2644 6.1585 sec/batch
Epoch 10/20  Iteration 1681/3560 Training loss: 1.2640 6.0043 sec/batch
Epoch 10/20  Iteration 1682/3560 Training loss: 1.2639 6.1652 sec/batch
Epoch 10/20  Iteration 1683/3560 Training loss: 1.2635 5.9916 sec/batch
Epoch 10/20  Iteration 1684/3560 Training loss: 1.2633 6.0638 sec/batch
Epoch 10/20  Iteration 1685/3560 Training loss: 1.2630 6.2337 sec/batch
Epoch 10/20  Iteration 1686/3560 Training loss: 1.2629 6.1416 sec/batch
Epoch 10/20  Iteration 1687/3560 Training loss: 1.2626 6.2318 sec/batch
Epoch 10/20  Iteration 1688/3560 Training loss: 1.2624 6.0042 sec/batch
Epoch 10/20  Iteration 1689/3560 Training loss: 1.2622 6.2890 sec/batch
Epoch 10/20  Iteration 1690/3560 Training loss: 1.2620 6.0753 sec/batch
Epoch 10/20  Iteration 1691/3560 Training loss: 1.2615 6.1257 sec/batch
Epoch 10/20  Iteration 1692/3560 Training loss: 1.2617 6.2188 sec/batch
Epoch 10/20  Iteration 1693/3560 Training loss: 1.2614 6.1141 sec/batch
Epoch 10/20  Iteration 1694/3560 Training loss: 1.2613 6.3357 sec/batch
Epoch 10/20  Iteration 1695/3560 Training loss: 1.2609 6.0281 sec/batch
Epoch 10/20  Iteration 1696/3560 Training loss: 1.2605 6.2179 sec/batch
Epoch 10/20  Iteration 1697/3560 Training loss: 1.2602 6.0747 sec/batch
Epoch 10/20  Iteration 1698/3560 Training loss: 1.2604 6.0337 sec/batch
Epoch 10/20  Iteration 1699/3560 Training loss: 1.2604 6.0984 sec/batch
Epoch 10/20  Iteration 1700/3560 Training loss: 1.2600 6.1616 sec/batch
Epoch 10/20  Iteration 1701/3560 Training loss: 1.2597 6.3684 sec/batch
Epoch 10/20  Iteration 1702/3560 Training loss: 1.2593 5.9872 sec/batch
Epoch 10/20  Iteration 1703/3560 Training loss: 1.2593 6.0603 sec/batch
Epoch 10/20  Iteration 1704/3560 Training loss: 1.2593 6.0284 sec/batch
Epoch 10/20  Iteration 1705/3560 Training loss: 1.2591 5.9740 sec/batch
Epoch 10/20  Iteration 1706/3560 Training loss: 1.2589 6.0963 sec/batch
Epoch 10/20  Iteration 1707/3560 Training loss: 1.2587 6.3296 sec/batch
Epoch 10/20  Iteration 1708/3560 Training loss: 1.2586 6.3663 sec/batch
Epoch 10/20  Iteration 1709/3560 Training loss: 1.2586 6.2690 sec/batch
Epoch 10/20  Iteration 1710/3560 Training loss: 1.2586 6.1015 sec/batch
Epoch 10/20  Iteration 1711/3560 Training loss: 1.2584 6.0815 sec/batch
Epoch 10/20  Iteration 1712/3560 Training loss: 1.2585 6.4435 sec/batch
Epoch 10/20  Iteration 1713/3560 Training loss: 1.2583 6.0991 sec/batch
Epoch 10/20  Iteration 1714/3560 Training loss: 1.2582 6.3613 sec/batch
Epoch 10/20  Iteration 1715/3560 Training loss: 1.2580 6.4695 sec/batch
Epoch 10/20  Iteration 1716/3560 Training loss: 1.2579 6.0253 sec/batch
Epoch 10/20  Iteration 1717/3560 Training loss: 1.2575 6.0816 sec/batch
Epoch 10/20  Iteration 1718/3560 Training loss: 1.2572 6.0659 sec/batch
Epoch 10/20  Iteration 1719/3560 Training loss: 1.2572 5.9713 sec/batch
Epoch 10/20  Iteration 1720/3560 Training loss: 1.2571 6.0454 sec/batch
Epoch 10/20  Iteration 1721/3560 Training loss: 1.2570 6.2883 sec/batch
Epoch 10/20  Iteration 1722/3560 Training loss: 1.2570 6.3234 sec/batch
Epoch 10/20  Iteration 1723/3560 Training loss: 1.2568 6.1096 sec/batch
Epoch 10/20  Iteration 1724/3560 Training loss: 1.2564 6.0048 sec/batch
Epoch 10/20  Iteration 1725/3560 Training loss: 1.2560 6.0542 sec/batch
Epoch 10/20  Iteration 1726/3560 Training loss: 1.2559 6.0642 sec/batch
Epoch 10/20  Iteration 1727/3560 Training loss: 1.2558 6.0805 sec/batch
Epoch 10/20  Iteration 1728/3560 Training loss: 1.2555 6.5194 sec/batch
Epoch 10/20  Iteration 1729/3560 Training loss: 1.2554 6.2674 sec/batch
Epoch 10/20  Iteration 1730/3560 Training loss: 1.2554 6.1550 sec/batch
Epoch 10/20  Iteration 1731/3560 Training loss: 1.2551 5.9979 sec/batch
Epoch 10/20  Iteration 1732/3560 Training loss: 1.2548 6.0538 sec/batch
Epoch 10/20  Iteration 1733/3560 Training loss: 1.2543 5.9754 sec/batch
Epoch 10/20  Iteration 1734/3560 Training loss: 1.2541 6.0419 sec/batch
Epoch 10/20  Iteration 1735/3560 Training loss: 1.2542 6.2248 sec/batch
Epoch 10/20  Iteration 1736/3560 Training loss: 1.2541 6.1758 sec/batch
Epoch 10/20  Iteration 1737/3560 Training loss: 1.2541 6.2519 sec/batch
Epoch 10/20  Iteration 1738/3560 Training loss: 1.2540 6.0729 sec/batch
Epoch 10/20  Iteration 1739/3560 Training loss: 1.2541 6.0742 sec/batch
Epoch 10/20  Iteration 1740/3560 Training loss: 1.2543 6.0899 sec/batch
Epoch 10/20  Iteration 1741/3560 Training loss: 1.2543 6.2008 sec/batch
Epoch 10/20  Iteration 1742/3560 Training loss: 1.2542 6.1685 sec/batch
Epoch 10/20  Iteration 1743/3560 Training loss: 1.2545 6.1498 sec/batch
Epoch 10/20  Iteration 1744/3560 Training loss: 1.2545 6.3616 sec/batch
Epoch 10/20  Iteration 1745/3560 Training loss: 1.2544 6.0591 sec/batch
Epoch 10/20  Iteration 1746/3560 Training loss: 1.2546 6.0720 sec/batch
Epoch 10/20  Iteration 1747/3560 Training loss: 1.2544 6.1985 sec/batch
Epoch 10/20  Iteration 1748/3560 Training loss: 1.2545 6.2385 sec/batch
Epoch 10/20  Iteration 1749/3560 Training loss: 1.2546 6.1432 sec/batch
Epoch 10/20  Iteration 1750/3560 Training loss: 1.2547 6.2266 sec/batch
Epoch 10/20  Iteration 1751/3560 Training loss: 1.2548 6.3519 sec/batch
Epoch 10/20  Iteration 1752/3560 Training loss: 1.2547 5.9779 sec/batch
Epoch 10/20  Iteration 1753/3560 Training loss: 1.2544 6.1261 sec/batch
Epoch 10/20  Iteration 1754/3560 Training loss: 1.2542 6.0634 sec/batch
Epoch 10/20  Iteration 1755/3560 Training loss: 1.2542 6.1082 sec/batch
Epoch 10/20  Iteration 1756/3560 Training loss: 1.2541 6.6482 sec/batch
Epoch 10/20  Iteration 1757/3560 Training loss: 1.2540 6.2458 sec/batch
Epoch 10/20  Iteration 1758/3560 Training loss: 1.2539 6.3284 sec/batch
Epoch 10/20  Iteration 1759/3560 Training loss: 1.2539 6.0381 sec/batch
Epoch 10/20  Iteration 1760/3560 Training loss: 1.2538 6.3700 sec/batch
Epoch 10/20  Iteration 1761/3560 Training loss: 1.2536 6.0299 sec/batch
Epoch 10/20  Iteration 1762/3560 Training loss: 1.2537 6.0306 sec/batch
Epoch 10/20  Iteration 1763/3560 Training loss: 1.2539 6.0391 sec/batch
Epoch 10/20  Iteration 1764/3560 Training loss: 1.2539 6.3591 sec/batch
Epoch 10/20  Iteration 1765/3560 Training loss: 1.2538 6.3352 sec/batch
Epoch 10/20  Iteration 1766/3560 Training loss: 1.2538 6.0635 sec/batch
Epoch 10/20  Iteration 1767/3560 Training loss: 1.2537 6.3238 sec/batch
Epoch 10/20  Iteration 1768/3560 Training loss: 1.2536 6.0232 sec/batch
Epoch 10/20  Iteration 1769/3560 Training loss: 1.2538 5.9684 sec/batch
Epoch 10/20  Iteration 1770/3560 Training loss: 1.2542 5.9909 sec/batch
Epoch 10/20  Iteration 1771/3560 Training loss: 1.2542 6.3255 sec/batch
Epoch 10/20  Iteration 1772/3560 Training loss: 1.2542 6.2999 sec/batch
Epoch 10/20  Iteration 1773/3560 Training loss: 1.2542 6.1202 sec/batch
Epoch 10/20  Iteration 1774/3560 Training loss: 1.2541 6.0116 sec/batch
Epoch 10/20  Iteration 1775/3560 Training loss: 1.2541 6.0123 sec/batch
Epoch 10/20  Iteration 1776/3560 Training loss: 1.2541 6.0310 sec/batch
Epoch 10/20  Iteration 1777/3560 Training loss: 1.2542 6.0403 sec/batch
Epoch 10/20  Iteration 1778/3560 Training loss: 1.2540 6.2472 sec/batch
Epoch 10/20  Iteration 1779/3560 Training loss: 1.2538 6.2464 sec/batch
Epoch 10/20  Iteration 1780/3560 Training loss: 1.2540 6.1541 sec/batch
Epoch 11/20  Iteration 1781/3560 Training loss: 1.3478 6.0753 sec/batch
Epoch 11/20  Iteration 1782/3560 Training loss: 1.3022 5.9809 sec/batch
Epoch 11/20  Iteration 1783/3560 Training loss: 1.2827 6.0427 sec/batch
Epoch 11/20  Iteration 1784/3560 Training loss: 1.2781 6.0268 sec/batch
Epoch 11/20  Iteration 1785/3560 Training loss: 1.2692 6.2452 sec/batch
Epoch 11/20  Iteration 1786/3560 Training loss: 1.2578 6.2895 sec/batch
Epoch 11/20  Iteration 1787/3560 Training loss: 1.2580 6.3331 sec/batch
Epoch 11/20  Iteration 1788/3560 Training loss: 1.2562 6.0640 sec/batch
Epoch 11/20  Iteration 1789/3560 Training loss: 1.2553 6.0180 sec/batch
Epoch 11/20  Iteration 1790/3560 Training loss: 1.2540 6.0022 sec/batch
Epoch 11/20  Iteration 1791/3560 Training loss: 1.2509 6.0227 sec/batch
Epoch 11/20  Iteration 1792/3560 Training loss: 1.2501 6.0762 sec/batch
Epoch 11/20  Iteration 1793/3560 Training loss: 1.2504 6.2038 sec/batch
Epoch 11/20  Iteration 1794/3560 Training loss: 1.2510 6.3490 sec/batch
Epoch 11/20  Iteration 1795/3560 Training loss: 1.2496 5.9919 sec/batch
Epoch 11/20  Iteration 1796/3560 Training loss: 1.2475 6.0180 sec/batch
Epoch 11/20  Iteration 1797/3560 Training loss: 1.2475 6.0141 sec/batch
Epoch 11/20  Iteration 1798/3560 Training loss: 1.2483 6.0131 sec/batch
Epoch 11/20  Iteration 1799/3560 Training loss: 1.2475 6.7738 sec/batch
Epoch 11/20  Iteration 1800/3560 Training loss: 1.2487 6.1641 sec/batch
Epoch 11/20  Iteration 1801/3560 Training loss: 1.2482 6.3780 sec/batch
Epoch 11/20  Iteration 1802/3560 Training loss: 1.2485 6.0055 sec/batch
Epoch 11/20  Iteration 1803/3560 Training loss: 1.2478 6.1251 sec/batch
Epoch 11/20  Iteration 1804/3560 Training loss: 1.2481 6.0109 sec/batch
Epoch 11/20  Iteration 1805/3560 Training loss: 1.2474 6.0227 sec/batch
Epoch 11/20  Iteration 1806/3560 Training loss: 1.2458 6.3543 sec/batch
Epoch 11/20  Iteration 1807/3560 Training loss: 1.2446 6.3116 sec/batch
Epoch 11/20  Iteration 1808/3560 Training loss: 1.2451 6.4049 sec/batch
Epoch 11/20  Iteration 1809/3560 Training loss: 1.2450 6.3165 sec/batch
Epoch 11/20  Iteration 1810/3560 Training loss: 1.2455 6.1178 sec/batch
Epoch 11/20  Iteration 1811/3560 Training loss: 1.2448 6.0077 sec/batch
Epoch 11/20  Iteration 1812/3560 Training loss: 1.2436 6.0148 sec/batch
Epoch 11/20  Iteration 1813/3560 Training loss: 1.2438 6.0470 sec/batch
Epoch 11/20  Iteration 1814/3560 Training loss: 1.2436 6.2977 sec/batch
Epoch 11/20  Iteration 1815/3560 Training loss: 1.2432 6.3668 sec/batch
Epoch 11/20  Iteration 1816/3560 Training loss: 1.2429 6.0072 sec/batch
Epoch 11/20  Iteration 1817/3560 Training loss: 1.2423 5.9302 sec/batch
Epoch 11/20  Iteration 1818/3560 Training loss: 1.2410 6.0030 sec/batch
Epoch 11/20  Iteration 1819/3560 Training loss: 1.2399 5.9819 sec/batch
Epoch 11/20  Iteration 1820/3560 Training loss: 1.2396 6.0982 sec/batch
Epoch 11/20  Iteration 1821/3560 Training loss: 1.2388 6.2755 sec/batch
Epoch 11/20  Iteration 1822/3560 Training loss: 1.2395 6.1833 sec/batch
Epoch 11/20  Iteration 1823/3560 Training loss: 1.2390 6.1543 sec/batch
Epoch 11/20  Iteration 1824/3560 Training loss: 1.2384 6.0527 sec/batch
Epoch 11/20  Iteration 1825/3560 Training loss: 1.2384 6.2478 sec/batch
Epoch 11/20  Iteration 1826/3560 Training loss: 1.2376 6.0689 sec/batch
Epoch 11/20  Iteration 1827/3560 Training loss: 1.2372 6.0378 sec/batch
Epoch 11/20  Iteration 1828/3560 Training loss: 1.2366 6.2411 sec/batch
Epoch 11/20  Iteration 1829/3560 Training loss: 1.2364 6.1650 sec/batch
Epoch 11/20  Iteration 1830/3560 Training loss: 1.2367 6.2674 sec/batch
Epoch 11/20  Iteration 1831/3560 Training loss: 1.2360 6.0475 sec/batch
Epoch 11/20  Iteration 1832/3560 Training loss: 1.2366 6.0171 sec/batch
Epoch 11/20  Iteration 1833/3560 Training loss: 1.2366 6.0505 sec/batch
Epoch 11/20  Iteration 1834/3560 Training loss: 1.2368 5.9645 sec/batch
Epoch 11/20  Iteration 1835/3560 Training loss: 1.2365 6.1123 sec/batch
Epoch 11/20  Iteration 1836/3560 Training loss: 1.2365 6.2004 sec/batch
Epoch 11/20  Iteration 1837/3560 Training loss: 1.2369 6.3940 sec/batch
Epoch 11/20  Iteration 1838/3560 Training loss: 1.2365 6.0495 sec/batch
Epoch 11/20  Iteration 1839/3560 Training loss: 1.2361 6.0383 sec/batch
Epoch 11/20  Iteration 1840/3560 Training loss: 1.2367 6.0261 sec/batch
Epoch 11/20  Iteration 1841/3560 Training loss: 1.2368 5.9997 sec/batch
Epoch 11/20  Iteration 1842/3560 Training loss: 1.2377 6.1348 sec/batch
Epoch 11/20  Iteration 1843/3560 Training loss: 1.2379 6.1877 sec/batch
Epoch 11/20  Iteration 1844/3560 Training loss: 1.2381 6.3733 sec/batch
Epoch 11/20  Iteration 1845/3560 Training loss: 1.2381 6.2591 sec/batch
Epoch 11/20  Iteration 1846/3560 Training loss: 1.2382 6.1107 sec/batch
Epoch 11/20  Iteration 1847/3560 Training loss: 1.2385 6.0156 sec/batch
Epoch 11/20  Iteration 1848/3560 Training loss: 1.2383 6.0174 sec/batch
Epoch 11/20  Iteration 1849/3560 Training loss: 1.2382 6.0302 sec/batch
Epoch 11/20  Iteration 1850/3560 Training loss: 1.2382 6.3086 sec/batch
Epoch 11/20  Iteration 1851/3560 Training loss: 1.2387 6.3699 sec/batch
Epoch 11/20  Iteration 1852/3560 Training loss: 1.2389 6.0429 sec/batch
Epoch 11/20  Iteration 1853/3560 Training loss: 1.2394 6.0829 sec/batch
Epoch 11/20  Iteration 1854/3560 Training loss: 1.2391 6.0257 sec/batch
Epoch 11/20  Iteration 1855/3560 Training loss: 1.2391 5.9607 sec/batch
Epoch 11/20  Iteration 1856/3560 Training loss: 1.2392 6.0196 sec/batch
Epoch 11/20  Iteration 1857/3560 Training loss: 1.2391 6.2857 sec/batch
Epoch 11/20  Iteration 1858/3560 Training loss: 1.2391 6.6325 sec/batch
Epoch 11/20  Iteration 1859/3560 Training loss: 1.2385 6.1329 sec/batch
Epoch 11/20  Iteration 1860/3560 Training loss: 1.2384 6.1288 sec/batch
Epoch 11/20  Iteration 1861/3560 Training loss: 1.2380 6.0179 sec/batch
Epoch 11/20  Iteration 1862/3560 Training loss: 1.2380 5.9958 sec/batch
Epoch 11/20  Iteration 1863/3560 Training loss: 1.2375 6.0798 sec/batch
Epoch 11/20  Iteration 1864/3560 Training loss: 1.2375 6.2949 sec/batch
Epoch 11/20  Iteration 1865/3560 Training loss: 1.2373 6.5932 sec/batch
Epoch 11/20  Iteration 1866/3560 Training loss: 1.2372 6.0782 sec/batch
Epoch 11/20  Iteration 1867/3560 Training loss: 1.2371 6.0348 sec/batch
Epoch 11/20  Iteration 1868/3560 Training loss: 1.2370 5.9953 sec/batch
Epoch 11/20  Iteration 1869/3560 Training loss: 1.2366 6.0537 sec/batch
Epoch 11/20  Iteration 1870/3560 Training loss: 1.2366 6.0369 sec/batch
Epoch 11/20  Iteration 1871/3560 Training loss: 1.2364 6.2175 sec/batch
Epoch 11/20  Iteration 1872/3560 Training loss: 1.2363 6.1826 sec/batch
Epoch 11/20  Iteration 1873/3560 Training loss: 1.2358 6.2563 sec/batch
Epoch 11/20  Iteration 1874/3560 Training loss: 1.2355 6.0619 sec/batch
Epoch 11/20  Iteration 1875/3560 Training loss: 1.2353 6.0235 sec/batch
Epoch 11/20  Iteration 1876/3560 Training loss: 1.2353 6.0794 sec/batch
Epoch 11/20  Iteration 1877/3560 Training loss: 1.2353 5.9990 sec/batch
Epoch 11/20  Iteration 1878/3560 Training loss: 1.2349 6.0874 sec/batch
Epoch 11/20  Iteration 1879/3560 Training loss: 1.2345 6.1941 sec/batch
Epoch 11/20  Iteration 1880/3560 Training loss: 1.2342 6.3716 sec/batch
Epoch 11/20  Iteration 1881/3560 Training loss: 1.2341 6.0239 sec/batch
Epoch 11/20  Iteration 1882/3560 Training loss: 1.2339 6.0277 sec/batch
Epoch 11/20  Iteration 1883/3560 Training loss: 1.2338 6.0441 sec/batch
Epoch 11/20  Iteration 1884/3560 Training loss: 1.2336 6.3245 sec/batch
Epoch 11/20  Iteration 1885/3560 Training loss: 1.2335 6.0867 sec/batch
Epoch 11/20  Iteration 1886/3560 Training loss: 1.2333 6.1681 sec/batch
Epoch 11/20  Iteration 1887/3560 Training loss: 1.2333 6.3350 sec/batch
Epoch 11/20  Iteration 1888/3560 Training loss: 1.2332 5.9747 sec/batch
Epoch 11/20  Iteration 1889/3560 Training loss: 1.2331 6.0125 sec/batch
Epoch 11/20  Iteration 1890/3560 Training loss: 1.2332 6.0486 sec/batch
Epoch 11/20  Iteration 1891/3560 Training loss: 1.2330 5.9714 sec/batch
Epoch 11/20  Iteration 1892/3560 Training loss: 1.2329 6.0423 sec/batch
Epoch 11/20  Iteration 1893/3560 Training loss: 1.2328 6.2486 sec/batch
Epoch 11/20  Iteration 1894/3560 Training loss: 1.2327 6.3692 sec/batch
Epoch 11/20  Iteration 1895/3560 Training loss: 1.2324 6.0052 sec/batch
Epoch 11/20  Iteration 1896/3560 Training loss: 1.2322 6.0621 sec/batch
Epoch 11/20  Iteration 1897/3560 Training loss: 1.2322 6.0267 sec/batch
Epoch 11/20  Iteration 1898/3560 Training loss: 1.2322 5.9967 sec/batch
Epoch 11/20  Iteration 1899/3560 Training loss: 1.2320 5.9886 sec/batch
Epoch 11/20  Iteration 1900/3560 Training loss: 1.2319 6.2729 sec/batch
Epoch 11/20  Iteration 1901/3560 Training loss: 1.2317 6.2689 sec/batch
Epoch 11/20  Iteration 1902/3560 Training loss: 1.2313 6.1496 sec/batch
Epoch 11/20  Iteration 1903/3560 Training loss: 1.2309 6.0347 sec/batch
Epoch 11/20  Iteration 1904/3560 Training loss: 1.2309 6.5114 sec/batch
Epoch 11/20  Iteration 1905/3560 Training loss: 1.2308 6.0074 sec/batch
Epoch 11/20  Iteration 1906/3560 Training loss: 1.2304 6.0766 sec/batch
Epoch 11/20  Iteration 1907/3560 Training loss: 1.2304 6.7694 sec/batch
Epoch 11/20  Iteration 1908/3560 Training loss: 1.2304 6.2417 sec/batch
Epoch 11/20  Iteration 1909/3560 Training loss: 1.2301 6.2116 sec/batch
Epoch 11/20  Iteration 1910/3560 Training loss: 1.2298 6.0968 sec/batch
Epoch 11/20  Iteration 1911/3560 Training loss: 1.2294 6.0266 sec/batch
Epoch 11/20  Iteration 1912/3560 Training loss: 1.2293 6.0140 sec/batch
Epoch 11/20  Iteration 1913/3560 Training loss: 1.2294 6.0162 sec/batch
Epoch 11/20  Iteration 1914/3560 Training loss: 1.2294 6.2169 sec/batch
Epoch 11/20  Iteration 1915/3560 Training loss: 1.2294 6.1656 sec/batch
Epoch 11/20  Iteration 1916/3560 Training loss: 1.2295 6.2897 sec/batch
Epoch 11/20  Iteration 1917/3560 Training loss: 1.2297 6.0208 sec/batch
Epoch 11/20  Iteration 1918/3560 Training loss: 1.2298 6.0293 sec/batch
Epoch 11/20  Iteration 1919/3560 Training loss: 1.2298 6.0837 sec/batch
Epoch 11/20  Iteration 1920/3560 Training loss: 1.2298 6.0658 sec/batch
Epoch 11/20  Iteration 1921/3560 Training loss: 1.2301 6.0881 sec/batch
Epoch 11/20  Iteration 1922/3560 Training loss: 1.2302 6.1687 sec/batch
Epoch 11/20  Iteration 1923/3560 Training loss: 1.2300 6.6411 sec/batch
Epoch 11/20  Iteration 1924/3560 Training loss: 1.2302 6.0328 sec/batch
Epoch 11/20  Iteration 1925/3560 Training loss: 1.2300 6.0743 sec/batch
Epoch 11/20  Iteration 1926/3560 Training loss: 1.2301 6.0111 sec/batch
Epoch 11/20  Iteration 1927/3560 Training loss: 1.2302 6.0439 sec/batch
Epoch 11/20  Iteration 1928/3560 Training loss: 1.2305 6.0505 sec/batch
Epoch 11/20  Iteration 1929/3560 Training loss: 1.2306 6.1829 sec/batch
Epoch 11/20  Iteration 1930/3560 Training loss: 1.2306 6.3482 sec/batch
Epoch 11/20  Iteration 1931/3560 Training loss: 1.2304 6.0420 sec/batch
Epoch 11/20  Iteration 1932/3560 Training loss: 1.2302 6.0231 sec/batch
Epoch 11/20  Iteration 1933/3560 Training loss: 1.2302 6.0384 sec/batch
Epoch 11/20  Iteration 1934/3560 Training loss: 1.2302 5.9476 sec/batch
Epoch 11/20  Iteration 1935/3560 Training loss: 1.2302 6.1710 sec/batch
Epoch 11/20  Iteration 1936/3560 Training loss: 1.2301 7.6087 sec/batch
Epoch 11/20  Iteration 1937/3560 Training loss: 1.2302 6.7010 sec/batch
Epoch 11/20  Iteration 1938/3560 Training loss: 1.2300 6.0660 sec/batch
Epoch 11/20  Iteration 1939/3560 Training loss: 1.2298 6.0684 sec/batch
Epoch 11/20  Iteration 1940/3560 Training loss: 1.2298 6.0624 sec/batch
Epoch 11/20  Iteration 1941/3560 Training loss: 1.2300 6.0986 sec/batch
Epoch 11/20  Iteration 1942/3560 Training loss: 1.2299 6.6512 sec/batch
Epoch 11/20  Iteration 1943/3560 Training loss: 1.2300 6.2170 sec/batch
Epoch 11/20  Iteration 1944/3560 Training loss: 1.2299 6.2687 sec/batch
Epoch 11/20  Iteration 1945/3560 Training loss: 1.2299 6.1435 sec/batch
Epoch 11/20  Iteration 1946/3560 Training loss: 1.2299 6.0251 sec/batch
Epoch 11/20  Iteration 1947/3560 Training loss: 1.2300 6.1059 sec/batch
Epoch 11/20  Iteration 1948/3560 Training loss: 1.2304 6.0269 sec/batch
Epoch 11/20  Iteration 1949/3560 Training loss: 1.2304 6.2979 sec/batch
Epoch 11/20  Iteration 1950/3560 Training loss: 1.2305 6.2457 sec/batch
Epoch 11/20  Iteration 1951/3560 Training loss: 1.2304 6.2042 sec/batch
Epoch 11/20  Iteration 1952/3560 Training loss: 1.2303 6.1005 sec/batch
Epoch 11/20  Iteration 1953/3560 Training loss: 1.2304 6.0591 sec/batch
Epoch 11/20  Iteration 1954/3560 Training loss: 1.2304 6.0534 sec/batch
Epoch 11/20  Iteration 1955/3560 Training loss: 1.2305 6.3148 sec/batch
Epoch 11/20  Iteration 1956/3560 Training loss: 1.2303 6.3794 sec/batch
Epoch 11/20  Iteration 1957/3560 Training loss: 1.2301 6.3234 sec/batch
Epoch 11/20  Iteration 1958/3560 Training loss: 1.2303 6.1398 sec/batch
Epoch 12/20  Iteration 1959/3560 Training loss: 1.3360 6.0588 sec/batch
Epoch 12/20  Iteration 1960/3560 Training loss: 1.2882 6.0840 sec/batch
Epoch 12/20  Iteration 1961/3560 Training loss: 1.2699 6.2681 sec/batch
Epoch 12/20  Iteration 1962/3560 Training loss: 1.2621 6.2831 sec/batch
Epoch 12/20  Iteration 1963/3560 Training loss: 1.2518 6.2374 sec/batch
Epoch 12/20  Iteration 1964/3560 Training loss: 1.2407 6.4276 sec/batch
Epoch 12/20  Iteration 1965/3560 Training loss: 1.2401 6.0249 sec/batch
Epoch 12/20  Iteration 1966/3560 Training loss: 1.2379 6.0582 sec/batch
Epoch 12/20  Iteration 1967/3560 Training loss: 1.2364 6.0096 sec/batch
Epoch 12/20  Iteration 1968/3560 Training loss: 1.2356 6.0897 sec/batch
Epoch 12/20  Iteration 1969/3560 Training loss: 1.2342 6.1225 sec/batch
Epoch 12/20  Iteration 1970/3560 Training loss: 1.2335 6.2288 sec/batch
Epoch 12/20  Iteration 1971/3560 Training loss: 1.2333 6.3901 sec/batch
Epoch 12/20  Iteration 1972/3560 Training loss: 1.2331 6.0182 sec/batch
Epoch 12/20  Iteration 1973/3560 Training loss: 1.2312 6.0933 sec/batch
Epoch 12/20  Iteration 1974/3560 Training loss: 1.2286 6.0974 sec/batch
Epoch 12/20  Iteration 1975/3560 Training loss: 1.2284 6.0245 sec/batch
Epoch 12/20  Iteration 1976/3560 Training loss: 1.2289 6.2339 sec/batch
Epoch 12/20  Iteration 1977/3560 Training loss: 1.2284 6.1405 sec/batch
Epoch 12/20  Iteration 1978/3560 Training loss: 1.2296 6.3444 sec/batch
Epoch 12/20  Iteration 1979/3560 Training loss: 1.2289 6.1106 sec/batch
Epoch 12/20  Iteration 1980/3560 Training loss: 1.2290 6.0300 sec/batch
Epoch 12/20  Iteration 1981/3560 Training loss: 1.2284 6.3678 sec/batch
Epoch 12/20  Iteration 1982/3560 Training loss: 1.2284 6.0997 sec/batch
Epoch 12/20  Iteration 1983/3560 Training loss: 1.2283 6.3973 sec/batch
Epoch 12/20  Iteration 1984/3560 Training loss: 1.2264 6.2499 sec/batch
Epoch 12/20  Iteration 1985/3560 Training loss: 1.2250 6.1301 sec/batch
Epoch 12/20  Iteration 1986/3560 Training loss: 1.2257 6.1205 sec/batch
Epoch 12/20  Iteration 1987/3560 Training loss: 1.2256 6.0661 sec/batch
Epoch 12/20  Iteration 1988/3560 Training loss: 1.2258 6.0707 sec/batch
Epoch 12/20  Iteration 1989/3560 Training loss: 1.2253 6.0543 sec/batch
Epoch 12/20  Iteration 1990/3560 Training loss: 1.2244 6.2913 sec/batch
Epoch 12/20  Iteration 1991/3560 Training loss: 1.2246 6.3398 sec/batch
Epoch 12/20  Iteration 1992/3560 Training loss: 1.2248 6.1214 sec/batch
Epoch 12/20  Iteration 1993/3560 Training loss: 1.2244 6.1012 sec/batch
Epoch 12/20  Iteration 1994/3560 Training loss: 1.2242 6.0551 sec/batch
Epoch 12/20  Iteration 1995/3560 Training loss: 1.2236 5.9905 sec/batch
Epoch 12/20  Iteration 1996/3560 Training loss: 1.2227 6.0592 sec/batch
Epoch 12/20  Iteration 1997/3560 Training loss: 1.2216 6.3815 sec/batch
Epoch 12/20  Iteration 1998/3560 Training loss: 1.2213 6.3937 sec/batch
Epoch 12/20  Iteration 1999/3560 Training loss: 1.2208 6.0326 sec/batch
Epoch 12/20  Iteration 2000/3560 Training loss: 1.2218 6.2867 sec/batch
Epoch 12/20  Iteration 2001/3560 Training loss: 1.2216 6.1391 sec/batch
Epoch 12/20  Iteration 2002/3560 Training loss: 1.2208 6.0995 sec/batch
Epoch 12/20  Iteration 2003/3560 Training loss: 1.2208 6.1326 sec/batch
Epoch 12/20  Iteration 2004/3560 Training loss: 1.2203 7.0376 sec/batch
Epoch 12/20  Iteration 2005/3560 Training loss: 1.2198 6.4703 sec/batch
Epoch 12/20  Iteration 2006/3560 Training loss: 1.2195 6.0599 sec/batch
Epoch 12/20  Iteration 2007/3560 Training loss: 1.2194 6.0622 sec/batch
Epoch 12/20  Iteration 2008/3560 Training loss: 1.2195 6.0497 sec/batch
Epoch 12/20  Iteration 2009/3560 Training loss: 1.2190 6.0475 sec/batch
Epoch 12/20  Iteration 2010/3560 Training loss: 1.2194 6.1817 sec/batch
Epoch 12/20  Iteration 2011/3560 Training loss: 1.2194 6.1643 sec/batch
Epoch 12/20  Iteration 2012/3560 Training loss: 1.2196 6.2232 sec/batch
Epoch 12/20  Iteration 2013/3560 Training loss: 1.2194 6.0834 sec/batch
Epoch 12/20  Iteration 2014/3560 Training loss: 1.2195 6.0707 sec/batch
Epoch 12/20  Iteration 2015/3560 Training loss: 1.2197 6.0175 sec/batch
Epoch 12/20  Iteration 2016/3560 Training loss: 1.2194 6.0526 sec/batch
Epoch 12/20  Iteration 2017/3560 Training loss: 1.2188 6.2939 sec/batch
Epoch 12/20  Iteration 2018/3560 Training loss: 1.2193 6.2581 sec/batch
Epoch 12/20  Iteration 2019/3560 Training loss: 1.2192 6.1441 sec/batch
Epoch 12/20  Iteration 2020/3560 Training loss: 1.2199 6.4320 sec/batch
Epoch 12/20  Iteration 2021/3560 Training loss: 1.2202 6.1324 sec/batch
Epoch 12/20  Iteration 2022/3560 Training loss: 1.2202 5.9808 sec/batch
Epoch 12/20  Iteration 2023/3560 Training loss: 1.2201 6.0573 sec/batch
Epoch 12/20  Iteration 2024/3560 Training loss: 1.2201 6.2955 sec/batch
Epoch 12/20  Iteration 2025/3560 Training loss: 1.2202 6.3338 sec/batch
Epoch 12/20  Iteration 2026/3560 Training loss: 1.2198 6.0310 sec/batch
Epoch 12/20  Iteration 2027/3560 Training loss: 1.2198 6.0265 sec/batch
Epoch 12/20  Iteration 2028/3560 Training loss: 1.2197 6.0745 sec/batch
Epoch 12/20  Iteration 2029/3560 Training loss: 1.2202 6.0102 sec/batch
Epoch 12/20  Iteration 2030/3560 Training loss: 1.2205 6.0823 sec/batch
Epoch 12/20  Iteration 2031/3560 Training loss: 1.2209 6.2871 sec/batch
Epoch 12/20  Iteration 2032/3560 Training loss: 1.2204 6.3915 sec/batch
Epoch 12/20  Iteration 2033/3560 Training loss: 1.2203 6.0164 sec/batch
Epoch 12/20  Iteration 2034/3560 Training loss: 1.2204 6.0896 sec/batch
Epoch 12/20  Iteration 2035/3560 Training loss: 1.2203 6.1420 sec/batch
Epoch 12/20  Iteration 2036/3560 Training loss: 1.2200 6.0255 sec/batch
Epoch 12/20  Iteration 2037/3560 Training loss: 1.2195 6.1386 sec/batch
Epoch 12/20  Iteration 2038/3560 Training loss: 1.2194 6.1726 sec/batch
Epoch 12/20  Iteration 2039/3560 Training loss: 1.2188 6.5447 sec/batch
Epoch 12/20  Iteration 2040/3560 Training loss: 1.2187 6.0712 sec/batch
Epoch 12/20  Iteration 2041/3560 Training loss: 1.2183 6.0579 sec/batch
Epoch 12/20  Iteration 2042/3560 Training loss: 1.2182 6.0675 sec/batch
Epoch 12/20  Iteration 2043/3560 Training loss: 1.2179 6.0264 sec/batch
Epoch 12/20  Iteration 2044/3560 Training loss: 1.2178 6.0753 sec/batch
Epoch 12/20  Iteration 2045/3560 Training loss: 1.2175 6.2096 sec/batch
Epoch 12/20  Iteration 2046/3560 Training loss: 1.2173 6.3317 sec/batch
Epoch 12/20  Iteration 2047/3560 Training loss: 1.2170 6.0717 sec/batch
Epoch 12/20  Iteration 2048/3560 Training loss: 1.2170 5.9856 sec/batch
Epoch 12/20  Iteration 2049/3560 Training loss: 1.2166 6.0684 sec/batch
Epoch 12/20  Iteration 2050/3560 Training loss: 1.2166 6.0559 sec/batch
Epoch 12/20  Iteration 2051/3560 Training loss: 1.2163 6.0292 sec/batch
Epoch 12/20  Iteration 2052/3560 Training loss: 1.2158 6.6944 sec/batch
Epoch 12/20  Iteration 2053/3560 Training loss: 1.2155 6.4709 sec/batch
Epoch 12/20  Iteration 2054/3560 Training loss: 1.2155 6.0777 sec/batch
Epoch 12/20  Iteration 2055/3560 Training loss: 1.2155 6.1050 sec/batch
Epoch 12/20  Iteration 2056/3560 Training loss: 1.2151 5.9763 sec/batch
Epoch 12/20  Iteration 2057/3560 Training loss: 1.2148 6.0289 sec/batch
Epoch 12/20  Iteration 2058/3560 Training loss: 1.2145 6.0671 sec/batch
Epoch 12/20  Iteration 2059/3560 Training loss: 1.2144 6.4899 sec/batch
Epoch 12/20  Iteration 2060/3560 Training loss: 1.2143 6.3192 sec/batch
Epoch 12/20  Iteration 2061/3560 Training loss: 1.2142 6.0908 sec/batch
Epoch 12/20  Iteration 2062/3560 Training loss: 1.2140 6.0336 sec/batch
Epoch 12/20  Iteration 2063/3560 Training loss: 1.2138 6.0244 sec/batch
Epoch 12/20  Iteration 2064/3560 Training loss: 1.2137 6.0116 sec/batch
Epoch 12/20  Iteration 2065/3560 Training loss: 1.2136 6.0067 sec/batch
Epoch 12/20  Iteration 2066/3560 Training loss: 1.2136 6.2594 sec/batch
Epoch 12/20  Iteration 2067/3560 Training loss: 1.2135 6.1751 sec/batch
Epoch 12/20  Iteration 2068/3560 Training loss: 1.2135 6.1844 sec/batch
Epoch 12/20  Iteration 2069/3560 Training loss: 1.2133 6.0769 sec/batch
Epoch 12/20  Iteration 2070/3560 Training loss: 1.2132 5.9942 sec/batch
Epoch 12/20  Iteration 2071/3560 Training loss: 1.2132 6.0671 sec/batch
Epoch 12/20  Iteration 2072/3560 Training loss: 1.2130 6.0257 sec/batch
Epoch 12/20  Iteration 2073/3560 Training loss: 1.2127 6.1897 sec/batch
Epoch 12/20  Iteration 2074/3560 Training loss: 1.2124 5.9979 sec/batch
Epoch 12/20  Iteration 2075/3560 Training loss: 1.2125 6.3617 sec/batch
Epoch 12/20  Iteration 2076/3560 Training loss: 1.2125 6.0299 sec/batch
Epoch 12/20  Iteration 2077/3560 Training loss: 1.2123 6.0316 sec/batch
Epoch 12/20  Iteration 2078/3560 Training loss: 1.2122 6.0382 sec/batch
Epoch 12/20  Iteration 2079/3560 Training loss: 1.2120 6.2742 sec/batch
Epoch 12/20  Iteration 2080/3560 Training loss: 1.2117 6.1199 sec/batch
Epoch 12/20  Iteration 2081/3560 Training loss: 1.2113 6.2295 sec/batch
Epoch 12/20  Iteration 2082/3560 Training loss: 1.2112 6.3413 sec/batch
Epoch 12/20  Iteration 2083/3560 Training loss: 1.2111 6.0358 sec/batch
Epoch 12/20  Iteration 2084/3560 Training loss: 1.2107 6.0379 sec/batch
Epoch 12/20  Iteration 2085/3560 Training loss: 1.2107 6.0196 sec/batch
Epoch 12/20  Iteration 2086/3560 Training loss: 1.2107 6.0002 sec/batch
Epoch 12/20  Iteration 2087/3560 Training loss: 1.2106 6.0788 sec/batch
Epoch 12/20  Iteration 2088/3560 Training loss: 1.2102 6.2569 sec/batch
Epoch 12/20  Iteration 2089/3560 Training loss: 1.2098 6.3700 sec/batch
Epoch 12/20  Iteration 2090/3560 Training loss: 1.2096 5.9472 sec/batch
Epoch 12/20  Iteration 2091/3560 Training loss: 1.2097 6.1070 sec/batch
Epoch 12/20  Iteration 2092/3560 Training loss: 1.2097 6.0657 sec/batch
Epoch 12/20  Iteration 2093/3560 Training loss: 1.2097 5.9947 sec/batch
Epoch 12/20  Iteration 2094/3560 Training loss: 1.2096 6.0244 sec/batch
Epoch 12/20  Iteration 2095/3560 Training loss: 1.2098 6.2657 sec/batch
Epoch 12/20  Iteration 2096/3560 Training loss: 1.2099 6.2872 sec/batch
Epoch 12/20  Iteration 2097/3560 Training loss: 1.2099 6.1005 sec/batch
Epoch 12/20  Iteration 2098/3560 Training loss: 1.2099 6.3151 sec/batch
Epoch 12/20  Iteration 2099/3560 Training loss: 1.2102 6.0242 sec/batch
Epoch 12/20  Iteration 2100/3560 Training loss: 1.2102 6.4879 sec/batch
Epoch 12/20  Iteration 2101/3560 Training loss: 1.2102 7.5553 sec/batch
Epoch 12/20  Iteration 2102/3560 Training loss: 1.2105 6.3692 sec/batch
Epoch 12/20  Iteration 2103/3560 Training loss: 1.2103 6.3577 sec/batch
Epoch 12/20  Iteration 2104/3560 Training loss: 1.2105 6.0488 sec/batch
Epoch 12/20  Iteration 2105/3560 Training loss: 1.2105 6.0825 sec/batch
Epoch 12/20  Iteration 2106/3560 Training loss: 1.2107 6.0162 sec/batch
Epoch 12/20  Iteration 2107/3560 Training loss: 1.2108 6.0404 sec/batch
Epoch 12/20  Iteration 2108/3560 Training loss: 1.2108 6.1037 sec/batch
Epoch 12/20  Iteration 2109/3560 Training loss: 1.2105 6.1774 sec/batch
Epoch 12/20  Iteration 2110/3560 Training loss: 1.2103 6.3704 sec/batch
Epoch 12/20  Iteration 2111/3560 Training loss: 1.2103 5.9835 sec/batch
Epoch 12/20  Iteration 2112/3560 Training loss: 1.2102 6.0311 sec/batch
Epoch 12/20  Iteration 2113/3560 Training loss: 1.2102 6.0425 sec/batch
Epoch 12/20  Iteration 2114/3560 Training loss: 1.2102 6.0170 sec/batch
Epoch 12/20  Iteration 2115/3560 Training loss: 1.2101 6.1187 sec/batch
Epoch 12/20  Iteration 2116/3560 Training loss: 1.2100 6.2097 sec/batch
Epoch 12/20  Iteration 2117/3560 Training loss: 1.2098 6.7416 sec/batch
Epoch 12/20  Iteration 2118/3560 Training loss: 1.2100 6.0567 sec/batch
Epoch 12/20  Iteration 2119/3560 Training loss: 1.2101 6.0223 sec/batch
Epoch 12/20  Iteration 2120/3560 Training loss: 1.2101 6.0908 sec/batch
Epoch 12/20  Iteration 2121/3560 Training loss: 1.2101 6.0075 sec/batch
Epoch 12/20  Iteration 2122/3560 Training loss: 1.2101 6.2405 sec/batch
Epoch 12/20  Iteration 2123/3560 Training loss: 1.2100 6.4209 sec/batch
Epoch 12/20  Iteration 2124/3560 Training loss: 1.2099 6.8502 sec/batch
Epoch 12/20  Iteration 2125/3560 Training loss: 1.2100 6.4294 sec/batch
Epoch 12/20  Iteration 2126/3560 Training loss: 1.2103 7.1431 sec/batch
Epoch 12/20  Iteration 2127/3560 Training loss: 1.2104 7.6202 sec/batch
Epoch 12/20  Iteration 2128/3560 Training loss: 1.2104 7.6109 sec/batch
Epoch 12/20  Iteration 2129/3560 Training loss: 1.2103 7.9212 sec/batch
Epoch 12/20  Iteration 2130/3560 Training loss: 1.2103 7.3686 sec/batch
Epoch 12/20  Iteration 2131/3560 Training loss: 1.2104 6.8240 sec/batch
Epoch 12/20  Iteration 2132/3560 Training loss: 1.2105 7.2493 sec/batch
Epoch 12/20  Iteration 2133/3560 Training loss: 1.2105 6.7362 sec/batch
Epoch 12/20  Iteration 2134/3560 Training loss: 1.2104 6.3144 sec/batch
Epoch 12/20  Iteration 2135/3560 Training loss: 1.2103 6.5526 sec/batch
Epoch 12/20  Iteration 2136/3560 Training loss: 1.2105 6.1648 sec/batch
Epoch 13/20  Iteration 2137/3560 Training loss: 1.3082 6.0645 sec/batch
Epoch 13/20  Iteration 2138/3560 Training loss: 1.2671 6.0083 sec/batch
Epoch 13/20  Iteration 2139/3560 Training loss: 1.2479 5.9869 sec/batch
Epoch 13/20  Iteration 2140/3560 Training loss: 1.2428 6.0727 sec/batch
Epoch 13/20  Iteration 2141/3560 Training loss: 1.2323 6.2936 sec/batch
Epoch 13/20  Iteration 2142/3560 Training loss: 1.2210 6.2922 sec/batch
Epoch 13/20  Iteration 2143/3560 Training loss: 1.2205 6.0401 sec/batch
Epoch 13/20  Iteration 2144/3560 Training loss: 1.2187 6.1159 sec/batch
Epoch 13/20  Iteration 2145/3560 Training loss: 1.2183 6.0711 sec/batch
Epoch 13/20  Iteration 2146/3560 Training loss: 1.2172 6.0226 sec/batch
Epoch 13/20  Iteration 2147/3560 Training loss: 1.2141 6.0568 sec/batch
Epoch 13/20  Iteration 2148/3560 Training loss: 1.2136 6.6158 sec/batch
Epoch 13/20  Iteration 2149/3560 Training loss: 1.2137 6.4441 sec/batch
Epoch 13/20  Iteration 2150/3560 Training loss: 1.2140 5.9993 sec/batch
Epoch 13/20  Iteration 2151/3560 Training loss: 1.2123 6.1006 sec/batch
Epoch 13/20  Iteration 2152/3560 Training loss: 1.2099 6.0256 sec/batch
Epoch 13/20  Iteration 2153/3560 Training loss: 1.2103 6.1067 sec/batch
Epoch 13/20  Iteration 2154/3560 Training loss: 1.2109 6.3520 sec/batch
Epoch 13/20  Iteration 2155/3560 Training loss: 1.2107 6.2418 sec/batch
Epoch 13/20  Iteration 2156/3560 Training loss: 1.2118 6.3951 sec/batch
Epoch 13/20  Iteration 2157/3560 Training loss: 1.2109 6.0326 sec/batch
Epoch 13/20  Iteration 2158/3560 Training loss: 1.2111 6.0312 sec/batch
Epoch 13/20  Iteration 2159/3560 Training loss: 1.2103 6.0899 sec/batch
Epoch 13/20  Iteration 2160/3560 Training loss: 1.2102 6.0151 sec/batch
Epoch 13/20  Iteration 2161/3560 Training loss: 1.2098 6.1193 sec/batch
Epoch 13/20  Iteration 2162/3560 Training loss: 1.2081 6.1634 sec/batch
Epoch 13/20  Iteration 2163/3560 Training loss: 1.2069 6.3727 sec/batch
Epoch 13/20  Iteration 2164/3560 Training loss: 1.2078 6.0445 sec/batch
Epoch 13/20  Iteration 2165/3560 Training loss: 1.2077 6.0197 sec/batch
Epoch 13/20  Iteration 2166/3560 Training loss: 1.2079 6.0341 sec/batch
Epoch 13/20  Iteration 2167/3560 Training loss: 1.2073 6.0868 sec/batch
Epoch 13/20  Iteration 2168/3560 Training loss: 1.2066 6.2133 sec/batch
Epoch 13/20  Iteration 2169/3560 Training loss: 1.2067 6.1601 sec/batch
Epoch 13/20  Iteration 2170/3560 Training loss: 1.2066 6.2889 sec/batch
Epoch 13/20  Iteration 2171/3560 Training loss: 1.2063 6.0760 sec/batch
Epoch 13/20  Iteration 2172/3560 Training loss: 1.2061 6.0411 sec/batch
Epoch 13/20  Iteration 2173/3560 Training loss: 1.2054 6.0505 sec/batch
Epoch 13/20  Iteration 2174/3560 Training loss: 1.2044 6.3839 sec/batch
Epoch 13/20  Iteration 2175/3560 Training loss: 1.2034 6.3071 sec/batch
Epoch 13/20  Iteration 2176/3560 Training loss: 1.2031 6.2795 sec/batch
Epoch 13/20  Iteration 2177/3560 Training loss: 1.2024 6.0652 sec/batch
Epoch 13/20  Iteration 2178/3560 Training loss: 1.2031 6.0406 sec/batch
Epoch 13/20  Iteration 2179/3560 Training loss: 1.2029 6.0447 sec/batch
Epoch 13/20  Iteration 2180/3560 Training loss: 1.2023 6.0230 sec/batch
Epoch 13/20  Iteration 2181/3560 Training loss: 1.2022 6.0330 sec/batch
Epoch 13/20  Iteration 2182/3560 Training loss: 1.2014 6.3174 sec/batch
Epoch 13/20  Iteration 2183/3560 Training loss: 1.2011 6.3030 sec/batch
Epoch 13/20  Iteration 2184/3560 Training loss: 1.2006 6.0829 sec/batch
Epoch 13/20  Iteration 2185/3560 Training loss: 1.2005 6.0201 sec/batch
Epoch 13/20  Iteration 2186/3560 Training loss: 1.2006 6.0423 sec/batch
Epoch 13/20  Iteration 2187/3560 Training loss: 1.2000 5.9678 sec/batch
Epoch 13/20  Iteration 2188/3560 Training loss: 1.2005 6.0306 sec/batch
Epoch 13/20  Iteration 2189/3560 Training loss: 1.2005 6.4975 sec/batch
Epoch 13/20  Iteration 2190/3560 Training loss: 1.2007 6.4228 sec/batch
Epoch 13/20  Iteration 2191/3560 Training loss: 1.2005 6.0218 sec/batch
Epoch 13/20  Iteration 2192/3560 Training loss: 1.2007 6.0672 sec/batch
Epoch 13/20  Iteration 2193/3560 Training loss: 1.2010 6.3496 sec/batch
Epoch 13/20  Iteration 2194/3560 Training loss: 1.2007 6.1819 sec/batch
Epoch 13/20  Iteration 2195/3560 Training loss: 1.2001 6.1561 sec/batch
Epoch 13/20  Iteration 2196/3560 Training loss: 1.2006 6.4852 sec/batch
Epoch 13/20  Iteration 2197/3560 Training loss: 1.2007 6.7448 sec/batch
Epoch 13/20  Iteration 2198/3560 Training loss: 1.2015 6.0925 sec/batch
Epoch 13/20  Iteration 2199/3560 Training loss: 1.2018 6.0149 sec/batch
Epoch 13/20  Iteration 2200/3560 Training loss: 1.2019 6.0658 sec/batch
Epoch 13/20  Iteration 2201/3560 Training loss: 1.2017 6.0987 sec/batch
Epoch 13/20  Iteration 2202/3560 Training loss: 1.2017 8.2254 sec/batch
Epoch 13/20  Iteration 2203/3560 Training loss: 1.2019 7.9128 sec/batch
Epoch 13/20  Iteration 2204/3560 Training loss: 1.2017 6.3430 sec/batch
Epoch 13/20  Iteration 2205/3560 Training loss: 1.2019 6.0605 sec/batch
Epoch 13/20  Iteration 2206/3560 Training loss: 1.2017 6.0701 sec/batch
Epoch 13/20  Iteration 2207/3560 Training loss: 1.2021 6.0930 sec/batch
Epoch 13/20  Iteration 2208/3560 Training loss: 1.2023 6.1240 sec/batch
Epoch 13/20  Iteration 2209/3560 Training loss: 1.2028 6.2214 sec/batch
Epoch 13/20  Iteration 2210/3560 Training loss: 1.2024 6.3799 sec/batch
Epoch 13/20  Iteration 2211/3560 Training loss: 1.2025 6.0431 sec/batch
Epoch 13/20  Iteration 2212/3560 Training loss: 1.2024 6.3563 sec/batch
Epoch 13/20  Iteration 2213/3560 Training loss: 1.2023 6.1391 sec/batch
Epoch 13/20  Iteration 2214/3560 Training loss: 1.2022 6.1020 sec/batch
Epoch 13/20  Iteration 2215/3560 Training loss: 1.2017 6.6613 sec/batch
Epoch 13/20  Iteration 2216/3560 Training loss: 1.2016 6.9723 sec/batch
Epoch 13/20  Iteration 2217/3560 Training loss: 1.2012 6.3238 sec/batch
Epoch 13/20  Iteration 2218/3560 Training loss: 1.2012 6.0567 sec/batch
Epoch 13/20  Iteration 2219/3560 Training loss: 1.2008 6.8869 sec/batch
Epoch 13/20  Iteration 2220/3560 Training loss: 1.2008 6.1841 sec/batch
Epoch 13/20  Iteration 2221/3560 Training loss: 1.2006 6.1061 sec/batch
Epoch 13/20  Iteration 2222/3560 Training loss: 1.2004 6.4866 sec/batch
Epoch 13/20  Iteration 2223/3560 Training loss: 1.2003 6.4304 sec/batch
Epoch 13/20  Iteration 2224/3560 Training loss: 1.2000 6.2650 sec/batch
Epoch 13/20  Iteration 2225/3560 Training loss: 1.1997 6.1843 sec/batch
Epoch 13/20  Iteration 2226/3560 Training loss: 1.1997 6.1778 sec/batch
Epoch 13/20  Iteration 2227/3560 Training loss: 1.1995 6.1728 sec/batch
Epoch 13/20  Iteration 2228/3560 Training loss: 1.1994 6.7098 sec/batch
Epoch 13/20  Iteration 2229/3560 Training loss: 1.1990 7.1855 sec/batch
Epoch 13/20  Iteration 2230/3560 Training loss: 1.1986 6.5749 sec/batch
Epoch 13/20  Iteration 2231/3560 Training loss: 1.1983 6.6694 sec/batch
Epoch 13/20  Iteration 2232/3560 Training loss: 1.1983 6.7839 sec/batch
Epoch 13/20  Iteration 2233/3560 Training loss: 1.1984 6.2049 sec/batch
Epoch 13/20  Iteration 2234/3560 Training loss: 1.1979 6.2021 sec/batch
Epoch 13/20  Iteration 2235/3560 Training loss: 1.1976 6.3924 sec/batch
Epoch 13/20  Iteration 2236/3560 Training loss: 1.1973 6.2678 sec/batch
Epoch 13/20  Iteration 2237/3560 Training loss: 1.1972 6.3930 sec/batch
Epoch 13/20  Iteration 2238/3560 Training loss: 1.1971 6.7135 sec/batch
Epoch 13/20  Iteration 2239/3560 Training loss: 1.1970 6.3499 sec/batch
Epoch 13/20  Iteration 2240/3560 Training loss: 1.1969 6.4789 sec/batch
Epoch 13/20  Iteration 2241/3560 Training loss: 1.1968 6.4366 sec/batch
Epoch 13/20  Iteration 2242/3560 Training loss: 1.1968 6.3752 sec/batch
Epoch 13/20  Iteration 2243/3560 Training loss: 1.1967 7.1644 sec/batch
Epoch 13/20  Iteration 2244/3560 Training loss: 1.1967 6.4589 sec/batch
Epoch 13/20  Iteration 2245/3560 Training loss: 1.1964 6.2452 sec/batch
Epoch 13/20  Iteration 2246/3560 Training loss: 1.1965 6.3359 sec/batch
Epoch 13/20  Iteration 2247/3560 Training loss: 1.1964 6.2929 sec/batch
Epoch 13/20  Iteration 2248/3560 Training loss: 1.1963 7.5206 sec/batch
Epoch 13/20  Iteration 2249/3560 Training loss: 1.1962 8.1282 sec/batch
Epoch 13/20  Iteration 2250/3560 Training loss: 1.1961 9.3940 sec/batch
Epoch 13/20  Iteration 2251/3560 Training loss: 1.1958 7.3591 sec/batch
Epoch 13/20  Iteration 2252/3560 Training loss: 1.1955 6.2596 sec/batch
Epoch 13/20  Iteration 2253/3560 Training loss: 1.1956 6.3941 sec/batch
Epoch 13/20  Iteration 2254/3560 Training loss: 1.1955 6.8479 sec/batch
Epoch 13/20  Iteration 2255/3560 Training loss: 1.1955 6.8892 sec/batch
Epoch 13/20  Iteration 2256/3560 Training loss: 1.1954 6.5896 sec/batch
Epoch 13/20  Iteration 2257/3560 Training loss: 1.1953 6.5430 sec/batch
Epoch 13/20  Iteration 2258/3560 Training loss: 1.1949 6.5729 sec/batch
Epoch 13/20  Iteration 2259/3560 Training loss: 1.1946 6.3436 sec/batch
Epoch 13/20  Iteration 2260/3560 Training loss: 1.1945 6.4880 sec/batch
Epoch 13/20  Iteration 2261/3560 Training loss: 1.1944 6.6708 sec/batch
Epoch 13/20  Iteration 2262/3560 Training loss: 1.1940 7.0967 sec/batch
Epoch 13/20  Iteration 2263/3560 Training loss: 1.1940 6.3696 sec/batch
Epoch 13/20  Iteration 2264/3560 Training loss: 1.1939 6.8002 sec/batch
Epoch 13/20  Iteration 2265/3560 Training loss: 1.1937 6.2322 sec/batch
Epoch 13/20  Iteration 2266/3560 Training loss: 1.1934 6.1896 sec/batch
Epoch 13/20  Iteration 2267/3560 Training loss: 1.1930 6.6231 sec/batch
Epoch 13/20  Iteration 2268/3560 Training loss: 1.1928 6.5099 sec/batch
Epoch 13/20  Iteration 2269/3560 Training loss: 1.1929 6.2749 sec/batch
Epoch 13/20  Iteration 2270/3560 Training loss: 1.1929 6.0692 sec/batch
Epoch 13/20  Iteration 2271/3560 Training loss: 1.1929 6.1374 sec/batch
Epoch 13/20  Iteration 2272/3560 Training loss: 1.1929 6.1059 sec/batch
Epoch 13/20  Iteration 2273/3560 Training loss: 1.1930 6.1681 sec/batch
Epoch 13/20  Iteration 2274/3560 Training loss: 1.1931 6.2721 sec/batch
Epoch 13/20  Iteration 2275/3560 Training loss: 1.1931 6.5874 sec/batch
Epoch 13/20  Iteration 2276/3560 Training loss: 1.1931 6.1040 sec/batch
Epoch 13/20  Iteration 2277/3560 Training loss: 1.1934 6.0629 sec/batch
Epoch 13/20  Iteration 2278/3560 Training loss: 1.1935 6.1495 sec/batch
Epoch 13/20  Iteration 2279/3560 Training loss: 1.1933 6.1128 sec/batch
Epoch 13/20  Iteration 2280/3560 Training loss: 1.1935 6.1916 sec/batch
Epoch 13/20  Iteration 2281/3560 Training loss: 1.1933 6.1757 sec/batch
Epoch 13/20  Iteration 2282/3560 Training loss: 1.1934 6.5051 sec/batch
Epoch 13/20  Iteration 2283/3560 Training loss: 1.1935 6.1184 sec/batch
Epoch 13/20  Iteration 2284/3560 Training loss: 1.1937 6.1187 sec/batch
Epoch 13/20  Iteration 2285/3560 Training loss: 1.1938 6.1196 sec/batch
Epoch 13/20  Iteration 2286/3560 Training loss: 1.1937 6.8118 sec/batch
Epoch 13/20  Iteration 2287/3560 Training loss: 1.1935 6.4421 sec/batch
Epoch 13/20  Iteration 2288/3560 Training loss: 1.1933 6.4117 sec/batch
Epoch 13/20  Iteration 2289/3560 Training loss: 1.1933 6.8360 sec/batch
Epoch 13/20  Iteration 2290/3560 Training loss: 1.1933 6.1428 sec/batch
Epoch 13/20  Iteration 2291/3560 Training loss: 1.1933 6.1425 sec/batch
Epoch 13/20  Iteration 2292/3560 Training loss: 1.1932 6.1572 sec/batch
Epoch 13/20  Iteration 2293/3560 Training loss: 1.1932 6.1362 sec/batch
Epoch 13/20  Iteration 2294/3560 Training loss: 1.1931 6.3036 sec/batch
Epoch 13/20  Iteration 2295/3560 Training loss: 1.1929 6.2954 sec/batch
Epoch 13/20  Iteration 2296/3560 Training loss: 1.1929 6.3600 sec/batch
Epoch 13/20  Iteration 2297/3560 Training loss: 1.1930 6.1686 sec/batch
Epoch 13/20  Iteration 2298/3560 Training loss: 1.1930 6.0586 sec/batch
Epoch 13/20  Iteration 2299/3560 Training loss: 1.1929 6.1140 sec/batch
Epoch 13/20  Iteration 2300/3560 Training loss: 1.1929 6.0826 sec/batch
Epoch 13/20  Iteration 2301/3560 Training loss: 1.1929 6.3283 sec/batch
Epoch 13/20  Iteration 2302/3560 Training loss: 1.1929 6.4475 sec/batch
Epoch 13/20  Iteration 2303/3560 Training loss: 1.1930 6.1474 sec/batch
Epoch 13/20  Iteration 2304/3560 Training loss: 1.1934 6.1118 sec/batch
Epoch 13/20  Iteration 2305/3560 Training loss: 1.1934 6.4498 sec/batch
Epoch 13/20  Iteration 2306/3560 Training loss: 1.1935 6.1549 sec/batch
Epoch 13/20  Iteration 2307/3560 Training loss: 1.1934 6.2505 sec/batch
Epoch 13/20  Iteration 2308/3560 Training loss: 1.1933 6.8947 sec/batch
Epoch 13/20  Iteration 2309/3560 Training loss: 1.1935 6.6055 sec/batch
Epoch 13/20  Iteration 2310/3560 Training loss: 1.1935 6.1684 sec/batch
Epoch 13/20  Iteration 2311/3560 Training loss: 1.1935 6.0549 sec/batch
Epoch 13/20  Iteration 2312/3560 Training loss: 1.1933 6.1922 sec/batch
Epoch 13/20  Iteration 2313/3560 Training loss: 1.1932 6.1113 sec/batch
Epoch 13/20  Iteration 2314/3560 Training loss: 1.1934 6.2574 sec/batch
Epoch 14/20  Iteration 2315/3560 Training loss: 1.3031 6.6361 sec/batch
Epoch 14/20  Iteration 2316/3560 Training loss: 1.2563 6.7276 sec/batch
Epoch 14/20  Iteration 2317/3560 Training loss: 1.2367 6.5234 sec/batch
Epoch 14/20  Iteration 2318/3560 Training loss: 1.2317 6.2069 sec/batch
Epoch 14/20  Iteration 2319/3560 Training loss: 1.2182 6.0976 sec/batch
Epoch 14/20  Iteration 2320/3560 Training loss: 1.2069 6.1049 sec/batch
Epoch 14/20  Iteration 2321/3560 Training loss: 1.2069 6.3901 sec/batch
Epoch 14/20  Iteration 2322/3560 Training loss: 1.2048 6.4551 sec/batch
Epoch 14/20  Iteration 2323/3560 Training loss: 1.2040 6.1089 sec/batch
Epoch 14/20  Iteration 2324/3560 Training loss: 1.2029 6.5161 sec/batch
Epoch 14/20  Iteration 2325/3560 Training loss: 1.2008 6.4237 sec/batch
Epoch 14/20  Iteration 2326/3560 Training loss: 1.2002 6.1810 sec/batch
Epoch 14/20  Iteration 2327/3560 Training loss: 1.2002 6.2061 sec/batch
Epoch 14/20  Iteration 2328/3560 Training loss: 1.2005 6.2927 sec/batch
Epoch 14/20  Iteration 2329/3560 Training loss: 1.1987 6.4658 sec/batch
Epoch 14/20  Iteration 2330/3560 Training loss: 1.1965 6.0978 sec/batch
Epoch 14/20  Iteration 2331/3560 Training loss: 1.1966 6.1341 sec/batch
Epoch 14/20  Iteration 2332/3560 Training loss: 1.1980 6.1642 sec/batch
Epoch 14/20  Iteration 2333/3560 Training loss: 1.1972 6.1792 sec/batch
Epoch 14/20  Iteration 2334/3560 Training loss: 1.1984 6.3643 sec/batch
Epoch 14/20  Iteration 2335/3560 Training loss: 1.1976 6.2265 sec/batch
Epoch 14/20  Iteration 2336/3560 Training loss: 1.1979 6.3953 sec/batch
Epoch 14/20  Iteration 2337/3560 Training loss: 1.1970 6.7456 sec/batch
Epoch 14/20  Iteration 2338/3560 Training loss: 1.1973 6.1793 sec/batch
Epoch 14/20  Iteration 2339/3560 Training loss: 1.1968 6.1090 sec/batch
Epoch 14/20  Iteration 2340/3560 Training loss: 1.1949 6.1530 sec/batch
Epoch 14/20  Iteration 2341/3560 Training loss: 1.1937 6.3735 sec/batch
Epoch 14/20  Iteration 2342/3560 Training loss: 1.1941 6.4100 sec/batch
Epoch 14/20  Iteration 2343/3560 Training loss: 1.1936 6.4462 sec/batch
Epoch 14/20  Iteration 2344/3560 Training loss: 1.1937 6.2389 sec/batch
Epoch 14/20  Iteration 2345/3560 Training loss: 1.1931 6.1680 sec/batch
Epoch 14/20  Iteration 2346/3560 Training loss: 1.1921 6.1298 sec/batch
Epoch 14/20  Iteration 2347/3560 Training loss: 1.1921 6.1786 sec/batch
Epoch 14/20  Iteration 2348/3560 Training loss: 1.1920 6.3152 sec/batch
Epoch 14/20  Iteration 2349/3560 Training loss: 1.1914 6.4703 sec/batch
Epoch 14/20  Iteration 2350/3560 Training loss: 1.1914 6.1121 sec/batch
Epoch 14/20  Iteration 2351/3560 Training loss: 1.1907 6.1087 sec/batch
Epoch 14/20  Iteration 2352/3560 Training loss: 1.1894 6.1399 sec/batch
Epoch 14/20  Iteration 2353/3560 Training loss: 1.1884 6.1283 sec/batch
Epoch 14/20  Iteration 2354/3560 Training loss: 1.1881 6.4781 sec/batch
Epoch 14/20  Iteration 2355/3560 Training loss: 1.1874 6.2029 sec/batch
Epoch 14/20  Iteration 2356/3560 Training loss: 1.1883 6.4485 sec/batch
Epoch 14/20  Iteration 2357/3560 Training loss: 1.1882 6.1464 sec/batch
Epoch 14/20  Iteration 2358/3560 Training loss: 1.1873 6.1680 sec/batch
Epoch 14/20  Iteration 2359/3560 Training loss: 1.1875 6.1094 sec/batch
Epoch 14/20  Iteration 2360/3560 Training loss: 1.1869 6.1613 sec/batch
Epoch 14/20  Iteration 2361/3560 Training loss: 1.1865 6.3745 sec/batch
Epoch 14/20  Iteration 2362/3560 Training loss: 1.1858 6.7018 sec/batch
Epoch 14/20  Iteration 2363/3560 Training loss: 1.1856 6.2234 sec/batch
Epoch 14/20  Iteration 2364/3560 Training loss: 1.1857 6.1401 sec/batch
Epoch 14/20  Iteration 2365/3560 Training loss: 1.1852 6.0696 sec/batch
Epoch 14/20  Iteration 2366/3560 Training loss: 1.1857 6.1070 sec/batch
Epoch 14/20  Iteration 2367/3560 Training loss: 1.1858 6.1126 sec/batch
Epoch 14/20  Iteration 2368/3560 Training loss: 1.1859 6.3666 sec/batch
Epoch 14/20  Iteration 2369/3560 Training loss: 1.1857 6.4937 sec/batch
Epoch 14/20  Iteration 2370/3560 Training loss: 1.1857 6.0566 sec/batch
Epoch 14/20  Iteration 2371/3560 Training loss: 1.1860 6.2089 sec/batch
Epoch 14/20  Iteration 2372/3560 Training loss: 1.1856 6.1219 sec/batch
Epoch 14/20  Iteration 2373/3560 Training loss: 1.1851 6.1743 sec/batch
Epoch 14/20  Iteration 2374/3560 Training loss: 1.1856 6.2556 sec/batch
Epoch 14/20  Iteration 2375/3560 Training loss: 1.1855 6.2567 sec/batch
Epoch 14/20  Iteration 2376/3560 Training loss: 1.1861 6.4901 sec/batch
Epoch 14/20  Iteration 2377/3560 Training loss: 1.1864 6.0366 sec/batch
Epoch 14/20  Iteration 2378/3560 Training loss: 1.1866 6.1561 sec/batch
Epoch 14/20  Iteration 2379/3560 Training loss: 1.1867 6.1191 sec/batch
Epoch 14/20  Iteration 2380/3560 Training loss: 1.1867 6.1312 sec/batch
Epoch 14/20  Iteration 2381/3560 Training loss: 1.1869 6.6514 sec/batch
Epoch 14/20  Iteration 2382/3560 Training loss: 1.1868 6.4469 sec/batch
Epoch 14/20  Iteration 2383/3560 Training loss: 1.1869 6.4287 sec/batch
Epoch 14/20  Iteration 2384/3560 Training loss: 1.1867 6.2234 sec/batch
Epoch 14/20  Iteration 2385/3560 Training loss: 1.1870 6.4843 sec/batch
Epoch 14/20  Iteration 2386/3560 Training loss: 1.1872 6.1595 sec/batch
Epoch 14/20  Iteration 2387/3560 Training loss: 1.1876 6.1661 sec/batch
Epoch 14/20  Iteration 2388/3560 Training loss: 1.1873 6.4072 sec/batch
Epoch 14/20  Iteration 2389/3560 Training loss: 1.1872 6.4141 sec/batch
Epoch 14/20  Iteration 2390/3560 Training loss: 1.1873 6.2883 sec/batch
Epoch 14/20  Iteration 2391/3560 Training loss: 1.1873 6.1025 sec/batch
Epoch 14/20  Iteration 2392/3560 Training loss: 1.1871 6.1990 sec/batch
Epoch 14/20  Iteration 2393/3560 Training loss: 1.1865 6.0844 sec/batch
Epoch 14/20  Iteration 2394/3560 Training loss: 1.1865 6.1843 sec/batch
Epoch 14/20  Iteration 2395/3560 Training loss: 1.1861 6.2938 sec/batch
Epoch 14/20  Iteration 2396/3560 Training loss: 1.1861 6.4232 sec/batch
Epoch 14/20  Iteration 2397/3560 Training loss: 1.1856 6.0781 sec/batch
Epoch 14/20  Iteration 2398/3560 Training loss: 1.1855 6.1545 sec/batch
Epoch 14/20  Iteration 2399/3560 Training loss: 1.1852 6.2692 sec/batch
Epoch 14/20  Iteration 2400/3560 Training loss: 1.1851 6.3255 sec/batch
Epoch 14/20  Iteration 2401/3560 Training loss: 1.1849 6.3649 sec/batch
Epoch 14/20  Iteration 2402/3560 Training loss: 1.1846 6.3814 sec/batch
Epoch 14/20  Iteration 2403/3560 Training loss: 1.1843 6.5313 sec/batch
Epoch 14/20  Iteration 2404/3560 Training loss: 1.1844 6.1297 sec/batch
Epoch 14/20  Iteration 2405/3560 Training loss: 1.1841 6.0935 sec/batch
Epoch 14/20  Iteration 2406/3560 Training loss: 1.1841 6.0943 sec/batch
Epoch 14/20  Iteration 2407/3560 Training loss: 1.1837 6.1140 sec/batch
Epoch 14/20  Iteration 2408/3560 Training loss: 1.1833 6.3882 sec/batch
Epoch 14/20  Iteration 2409/3560 Training loss: 1.1831 6.3176 sec/batch
Epoch 14/20  Iteration 2410/3560 Training loss: 1.1832 6.2196 sec/batch
Epoch 14/20  Iteration 2411/3560 Training loss: 1.1832 6.1332 sec/batch
Epoch 14/20  Iteration 2412/3560 Training loss: 1.1829 6.1146 sec/batch
Epoch 14/20  Iteration 2413/3560 Training loss: 1.1826 6.0927 sec/batch
Epoch 14/20  Iteration 2414/3560 Training loss: 1.1823 6.0878 sec/batch
Epoch 14/20  Iteration 2415/3560 Training loss: 1.1823 6.3538 sec/batch
Epoch 14/20  Iteration 2416/3560 Training loss: 1.1822 6.5119 sec/batch
Epoch 14/20  Iteration 2417/3560 Training loss: 1.1821 6.1400 sec/batch
Epoch 14/20  Iteration 2418/3560 Training loss: 1.1820 6.0688 sec/batch
Epoch 14/20  Iteration 2419/3560 Training loss: 1.1818 6.1703 sec/batch
Epoch 14/20  Iteration 2420/3560 Training loss: 1.1818 6.4619 sec/batch
Epoch 14/20  Iteration 2421/3560 Training loss: 1.1818 6.2704 sec/batch
Epoch 14/20  Iteration 2422/3560 Training loss: 1.1818 6.3603 sec/batch
Epoch 14/20  Iteration 2423/3560 Training loss: 1.1816 6.4619 sec/batch
Epoch 14/20  Iteration 2424/3560 Training loss: 1.1817 6.1186 sec/batch
Epoch 14/20  Iteration 2425/3560 Training loss: 1.1815 6.1140 sec/batch
Epoch 14/20  Iteration 2426/3560 Training loss: 1.1814 6.1595 sec/batch
Epoch 14/20  Iteration 2427/3560 Training loss: 1.1814 6.1443 sec/batch
Epoch 14/20  Iteration 2428/3560 Training loss: 1.1813 6.3502 sec/batch
Epoch 14/20  Iteration 2429/3560 Training loss: 1.1810 6.2266 sec/batch
Epoch 14/20  Iteration 2430/3560 Training loss: 1.1808 6.3191 sec/batch
Epoch 14/20  Iteration 2431/3560 Training loss: 1.1809 6.1523 sec/batch
Epoch 14/20  Iteration 2432/3560 Training loss: 1.1808 6.1827 sec/batch
Epoch 14/20  Iteration 2433/3560 Training loss: 1.1807 6.3800 sec/batch
Epoch 14/20  Iteration 2434/3560 Training loss: 1.1807 6.2276 sec/batch
Epoch 14/20  Iteration 2435/3560 Training loss: 1.1806 6.4356 sec/batch
Epoch 14/20  Iteration 2436/3560 Training loss: 1.1803 6.4428 sec/batch
Epoch 14/20  Iteration 2437/3560 Training loss: 1.1799 6.1854 sec/batch
Epoch 14/20  Iteration 2438/3560 Training loss: 1.1799 6.2076 sec/batch
Epoch 14/20  Iteration 2439/3560 Training loss: 1.1797 6.4318 sec/batch
Epoch 14/20  Iteration 2440/3560 Training loss: 1.1794 6.1982 sec/batch
Epoch 14/20  Iteration 2441/3560 Training loss: 1.1794 6.2430 sec/batch
Epoch 14/20  Iteration 2442/3560 Training loss: 1.1794 6.2671 sec/batch
Epoch 14/20  Iteration 2443/3560 Training loss: 1.1793 6.4403 sec/batch
Epoch 14/20  Iteration 2444/3560 Training loss: 1.1790 6.0831 sec/batch
Epoch 14/20  Iteration 2445/3560 Training loss: 1.1787 6.0964 sec/batch
Epoch 14/20  Iteration 2446/3560 Training loss: 1.1785 6.1367 sec/batch
Epoch 14/20  Iteration 2447/3560 Training loss: 1.1786 6.1003 sec/batch
Epoch 14/20  Iteration 2448/3560 Training loss: 1.1786 6.2805 sec/batch
Epoch 14/20  Iteration 2449/3560 Training loss: 1.1786 6.2202 sec/batch
Epoch 14/20  Iteration 2450/3560 Training loss: 1.1786 6.4406 sec/batch
Epoch 14/20  Iteration 2451/3560 Training loss: 1.1788 6.1404 sec/batch
Epoch 14/20  Iteration 2452/3560 Training loss: 1.1789 6.1473 sec/batch
Epoch 14/20  Iteration 2453/3560 Training loss: 1.1789 6.0644 sec/batch
Epoch 14/20  Iteration 2454/3560 Training loss: 1.1788 6.0531 sec/batch
Epoch 14/20  Iteration 2455/3560 Training loss: 1.1791 6.3829 sec/batch
Epoch 14/20  Iteration 2456/3560 Training loss: 1.1792 6.2228 sec/batch
Epoch 14/20  Iteration 2457/3560 Training loss: 1.1791 6.2940 sec/batch
Epoch 14/20  Iteration 2458/3560 Training loss: 1.1793 6.5607 sec/batch
Epoch 14/20  Iteration 2459/3560 Training loss: 1.1792 6.1946 sec/batch
Epoch 14/20  Iteration 2460/3560 Training loss: 1.1793 6.1307 sec/batch
Epoch 14/20  Iteration 2461/3560 Training loss: 1.1793 6.1197 sec/batch
Epoch 14/20  Iteration 2462/3560 Training loss: 1.1795 6.3655 sec/batch
Epoch 14/20  Iteration 2463/3560 Training loss: 1.1797 6.4036 sec/batch
Epoch 14/20  Iteration 2464/3560 Training loss: 1.1796 6.1103 sec/batch
Epoch 14/20  Iteration 2465/3560 Training loss: 1.1793 6.1315 sec/batch
Epoch 14/20  Iteration 2466/3560 Training loss: 1.1791 6.1143 sec/batch
Epoch 14/20  Iteration 2467/3560 Training loss: 1.1791 6.1169 sec/batch
Epoch 14/20  Iteration 2468/3560 Training loss: 1.1790 6.1634 sec/batch
Epoch 14/20  Iteration 2469/3560 Training loss: 1.1790 6.2743 sec/batch
Epoch 14/20  Iteration 2470/3560 Training loss: 1.1789 6.5296 sec/batch
Epoch 14/20  Iteration 2471/3560 Training loss: 1.1789 6.0747 sec/batch
Epoch 14/20  Iteration 2472/3560 Training loss: 1.1788 6.1358 sec/batch
Epoch 14/20  Iteration 2473/3560 Training loss: 1.1786 6.1279 sec/batch
Epoch 14/20  Iteration 2474/3560 Training loss: 1.1787 6.0713 sec/batch
Epoch 14/20  Iteration 2475/3560 Training loss: 1.1788 6.2092 sec/batch
Epoch 14/20  Iteration 2476/3560 Training loss: 1.1788 6.2566 sec/batch
Epoch 14/20  Iteration 2477/3560 Training loss: 1.1787 6.6617 sec/batch
Epoch 14/20  Iteration 2478/3560 Training loss: 1.1787 6.2697 sec/batch
Epoch 14/20  Iteration 2479/3560 Training loss: 1.1787 6.1494 sec/batch
Epoch 14/20  Iteration 2480/3560 Training loss: 1.1786 6.1877 sec/batch
Epoch 14/20  Iteration 2481/3560 Training loss: 1.1788 6.4531 sec/batch
Epoch 14/20  Iteration 2482/3560 Training loss: 1.1791 6.3978 sec/batch
Epoch 14/20  Iteration 2483/3560 Training loss: 1.1792 6.2782 sec/batch
Epoch 14/20  Iteration 2484/3560 Training loss: 1.1792 6.2410 sec/batch
Epoch 14/20  Iteration 2485/3560 Training loss: 1.1791 6.1107 sec/batch
Epoch 14/20  Iteration 2486/3560 Training loss: 1.1790 6.0907 sec/batch
Epoch 14/20  Iteration 2487/3560 Training loss: 1.1791 6.0255 sec/batch
Epoch 14/20  Iteration 2488/3560 Training loss: 1.1791 6.1052 sec/batch
Epoch 14/20  Iteration 2489/3560 Training loss: 1.1791 6.3653 sec/batch
Epoch 14/20  Iteration 2490/3560 Training loss: 1.1790 6.4716 sec/batch
Epoch 14/20  Iteration 2491/3560 Training loss: 1.1789 6.1386 sec/batch
Epoch 14/20  Iteration 2492/3560 Training loss: 1.1790 6.0977 sec/batch
Epoch 15/20  Iteration 2493/3560 Training loss: 1.2938 6.1021 sec/batch
Epoch 15/20  Iteration 2494/3560 Training loss: 1.2423 6.1596 sec/batch
Epoch 15/20  Iteration 2495/3560 Training loss: 1.2212 6.2204 sec/batch
Epoch 15/20  Iteration 2496/3560 Training loss: 1.2137 6.5784 sec/batch
Epoch 15/20  Iteration 2497/3560 Training loss: 1.2027 6.6553 sec/batch
Epoch 15/20  Iteration 2498/3560 Training loss: 1.1916 6.4999 sec/batch
Epoch 15/20  Iteration 2499/3560 Training loss: 1.1903 6.1065 sec/batch
Epoch 15/20  Iteration 2500/3560 Training loss: 1.1881 6.1122 sec/batch
Epoch 15/20  Iteration 2501/3560 Training loss: 1.1868 6.1430 sec/batch
Epoch 15/20  Iteration 2502/3560 Training loss: 1.1854 6.2908 sec/batch
Epoch 15/20  Iteration 2503/3560 Training loss: 1.1822 6.3009 sec/batch
Epoch 15/20  Iteration 2504/3560 Training loss: 1.1819 6.3259 sec/batch
Epoch 15/20  Iteration 2505/3560 Training loss: 1.1819 6.1945 sec/batch
Epoch 15/20  Iteration 2506/3560 Training loss: 1.1821 6.0952 sec/batch
Epoch 15/20  Iteration 2507/3560 Training loss: 1.1804 6.1504 sec/batch
Epoch 15/20  Iteration 2508/3560 Training loss: 1.1780 6.1532 sec/batch
Epoch 15/20  Iteration 2509/3560 Training loss: 1.1782 6.3981 sec/batch
Epoch 15/20  Iteration 2510/3560 Training loss: 1.1792 6.3595 sec/batch
Epoch 15/20  Iteration 2511/3560 Training loss: 1.1784 6.1869 sec/batch
Epoch 15/20  Iteration 2512/3560 Training loss: 1.1792 6.1899 sec/batch
Epoch 15/20  Iteration 2513/3560 Training loss: 1.1786 6.2195 sec/batch
Epoch 15/20  Iteration 2514/3560 Training loss: 1.1791 6.0980 sec/batch
Epoch 15/20  Iteration 2515/3560 Training loss: 1.1784 6.2601 sec/batch
Epoch 15/20  Iteration 2516/3560 Training loss: 1.1788 6.6701 sec/batch
Epoch 15/20  Iteration 2517/3560 Training loss: 1.1783 6.6074 sec/batch
Epoch 15/20  Iteration 2518/3560 Training loss: 1.1767 6.0958 sec/batch
Epoch 15/20  Iteration 2519/3560 Training loss: 1.1756 6.1072 sec/batch
Epoch 15/20  Iteration 2520/3560 Training loss: 1.1762 6.1459 sec/batch
Epoch 15/20  Iteration 2521/3560 Training loss: 1.1762 6.1426 sec/batch
Epoch 15/20  Iteration 2522/3560 Training loss: 1.1762 6.3009 sec/batch
Epoch 15/20  Iteration 2523/3560 Training loss: 1.1755 6.2748 sec/batch
Epoch 15/20  Iteration 2524/3560 Training loss: 1.1749 6.4802 sec/batch
Epoch 15/20  Iteration 2525/3560 Training loss: 1.1747 6.1471 sec/batch
Epoch 15/20  Iteration 2526/3560 Training loss: 1.1748 6.1080 sec/batch
Epoch 15/20  Iteration 2527/3560 Training loss: 1.1745 6.1159 sec/batch
Epoch 15/20  Iteration 2528/3560 Training loss: 1.1744 6.2543 sec/batch
Epoch 15/20  Iteration 2529/3560 Training loss: 1.1735 7.0261 sec/batch
Epoch 15/20  Iteration 2530/3560 Training loss: 1.1725 6.4759 sec/batch
Epoch 15/20  Iteration 2531/3560 Training loss: 1.1714 6.1738 sec/batch
Epoch 15/20  Iteration 2532/3560 Training loss: 1.1712 6.1953 sec/batch
Epoch 15/20  Iteration 2533/3560 Training loss: 1.1706 6.1333 sec/batch
Epoch 15/20  Iteration 2534/3560 Training loss: 1.1717 6.1117 sec/batch
Epoch 15/20  Iteration 2535/3560 Training loss: 1.1716 6.6375 sec/batch
Epoch 15/20  Iteration 2536/3560 Training loss: 1.1708 6.4969 sec/batch
Epoch 15/20  Iteration 2537/3560 Training loss: 1.1710 6.9822 sec/batch
Epoch 15/20  Iteration 2538/3560 Training loss: 1.1704 7.2367 sec/batch
Epoch 15/20  Iteration 2539/3560 Training loss: 1.1702 7.3631 sec/batch
Epoch 15/20  Iteration 2540/3560 Training loss: 1.1696 6.9913 sec/batch
Epoch 15/20  Iteration 2541/3560 Training loss: 1.1695 8.7379 sec/batch
Epoch 15/20  Iteration 2542/3560 Training loss: 1.1696 8.0757 sec/batch
Epoch 15/20  Iteration 2543/3560 Training loss: 1.1691 7.1791 sec/batch
Epoch 15/20  Iteration 2544/3560 Training loss: 1.1697 6.6198 sec/batch
Epoch 15/20  Iteration 2545/3560 Training loss: 1.1695 6.4479 sec/batch
Epoch 15/20  Iteration 2546/3560 Training loss: 1.1698 6.3171 sec/batch
Epoch 15/20  Iteration 2547/3560 Training loss: 1.1697 6.2927 sec/batch
Epoch 15/20  Iteration 2548/3560 Training loss: 1.1697 6.5638 sec/batch
Epoch 15/20  Iteration 2549/3560 Training loss: 1.1701 6.9373 sec/batch
Epoch 15/20  Iteration 2550/3560 Training loss: 1.1698 6.9489 sec/batch
Epoch 15/20  Iteration 2551/3560 Training loss: 1.1693 6.4421 sec/batch
Epoch 15/20  Iteration 2552/3560 Training loss: 1.1699 6.6391 sec/batch
Epoch 15/20  Iteration 2553/3560 Training loss: 1.1699 6.7288 sec/batch
Epoch 15/20  Iteration 2554/3560 Training loss: 1.1708 6.3149 sec/batch
Epoch 15/20  Iteration 2555/3560 Training loss: 1.1709 6.5086 sec/batch
Epoch 15/20  Iteration 2556/3560 Training loss: 1.1711 7.0827 sec/batch
Epoch 15/20  Iteration 2557/3560 Training loss: 1.1712 6.9204 sec/batch
Epoch 15/20  Iteration 2558/3560 Training loss: 1.1713 6.4180 sec/batch
Epoch 15/20  Iteration 2559/3560 Training loss: 1.1715 6.1707 sec/batch
Epoch 15/20  Iteration 2560/3560 Training loss: 1.1713 6.5606 sec/batch
Epoch 15/20  Iteration 2561/3560 Training loss: 1.1715 7.4889 sec/batch
Epoch 15/20  Iteration 2562/3560 Training loss: 1.1714 11.8855 sec/batch
Epoch 15/20  Iteration 2563/3560 Training loss: 1.1719 11.2868 sec/batch
Epoch 15/20  Iteration 2564/3560 Training loss: 1.1721 9.2628 sec/batch
Epoch 15/20  Iteration 2565/3560 Training loss: 1.1726 7.7709 sec/batch
Epoch 15/20  Iteration 2566/3560 Training loss: 1.1723 8.0563 sec/batch
Epoch 15/20  Iteration 2567/3560 Training loss: 1.1723 8.4033 sec/batch
Epoch 15/20  Iteration 2568/3560 Training loss: 1.1724 7.9241 sec/batch
Epoch 15/20  Iteration 2569/3560 Training loss: 1.1723 7.4613 sec/batch
Epoch 15/20  Iteration 2570/3560 Training loss: 1.1721 8.2510 sec/batch
Epoch 15/20  Iteration 2571/3560 Training loss: 1.1716 8.1405 sec/batch
Epoch 15/20  Iteration 2572/3560 Training loss: 1.1716 7.8234 sec/batch
Epoch 15/20  Iteration 2573/3560 Training loss: 1.1711 7.5388 sec/batch
Epoch 15/20  Iteration 2574/3560 Training loss: 1.1711 7.4613 sec/batch
Epoch 15/20  Iteration 2575/3560 Training loss: 1.1707 7.5190 sec/batch
Epoch 15/20  Iteration 2576/3560 Training loss: 1.1706 7.6232 sec/batch
Epoch 15/20  Iteration 2577/3560 Training loss: 1.1704 7.8066 sec/batch
Epoch 15/20  Iteration 2578/3560 Training loss: 1.1703 7.8553 sec/batch
Epoch 15/20  Iteration 2579/3560 Training loss: 1.1701 7.3990 sec/batch
Epoch 15/20  Iteration 2580/3560 Training loss: 1.1699 7.3642 sec/batch
Epoch 15/20  Iteration 2581/3560 Training loss: 1.1696 7.4652 sec/batch
Epoch 15/20  Iteration 2582/3560 Training loss: 1.1696 6.8979 sec/batch
Epoch 15/20  Iteration 2583/3560 Training loss: 1.1694 7.4321 sec/batch
Epoch 15/20  Iteration 2584/3560 Training loss: 1.1692 7.3299 sec/batch
Epoch 15/20  Iteration 2585/3560 Training loss: 1.1689 6.7411 sec/batch
Epoch 15/20  Iteration 2586/3560 Training loss: 1.1686 6.7830 sec/batch
Epoch 15/20  Iteration 2587/3560 Training loss: 1.1684 6.7823 sec/batch
Epoch 15/20  Iteration 2588/3560 Training loss: 1.1685 6.8462 sec/batch
Epoch 15/20  Iteration 2589/3560 Training loss: 1.1684 7.0707 sec/batch
Epoch 15/20  Iteration 2590/3560 Training loss: 1.1681 7.0672 sec/batch
Epoch 15/20  Iteration 2591/3560 Training loss: 1.1678 7.3232 sec/batch
Epoch 15/20  Iteration 2592/3560 Training loss: 1.1676 6.8430 sec/batch
Epoch 15/20  Iteration 2593/3560 Training loss: 1.1675 6.7583 sec/batch
Epoch 15/20  Iteration 2594/3560 Training loss: 1.1673 6.7937 sec/batch
Epoch 15/20  Iteration 2595/3560 Training loss: 1.1674 6.9884 sec/batch
Epoch 15/20  Iteration 2596/3560 Training loss: 1.1672 6.9522 sec/batch
Epoch 15/20  Iteration 2597/3560 Training loss: 1.1670 7.1098 sec/batch
Epoch 15/20  Iteration 2598/3560 Training loss: 1.1669 6.7671 sec/batch
Epoch 15/20  Iteration 2599/3560 Training loss: 1.1668 6.7624 sec/batch
Epoch 15/20  Iteration 2600/3560 Training loss: 1.1669 6.7136 sec/batch
Epoch 15/20  Iteration 2601/3560 Training loss: 1.1667 7.1783 sec/batch
Epoch 15/20  Iteration 2602/3560 Training loss: 1.1668 7.1816 sec/batch
Epoch 15/20  Iteration 2603/3560 Training loss: 1.1666 7.2371 sec/batch
Epoch 15/20  Iteration 2604/3560 Training loss: 1.1666 6.8181 sec/batch
Epoch 15/20  Iteration 2605/3560 Training loss: 1.1665 6.7286 sec/batch
Epoch 15/20  Iteration 2606/3560 Training loss: 1.1664 6.8038 sec/batch
Epoch 15/20  Iteration 2607/3560 Training loss: 1.1662 6.8582 sec/batch
Epoch 15/20  Iteration 2608/3560 Training loss: 1.1660 7.0899 sec/batch
Epoch 15/20  Iteration 2609/3560 Training loss: 1.1659 7.0089 sec/batch
Epoch 15/20  Iteration 2610/3560 Training loss: 1.1660 7.0450 sec/batch
Epoch 15/20  Iteration 2611/3560 Training loss: 1.1658 7.0810 sec/batch
Epoch 15/20  Iteration 2612/3560 Training loss: 1.1658 7.4614 sec/batch
Epoch 15/20  Iteration 2613/3560 Training loss: 1.1657 6.9646 sec/batch
Epoch 15/20  Iteration 2614/3560 Training loss: 1.1654 7.1359 sec/batch
Epoch 15/20  Iteration 2615/3560 Training loss: 1.1650 6.9277 sec/batch
Epoch 15/20  Iteration 2616/3560 Training loss: 1.1649 7.1861 sec/batch
Epoch 15/20  Iteration 2617/3560 Training loss: 1.1648 7.0276 sec/batch
Epoch 15/20  Iteration 2618/3560 Training loss: 1.1645 7.3940 sec/batch
Epoch 15/20  Iteration 2619/3560 Training loss: 1.1645 6.8404 sec/batch
Epoch 15/20  Iteration 2620/3560 Training loss: 1.1645 6.8965 sec/batch
Epoch 15/20  Iteration 2621/3560 Training loss: 1.1643 7.0275 sec/batch
Epoch 15/20  Iteration 2622/3560 Training loss: 1.1640 7.1159 sec/batch
Epoch 15/20  Iteration 2623/3560 Training loss: 1.1637 6.7235 sec/batch
Epoch 15/20  Iteration 2624/3560 Training loss: 1.1635 6.7728 sec/batch
Epoch 15/20  Iteration 2625/3560 Training loss: 1.1636 7.0673 sec/batch
Epoch 15/20  Iteration 2626/3560 Training loss: 1.1635 7.0592 sec/batch
Epoch 15/20  Iteration 2627/3560 Training loss: 1.1636 7.0951 sec/batch
Epoch 15/20  Iteration 2628/3560 Training loss: 1.1636 7.2945 sec/batch
Epoch 15/20  Iteration 2629/3560 Training loss: 1.1637 6.7220 sec/batch
Epoch 15/20  Iteration 2630/3560 Training loss: 1.1639 6.8051 sec/batch
Epoch 15/20  Iteration 2631/3560 Training loss: 1.1639 6.7491 sec/batch
Epoch 15/20  Iteration 2632/3560 Training loss: 1.1640 6.8165 sec/batch
Epoch 15/20  Iteration 2633/3560 Training loss: 1.1643 7.2975 sec/batch
Epoch 15/20  Iteration 2634/3560 Training loss: 1.1644 6.9773 sec/batch
Epoch 15/20  Iteration 2635/3560 Training loss: 1.1643 7.4005 sec/batch
Epoch 15/20  Iteration 2636/3560 Training loss: 1.1645 6.8013 sec/batch
Epoch 15/20  Iteration 2637/3560 Training loss: 1.1644 6.8229 sec/batch
Epoch 15/20  Iteration 2638/3560 Training loss: 1.1646 6.7828 sec/batch
Epoch 15/20  Iteration 2639/3560 Training loss: 1.1647 6.9478 sec/batch
Epoch 15/20  Iteration 2640/3560 Training loss: 1.1648 6.9263 sec/batch
Epoch 15/20  Iteration 2641/3560 Training loss: 1.1650 8.2730 sec/batch
Epoch 15/20  Iteration 2642/3560 Training loss: 1.1649 10.9090 sec/batch
Epoch 15/20  Iteration 2643/3560 Training loss: 1.1647 13.3956 sec/batch
Epoch 15/20  Iteration 2644/3560 Training loss: 1.1646 10.6177 sec/batch
Epoch 15/20  Iteration 2645/3560 Training loss: 1.1646 7.0144 sec/batch
Epoch 15/20  Iteration 2646/3560 Training loss: 1.1646 9.4803 sec/batch
Epoch 15/20  Iteration 2647/3560 Training loss: 1.1646 8.3637 sec/batch
Epoch 15/20  Iteration 2648/3560 Training loss: 1.1646 8.5631 sec/batch
Epoch 15/20  Iteration 2649/3560 Training loss: 1.1646 9.9369 sec/batch
Epoch 15/20  Iteration 2650/3560 Training loss: 1.1644 9.1585 sec/batch
Epoch 15/20  Iteration 2651/3560 Training loss: 1.1642 8.6860 sec/batch
Epoch 15/20  Iteration 2652/3560 Training loss: 1.1643 7.5667 sec/batch
Epoch 15/20  Iteration 2653/3560 Training loss: 1.1645 7.3316 sec/batch
Epoch 15/20  Iteration 2654/3560 Training loss: 1.1646 7.2524 sec/batch
Epoch 15/20  Iteration 2655/3560 Training loss: 1.1645 7.7147 sec/batch
Epoch 15/20  Iteration 2656/3560 Training loss: 1.1645 7.3019 sec/batch
Epoch 15/20  Iteration 2657/3560 Training loss: 1.1644 7.6462 sec/batch
Epoch 15/20  Iteration 2658/3560 Training loss: 1.1643 7.3550 sec/batch
Epoch 15/20  Iteration 2659/3560 Training loss: 1.1644 7.3419 sec/batch
Epoch 15/20  Iteration 2660/3560 Training loss: 1.1648 7.4413 sec/batch
Epoch 15/20  Iteration 2661/3560 Training loss: 1.1648 7.4124 sec/batch
Epoch 15/20  Iteration 2662/3560 Training loss: 1.1649 7.5069 sec/batch
Epoch 15/20  Iteration 2663/3560 Training loss: 1.1649 7.6488 sec/batch
Epoch 15/20  Iteration 2664/3560 Training loss: 1.1649 8.2214 sec/batch
Epoch 15/20  Iteration 2665/3560 Training loss: 1.1650 8.8492 sec/batch
Epoch 15/20  Iteration 2666/3560 Training loss: 1.1650 7.2636 sec/batch
Epoch 15/20  Iteration 2667/3560 Training loss: 1.1650 8.0203 sec/batch
Epoch 15/20  Iteration 2668/3560 Training loss: 1.1648 7.8280 sec/batch
Epoch 15/20  Iteration 2669/3560 Training loss: 1.1647 8.5764 sec/batch
Epoch 15/20  Iteration 2670/3560 Training loss: 1.1649 6.3029 sec/batch
Epoch 16/20  Iteration 2671/3560 Training loss: 1.2937 6.2797 sec/batch
Epoch 16/20  Iteration 2672/3560 Training loss: 1.2358 6.2303 sec/batch
Epoch 16/20  Iteration 2673/3560 Training loss: 1.2110 6.1915 sec/batch
Epoch 16/20  Iteration 2674/3560 Training loss: 1.2046 6.4352 sec/batch
Epoch 16/20  Iteration 2675/3560 Training loss: 1.1914 6.4585 sec/batch
Epoch 16/20  Iteration 2676/3560 Training loss: 1.1804 6.2962 sec/batch
Epoch 16/20  Iteration 2677/3560 Training loss: 1.1795 6.1178 sec/batch
Epoch 16/20  Iteration 2678/3560 Training loss: 1.1764 6.1721 sec/batch
Epoch 16/20  Iteration 2679/3560 Training loss: 1.1739 6.1397 sec/batch
Epoch 16/20  Iteration 2680/3560 Training loss: 1.1725 6.0590 sec/batch
Epoch 16/20  Iteration 2681/3560 Training loss: 1.1698 6.4173 sec/batch
Epoch 16/20  Iteration 2682/3560 Training loss: 1.1693 6.7504 sec/batch
Epoch 16/20  Iteration 2683/3560 Training loss: 1.1693 6.4903 sec/batch
Epoch 16/20  Iteration 2684/3560 Training loss: 1.1691 6.1914 sec/batch
Epoch 16/20  Iteration 2685/3560 Training loss: 1.1671 6.1801 sec/batch
Epoch 16/20  Iteration 2686/3560 Training loss: 1.1649 6.0961 sec/batch
Epoch 16/20  Iteration 2687/3560 Training loss: 1.1650 6.1914 sec/batch
Epoch 16/20  Iteration 2688/3560 Training loss: 1.1662 6.3869 sec/batch
Epoch 16/20  Iteration 2689/3560 Training loss: 1.1660 6.5136 sec/batch
Epoch 16/20  Iteration 2690/3560 Training loss: 1.1663 7.4104 sec/batch
Epoch 16/20  Iteration 2691/3560 Training loss: 1.1658 6.7019 sec/batch
Epoch 16/20  Iteration 2692/3560 Training loss: 1.1660 6.9822 sec/batch
Epoch 16/20  Iteration 2693/3560 Training loss: 1.1651 6.3909 sec/batch
Epoch 16/20  Iteration 2694/3560 Training loss: 1.1658 10.6748 sec/batch
Epoch 16/20  Iteration 2695/3560 Training loss: 1.1659 8.8261 sec/batch
Epoch 16/20  Iteration 2696/3560 Training loss: 1.1642 8.1976 sec/batch
Epoch 16/20  Iteration 2697/3560 Training loss: 1.1630 6.3479 sec/batch
Epoch 16/20  Iteration 2698/3560 Training loss: 1.1636 6.3943 sec/batch
Epoch 16/20  Iteration 2699/3560 Training loss: 1.1636 6.7977 sec/batch
Epoch 16/20  Iteration 2700/3560 Training loss: 1.1637 7.6882 sec/batch
Epoch 16/20  Iteration 2701/3560 Training loss: 1.1630 7.6982 sec/batch
Epoch 16/20  Iteration 2702/3560 Training loss: 1.1620 6.9436 sec/batch
Epoch 16/20  Iteration 2703/3560 Training loss: 1.1621 7.5059 sec/batch
Epoch 16/20  Iteration 2704/3560 Training loss: 1.1621 6.8096 sec/batch
Epoch 16/20  Iteration 2705/3560 Training loss: 1.1617 6.8318 sec/batch
Epoch 16/20  Iteration 2706/3560 Training loss: 1.1617 6.8976 sec/batch
Epoch 16/20  Iteration 2707/3560 Training loss: 1.1608 6.7427 sec/batch
Epoch 16/20  Iteration 2708/3560 Training loss: 1.1597 7.1947 sec/batch
Epoch 16/20  Iteration 2709/3560 Training loss: 1.1586 6.9580 sec/batch
Epoch 16/20  Iteration 2710/3560 Training loss: 1.1586 7.0655 sec/batch
Epoch 16/20  Iteration 2711/3560 Training loss: 1.1580 6.7238 sec/batch
Epoch 16/20  Iteration 2712/3560 Training loss: 1.1591 6.7689 sec/batch
Epoch 16/20  Iteration 2713/3560 Training loss: 1.1589 6.7054 sec/batch
Epoch 16/20  Iteration 2714/3560 Training loss: 1.1582 6.6593 sec/batch
Epoch 16/20  Iteration 2715/3560 Training loss: 1.1582 7.0893 sec/batch
Epoch 16/20  Iteration 2716/3560 Training loss: 1.1577 7.3520 sec/batch
Epoch 16/20  Iteration 2717/3560 Training loss: 1.1576 7.0424 sec/batch
Epoch 16/20  Iteration 2718/3560 Training loss: 1.1572 6.7489 sec/batch
Epoch 16/20  Iteration 2719/3560 Training loss: 1.1569 6.8674 sec/batch
Epoch 16/20  Iteration 2720/3560 Training loss: 1.1571 6.8153 sec/batch
Epoch 16/20  Iteration 2721/3560 Training loss: 1.1569 6.7345 sec/batch
Epoch 16/20  Iteration 2722/3560 Training loss: 1.1575 7.1330 sec/batch
Epoch 16/20  Iteration 2723/3560 Training loss: 1.1573 6.9570 sec/batch
Epoch 16/20  Iteration 2724/3560 Training loss: 1.1575 7.0699 sec/batch
Epoch 16/20  Iteration 2725/3560 Training loss: 1.1572 6.7485 sec/batch
Epoch 16/20  Iteration 2726/3560 Training loss: 1.1573 6.7111 sec/batch
Epoch 16/20  Iteration 2727/3560 Training loss: 1.1575 6.3948 sec/batch
Epoch 16/20  Iteration 2728/3560 Training loss: 1.1572 6.3022 sec/batch
Epoch 16/20  Iteration 2729/3560 Training loss: 1.1567 7.2375 sec/batch
Epoch 16/20  Iteration 2730/3560 Training loss: 1.1572 10.0441 sec/batch
Epoch 16/20  Iteration 2731/3560 Training loss: 1.1572 8.0457 sec/batch
Epoch 16/20  Iteration 2732/3560 Training loss: 1.1580 6.8163 sec/batch
Epoch 16/20  Iteration 2733/3560 Training loss: 1.1582 7.3703 sec/batch
Epoch 16/20  Iteration 2734/3560 Training loss: 1.1583 7.1174 sec/batch
Epoch 16/20  Iteration 2735/3560 Training loss: 1.1583 7.1067 sec/batch
Epoch 16/20  Iteration 2736/3560 Training loss: 1.1584 7.7077 sec/batch
Epoch 16/20  Iteration 2737/3560 Training loss: 1.1587 7.6650 sec/batch
Epoch 16/20  Iteration 2738/3560 Training loss: 1.1583 6.9110 sec/batch
Epoch 16/20  Iteration 2739/3560 Training loss: 1.1584 6.9473 sec/batch
Epoch 16/20  Iteration 2740/3560 Training loss: 1.1583 6.7997 sec/batch
Epoch 16/20  Iteration 2741/3560 Training loss: 1.1589 6.9871 sec/batch
Epoch 16/20  Iteration 2742/3560 Training loss: 1.1589 7.0786 sec/batch
Epoch 16/20  Iteration 2743/3560 Training loss: 1.1593 6.8832 sec/batch
Epoch 16/20  Iteration 2744/3560 Training loss: 1.1590 7.1411 sec/batch
Epoch 16/20  Iteration 2745/3560 Training loss: 1.1590 6.8528 sec/batch
Epoch 16/20  Iteration 2746/3560 Training loss: 1.1591 6.7946 sec/batch
Epoch 16/20  Iteration 2747/3560 Training loss: 1.1590 6.7448 sec/batch
Epoch 16/20  Iteration 2748/3560 Training loss: 1.1590 6.8821 sec/batch
Epoch 16/20  Iteration 2749/3560 Training loss: 1.1584 7.0954 sec/batch
Epoch 16/20  Iteration 2750/3560 Training loss: 1.1585 7.6917 sec/batch
Epoch 16/20  Iteration 2751/3560 Training loss: 1.1580 7.0978 sec/batch
Epoch 16/20  Iteration 2752/3560 Training loss: 1.1579 6.9083 sec/batch
Epoch 16/20  Iteration 2753/3560 Training loss: 1.1574 6.7037 sec/batch
Epoch 16/20  Iteration 2754/3560 Training loss: 1.1574 6.7715 sec/batch
Epoch 16/20  Iteration 2755/3560 Training loss: 1.1571 6.9768 sec/batch
Epoch 16/20  Iteration 2756/3560 Training loss: 1.1569 7.0900 sec/batch
Epoch 16/20  Iteration 2757/3560 Training loss: 1.1568 6.8588 sec/batch
Epoch 16/20  Iteration 2758/3560 Training loss: 1.1566 7.1287 sec/batch
Epoch 16/20  Iteration 2759/3560 Training loss: 1.1563 6.7496 sec/batch
Epoch 16/20  Iteration 2760/3560 Training loss: 1.1564 6.7654 sec/batch
Epoch 16/20  Iteration 2761/3560 Training loss: 1.1562 6.7934 sec/batch
Epoch 16/20  Iteration 2762/3560 Training loss: 1.1561 6.8655 sec/batch
Epoch 16/20  Iteration 2763/3560 Training loss: 1.1557 7.0853 sec/batch
Epoch 16/20  Iteration 2764/3560 Training loss: 1.1554 6.9732 sec/batch
Epoch 16/20  Iteration 2765/3560 Training loss: 1.1552 6.9481 sec/batch
Epoch 16/20  Iteration 2766/3560 Training loss: 1.1553 6.7574 sec/batch
Epoch 16/20  Iteration 2767/3560 Training loss: 1.1553 7.2904 sec/batch
Epoch 16/20  Iteration 2768/3560 Training loss: 1.1548 6.9610 sec/batch
Epoch 16/20  Iteration 2769/3560 Training loss: 1.1545 6.8909 sec/batch
Epoch 16/20  Iteration 2770/3560 Training loss: 1.1542 7.0859 sec/batch
Epoch 16/20  Iteration 2771/3560 Training loss: 1.1541 7.0259 sec/batch
Epoch 16/20  Iteration 2772/3560 Training loss: 1.1540 6.8864 sec/batch
Epoch 16/20  Iteration 2773/3560 Training loss: 1.1540 6.7013 sec/batch
Epoch 16/20  Iteration 2774/3560 Training loss: 1.1537 6.7379 sec/batch
Epoch 16/20  Iteration 2775/3560 Training loss: 1.1536 6.7972 sec/batch
Epoch 16/20  Iteration 2776/3560 Training loss: 1.1534 6.9743 sec/batch
Epoch 16/20  Iteration 2777/3560 Training loss: 1.1534 7.0900 sec/batch
Epoch 16/20  Iteration 2778/3560 Training loss: 1.1534 7.0928 sec/batch
Epoch 16/20  Iteration 2779/3560 Training loss: 1.1532 7.8159 sec/batch
Epoch 16/20  Iteration 2780/3560 Training loss: 1.1532 6.8354 sec/batch
Epoch 16/20  Iteration 2781/3560 Training loss: 1.1532 6.7177 sec/batch
Epoch 16/20  Iteration 2782/3560 Training loss: 1.1531 6.7993 sec/batch
Epoch 16/20  Iteration 2783/3560 Training loss: 1.1531 7.0050 sec/batch
Epoch 16/20  Iteration 2784/3560 Training loss: 1.1530 7.8207 sec/batch
Epoch 16/20  Iteration 2785/3560 Training loss: 1.1527 7.6139 sec/batch
Epoch 16/20  Iteration 2786/3560 Training loss: 1.1525 6.7701 sec/batch
Epoch 16/20  Iteration 2787/3560 Training loss: 1.1524 6.8074 sec/batch
Epoch 16/20  Iteration 2788/3560 Training loss: 1.1524 7.6369 sec/batch
Epoch 16/20  Iteration 2789/3560 Training loss: 1.1523 7.6569 sec/batch
Epoch 16/20  Iteration 2790/3560 Training loss: 1.1523 7.4345 sec/batch
Epoch 16/20  Iteration 2791/3560 Training loss: 1.1522 9.9613 sec/batch
Epoch 16/20  Iteration 2792/3560 Training loss: 1.1519 9.3688 sec/batch
Epoch 16/20  Iteration 2793/3560 Training loss: 1.1515 8.8065 sec/batch
Epoch 16/20  Iteration 2794/3560 Training loss: 1.1514 7.1120 sec/batch
Epoch 16/20  Iteration 2795/3560 Training loss: 1.1513 6.9431 sec/batch
Epoch 16/20  Iteration 2796/3560 Training loss: 1.1510 7.1726 sec/batch
Epoch 16/20  Iteration 2797/3560 Training loss: 1.1510 7.4657 sec/batch
Epoch 16/20  Iteration 2798/3560 Training loss: 1.1509 8.2292 sec/batch
Epoch 16/20  Iteration 2799/3560 Training loss: 1.1507 8.7427 sec/batch
Epoch 16/20  Iteration 2800/3560 Training loss: 1.1504 9.0461 sec/batch
Epoch 16/20  Iteration 2801/3560 Training loss: 1.1500 8.9275 sec/batch
Epoch 16/20  Iteration 2802/3560 Training loss: 1.1500 8.5649 sec/batch
Epoch 16/20  Iteration 2803/3560 Training loss: 1.1500 8.9457 sec/batch
Epoch 16/20  Iteration 2804/3560 Training loss: 1.1500 8.9732 sec/batch
Epoch 16/20  Iteration 2805/3560 Training loss: 1.1500 8.5708 sec/batch
Epoch 16/20  Iteration 2806/3560 Training loss: 1.1501 8.8608 sec/batch
Epoch 16/20  Iteration 2807/3560 Training loss: 1.1502 9.0939 sec/batch
Epoch 16/20  Iteration 2808/3560 Training loss: 1.1503 9.8202 sec/batch
Epoch 16/20  Iteration 2809/3560 Training loss: 1.1503 9.4185 sec/batch
Epoch 16/20  Iteration 2810/3560 Training loss: 1.1504 8.3502 sec/batch
Epoch 16/20  Iteration 2811/3560 Training loss: 1.1507 9.6339 sec/batch
Epoch 16/20  Iteration 2812/3560 Training loss: 1.1508 9.1393 sec/batch
Epoch 16/20  Iteration 2813/3560 Training loss: 1.1508 7.2559 sec/batch
Epoch 16/20  Iteration 2814/3560 Training loss: 1.1511 6.8644 sec/batch
Epoch 16/20  Iteration 2815/3560 Training loss: 1.1510 6.9204 sec/batch
Epoch 16/20  Iteration 2816/3560 Training loss: 1.1511 8.3528 sec/batch
Epoch 16/20  Iteration 2817/3560 Training loss: 1.1512 6.7109 sec/batch
Epoch 16/20  Iteration 2818/3560 Training loss: 1.1513 9.1495 sec/batch
Epoch 16/20  Iteration 2819/3560 Training loss: 1.1515 8.4958 sec/batch
Epoch 16/20  Iteration 2820/3560 Training loss: 1.1514 10.2873 sec/batch
Epoch 16/20  Iteration 2821/3560 Training loss: 1.1512 11.1657 sec/batch
Epoch 16/20  Iteration 2822/3560 Training loss: 1.1511 10.0504 sec/batch
Epoch 16/20  Iteration 2823/3560 Training loss: 1.1511 6.5256 sec/batch
Epoch 16/20  Iteration 2824/3560 Training loss: 1.1510 6.5709 sec/batch
Epoch 16/20  Iteration 2825/3560 Training loss: 1.1510 12.0259 sec/batch
Epoch 16/20  Iteration 2826/3560 Training loss: 1.1510 10.2697 sec/batch
Epoch 16/20  Iteration 2827/3560 Training loss: 1.1509 11.1140 sec/batch
Epoch 16/20  Iteration 2828/3560 Training loss: 1.1508 9.7864 sec/batch
Epoch 16/20  Iteration 2829/3560 Training loss: 1.1506 9.7759 sec/batch
Epoch 16/20  Iteration 2830/3560 Training loss: 1.1507 10.0546 sec/batch
Epoch 16/20  Iteration 2831/3560 Training loss: 1.1508 9.9828 sec/batch
Epoch 16/20  Iteration 2832/3560 Training loss: 1.1509 9.0407 sec/batch
Epoch 16/20  Iteration 2833/3560 Training loss: 1.1508 7.6510 sec/batch
Epoch 16/20  Iteration 2834/3560 Training loss: 1.1509 8.5845 sec/batch
Epoch 16/20  Iteration 2835/3560 Training loss: 1.1509 9.7397 sec/batch
Epoch 16/20  Iteration 2836/3560 Training loss: 1.1508 8.5852 sec/batch
Epoch 16/20  Iteration 2837/3560 Training loss: 1.1510 9.4047 sec/batch
Epoch 16/20  Iteration 2838/3560 Training loss: 1.1513 8.3897 sec/batch
Epoch 16/20  Iteration 2839/3560 Training loss: 1.1514 9.5284 sec/batch
Epoch 16/20  Iteration 2840/3560 Training loss: 1.1514 10.0460 sec/batch
Epoch 16/20  Iteration 2841/3560 Training loss: 1.1514 9.3626 sec/batch
Epoch 16/20  Iteration 2842/3560 Training loss: 1.1514 9.0032 sec/batch
Epoch 16/20  Iteration 2843/3560 Training loss: 1.1516 9.0745 sec/batch
Epoch 16/20  Iteration 2844/3560 Training loss: 1.1516 8.6305 sec/batch
Epoch 16/20  Iteration 2845/3560 Training loss: 1.1516 10.1444 sec/batch
Epoch 16/20  Iteration 2846/3560 Training loss: 1.1515 9.6452 sec/batch
Epoch 16/20  Iteration 2847/3560 Training loss: 1.1515 8.6445 sec/batch
Epoch 16/20  Iteration 2848/3560 Training loss: 1.1517 9.1423 sec/batch
Epoch 17/20  Iteration 2849/3560 Training loss: 1.2757 9.4220 sec/batch
Epoch 17/20  Iteration 2850/3560 Training loss: 1.2206 9.3097 sec/batch
Epoch 17/20  Iteration 2851/3560 Training loss: 1.1969 10.0924 sec/batch
Epoch 17/20  Iteration 2852/3560 Training loss: 1.1899 8.8943 sec/batch
Epoch 17/20  Iteration 2853/3560 Training loss: 1.1788 6.4173 sec/batch
Epoch 17/20  Iteration 2854/3560 Training loss: 1.1672 6.1749 sec/batch
Epoch 17/20  Iteration 2855/3560 Training loss: 1.1669 6.2249 sec/batch
Epoch 17/20  Iteration 2856/3560 Training loss: 1.1653 6.1727 sec/batch
Epoch 17/20  Iteration 2857/3560 Training loss: 1.1634 6.4615 sec/batch
Epoch 17/20  Iteration 2858/3560 Training loss: 1.1628 6.4456 sec/batch
Epoch 17/20  Iteration 2859/3560 Training loss: 1.1598 6.7284 sec/batch
Epoch 17/20  Iteration 2860/3560 Training loss: 1.1588 6.1813 sec/batch
Epoch 17/20  Iteration 2861/3560 Training loss: 1.1587 6.1925 sec/batch
Epoch 17/20  Iteration 2862/3560 Training loss: 1.1581 6.2714 sec/batch
Epoch 17/20  Iteration 2863/3560 Training loss: 1.1564 6.1113 sec/batch
Epoch 17/20  Iteration 2864/3560 Training loss: 1.1545 6.1068 sec/batch
Epoch 17/20  Iteration 2865/3560 Training loss: 1.1547 6.3463 sec/batch
Epoch 17/20  Iteration 2866/3560 Training loss: 1.1553 6.3475 sec/batch
Epoch 17/20  Iteration 2867/3560 Training loss: 1.1550 6.8249 sec/batch
Epoch 17/20  Iteration 2868/3560 Training loss: 1.1561 6.3100 sec/batch
Epoch 17/20  Iteration 2869/3560 Training loss: 1.1554 6.1840 sec/batch
Epoch 17/20  Iteration 2870/3560 Training loss: 1.1554 8.1733 sec/batch
Epoch 17/20  Iteration 2871/3560 Training loss: 1.1547 9.4866 sec/batch
Epoch 17/20  Iteration 2872/3560 Training loss: 1.1547 6.5105 sec/batch
Epoch 17/20  Iteration 2873/3560 Training loss: 1.1544 6.8863 sec/batch
Epoch 17/20  Iteration 2874/3560 Training loss: 1.1526 7.3513 sec/batch
Epoch 17/20  Iteration 2875/3560 Training loss: 1.1518 6.5185 sec/batch
Epoch 17/20  Iteration 2876/3560 Training loss: 1.1527 6.3285 sec/batch
Epoch 17/20  Iteration 2877/3560 Training loss: 1.1530 6.1699 sec/batch
Epoch 17/20  Iteration 2878/3560 Training loss: 1.1533 6.1749 sec/batch
Epoch 17/20  Iteration 2879/3560 Training loss: 1.1527 6.1975 sec/batch
Epoch 17/20  Iteration 2880/3560 Training loss: 1.1519 6.4412 sec/batch
Epoch 17/20  Iteration 2881/3560 Training loss: 1.1517 6.4147 sec/batch
Epoch 17/20  Iteration 2882/3560 Training loss: 1.1515 6.6282 sec/batch
Epoch 17/20  Iteration 2883/3560 Training loss: 1.1508 6.2522 sec/batch
Epoch 17/20  Iteration 2884/3560 Training loss: 1.1508 6.2598 sec/batch
Epoch 17/20  Iteration 2885/3560 Training loss: 1.1502 6.2325 sec/batch
Epoch 17/20  Iteration 2886/3560 Training loss: 1.1490 6.6626 sec/batch
Epoch 17/20  Iteration 2887/3560 Training loss: 1.1481 6.2431 sec/batch
Epoch 17/20  Iteration 2888/3560 Training loss: 1.1479 6.4428 sec/batch
Epoch 17/20  Iteration 2889/3560 Training loss: 1.1474 6.7305 sec/batch
Epoch 17/20  Iteration 2890/3560 Training loss: 1.1483 6.6576 sec/batch
Epoch 17/20  Iteration 2891/3560 Training loss: 1.1482 6.2772 sec/batch
Epoch 17/20  Iteration 2892/3560 Training loss: 1.1474 6.4529 sec/batch
Epoch 17/20  Iteration 2893/3560 Training loss: 1.1474 6.1868 sec/batch
Epoch 17/20  Iteration 2894/3560 Training loss: 1.1470 6.2692 sec/batch
Epoch 17/20  Iteration 2895/3560 Training loss: 1.1469 7.1281 sec/batch
Epoch 17/20  Iteration 2896/3560 Training loss: 1.1464 7.3252 sec/batch
Epoch 17/20  Iteration 2897/3560 Training loss: 1.1463 6.9521 sec/batch
Epoch 17/20  Iteration 2898/3560 Training loss: 1.1468 6.3648 sec/batch
Epoch 17/20  Iteration 2899/3560 Training loss: 1.1463 6.2386 sec/batch
Epoch 17/20  Iteration 2900/3560 Training loss: 1.1470 6.2603 sec/batch
Epoch 17/20  Iteration 2901/3560 Training loss: 1.1470 6.2306 sec/batch
Epoch 17/20  Iteration 2902/3560 Training loss: 1.1473 6.1892 sec/batch
Epoch 17/20  Iteration 2903/3560 Training loss: 1.1471 6.4854 sec/batch
Epoch 17/20  Iteration 2904/3560 Training loss: 1.1474 6.3723 sec/batch
Epoch 17/20  Iteration 2905/3560 Training loss: 1.1477 7.3089 sec/batch
Epoch 17/20  Iteration 2906/3560 Training loss: 1.1473 7.4671 sec/batch
Epoch 17/20  Iteration 2907/3560 Training loss: 1.1468 7.0338 sec/batch
Epoch 17/20  Iteration 2908/3560 Training loss: 1.1473 6.2498 sec/batch
Epoch 17/20  Iteration 2909/3560 Training loss: 1.1474 6.2478 sec/batch
Epoch 17/20  Iteration 2910/3560 Training loss: 1.1479 6.2693 sec/batch
Epoch 17/20  Iteration 2911/3560 Training loss: 1.1482 6.4042 sec/batch
Epoch 17/20  Iteration 2912/3560 Training loss: 1.1484 6.3893 sec/batch
Epoch 17/20  Iteration 2913/3560 Training loss: 1.1484 6.5567 sec/batch
Epoch 17/20  Iteration 2914/3560 Training loss: 1.1485 6.1864 sec/batch
Epoch 17/20  Iteration 2915/3560 Training loss: 1.1487 6.1824 sec/batch
Epoch 17/20  Iteration 2916/3560 Training loss: 1.1486 6.1534 sec/batch
Epoch 17/20  Iteration 2917/3560 Training loss: 1.1488 6.1667 sec/batch
Epoch 17/20  Iteration 2918/3560 Training loss: 1.1487 6.1670 sec/batch
Epoch 17/20  Iteration 2919/3560 Training loss: 1.1492 6.2504 sec/batch
Epoch 17/20  Iteration 2920/3560 Training loss: 1.1494 6.5091 sec/batch
Epoch 17/20  Iteration 2921/3560 Training loss: 1.1498 6.4920 sec/batch
Epoch 17/20  Iteration 2922/3560 Training loss: 1.1494 6.0658 sec/batch
Epoch 17/20  Iteration 2923/3560 Training loss: 1.1493 7.7071 sec/batch
Epoch 17/20  Iteration 2924/3560 Training loss: 1.1495 8.8385 sec/batch
Epoch 17/20  Iteration 2925/3560 Training loss: 1.1494 8.8612 sec/batch
Epoch 17/20  Iteration 2926/3560 Training loss: 1.1493 7.1799 sec/batch
Epoch 17/20  Iteration 2927/3560 Training loss: 1.1488 6.6577 sec/batch
Epoch 17/20  Iteration 2928/3560 Training loss: 1.1487 8.9940 sec/batch
Epoch 17/20  Iteration 2929/3560 Training loss: 1.1483 8.1188 sec/batch
Epoch 17/20  Iteration 2930/3560 Training loss: 1.1482 6.9136 sec/batch
Epoch 17/20  Iteration 2931/3560 Training loss: 1.1479 6.8140 sec/batch
Epoch 17/20  Iteration 2932/3560 Training loss: 1.1478 6.8475 sec/batch
Epoch 17/20  Iteration 2933/3560 Training loss: 1.1475 7.0015 sec/batch
Epoch 17/20  Iteration 2934/3560 Training loss: 1.1473 7.0426 sec/batch
Epoch 17/20  Iteration 2935/3560 Training loss: 1.1472 7.4501 sec/batch
Epoch 17/20  Iteration 2936/3560 Training loss: 1.1470 7.1424 sec/batch
Epoch 17/20  Iteration 2937/3560 Training loss: 1.1467 6.9493 sec/batch
Epoch 17/20  Iteration 2938/3560 Training loss: 1.1468 6.7841 sec/batch
Epoch 17/20  Iteration 2939/3560 Training loss: 1.1465 6.8675 sec/batch
Epoch 17/20  Iteration 2940/3560 Training loss: 1.1464 6.8663 sec/batch
Epoch 17/20  Iteration 2941/3560 Training loss: 1.1460 7.4986 sec/batch
Epoch 17/20  Iteration 2942/3560 Training loss: 1.1457 7.6709 sec/batch
Epoch 17/20  Iteration 2943/3560 Training loss: 1.1456 7.2439 sec/batch
Epoch 17/20  Iteration 2944/3560 Training loss: 1.1457 6.7306 sec/batch
Epoch 17/20  Iteration 2945/3560 Training loss: 1.1457 6.7360 sec/batch
Epoch 17/20  Iteration 2946/3560 Training loss: 1.1453 6.7486 sec/batch
Epoch 17/20  Iteration 2947/3560 Training loss: 1.1450 6.6758 sec/batch
Epoch 17/20  Iteration 2948/3560 Training loss: 1.1447 7.0389 sec/batch
Epoch 17/20  Iteration 2949/3560 Training loss: 1.1446 6.9714 sec/batch
Epoch 17/20  Iteration 2950/3560 Training loss: 1.1445 7.2281 sec/batch
Epoch 17/20  Iteration 2951/3560 Training loss: 1.1445 6.7934 sec/batch
Epoch 17/20  Iteration 2952/3560 Training loss: 1.1444 6.7542 sec/batch
Epoch 17/20  Iteration 2953/3560 Training loss: 1.1442 6.8097 sec/batch
Epoch 17/20  Iteration 2954/3560 Training loss: 1.1441 7.0171 sec/batch
Epoch 17/20  Iteration 2955/3560 Training loss: 1.1441 7.1475 sec/batch
Epoch 17/20  Iteration 2956/3560 Training loss: 1.1441 7.6383 sec/batch
Epoch 17/20  Iteration 2957/3560 Training loss: 1.1439 6.6640 sec/batch
Epoch 17/20  Iteration 2958/3560 Training loss: 1.1440 6.8493 sec/batch
Epoch 17/20  Iteration 2959/3560 Training loss: 1.1438 6.7448 sec/batch
Epoch 17/20  Iteration 2960/3560 Training loss: 1.1438 6.6157 sec/batch
Epoch 17/20  Iteration 2961/3560 Training loss: 1.1437 7.6608 sec/batch
Epoch 17/20  Iteration 2962/3560 Training loss: 1.1436 6.4327 sec/batch
Epoch 17/20  Iteration 2963/3560 Training loss: 1.1434 6.6325 sec/batch
Epoch 17/20  Iteration 2964/3560 Training loss: 1.1431 6.9764 sec/batch
Epoch 17/20  Iteration 2965/3560 Training loss: 1.1430 6.9450 sec/batch
Epoch 17/20  Iteration 2966/3560 Training loss: 1.1430 7.3987 sec/batch
Epoch 17/20  Iteration 2967/3560 Training loss: 1.1429 6.4785 sec/batch
Epoch 17/20  Iteration 2968/3560 Training loss: 1.1429 6.3860 sec/batch
Epoch 17/20  Iteration 2969/3560 Training loss: 1.1428 6.4069 sec/batch
Epoch 17/20  Iteration 2970/3560 Training loss: 1.1425 6.3435 sec/batch
Epoch 17/20  Iteration 2971/3560 Training loss: 1.1421 7.1165 sec/batch
Epoch 17/20  Iteration 2972/3560 Training loss: 1.1420 7.2257 sec/batch
Epoch 17/20  Iteration 2973/3560 Training loss: 1.1420 9.1110 sec/batch
Epoch 17/20  Iteration 2974/3560 Training loss: 1.1416 8.2461 sec/batch
Epoch 17/20  Iteration 2975/3560 Training loss: 1.1416 7.0883 sec/batch
Epoch 17/20  Iteration 2976/3560 Training loss: 1.1415 7.7660 sec/batch
Epoch 17/20  Iteration 2977/3560 Training loss: 1.1413 7.5259 sec/batch
Epoch 17/20  Iteration 2978/3560 Training loss: 1.1410 6.9834 sec/batch
Epoch 17/20  Iteration 2979/3560 Training loss: 1.1406 7.3615 sec/batch
Epoch 17/20  Iteration 2980/3560 Training loss: 1.1404 7.0116 sec/batch
Epoch 17/20  Iteration 2981/3560 Training loss: 1.1405 7.2655 sec/batch
Epoch 17/20  Iteration 2982/3560 Training loss: 1.1405 6.8158 sec/batch
Epoch 17/20  Iteration 2983/3560 Training loss: 1.1404 7.0452 sec/batch
Epoch 17/20  Iteration 2984/3560 Training loss: 1.1404 7.0257 sec/batch
Epoch 17/20  Iteration 2985/3560 Training loss: 1.1405 6.7724 sec/batch
Epoch 17/20  Iteration 2986/3560 Training loss: 1.1406 6.9594 sec/batch
Epoch 17/20  Iteration 2987/3560 Training loss: 1.1406 7.0350 sec/batch
Epoch 17/20  Iteration 2988/3560 Training loss: 1.1407 7.1029 sec/batch
Epoch 17/20  Iteration 2989/3560 Training loss: 1.1410 7.1864 sec/batch
Epoch 17/20  Iteration 2990/3560 Training loss: 1.1411 6.7992 sec/batch
Epoch 17/20  Iteration 2991/3560 Training loss: 1.1410 6.7899 sec/batch
Epoch 17/20  Iteration 2992/3560 Training loss: 1.1412 6.8035 sec/batch
Epoch 17/20  Iteration 2993/3560 Training loss: 1.1410 7.2276 sec/batch
Epoch 17/20  Iteration 2994/3560 Training loss: 1.1412 6.8939 sec/batch
Epoch 17/20  Iteration 2995/3560 Training loss: 1.1413 7.0102 sec/batch
Epoch 17/20  Iteration 2996/3560 Training loss: 1.1414 7.2940 sec/batch
Epoch 17/20  Iteration 2997/3560 Training loss: 1.1416 6.8757 sec/batch
Epoch 17/20  Iteration 2998/3560 Training loss: 1.1415 6.7286 sec/batch
Epoch 17/20  Iteration 2999/3560 Training loss: 1.1413 6.7216 sec/batch
Epoch 17/20  Iteration 3000/3560 Training loss: 1.1411 6.8320 sec/batch
Epoch 17/20  Iteration 3001/3560 Training loss: 1.1412 6.9790 sec/batch
Epoch 17/20  Iteration 3002/3560 Training loss: 1.1411 6.8804 sec/batch
Epoch 17/20  Iteration 3003/3560 Training loss: 1.1411 7.0521 sec/batch
Epoch 17/20  Iteration 3004/3560 Training loss: 1.1410 7.1837 sec/batch
Epoch 17/20  Iteration 3005/3560 Training loss: 1.1409 6.8844 sec/batch
Epoch 17/20  Iteration 3006/3560 Training loss: 1.1408 6.7693 sec/batch
Epoch 17/20  Iteration 3007/3560 Training loss: 1.1407 6.7330 sec/batch
Epoch 17/20  Iteration 3008/3560 Training loss: 1.1408 6.7492 sec/batch
Epoch 17/20  Iteration 3009/3560 Training loss: 1.1409 6.8607 sec/batch
Epoch 17/20  Iteration 3010/3560 Training loss: 1.1409 7.3178 sec/batch
Epoch 17/20  Iteration 3011/3560 Training loss: 1.1408 6.9396 sec/batch
Epoch 17/20  Iteration 3012/3560 Training loss: 1.1408 7.1980 sec/batch
Epoch 17/20  Iteration 3013/3560 Training loss: 1.1408 6.8675 sec/batch
Epoch 17/20  Iteration 3014/3560 Training loss: 1.1408 6.8199 sec/batch
Epoch 17/20  Iteration 3015/3560 Training loss: 1.1409 6.6981 sec/batch
Epoch 17/20  Iteration 3016/3560 Training loss: 1.1412 6.8490 sec/batch
Epoch 17/20  Iteration 3017/3560 Training loss: 1.1413 6.8967 sec/batch
Epoch 17/20  Iteration 3018/3560 Training loss: 1.1414 7.0636 sec/batch
Epoch 17/20  Iteration 3019/3560 Training loss: 1.1414 7.0154 sec/batch
Epoch 17/20  Iteration 3020/3560 Training loss: 1.1413 7.2738 sec/batch
Epoch 17/20  Iteration 3021/3560 Training loss: 1.1415 6.9734 sec/batch
Epoch 17/20  Iteration 3022/3560 Training loss: 1.1415 7.0652 sec/batch
Epoch 17/20  Iteration 3023/3560 Training loss: 1.1416 7.1789 sec/batch
Epoch 17/20  Iteration 3024/3560 Training loss: 1.1415 6.5596 sec/batch
Epoch 17/20  Iteration 3025/3560 Training loss: 1.1414 6.2839 sec/batch
Epoch 17/20  Iteration 3026/3560 Training loss: 1.1416 6.3888 sec/batch
Epoch 18/20  Iteration 3027/3560 Training loss: 1.2633 7.7363 sec/batch
Epoch 18/20  Iteration 3028/3560 Training loss: 1.2082 6.5477 sec/batch
Epoch 18/20  Iteration 3029/3560 Training loss: 1.1863 8.6786 sec/batch
Epoch 18/20  Iteration 3030/3560 Training loss: 1.1811 8.1246 sec/batch
Epoch 18/20  Iteration 3031/3560 Training loss: 1.1696 8.0643 sec/batch
Epoch 18/20  Iteration 3032/3560 Training loss: 1.1588 7.5694 sec/batch
Epoch 18/20  Iteration 3033/3560 Training loss: 1.1577 8.5800 sec/batch
Epoch 18/20  Iteration 3034/3560 Training loss: 1.1553 8.9247 sec/batch
Epoch 18/20  Iteration 3035/3560 Training loss: 1.1538 8.4636 sec/batch
Epoch 18/20  Iteration 3036/3560 Training loss: 1.1522 8.2531 sec/batch
Epoch 18/20  Iteration 3037/3560 Training loss: 1.1483 8.3389 sec/batch
Epoch 18/20  Iteration 3038/3560 Training loss: 1.1477 8.4263 sec/batch
Epoch 18/20  Iteration 3039/3560 Training loss: 1.1480 8.8994 sec/batch
Epoch 18/20  Iteration 3040/3560 Training loss: 1.1476 8.9721 sec/batch
Epoch 18/20  Iteration 3041/3560 Training loss: 1.1455 8.7061 sec/batch
Epoch 18/20  Iteration 3042/3560 Training loss: 1.1437 8.9758 sec/batch
Epoch 18/20  Iteration 3043/3560 Training loss: 1.1438 8.3859 sec/batch
Epoch 18/20  Iteration 3044/3560 Training loss: 1.1447 8.5518 sec/batch
Epoch 18/20  Iteration 3045/3560 Training loss: 1.1437 9.0352 sec/batch
Epoch 18/20  Iteration 3046/3560 Training loss: 1.1450 9.0252 sec/batch
Epoch 18/20  Iteration 3047/3560 Training loss: 1.1442 8.8106 sec/batch
Epoch 18/20  Iteration 3048/3560 Training loss: 1.1445 8.6008 sec/batch
Epoch 18/20  Iteration 3049/3560 Training loss: 1.1439 8.5559 sec/batch
Epoch 18/20  Iteration 3050/3560 Training loss: 1.1440 8.5931 sec/batch
Epoch 18/20  Iteration 3051/3560 Training loss: 1.1436 9.1265 sec/batch
Epoch 18/20  Iteration 3052/3560 Training loss: 1.1419 8.3314 sec/batch
Epoch 18/20  Iteration 3053/3560 Training loss: 1.1408 7.2922 sec/batch
Epoch 18/20  Iteration 3054/3560 Training loss: 1.1414 6.6446 sec/batch
Epoch 18/20  Iteration 3055/3560 Training loss: 1.1416 6.8203 sec/batch
Epoch 18/20  Iteration 3056/3560 Training loss: 1.1418 7.0312 sec/batch
Epoch 18/20  Iteration 3057/3560 Training loss: 1.1408 6.4752 sec/batch
Epoch 18/20  Iteration 3058/3560 Training loss: 1.1400 6.4127 sec/batch
Epoch 18/20  Iteration 3059/3560 Training loss: 1.1399 6.7967 sec/batch
Epoch 18/20  Iteration 3060/3560 Training loss: 1.1398 7.2423 sec/batch
Epoch 18/20  Iteration 3061/3560 Training loss: 1.1393 10.7608 sec/batch
Epoch 18/20  Iteration 3062/3560 Training loss: 1.1393 10.3392 sec/batch
Epoch 18/20  Iteration 3063/3560 Training loss: 1.1384 10.2059 sec/batch
Epoch 18/20  Iteration 3064/3560 Training loss: 1.1373 9.0849 sec/batch
Epoch 18/20  Iteration 3065/3560 Training loss: 1.1365 8.3947 sec/batch
Epoch 18/20  Iteration 3066/3560 Training loss: 1.1364 8.1064 sec/batch
Epoch 18/20  Iteration 3067/3560 Training loss: 1.1358 8.2440 sec/batch
Epoch 18/20  Iteration 3068/3560 Training loss: 1.1367 7.5080 sec/batch
Epoch 18/20  Iteration 3069/3560 Training loss: 1.1366 7.4822 sec/batch
Epoch 18/20  Iteration 3070/3560 Training loss: 1.1361 7.3959 sec/batch
Epoch 18/20  Iteration 3071/3560 Training loss: 1.1363 8.6616 sec/batch
Epoch 18/20  Iteration 3072/3560 Training loss: 1.1358 7.2458 sec/batch
Epoch 18/20  Iteration 3073/3560 Training loss: 1.1352 8.5116 sec/batch
Epoch 18/20  Iteration 3074/3560 Training loss: 1.1347 6.8344 sec/batch
Epoch 18/20  Iteration 3075/3560 Training loss: 1.1345 7.0805 sec/batch
Epoch 18/20  Iteration 3076/3560 Training loss: 1.1347 6.1849 sec/batch
Epoch 18/20  Iteration 3077/3560 Training loss: 1.1342 6.2328 sec/batch
Epoch 18/20  Iteration 3078/3560 Training loss: 1.1349 6.1886 sec/batch
Epoch 18/20  Iteration 3079/3560 Training loss: 1.1350 6.2934 sec/batch
Epoch 18/20  Iteration 3080/3560 Training loss: 1.1351 7.2996 sec/batch
Epoch 18/20  Iteration 3081/3560 Training loss: 1.1350 6.6275 sec/batch
Epoch 18/20  Iteration 3082/3560 Training loss: 1.1352 6.3292 sec/batch
Epoch 18/20  Iteration 3083/3560 Training loss: 1.1354 6.1731 sec/batch
Epoch 18/20  Iteration 3084/3560 Training loss: 1.1352 7.0487 sec/batch
Epoch 18/20  Iteration 3085/3560 Training loss: 1.1349 6.2529 sec/batch
Epoch 18/20  Iteration 3086/3560 Training loss: 1.1354 6.1447 sec/batch
Epoch 18/20  Iteration 3087/3560 Training loss: 1.1355 6.3879 sec/batch
Epoch 18/20  Iteration 3088/3560 Training loss: 1.1361 6.4636 sec/batch
Epoch 18/20  Iteration 3089/3560 Training loss: 1.1364 7.6139 sec/batch
Epoch 18/20  Iteration 3090/3560 Training loss: 1.1366 6.3706 sec/batch
Epoch 18/20  Iteration 3091/3560 Training loss: 1.1366 6.1137 sec/batch
Epoch 18/20  Iteration 3092/3560 Training loss: 1.1366 6.1515 sec/batch
Epoch 18/20  Iteration 3093/3560 Training loss: 1.1367 6.6514 sec/batch
Epoch 18/20  Iteration 3094/3560 Training loss: 1.1366 6.2735 sec/batch
Epoch 18/20  Iteration 3095/3560 Training loss: 1.1367 6.2844 sec/batch
Epoch 18/20  Iteration 3096/3560 Training loss: 1.1366 6.4022 sec/batch
Epoch 18/20  Iteration 3097/3560 Training loss: 1.1373 6.4393 sec/batch
Epoch 18/20  Iteration 3098/3560 Training loss: 1.1375 7.9412 sec/batch
Epoch 18/20  Iteration 3099/3560 Training loss: 1.1378 6.2334 sec/batch
Epoch 18/20  Iteration 3100/3560 Training loss: 1.1374 6.1783 sec/batch
Epoch 18/20  Iteration 3101/3560 Training loss: 1.1373 6.6421 sec/batch
Epoch 18/20  Iteration 3102/3560 Training loss: 1.1376 6.3288 sec/batch
Epoch 18/20  Iteration 3103/3560 Training loss: 1.1375 6.9694 sec/batch
Epoch 18/20  Iteration 3104/3560 Training loss: 1.1374 6.4023 sec/batch
Epoch 18/20  Iteration 3105/3560 Training loss: 1.1369 6.4815 sec/batch
Epoch 18/20  Iteration 3106/3560 Training loss: 1.1369 6.5856 sec/batch
Epoch 18/20  Iteration 3107/3560 Training loss: 1.1364 7.0149 sec/batch
Epoch 18/20  Iteration 3108/3560 Training loss: 1.1364 7.8700 sec/batch
Epoch 18/20  Iteration 3109/3560 Training loss: 1.1361 6.2396 sec/batch
Epoch 18/20  Iteration 3110/3560 Training loss: 1.1360 6.3086 sec/batch
Epoch 18/20  Iteration 3111/3560 Training loss: 1.1357 6.3851 sec/batch
Epoch 18/20  Iteration 3112/3560 Training loss: 1.1355 6.8134 sec/batch
Epoch 18/20  Iteration 3113/3560 Training loss: 1.1354 6.4546 sec/batch
Epoch 18/20  Iteration 3114/3560 Training loss: 1.1352 6.7788 sec/batch
Epoch 18/20  Iteration 3115/3560 Training loss: 1.1349 7.2729 sec/batch
Epoch 18/20  Iteration 3116/3560 Training loss: 1.1350 6.8112 sec/batch
Epoch 18/20  Iteration 3117/3560 Training loss: 1.1348 6.1673 sec/batch
Epoch 18/20  Iteration 3118/3560 Training loss: 1.1347 6.2284 sec/batch
Epoch 18/20  Iteration 3119/3560 Training loss: 1.1343 6.3020 sec/batch
Epoch 18/20  Iteration 3120/3560 Training loss: 1.1340 9.0080 sec/batch
Epoch 18/20  Iteration 3121/3560 Training loss: 1.1339 7.7010 sec/batch
Epoch 18/20  Iteration 3122/3560 Training loss: 1.1339 7.0020 sec/batch
Epoch 18/20  Iteration 3123/3560 Training loss: 1.1338 9.6302 sec/batch
Epoch 18/20  Iteration 3124/3560 Training loss: 1.1334 8.9886 sec/batch
Epoch 18/20  Iteration 3125/3560 Training loss: 1.1331 7.9946 sec/batch
Epoch 18/20  Iteration 3126/3560 Training loss: 1.1328 6.7574 sec/batch
Epoch 18/20  Iteration 3127/3560 Training loss: 1.1328 6.5587 sec/batch
Epoch 18/20  Iteration 3128/3560 Training loss: 1.1327 9.2997 sec/batch
Epoch 18/20  Iteration 3129/3560 Training loss: 1.1327 9.1117 sec/batch
Epoch 18/20  Iteration 3130/3560 Training loss: 1.1326 7.1808 sec/batch
Epoch 18/20  Iteration 3131/3560 Training loss: 1.1325 7.1761 sec/batch
Epoch 18/20  Iteration 3132/3560 Training loss: 1.1323 6.9112 sec/batch
Epoch 18/20  Iteration 3133/3560 Training loss: 1.1322 6.6448 sec/batch
Epoch 18/20  Iteration 3134/3560 Training loss: 1.1322 6.5484 sec/batch
Epoch 18/20  Iteration 3135/3560 Training loss: 1.1320 6.6744 sec/batch
Epoch 18/20  Iteration 3136/3560 Training loss: 1.1321 7.1263 sec/batch
Epoch 18/20  Iteration 3137/3560 Training loss: 1.1319 6.6442 sec/batch
Epoch 18/20  Iteration 3138/3560 Training loss: 1.1318 6.8224 sec/batch
Epoch 18/20  Iteration 3139/3560 Training loss: 1.1318 7.2553 sec/batch
Epoch 18/20  Iteration 3140/3560 Training loss: 1.1317 8.9556 sec/batch
Epoch 18/20  Iteration 3141/3560 Training loss: 1.1315 6.5386 sec/batch
Epoch 18/20  Iteration 3142/3560 Training loss: 1.1312 6.4852 sec/batch
Epoch 18/20  Iteration 3143/3560 Training loss: 1.1311 7.6388 sec/batch
Epoch 18/20  Iteration 3144/3560 Training loss: 1.1311 8.2522 sec/batch
Epoch 18/20  Iteration 3145/3560 Training loss: 1.1309 8.2979 sec/batch
Epoch 18/20  Iteration 3146/3560 Training loss: 1.1309 8.1548 sec/batch
Epoch 18/20  Iteration 3147/3560 Training loss: 1.1308 7.4699 sec/batch
Epoch 18/20  Iteration 3148/3560 Training loss: 1.1305 7.2641 sec/batch
Epoch 18/20  Iteration 3149/3560 Training loss: 1.1302 6.6421 sec/batch
Epoch 18/20  Iteration 3150/3560 Training loss: 1.1301 6.4298 sec/batch
Epoch 18/20  Iteration 3151/3560 Training loss: 1.1299 6.6439 sec/batch
Epoch 18/20  Iteration 3152/3560 Training loss: 1.1295 6.6125 sec/batch
Epoch 18/20  Iteration 3153/3560 Training loss: 1.1296 7.3871 sec/batch
Epoch 18/20  Iteration 3154/3560 Training loss: 1.1296 6.2101 sec/batch
Epoch 18/20  Iteration 3155/3560 Training loss: 1.1294 6.1984 sec/batch
Epoch 18/20  Iteration 3156/3560 Training loss: 1.1290 6.1840 sec/batch
Epoch 18/20  Iteration 3157/3560 Training loss: 1.1286 7.1356 sec/batch
Epoch 18/20  Iteration 3158/3560 Training loss: 1.1285 6.4189 sec/batch
Epoch 18/20  Iteration 3159/3560 Training loss: 1.1286 6.3567 sec/batch
Epoch 18/20  Iteration 3160/3560 Training loss: 1.1286 6.4842 sec/batch
Epoch 18/20  Iteration 3161/3560 Training loss: 1.1286 6.5120 sec/batch
Epoch 18/20  Iteration 3162/3560 Training loss: 1.1286 7.3544 sec/batch
Epoch 18/20  Iteration 3163/3560 Training loss: 1.1287 6.2070 sec/batch
Epoch 18/20  Iteration 3164/3560 Training loss: 1.1289 6.2642 sec/batch
Epoch 18/20  Iteration 3165/3560 Training loss: 1.1289 6.2639 sec/batch
Epoch 18/20  Iteration 3166/3560 Training loss: 1.1290 6.3099 sec/batch
Epoch 18/20  Iteration 3167/3560 Training loss: 1.1293 6.9468 sec/batch
Epoch 18/20  Iteration 3168/3560 Training loss: 1.1294 6.2658 sec/batch
Epoch 18/20  Iteration 3169/3560 Training loss: 1.1293 6.5693 sec/batch
Epoch 18/20  Iteration 3170/3560 Training loss: 1.1295 6.5691 sec/batch
Epoch 18/20  Iteration 3171/3560 Training loss: 1.1294 7.4779 sec/batch
Epoch 18/20  Iteration 3172/3560 Training loss: 1.1295 6.2228 sec/batch
Epoch 18/20  Iteration 3173/3560 Training loss: 1.1296 6.3126 sec/batch
Epoch 18/20  Iteration 3174/3560 Training loss: 1.1298 6.2267 sec/batch
Epoch 18/20  Iteration 3175/3560 Training loss: 1.1299 6.5362 sec/batch
Epoch 18/20  Iteration 3176/3560 Training loss: 1.1298 7.9906 sec/batch
Epoch 18/20  Iteration 3177/3560 Training loss: 1.1296 6.4927 sec/batch
Epoch 18/20  Iteration 3178/3560 Training loss: 1.1295 6.7112 sec/batch
Epoch 18/20  Iteration 3179/3560 Training loss: 1.1296 6.5574 sec/batch
Epoch 18/20  Iteration 3180/3560 Training loss: 1.1295 6.9644 sec/batch
Epoch 18/20  Iteration 3181/3560 Training loss: 1.1294 6.1913 sec/batch
Epoch 18/20  Iteration 3182/3560 Training loss: 1.1294 6.2614 sec/batch
Epoch 18/20  Iteration 3183/3560 Training loss: 1.1294 6.2891 sec/batch
Epoch 18/20  Iteration 3184/3560 Training loss: 1.1293 6.1890 sec/batch
Epoch 18/20  Iteration 3185/3560 Training loss: 1.1291 6.8389 sec/batch
Epoch 18/20  Iteration 3186/3560 Training loss: 1.1293 6.4932 sec/batch
Epoch 18/20  Iteration 3187/3560 Training loss: 1.1294 6.5997 sec/batch
Epoch 18/20  Iteration 3188/3560 Training loss: 1.1294 6.9171 sec/batch
Epoch 18/20  Iteration 3189/3560 Training loss: 1.1294 6.8455 sec/batch
Epoch 18/20  Iteration 3190/3560 Training loss: 1.1294 6.3596 sec/batch
Epoch 18/20  Iteration 3191/3560 Training loss: 1.1294 6.2401 sec/batch
Epoch 18/20  Iteration 3192/3560 Training loss: 1.1293 6.3331 sec/batch
Epoch 18/20  Iteration 3193/3560 Training loss: 1.1294 6.3869 sec/batch
Epoch 18/20  Iteration 3194/3560 Training loss: 1.1297 7.2137 sec/batch
Epoch 18/20  Iteration 3195/3560 Training loss: 1.1298 6.5884 sec/batch
Epoch 18/20  Iteration 3196/3560 Training loss: 1.1299 6.4923 sec/batch
Epoch 18/20  Iteration 3197/3560 Training loss: 1.1298 6.6136 sec/batch
Epoch 18/20  Iteration 3198/3560 Training loss: 1.1298 6.6358 sec/batch
Epoch 18/20  Iteration 3199/3560 Training loss: 1.1299 6.2035 sec/batch
Epoch 18/20  Iteration 3200/3560 Training loss: 1.1300 6.2030 sec/batch
Epoch 18/20  Iteration 3201/3560 Training loss: 1.1300 6.1952 sec/batch
Epoch 18/20  Iteration 3202/3560 Training loss: 1.1299 6.2065 sec/batch
Epoch 18/20  Iteration 3203/3560 Training loss: 1.1299 6.7437 sec/batch
Epoch 18/20  Iteration 3204/3560 Training loss: 1.1301 6.6370 sec/batch
Epoch 19/20  Iteration 3205/3560 Training loss: 1.2554 6.4719 sec/batch
Epoch 19/20  Iteration 3206/3560 Training loss: 1.2025 6.6366 sec/batch
Epoch 19/20  Iteration 3207/3560 Training loss: 1.1812 6.3699 sec/batch
Epoch 19/20  Iteration 3208/3560 Training loss: 1.1738 7.8975 sec/batch
Epoch 19/20  Iteration 3209/3560 Training loss: 1.1625 6.3534 sec/batch
Epoch 19/20  Iteration 3210/3560 Training loss: 1.1497 6.3088 sec/batch
Epoch 19/20  Iteration 3211/3560 Training loss: 1.1482 6.4504 sec/batch
Epoch 19/20  Iteration 3212/3560 Training loss: 1.1453 7.4434 sec/batch
Epoch 19/20  Iteration 3213/3560 Training loss: 1.1430 6.7548 sec/batch
Epoch 19/20  Iteration 3214/3560 Training loss: 1.1414 6.6597 sec/batch
Epoch 19/20  Iteration 3215/3560 Training loss: 1.1393 6.4505 sec/batch
Epoch 19/20  Iteration 3216/3560 Training loss: 1.1390 6.2755 sec/batch
Epoch 19/20  Iteration 3217/3560 Training loss: 1.1393 6.5925 sec/batch
Epoch 19/20  Iteration 3218/3560 Training loss: 1.1386 6.1612 sec/batch
Epoch 19/20  Iteration 3219/3560 Training loss: 1.1363 6.2276 sec/batch
Epoch 19/20  Iteration 3220/3560 Training loss: 1.1338 6.3769 sec/batch
Epoch 19/20  Iteration 3221/3560 Training loss: 1.1343 7.4158 sec/batch
Epoch 19/20  Iteration 3222/3560 Training loss: 1.1347 6.8178 sec/batch
Epoch 19/20  Iteration 3223/3560 Training loss: 1.1342 6.9987 sec/batch
Epoch 19/20  Iteration 3224/3560 Training loss: 1.1350 6.3061 sec/batch
Epoch 19/20  Iteration 3225/3560 Training loss: 1.1339 6.1966 sec/batch
Epoch 19/20  Iteration 3226/3560 Training loss: 1.1342 7.7243 sec/batch
Epoch 19/20  Iteration 3227/3560 Training loss: 1.1335 6.2680 sec/batch
Epoch 19/20  Iteration 3228/3560 Training loss: 1.1340 6.1900 sec/batch
Epoch 19/20  Iteration 3229/3560 Training loss: 1.1341 6.4110 sec/batch
Epoch 19/20  Iteration 3230/3560 Training loss: 1.1326 8.0177 sec/batch
Epoch 19/20  Iteration 3231/3560 Training loss: 1.1313 6.6790 sec/batch
Epoch 19/20  Iteration 3232/3560 Training loss: 1.1320 6.5888 sec/batch
Epoch 19/20  Iteration 3233/3560 Training loss: 1.1322 6.5684 sec/batch
Epoch 19/20  Iteration 3234/3560 Training loss: 1.1325 6.4174 sec/batch
Epoch 19/20  Iteration 3235/3560 Training loss: 1.1319 7.2110 sec/batch
Epoch 19/20  Iteration 3236/3560 Training loss: 1.1312 6.2061 sec/batch
Epoch 19/20  Iteration 3237/3560 Training loss: 1.1311 6.3062 sec/batch
Epoch 19/20  Iteration 3238/3560 Training loss: 1.1311 6.3965 sec/batch
Epoch 19/20  Iteration 3239/3560 Training loss: 1.1305 7.4201 sec/batch
Epoch 19/20  Iteration 3240/3560 Training loss: 1.1304 6.4349 sec/batch
Epoch 19/20  Iteration 3241/3560 Training loss: 1.1295 6.5872 sec/batch
Epoch 19/20  Iteration 3242/3560 Training loss: 1.1283 6.1435 sec/batch
Epoch 19/20  Iteration 3243/3560 Training loss: 1.1274 6.1493 sec/batch
Epoch 19/20  Iteration 3244/3560 Training loss: 1.1271 6.6605 sec/batch
Epoch 19/20  Iteration 3245/3560 Training loss: 1.1267 6.4064 sec/batch
Epoch 19/20  Iteration 3246/3560 Training loss: 1.1278 6.4026 sec/batch
Epoch 19/20  Iteration 3247/3560 Training loss: 1.1276 7.6029 sec/batch
Epoch 19/20  Iteration 3248/3560 Training loss: 1.1270 8.1554 sec/batch
Epoch 19/20  Iteration 3249/3560 Training loss: 1.1272 6.6717 sec/batch
Epoch 19/20  Iteration 3250/3560 Training loss: 1.1268 6.4590 sec/batch
Epoch 19/20  Iteration 3251/3560 Training loss: 1.1263 6.2852 sec/batch
Epoch 19/20  Iteration 3252/3560 Training loss: 1.1259 6.3151 sec/batch
Epoch 19/20  Iteration 3253/3560 Training loss: 1.1259 7.2982 sec/batch
Epoch 19/20  Iteration 3254/3560 Training loss: 1.1261 6.2448 sec/batch
Epoch 19/20  Iteration 3255/3560 Training loss: 1.1255 6.4328 sec/batch
Epoch 19/20  Iteration 3256/3560 Training loss: 1.1262 6.5353 sec/batch
Epoch 19/20  Iteration 3257/3560 Training loss: 1.1262 7.0853 sec/batch
Epoch 19/20  Iteration 3258/3560 Training loss: 1.1264 6.5584 sec/batch
Epoch 19/20  Iteration 3259/3560 Training loss: 1.1263 6.4208 sec/batch
Epoch 19/20  Iteration 3260/3560 Training loss: 1.1264 6.2830 sec/batch
Epoch 19/20  Iteration 3261/3560 Training loss: 1.1266 6.2610 sec/batch
Epoch 19/20  Iteration 3262/3560 Training loss: 1.1266 7.0136 sec/batch
Epoch 19/20  Iteration 3263/3560 Training loss: 1.1261 6.3801 sec/batch
Epoch 19/20  Iteration 3264/3560 Training loss: 1.1267 6.5683 sec/batch
Epoch 19/20  Iteration 3265/3560 Training loss: 1.1266 6.7944 sec/batch
Epoch 19/20  Iteration 3266/3560 Training loss: 1.1271 8.5294 sec/batch
Epoch 19/20  Iteration 3267/3560 Training loss: 1.1274 6.8441 sec/batch
Epoch 19/20  Iteration 3268/3560 Training loss: 1.1275 6.4103 sec/batch
Epoch 19/20  Iteration 3269/3560 Training loss: 1.1276 6.3466 sec/batch
Epoch 19/20  Iteration 3270/3560 Training loss: 1.1277 7.7176 sec/batch
Epoch 19/20  Iteration 3271/3560 Training loss: 1.1279 7.8580 sec/batch
Epoch 19/20  Iteration 3272/3560 Training loss: 1.1278 7.2264 sec/batch
Epoch 19/20  Iteration 3273/3560 Training loss: 1.1278 6.3426 sec/batch
Epoch 19/20  Iteration 3274/3560 Training loss: 1.1279 6.0906 sec/batch
Epoch 19/20  Iteration 3275/3560 Training loss: 1.1283 6.0680 sec/batch
Epoch 19/20  Iteration 3276/3560 Training loss: 1.1286 6.6582 sec/batch
Epoch 19/20  Iteration 3277/3560 Training loss: 1.1288 6.5261 sec/batch
Epoch 19/20  Iteration 3278/3560 Training loss: 1.1284 6.3261 sec/batch
Epoch 19/20  Iteration 3279/3560 Training loss: 1.1283 6.0068 sec/batch
Epoch 19/20  Iteration 3280/3560 Training loss: 1.1285 6.0179 sec/batch
Epoch 19/20  Iteration 3281/3560 Training loss: 1.1284 6.0052 sec/batch
Epoch 19/20  Iteration 3282/3560 Training loss: 1.1283 6.0343 sec/batch
Epoch 19/20  Iteration 3283/3560 Training loss: 1.1278 6.0459 sec/batch
Epoch 19/20  Iteration 3284/3560 Training loss: 1.1278 6.3480 sec/batch
Epoch 19/20  Iteration 3285/3560 Training loss: 1.1274 5.9760 sec/batch
Epoch 19/20  Iteration 3286/3560 Training loss: 1.1273 5.9914 sec/batch
Epoch 19/20  Iteration 3287/3560 Training loss: 1.1270 6.0347 sec/batch
Epoch 19/20  Iteration 3288/3560 Training loss: 1.1270 6.2103 sec/batch
Epoch 19/20  Iteration 3289/3560 Training loss: 1.1267 5.9996 sec/batch
Epoch 19/20  Iteration 3290/3560 Training loss: 1.1266 6.0189 sec/batch
Epoch 19/20  Iteration 3291/3560 Training loss: 1.1264 6.0205 sec/batch
Epoch 19/20  Iteration 3292/3560 Training loss: 1.1262 6.0001 sec/batch
Epoch 19/20  Iteration 3293/3560 Training loss: 1.1260 6.0780 sec/batch
Epoch 19/20  Iteration 3294/3560 Training loss: 1.1260 6.0213 sec/batch
Epoch 19/20  Iteration 3295/3560 Training loss: 1.1258 5.9982 sec/batch
Epoch 19/20  Iteration 3296/3560 Training loss: 1.1259 5.9949 sec/batch
Epoch 19/20  Iteration 3297/3560 Training loss: 1.1256 6.0648 sec/batch
Epoch 19/20  Iteration 3298/3560 Training loss: 1.1251 6.0237 sec/batch
Epoch 19/20  Iteration 3299/3560 Training loss: 1.1249 5.9759 sec/batch
Epoch 19/20  Iteration 3300/3560 Training loss: 1.1250 6.0438 sec/batch
Epoch 19/20  Iteration 3301/3560 Training loss: 1.1250 5.9954 sec/batch
Epoch 19/20  Iteration 3302/3560 Training loss: 1.1246 6.0585 sec/batch
Epoch 19/20  Iteration 3303/3560 Training loss: 1.1243 6.0156 sec/batch
Epoch 19/20  Iteration 3304/3560 Training loss: 1.1240 6.4044 sec/batch
Epoch 19/20  Iteration 3305/3560 Training loss: 1.1239 6.0689 sec/batch
Epoch 19/20  Iteration 3306/3560 Training loss: 1.1238 6.0111 sec/batch
Epoch 19/20  Iteration 3307/3560 Training loss: 1.1238 6.0614 sec/batch
Epoch 19/20  Iteration 3308/3560 Training loss: 1.1235 6.0497 sec/batch
Epoch 19/20  Iteration 3309/3560 Training loss: 1.1234 6.0170 sec/batch
Epoch 19/20  Iteration 3310/3560 Training loss: 1.1233 6.9003 sec/batch
Epoch 19/20  Iteration 3311/3560 Training loss: 1.1232 6.3720 sec/batch
Epoch 19/20  Iteration 3312/3560 Training loss: 1.1231 7.1939 sec/batch
Epoch 19/20  Iteration 3313/3560 Training loss: 1.1229 6.3041 sec/batch
Epoch 19/20  Iteration 3314/3560 Training loss: 1.1230 6.0929 sec/batch
Epoch 19/20  Iteration 3315/3560 Training loss: 1.1228 6.3504 sec/batch
Epoch 19/20  Iteration 3316/3560 Training loss: 1.1228 6.1895 sec/batch
Epoch 19/20  Iteration 3317/3560 Training loss: 1.1227 6.2246 sec/batch
Epoch 19/20  Iteration 3318/3560 Training loss: 1.1225 6.0336 sec/batch
Epoch 19/20  Iteration 3319/3560 Training loss: 1.1222 6.0627 sec/batch
Epoch 19/20  Iteration 3320/3560 Training loss: 1.1219 5.9995 sec/batch
Epoch 19/20  Iteration 3321/3560 Training loss: 1.1219 6.0171 sec/batch
Epoch 19/20  Iteration 3322/3560 Training loss: 1.1219 5.9756 sec/batch
Epoch 19/20  Iteration 3323/3560 Training loss: 1.1218 6.3332 sec/batch
Epoch 19/20  Iteration 3324/3560 Training loss: 1.1218 5.9681 sec/batch
Epoch 19/20  Iteration 3325/3560 Training loss: 1.1217 5.9933 sec/batch
Epoch 19/20  Iteration 3326/3560 Training loss: 1.1214 6.0990 sec/batch
Epoch 19/20  Iteration 3327/3560 Training loss: 1.1211 6.5053 sec/batch
Epoch 19/20  Iteration 3328/3560 Training loss: 1.1210 6.0226 sec/batch
Epoch 19/20  Iteration 3329/3560 Training loss: 1.1209 6.0080 sec/batch
Epoch 19/20  Iteration 3330/3560 Training loss: 1.1206 6.0954 sec/batch
Epoch 19/20  Iteration 3331/3560 Training loss: 1.1206 6.0196 sec/batch
Epoch 19/20  Iteration 3332/3560 Training loss: 1.1205 6.0625 sec/batch
Epoch 19/20  Iteration 3333/3560 Training loss: 1.1203 6.0191 sec/batch
Epoch 19/20  Iteration 3334/3560 Training loss: 1.1200 5.9595 sec/batch
Epoch 19/20  Iteration 3335/3560 Training loss: 1.1197 6.0295 sec/batch
Epoch 19/20  Iteration 3336/3560 Training loss: 1.1196 6.0307 sec/batch
Epoch 19/20  Iteration 3337/3560 Training loss: 1.1197 5.9986 sec/batch
Epoch 19/20  Iteration 3338/3560 Training loss: 1.1197 5.9884 sec/batch
Epoch 19/20  Iteration 3339/3560 Training loss: 1.1197 6.0069 sec/batch
Epoch 19/20  Iteration 3340/3560 Training loss: 1.1196 5.9931 sec/batch
Epoch 19/20  Iteration 3341/3560 Training loss: 1.1198 6.0652 sec/batch
Epoch 19/20  Iteration 3342/3560 Training loss: 1.1199 5.9842 sec/batch
Epoch 19/20  Iteration 3343/3560 Training loss: 1.1199 6.3444 sec/batch
Epoch 19/20  Iteration 3344/3560 Training loss: 1.1200 6.0108 sec/batch
Epoch 19/20  Iteration 3345/3560 Training loss: 1.1203 6.0417 sec/batch
Epoch 19/20  Iteration 3346/3560 Training loss: 1.1204 5.9670 sec/batch
Epoch 19/20  Iteration 3347/3560 Training loss: 1.1203 5.9940 sec/batch
Epoch 19/20  Iteration 3348/3560 Training loss: 1.1205 6.0152 sec/batch
Epoch 19/20  Iteration 3349/3560 Training loss: 1.1204 6.0122 sec/batch
Epoch 19/20  Iteration 3350/3560 Training loss: 1.1206 6.0087 sec/batch
Epoch 19/20  Iteration 3351/3560 Training loss: 1.1207 6.0168 sec/batch
Epoch 19/20  Iteration 3352/3560 Training loss: 1.1209 6.0062 sec/batch
Epoch 19/20  Iteration 3353/3560 Training loss: 1.1210 6.0300 sec/batch
Epoch 19/20  Iteration 3354/3560 Training loss: 1.1210 5.9638 sec/batch
Epoch 19/20  Iteration 3355/3560 Training loss: 1.1207 6.0497 sec/batch
Epoch 19/20  Iteration 3356/3560 Training loss: 1.1206 5.9966 sec/batch
Epoch 19/20  Iteration 3357/3560 Training loss: 1.1207 6.0280 sec/batch
Epoch 19/20  Iteration 3358/3560 Training loss: 1.1206 5.9999 sec/batch
Epoch 19/20  Iteration 3359/3560 Training loss: 1.1206 5.9764 sec/batch
Epoch 19/20  Iteration 3360/3560 Training loss: 1.1206 6.0332 sec/batch
Epoch 19/20  Iteration 3361/3560 Training loss: 1.1206 6.0471 sec/batch
Epoch 19/20  Iteration 3362/3560 Training loss: 1.1205 6.0109 sec/batch
Epoch 19/20  Iteration 3363/3560 Training loss: 1.1203 6.3625 sec/batch
Epoch 19/20  Iteration 3364/3560 Training loss: 1.1205 6.0574 sec/batch
Epoch 19/20  Iteration 3365/3560 Training loss: 1.1206 5.9807 sec/batch
Epoch 19/20  Iteration 3366/3560 Training loss: 1.1206 6.0056 sec/batch
Epoch 19/20  Iteration 3367/3560 Training loss: 1.1206 6.0258 sec/batch
Epoch 19/20  Iteration 3368/3560 Training loss: 1.1206 6.0241 sec/batch
Epoch 19/20  Iteration 3369/3560 Training loss: 1.1206 6.0137 sec/batch
Epoch 19/20  Iteration 3370/3560 Training loss: 1.1205 5.9859 sec/batch
Epoch 19/20  Iteration 3371/3560 Training loss: 1.1206 6.0183 sec/batch
Epoch 19/20  Iteration 3372/3560 Training loss: 1.1209 6.2594 sec/batch
Epoch 19/20  Iteration 3373/3560 Training loss: 1.1210 6.5679 sec/batch
Epoch 19/20  Iteration 3374/3560 Training loss: 1.1211 6.0444 sec/batch
Epoch 19/20  Iteration 3375/3560 Training loss: 1.1210 6.0234 sec/batch
Epoch 19/20  Iteration 3376/3560 Training loss: 1.1209 6.2427 sec/batch
Epoch 19/20  Iteration 3377/3560 Training loss: 1.1211 6.0727 sec/batch
Epoch 19/20  Iteration 3378/3560 Training loss: 1.1211 6.2051 sec/batch
Epoch 19/20  Iteration 3379/3560 Training loss: 1.1211 6.0236 sec/batch
Epoch 19/20  Iteration 3380/3560 Training loss: 1.1210 6.0264 sec/batch
Epoch 19/20  Iteration 3381/3560 Training loss: 1.1209 6.0201 sec/batch
Epoch 19/20  Iteration 3382/3560 Training loss: 1.1211 6.3523 sec/batch
Epoch 20/20  Iteration 3383/3560 Training loss: 1.2405 6.0815 sec/batch
Epoch 20/20  Iteration 3384/3560 Training loss: 1.1865 6.0056 sec/batch
Epoch 20/20  Iteration 3385/3560 Training loss: 1.1650 6.0237 sec/batch
Epoch 20/20  Iteration 3386/3560 Training loss: 1.1568 6.0047 sec/batch
Epoch 20/20  Iteration 3387/3560 Training loss: 1.1463 6.0291 sec/batch
Epoch 20/20  Iteration 3388/3560 Training loss: 1.1356 6.0738 sec/batch
Epoch 20/20  Iteration 3389/3560 Training loss: 1.1339 5.9823 sec/batch
Epoch 20/20  Iteration 3390/3560 Training loss: 1.1314 6.1022 sec/batch
Epoch 20/20  Iteration 3391/3560 Training loss: 1.1303 5.9964 sec/batch
Epoch 20/20  Iteration 3392/3560 Training loss: 1.1284 6.0013 sec/batch
Epoch 20/20  Iteration 3393/3560 Training loss: 1.1255 6.0017 sec/batch
Epoch 20/20  Iteration 3394/3560 Training loss: 1.1254 5.9835 sec/batch
Epoch 20/20  Iteration 3395/3560 Training loss: 1.1257 6.0149 sec/batch
Epoch 20/20  Iteration 3396/3560 Training loss: 1.1254 5.9885 sec/batch
Epoch 20/20  Iteration 3397/3560 Training loss: 1.1236 5.9999 sec/batch
Epoch 20/20  Iteration 3398/3560 Training loss: 1.1217 6.0197 sec/batch
Epoch 20/20  Iteration 3399/3560 Training loss: 1.1221 6.0113 sec/batch
Epoch 20/20  Iteration 3400/3560 Training loss: 1.1235 6.0175 sec/batch
Epoch 20/20  Iteration 3401/3560 Training loss: 1.1228 6.0241 sec/batch
Epoch 20/20  Iteration 3402/3560 Training loss: 1.1237 6.2627 sec/batch
Epoch 20/20  Iteration 3403/3560 Training loss: 1.1230 6.0981 sec/batch
Epoch 20/20  Iteration 3404/3560 Training loss: 1.1234 5.9992 sec/batch
Epoch 20/20  Iteration 3405/3560 Training loss: 1.1227 5.9880 sec/batch
Epoch 20/20  Iteration 3406/3560 Training loss: 1.1228 6.0833 sec/batch
Epoch 20/20  Iteration 3407/3560 Training loss: 1.1226 6.0317 sec/batch
Epoch 20/20  Iteration 3408/3560 Training loss: 1.1211 5.9971 sec/batch
Epoch 20/20  Iteration 3409/3560 Training loss: 1.1200 6.0505 sec/batch
Epoch 20/20  Iteration 3410/3560 Training loss: 1.1205 6.0048 sec/batch
Epoch 20/20  Iteration 3411/3560 Training loss: 1.1208 6.0311 sec/batch
Epoch 20/20  Iteration 3412/3560 Training loss: 1.1211 6.0062 sec/batch
Epoch 20/20  Iteration 3413/3560 Training loss: 1.1200 5.9676 sec/batch
Epoch 20/20  Iteration 3414/3560 Training loss: 1.1191 6.0139 sec/batch
Epoch 20/20  Iteration 3415/3560 Training loss: 1.1190 6.0652 sec/batch
Epoch 20/20  Iteration 3416/3560 Training loss: 1.1189 5.9800 sec/batch
Epoch 20/20  Iteration 3417/3560 Training loss: 1.1184 5.9977 sec/batch
Epoch 20/20  Iteration 3418/3560 Training loss: 1.1184 6.0207 sec/batch
Epoch 20/20  Iteration 3419/3560 Training loss: 1.1177 5.9700 sec/batch
Epoch 20/20  Iteration 3420/3560 Training loss: 1.1166 6.0824 sec/batch
Epoch 20/20  Iteration 3421/3560 Training loss: 1.1156 6.0481 sec/batch
Epoch 20/20  Iteration 3422/3560 Training loss: 1.1156 6.3101 sec/batch
Epoch 20/20  Iteration 3423/3560 Training loss: 1.1151 6.0418 sec/batch
Epoch 20/20  Iteration 3424/3560 Training loss: 1.1161 6.0052 sec/batch
Epoch 20/20  Iteration 3425/3560 Training loss: 1.1162 6.0419 sec/batch
Epoch 20/20  Iteration 3426/3560 Training loss: 1.1157 6.3561 sec/batch
Epoch 20/20  Iteration 3427/3560 Training loss: 1.1158 5.9979 sec/batch
Epoch 20/20  Iteration 3428/3560 Training loss: 1.1153 6.0755 sec/batch
Epoch 20/20  Iteration 3429/3560 Training loss: 1.1149 5.9708 sec/batch
Epoch 20/20  Iteration 3430/3560 Training loss: 1.1143 6.0460 sec/batch
Epoch 20/20  Iteration 3431/3560 Training loss: 1.1142 6.0190 sec/batch
Epoch 20/20  Iteration 3432/3560 Training loss: 1.1144 6.0025 sec/batch
Epoch 20/20  Iteration 3433/3560 Training loss: 1.1139 5.9827 sec/batch
Epoch 20/20  Iteration 3434/3560 Training loss: 1.1145 6.0111 sec/batch
Epoch 20/20  Iteration 3435/3560 Training loss: 1.1146 6.0032 sec/batch
Epoch 20/20  Iteration 3436/3560 Training loss: 1.1148 5.9727 sec/batch
Epoch 20/20  Iteration 3437/3560 Training loss: 1.1149 5.9786 sec/batch
Epoch 20/20  Iteration 3438/3560 Training loss: 1.1149 5.9760 sec/batch
Epoch 20/20  Iteration 3439/3560 Training loss: 1.1152 6.0288 sec/batch
Epoch 20/20  Iteration 3440/3560 Training loss: 1.1150 5.9917 sec/batch
Epoch 20/20  Iteration 3441/3560 Training loss: 1.1146 6.0174 sec/batch
Epoch 20/20  Iteration 3442/3560 Training loss: 1.1150 6.3066 sec/batch
Epoch 20/20  Iteration 3443/3560 Training loss: 1.1149 6.0444 sec/batch
Epoch 20/20  Iteration 3444/3560 Training loss: 1.1155 6.0092 sec/batch
Epoch 20/20  Iteration 3445/3560 Training loss: 1.1158 6.0338 sec/batch
Epoch 20/20  Iteration 3446/3560 Training loss: 1.1160 5.9529 sec/batch
Epoch 20/20  Iteration 3447/3560 Training loss: 1.1160 5.9707 sec/batch
Epoch 20/20  Iteration 3448/3560 Training loss: 1.1160 6.0406 sec/batch
Epoch 20/20  Iteration 3449/3560 Training loss: 1.1163 6.0594 sec/batch
Epoch 20/20  Iteration 3450/3560 Training loss: 1.1160 6.0293 sec/batch
Epoch 20/20  Iteration 3451/3560 Training loss: 1.1161 6.0111 sec/batch
Epoch 20/20  Iteration 3452/3560 Training loss: 1.1161 6.0311 sec/batch
Epoch 20/20  Iteration 3453/3560 Training loss: 1.1167 5.9770 sec/batch
Epoch 20/20  Iteration 3454/3560 Training loss: 1.1169 5.9977 sec/batch
Epoch 20/20  Iteration 3455/3560 Training loss: 1.1172 6.0181 sec/batch
Epoch 20/20  Iteration 3456/3560 Training loss: 1.1169 6.0016 sec/batch
Epoch 20/20  Iteration 3457/3560 Training loss: 1.1169 6.0341 sec/batch
Epoch 20/20  Iteration 3458/3560 Training loss: 1.1170 5.9851 sec/batch
Epoch 20/20  Iteration 3459/3560 Training loss: 1.1170 5.9791 sec/batch
Epoch 20/20  Iteration 3460/3560 Training loss: 1.1169 6.0335 sec/batch
Epoch 20/20  Iteration 3461/3560 Training loss: 1.1165 5.9802 sec/batch
Epoch 20/20  Iteration 3462/3560 Training loss: 1.1165 6.3998 sec/batch
Epoch 20/20  Iteration 3463/3560 Training loss: 1.1161 6.0396 sec/batch
Epoch 20/20  Iteration 3464/3560 Training loss: 1.1161 5.9949 sec/batch
Epoch 20/20  Iteration 3465/3560 Training loss: 1.1159 6.0370 sec/batch
Epoch 20/20  Iteration 3466/3560 Training loss: 1.1159 6.0153 sec/batch
Epoch 20/20  Iteration 3467/3560 Training loss: 1.1157 6.0279 sec/batch
Epoch 20/20  Iteration 3468/3560 Training loss: 1.1155 5.9986 sec/batch
Epoch 20/20  Iteration 3469/3560 Training loss: 1.1153 5.9822 sec/batch
Epoch 20/20  Iteration 3470/3560 Training loss: 1.1152 6.0085 sec/batch
Epoch 20/20  Iteration 3471/3560 Training loss: 1.1149 5.9991 sec/batch
Epoch 20/20  Iteration 3472/3560 Training loss: 1.1150 5.9775 sec/batch
Epoch 20/20  Iteration 3473/3560 Training loss: 1.1148 6.0273 sec/batch
Epoch 20/20  Iteration 3474/3560 Training loss: 1.1148 5.9898 sec/batch
Epoch 20/20  Iteration 3475/3560 Training loss: 1.1145 6.2508 sec/batch
Epoch 20/20  Iteration 3476/3560 Training loss: 1.1142 6.0115 sec/batch
Epoch 20/20  Iteration 3477/3560 Training loss: 1.1140 6.0024 sec/batch
Epoch 20/20  Iteration 3478/3560 Training loss: 1.1141 5.9790 sec/batch
Epoch 20/20  Iteration 3479/3560 Training loss: 1.1141 6.0197 sec/batch
Epoch 20/20  Iteration 3480/3560 Training loss: 1.1139 5.9950 sec/batch
Epoch 20/20  Iteration 3481/3560 Training loss: 1.1136 6.0517 sec/batch
Epoch 20/20  Iteration 3482/3560 Training loss: 1.1134 6.3030 sec/batch
Epoch 20/20  Iteration 3483/3560 Training loss: 1.1134 6.0001 sec/batch
Epoch 20/20  Iteration 3484/3560 Training loss: 1.1133 6.0376 sec/batch
Epoch 20/20  Iteration 3485/3560 Training loss: 1.1132 6.0007 sec/batch
Epoch 20/20  Iteration 3486/3560 Training loss: 1.1130 5.9852 sec/batch
Epoch 20/20  Iteration 3487/3560 Training loss: 1.1129 5.9868 sec/batch
Epoch 20/20  Iteration 3488/3560 Training loss: 1.1128 6.0061 sec/batch
Epoch 20/20  Iteration 3489/3560 Training loss: 1.1127 6.0058 sec/batch
Epoch 20/20  Iteration 3490/3560 Training loss: 1.1127 5.9902 sec/batch
Epoch 20/20  Iteration 3491/3560 Training loss: 1.1126 6.0233 sec/batch
Epoch 20/20  Iteration 3492/3560 Training loss: 1.1126 6.0276 sec/batch
Epoch 20/20  Iteration 3493/3560 Training loss: 1.1124 5.9998 sec/batch
Epoch 20/20  Iteration 3494/3560 Training loss: 1.1123 5.9862 sec/batch
Epoch 20/20  Iteration 3495/3560 Training loss: 1.1122 5.9728 sec/batch
Epoch 20/20  Iteration 3496/3560 Training loss: 1.1121 5.9906 sec/batch
Epoch 20/20  Iteration 3497/3560 Training loss: 1.1118 5.9605 sec/batch
Epoch 20/20  Iteration 3498/3560 Training loss: 1.1116 6.0216 sec/batch
Epoch 20/20  Iteration 3499/3560 Training loss: 1.1116 6.0827 sec/batch
Epoch 20/20  Iteration 3500/3560 Training loss: 1.1116 5.9977 sec/batch
Epoch 20/20  Iteration 3501/3560 Training loss: 1.1115 6.0363 sec/batch
Epoch 20/20  Iteration 3502/3560 Training loss: 1.1115 6.3791 sec/batch
Epoch 20/20  Iteration 3503/3560 Training loss: 1.1114 6.0195 sec/batch
Epoch 20/20  Iteration 3504/3560 Training loss: 1.1111 5.9952 sec/batch
Epoch 20/20  Iteration 3505/3560 Training loss: 1.1108 5.9856 sec/batch
Epoch 20/20  Iteration 3506/3560 Training loss: 1.1107 6.1896 sec/batch
Epoch 20/20  Iteration 3507/3560 Training loss: 1.1106 6.0151 sec/batch
Epoch 20/20  Iteration 3508/3560 Training loss: 1.1103 5.9513 sec/batch
Epoch 20/20  Iteration 3509/3560 Training loss: 1.1103 6.0165 sec/batch
Epoch 20/20  Iteration 3510/3560 Training loss: 1.1103 5.9940 sec/batch
Epoch 20/20  Iteration 3511/3560 Training loss: 1.1101 5.9845 sec/batch
Epoch 20/20  Iteration 3512/3560 Training loss: 1.1098 5.9907 sec/batch
Epoch 20/20  Iteration 3513/3560 Training loss: 1.1094 5.9340 sec/batch
Epoch 20/20  Iteration 3514/3560 Training loss: 1.1093 6.0128 sec/batch
Epoch 20/20  Iteration 3515/3560 Training loss: 1.1094 5.9481 sec/batch
Epoch 20/20  Iteration 3516/3560 Training loss: 1.1095 5.9270 sec/batch
Epoch 20/20  Iteration 3517/3560 Training loss: 1.1095 5.9746 sec/batch
Epoch 20/20  Iteration 3518/3560 Training loss: 1.1095 5.9734 sec/batch
Epoch 20/20  Iteration 3519/3560 Training loss: 1.1097 5.9677 sec/batch
Epoch 20/20  Iteration 3520/3560 Training loss: 1.1098 5.9714 sec/batch
Epoch 20/20  Iteration 3521/3560 Training loss: 1.1098 6.0002 sec/batch
Epoch 20/20  Iteration 3522/3560 Training loss: 1.1099 6.2487 sec/batch
Epoch 20/20  Iteration 3523/3560 Training loss: 1.1103 5.9793 sec/batch
Epoch 20/20  Iteration 3524/3560 Training loss: 1.1105 6.0000 sec/batch
Epoch 20/20  Iteration 3525/3560 Training loss: 1.1105 6.3928 sec/batch
Epoch 20/20  Iteration 3526/3560 Training loss: 1.1106 5.9923 sec/batch
Epoch 20/20  Iteration 3527/3560 Training loss: 1.1105 5.9246 sec/batch
Epoch 20/20  Iteration 3528/3560 Training loss: 1.1106 6.0521 sec/batch
Epoch 20/20  Iteration 3529/3560 Training loss: 1.1107 5.9884 sec/batch
Epoch 20/20  Iteration 3530/3560 Training loss: 1.1110 6.0052 sec/batch
Epoch 20/20  Iteration 3531/3560 Training loss: 1.1112 5.9959 sec/batch
Epoch 20/20  Iteration 3532/3560 Training loss: 1.1112 5.9588 sec/batch
Epoch 20/20  Iteration 3533/3560 Training loss: 1.1109 5.8899 sec/batch
Epoch 20/20  Iteration 3534/3560 Training loss: 1.1108 6.0132 sec/batch
Epoch 20/20  Iteration 3535/3560 Training loss: 1.1109 6.0113 sec/batch
Epoch 20/20  Iteration 3536/3560 Training loss: 1.1108 6.0016 sec/batch
Epoch 20/20  Iteration 3537/3560 Training loss: 1.1108 5.9364 sec/batch
Epoch 20/20  Iteration 3538/3560 Training loss: 1.1108 5.9560 sec/batch
Epoch 20/20  Iteration 3539/3560 Training loss: 1.1108 5.9642 sec/batch
Epoch 20/20  Iteration 3540/3560 Training loss: 1.1107 5.9861 sec/batch
Epoch 20/20  Iteration 3541/3560 Training loss: 1.1105 5.9946 sec/batch
Epoch 20/20  Iteration 3542/3560 Training loss: 1.1107 6.2642 sec/batch
Epoch 20/20  Iteration 3543/3560 Training loss: 1.1109 5.9440 sec/batch
Epoch 20/20  Iteration 3544/3560 Training loss: 1.1109 6.0408 sec/batch
Epoch 20/20  Iteration 3545/3560 Training loss: 1.1108 5.9287 sec/batch
Epoch 20/20  Iteration 3546/3560 Training loss: 1.1108 5.9797 sec/batch
Epoch 20/20  Iteration 3547/3560 Training loss: 1.1109 6.1923 sec/batch
Epoch 20/20  Iteration 3548/3560 Training loss: 1.1108 5.9922 sec/batch
Epoch 20/20  Iteration 3549/3560 Training loss: 1.1110 5.9720 sec/batch
Epoch 20/20  Iteration 3550/3560 Training loss: 1.1113 6.0493 sec/batch
Epoch 20/20  Iteration 3551/3560 Training loss: 1.1114 6.0049 sec/batch
Epoch 20/20  Iteration 3552/3560 Training loss: 1.1115 5.9841 sec/batch
Epoch 20/20  Iteration 3553/3560 Training loss: 1.1115 5.9540 sec/batch
Epoch 20/20  Iteration 3554/3560 Training loss: 1.1115 6.0029 sec/batch
Epoch 20/20  Iteration 3555/3560 Training loss: 1.1116 5.9780 sec/batch
Epoch 20/20  Iteration 3556/3560 Training loss: 1.1117 5.9544 sec/batch
Epoch 20/20  Iteration 3557/3560 Training loss: 1.1118 5.9821 sec/batch
Epoch 20/20  Iteration 3558/3560 Training loss: 1.1116 6.0085 sec/batch
Epoch 20/20  Iteration 3559/3560 Training loss: 1.1116 5.9986 sec/batch
Epoch 20/20  Iteration 3560/3560 Training loss: 1.1118 5.9997 sec/batch
Epoch 1/20  Iteration 1/3560 Training loss: 4.4191 6.0069 sec/batch
Epoch 1/20  Iteration 2/3560 Training loss: 4.3734 6.2455 sec/batch
Epoch 1/20  Iteration 3/3560 Training loss: 4.1970 6.0029 sec/batch
Epoch 1/20  Iteration 4/3560 Training loss: 4.5048 5.9911 sec/batch
Epoch 1/20  Iteration 5/3560 Training loss: 4.4560 6.0322 sec/batch
Epoch 1/20  Iteration 6/3560 Training loss: 4.3639 5.9811 sec/batch
Epoch 1/20  Iteration 7/3560 Training loss: 4.2770 5.9889 sec/batch
Epoch 1/20  Iteration 8/3560 Training loss: 4.2026 5.9707 sec/batch
Epoch 1/20  Iteration 9/3560 Training loss: 4.1288 6.0100 sec/batch
Epoch 1/20  Iteration 10/3560 Training loss: 4.0635 6.0167 sec/batch
Epoch 1/20  Iteration 11/3560 Training loss: 4.0040 5.9733 sec/batch
Epoch 1/20  Iteration 12/3560 Training loss: 3.9538 5.9329 sec/batch
Epoch 1/20  Iteration 13/3560 Training loss: 3.9092 5.9255 sec/batch
Epoch 1/20  Iteration 14/3560 Training loss: 3.8713 5.9521 sec/batch
Epoch 1/20  Iteration 15/3560 Training loss: 3.8370 6.2784 sec/batch
Epoch 1/20  Iteration 16/3560 Training loss: 3.8057 5.9660 sec/batch
Epoch 1/20  Iteration 17/3560 Training loss: 3.7766 5.9863 sec/batch
Epoch 1/20  Iteration 18/3560 Training loss: 3.7514 5.9460 sec/batch
Epoch 1/20  Iteration 19/3560 Training loss: 3.7272 6.0106 sec/batch
Epoch 1/20  Iteration 20/3560 Training loss: 3.7039 6.0605 sec/batch
Epoch 1/20  Iteration 21/3560 Training loss: 3.6836 6.1915 sec/batch
Epoch 1/20  Iteration 22/3560 Training loss: 3.6645 6.0121 sec/batch
Epoch 1/20  Iteration 23/3560 Training loss: 3.6466 5.9814 sec/batch
Epoch 1/20  Iteration 24/3560 Training loss: 3.6306 5.9766 sec/batch
Epoch 1/20  Iteration 25/3560 Training loss: 3.6150 6.0036 sec/batch
Epoch 1/20  Iteration 26/3560 Training loss: 3.6010 5.9110 sec/batch
Epoch 1/20  Iteration 27/3560 Training loss: 3.5877 5.9529 sec/batch
Epoch 1/20  Iteration 28/3560 Training loss: 3.5744 5.9705 sec/batch
Epoch 1/20  Iteration 29/3560 Training loss: 3.5621 5.9438 sec/batch
Epoch 1/20  Iteration 30/3560 Training loss: 3.5507 5.9784 sec/batch
Epoch 1/20  Iteration 31/3560 Training loss: 3.5409 6.0144 sec/batch
Epoch 1/20  Iteration 32/3560 Training loss: 3.5305 6.0180 sec/batch
Epoch 1/20  Iteration 33/3560 Training loss: 3.5205 5.8991 sec/batch
Epoch 1/20  Iteration 34/3560 Training loss: 3.5117 6.0036 sec/batch
Epoch 1/20  Iteration 35/3560 Training loss: 3.5027 5.9671 sec/batch
Epoch 1/20  Iteration 36/3560 Training loss: 3.4945 5.9319 sec/batch
Epoch 1/20  Iteration 37/3560 Training loss: 3.4859 6.0181 sec/batch
Epoch 1/20  Iteration 38/3560 Training loss: 3.4780 5.9668 sec/batch
Epoch 1/20  Iteration 39/3560 Training loss: 3.4701 5.9779 sec/batch
Epoch 1/20  Iteration 40/3560 Training loss: 3.4629 5.9927 sec/batch
Epoch 1/20  Iteration 41/3560 Training loss: 3.4558 6.1800 sec/batch
Epoch 1/20  Iteration 42/3560 Training loss: 3.4490 6.0018 sec/batch
Epoch 1/20  Iteration 43/3560 Training loss: 3.4424 5.9497 sec/batch
Epoch 1/20  Iteration 44/3560 Training loss: 3.4360 6.0038 sec/batch
Epoch 1/20  Iteration 45/3560 Training loss: 3.4297 6.0001 sec/batch
Epoch 1/20  Iteration 46/3560 Training loss: 3.4241 5.9424 sec/batch
Epoch 1/20  Iteration 47/3560 Training loss: 3.4188 5.9440 sec/batch
Epoch 1/20  Iteration 48/3560 Training loss: 3.4138 5.9475 sec/batch
Epoch 1/20  Iteration 49/3560 Training loss: 3.4089 5.9948 sec/batch
Epoch 1/20  Iteration 50/3560 Training loss: 3.4042 5.9935 sec/batch
Epoch 1/20  Iteration 51/3560 Training loss: 3.3993 5.9336 sec/batch
Epoch 1/20  Iteration 52/3560 Training loss: 3.3945 5.9941 sec/batch
Epoch 1/20  Iteration 53/3560 Training loss: 3.3901 5.9749 sec/batch
Epoch 1/20  Iteration 54/3560 Training loss: 3.3855 6.0312 sec/batch
Epoch 1/20  Iteration 55/3560 Training loss: 3.3815 5.9870 sec/batch
Epoch 1/20  Iteration 56/3560 Training loss: 3.3771 5.9757 sec/batch
Epoch 1/20  Iteration 57/3560 Training loss: 3.3731 5.9550 sec/batch
Epoch 1/20  Iteration 58/3560 Training loss: 3.3692 6.0078 sec/batch
Epoch 1/20  Iteration 59/3560 Training loss: 3.3652 6.0088 sec/batch
Epoch 1/20  Iteration 60/3560 Training loss: 3.3617 5.9822 sec/batch
Epoch 1/20  Iteration 61/3560 Training loss: 3.3581 6.1749 sec/batch
Epoch 1/20  Iteration 62/3560 Training loss: 3.3550 6.0299 sec/batch
Epoch 1/20  Iteration 63/3560 Training loss: 3.3521 6.0202 sec/batch
Epoch 1/20  Iteration 64/3560 Training loss: 3.3485 6.0161 sec/batch
Epoch 1/20  Iteration 65/3560 Training loss: 3.3450 6.3346 sec/batch
Epoch 1/20  Iteration 66/3560 Training loss: 3.3423 5.9370 sec/batch
Epoch 1/20  Iteration 67/3560 Training loss: 3.3394 5.9654 sec/batch
Epoch 1/20  Iteration 68/3560 Training loss: 3.3359 5.9718 sec/batch
Epoch 1/20  Iteration 69/3560 Training loss: 3.3329 6.0155 sec/batch
Epoch 1/20  Iteration 70/3560 Training loss: 3.3302 5.9524 sec/batch
Epoch 1/20  Iteration 71/3560 Training loss: 3.3274 5.9852 sec/batch
Epoch 1/20  Iteration 72/3560 Training loss: 3.3249 6.1001 sec/batch
Epoch 1/20  Iteration 73/3560 Training loss: 3.3223 5.9970 sec/batch
Epoch 1/20  Iteration 74/3560 Training loss: 3.3197 5.9726 sec/batch
Epoch 1/20  Iteration 75/3560 Training loss: 3.3173 5.9664 sec/batch
Epoch 1/20  Iteration 76/3560 Training loss: 3.3150 5.9583 sec/batch
Epoch 1/20  Iteration 77/3560 Training loss: 3.3127 5.9776 sec/batch
Epoch 1/20  Iteration 78/3560 Training loss: 3.3103 5.9470 sec/batch
Epoch 1/20  Iteration 79/3560 Training loss: 3.3080 6.0240 sec/batch
Epoch 1/20  Iteration 80/3560 Training loss: 3.3055 5.9786 sec/batch
Epoch 1/20  Iteration 81/3560 Training loss: 3.3031 6.2036 sec/batch
Epoch 1/20  Iteration 82/3560 Training loss: 3.3009 6.0646 sec/batch
Epoch 1/20  Iteration 83/3560 Training loss: 3.2988 5.9538 sec/batch
Epoch 1/20  Iteration 84/3560 Training loss: 3.2965 6.0125 sec/batch
Epoch 1/20  Iteration 85/3560 Training loss: 3.2942 5.9965 sec/batch
Epoch 1/20  Iteration 86/3560 Training loss: 3.2919 5.9712 sec/batch
Epoch 1/20  Iteration 87/3560 Training loss: 3.2896 5.9807 sec/batch
Epoch 1/20  Iteration 88/3560 Training loss: 3.2875 5.9648 sec/batch
Epoch 1/20  Iteration 89/3560 Training loss: 3.2855 6.0268 sec/batch
Epoch 1/20  Iteration 90/3560 Training loss: 3.3757 5.9915 sec/batch
Epoch 1/20  Iteration 91/3560 Training loss: 3.5855 5.9876 sec/batch
Epoch 1/20  Iteration 92/3560 Training loss: 3.7415 5.9066 sec/batch
Epoch 1/20  Iteration 93/3560 Training loss: 3.8161 5.9666 sec/batch
Epoch 1/20  Iteration 94/3560 Training loss: 3.8124 6.0104 sec/batch
Epoch 1/20  Iteration 95/3560 Training loss: 3.8061 5.9894 sec/batch
Epoch 1/20  Iteration 96/3560 Training loss: 3.7998 5.9483 sec/batch
Epoch 1/20  Iteration 97/3560 Training loss: 3.7936 5.9744 sec/batch
Epoch 1/20  Iteration 98/3560 Training loss: 3.7870 5.9701 sec/batch
Epoch 1/20  Iteration 99/3560 Training loss: 3.7809 5.9771 sec/batch
Epoch 1/20  Iteration 100/3560 Training loss: 3.7750 6.0156 sec/batch
Epoch 1/20  Iteration 101/3560 Training loss: 3.7690 6.1757 sec/batch
Epoch 1/20  Iteration 102/3560 Training loss: 3.7630 5.9901 sec/batch
Epoch 1/20  Iteration 103/3560 Training loss: 3.7571 5.9759 sec/batch
Epoch 1/20  Iteration 104/3560 Training loss: 3.7512 5.9841 sec/batch
Epoch 1/20  Iteration 105/3560 Training loss: 3.7454 5.9652 sec/batch
Epoch 1/20  Iteration 106/3560 Training loss: 3.7397 5.9715 sec/batch
Epoch 1/20  Iteration 107/3560 Training loss: 3.7339 6.0218 sec/batch
Epoch 1/20  Iteration 108/3560 Training loss: 3.7281 6.0241 sec/batch
Epoch 1/20  Iteration 109/3560 Training loss: 3.7226 6.2253 sec/batch
Epoch 1/20  Iteration 110/3560 Training loss: 3.7168 6.0868 sec/batch
Epoch 1/20  Iteration 111/3560 Training loss: 3.7113 6.2244 sec/batch
Epoch 1/20  Iteration 112/3560 Training loss: 3.7060 6.0706 sec/batch
Epoch 1/20  Iteration 113/3560 Training loss: 3.7006 6.0298 sec/batch
Epoch 1/20  Iteration 114/3560 Training loss: 3.6953 6.3908 sec/batch
Epoch 1/20  Iteration 115/3560 Training loss: 3.6900 6.5503 sec/batch
Epoch 1/20  Iteration 116/3560 Training loss: 3.6848 6.3982 sec/batch
Epoch 1/20  Iteration 117/3560 Training loss: 3.6796 6.1758 sec/batch
Epoch 1/20  Iteration 118/3560 Training loss: 3.6748 6.1471 sec/batch
Epoch 1/20  Iteration 119/3560 Training loss: 3.6700 6.0100 sec/batch
Epoch 1/20  Iteration 120/3560 Training loss: 3.6649 6.0356 sec/batch
Epoch 1/20  Iteration 121/3560 Training loss: 3.6603 6.2057 sec/batch
Epoch 1/20  Iteration 122/3560 Training loss: 3.6556 5.9871 sec/batch
Epoch 1/20  Iteration 123/3560 Training loss: 3.6509 5.9772 sec/batch
Epoch 1/20  Iteration 124/3560 Training loss: 3.6464 5.9835 sec/batch
Epoch 1/20  Iteration 125/3560 Training loss: 3.6416 5.9458 sec/batch
Epoch 1/20  Iteration 126/3560 Training loss: 3.6369 5.9483 sec/batch
Epoch 1/20  Iteration 127/3560 Training loss: 3.6323 5.9740 sec/batch
Epoch 1/20  Iteration 128/3560 Training loss: 3.6278 5.9815 sec/batch
Epoch 1/20  Iteration 129/3560 Training loss: 3.6233 5.9393 sec/batch
Epoch 1/20  Iteration 130/3560 Training loss: 3.6188 5.9853 sec/batch
Epoch 1/20  Iteration 131/3560 Training loss: 3.6145 5.9997 sec/batch
Epoch 1/20  Iteration 132/3560 Training loss: 3.6100 5.9628 sec/batch
Epoch 1/20  Iteration 133/3560 Training loss: 3.6056 6.0204 sec/batch
Epoch 1/20  Iteration 134/3560 Training loss: 3.6012 5.9875 sec/batch
Epoch 1/20  Iteration 135/3560 Training loss: 3.5966 5.9547 sec/batch
Epoch 1/20  Iteration 136/3560 Training loss: 3.5920 6.1965 sec/batch
Epoch 1/20  Iteration 137/3560 Training loss: 3.5877 5.9987 sec/batch
Epoch 1/20  Iteration 138/3560 Training loss: 3.5832 6.0160 sec/batch
Epoch 1/20  Iteration 139/3560 Training loss: 3.5790 6.0228 sec/batch
Epoch 1/20  Iteration 140/3560 Training loss: 3.5746 5.9418 sec/batch
Epoch 1/20  Iteration 141/3560 Training loss: 3.5703 6.2738 sec/batch
Epoch 1/20  Iteration 142/3560 Training loss: 3.5664 6.0166 sec/batch
Epoch 1/20  Iteration 143/3560 Training loss: 3.5629 5.9656 sec/batch
Epoch 1/20  Iteration 144/3560 Training loss: 3.5593 5.9817 sec/batch
Epoch 1/20  Iteration 145/3560 Training loss: 3.5553 5.9870 sec/batch
Epoch 1/20  Iteration 146/3560 Training loss: 3.5516 6.0054 sec/batch
Epoch 1/20  Iteration 147/3560 Training loss: 3.5478 5.9971 sec/batch
Epoch 1/20  Iteration 148/3560 Training loss: 3.5440 6.0187 sec/batch
Epoch 1/20  Iteration 149/3560 Training loss: 3.5401 5.9775 sec/batch
Epoch 1/20  Iteration 150/3560 Training loss: 3.5363 5.9869 sec/batch
Epoch 1/20  Iteration 151/3560 Training loss: 3.5326 6.0349 sec/batch
Epoch 1/20  Iteration 152/3560 Training loss: 3.5290 5.9208 sec/batch
Epoch 1/20  Iteration 153/3560 Training loss: 3.5251 6.0346 sec/batch
Epoch 1/20  Iteration 154/3560 Training loss: 3.5213 5.9435 sec/batch
Epoch 1/20  Iteration 155/3560 Training loss: 3.5174 6.0097 sec/batch
Epoch 1/20  Iteration 156/3560 Training loss: 3.5135 6.0058 sec/batch
Epoch 1/20  Iteration 157/3560 Training loss: 3.5095 5.9736 sec/batch
Epoch 1/20  Iteration 158/3560 Training loss: 3.5056 5.9734 sec/batch
Epoch 1/20  Iteration 159/3560 Training loss: 3.5016 5.9610 sec/batch
Epoch 1/20  Iteration 160/3560 Training loss: 3.4976 5.9824 sec/batch
Epoch 1/20  Iteration 161/3560 Training loss: 3.4937 6.2473 sec/batch
Epoch 1/20  Iteration 162/3560 Training loss: 3.4896 5.9881 sec/batch
Epoch 1/20  Iteration 163/3560 Training loss: 3.4854 6.0259 sec/batch
Epoch 1/20  Iteration 164/3560 Training loss: 3.4815 6.2343 sec/batch
Epoch 1/20  Iteration 165/3560 Training loss: 3.4775 5.9603 sec/batch
Epoch 1/20  Iteration 166/3560 Training loss: 3.4735 6.0158 sec/batch
Epoch 1/20  Iteration 167/3560 Training loss: 3.4695 5.9853 sec/batch
Epoch 1/20  Iteration 168/3560 Training loss: 3.4654 5.9541 sec/batch
Epoch 1/20  Iteration 169/3560 Training loss: 3.4615 5.9431 sec/batch
Epoch 1/20  Iteration 170/3560 Training loss: 3.4574 5.9838 sec/batch
Epoch 1/20  Iteration 171/3560 Training loss: 3.4535 5.9865 sec/batch
Epoch 1/20  Iteration 172/3560 Training loss: 3.4499 5.9878 sec/batch
Epoch 1/20  Iteration 173/3560 Training loss: 3.4463 6.0170 sec/batch
Epoch 1/20  Iteration 174/3560 Training loss: 3.4426 6.0002 sec/batch
Epoch 1/20  Iteration 175/3560 Training loss: 3.4388 5.9436 sec/batch
Epoch 1/20  Iteration 176/3560 Training loss: 3.4349 5.9626 sec/batch
Epoch 1/20  Iteration 177/3560 Training loss: 3.4310 5.9533 sec/batch
Epoch 1/20  Iteration 178/3560 Training loss: 3.4269 5.9834 sec/batch
Epoch 2/20  Iteration 179/3560 Training loss: 2.7752 6.0096 sec/batch
Epoch 2/20  Iteration 180/3560 Training loss: 2.7275 5.9931 sec/batch
Epoch 2/20  Iteration 181/3560 Training loss: 2.7091 6.2919 sec/batch
Epoch 2/20  Iteration 182/3560 Training loss: 2.7043 5.9945 sec/batch
Epoch 2/20  Iteration 183/3560 Training loss: 2.6982 5.9886 sec/batch
Epoch 2/20  Iteration 184/3560 Training loss: 2.6932 5.9650 sec/batch
Epoch 2/20  Iteration 185/3560 Training loss: 2.6890 6.0030 sec/batch
Epoch 2/20  Iteration 186/3560 Training loss: 2.6863 6.0318 sec/batch
Epoch 2/20  Iteration 187/3560 Training loss: 2.6823 5.9570 sec/batch
Epoch 2/20  Iteration 188/3560 Training loss: 2.6780 6.0230 sec/batch
Epoch 2/20  Iteration 189/3560 Training loss: 2.6718 5.9748 sec/batch
Epoch 2/20  Iteration 190/3560 Training loss: 2.6681 5.9420 sec/batch
Epoch 2/20  Iteration 191/3560 Training loss: 2.6649 5.9728 sec/batch
Epoch 2/20  Iteration 192/3560 Training loss: 2.6637 5.9884 sec/batch
Epoch 2/20  Iteration 193/3560 Training loss: 2.6599 5.9848 sec/batch
Epoch 2/20  Iteration 194/3560 Training loss: 2.6565 6.0091 sec/batch
Epoch 2/20  Iteration 195/3560 Training loss: 2.6532 5.9830 sec/batch
Epoch 2/20  Iteration 196/3560 Training loss: 2.6528 5.9785 sec/batch
Epoch 2/20  Iteration 197/3560 Training loss: 2.6497 5.9536 sec/batch
Epoch 2/20  Iteration 198/3560 Training loss: 2.6457 5.9720 sec/batch
Epoch 2/20  Iteration 199/3560 Training loss: 2.6422 5.9606 sec/batch
Epoch 2/20  Iteration 200/3560 Training loss: 2.6407 5.9948 sec/batch
Epoch 2/20  Iteration 201/3560 Training loss: 2.6377 6.2397 sec/batch
Epoch 2/20  Iteration 202/3560 Training loss: 2.6344 6.0019 sec/batch
Epoch 2/20  Iteration 203/3560 Training loss: 2.6307 5.9707 sec/batch
Epoch 2/20  Iteration 204/3560 Training loss: 2.6279 5.9623 sec/batch
Epoch 2/20  Iteration 205/3560 Training loss: 2.6249 5.9549 sec/batch
Epoch 2/20  Iteration 206/3560 Training loss: 2.6217 5.9594 sec/batch
Epoch 2/20  Iteration 207/3560 Training loss: 2.6194 5.9920 sec/batch
Epoch 2/20  Iteration 208/3560 Training loss: 2.6167 6.0150 sec/batch
Epoch 2/20  Iteration 209/3560 Training loss: 2.6147 5.9954 sec/batch
Epoch 2/20  Iteration 210/3560 Training loss: 2.6118 5.9665 sec/batch
Epoch 2/20  Iteration 211/3560 Training loss: 2.6087 6.0093 sec/batch
Epoch 2/20  Iteration 212/3560 Training loss: 2.6064 5.9596 sec/batch
Epoch 2/20  Iteration 213/3560 Training loss: 2.6037 5.9483 sec/batch
Epoch 2/20  Iteration 214/3560 Training loss: 2.6014 6.2821 sec/batch
Epoch 2/20  Iteration 215/3560 Training loss: 2.5986 5.9798 sec/batch
Epoch 2/20  Iteration 216/3560 Training loss: 2.5953 5.9854 sec/batch
Epoch 2/20  Iteration 217/3560 Training loss: 2.5924 6.0238 sec/batch
Epoch 2/20  Iteration 218/3560 Training loss: 2.5897 6.0033 sec/batch
Epoch 2/20  Iteration 219/3560 Training loss: 2.5868 5.9552 sec/batch
Epoch 2/20  Iteration 220/3560 Training loss: 2.5842 6.0180 sec/batch
Epoch 2/20  Iteration 221/3560 Training loss: 2.5815 6.2664 sec/batch
Epoch 2/20  Iteration 222/3560 Training loss: 2.5788 6.0094 sec/batch
Epoch 2/20  Iteration 223/3560 Training loss: 2.5762 5.9930 sec/batch
Epoch 2/20  Iteration 224/3560 Training loss: 2.5731 5.9654 sec/batch
Epoch 2/20  Iteration 225/3560 Training loss: 2.5712 5.9523 sec/batch
Epoch 2/20  Iteration 226/3560 Training loss: 2.5690 5.9618 sec/batch
Epoch 2/20  Iteration 227/3560 Training loss: 2.5665 5.9640 sec/batch
Epoch 2/20  Iteration 228/3560 Training loss: 2.5646 5.9842 sec/batch
Epoch 2/20  Iteration 229/3560 Training loss: 2.5625 5.9462 sec/batch
Epoch 2/20  Iteration 230/3560 Training loss: 2.5604 5.9747 sec/batch
Epoch 2/20  Iteration 231/3560 Training loss: 2.5581 5.9701 sec/batch
Epoch 2/20  Iteration 232/3560 Training loss: 2.5558 5.9281 sec/batch
Epoch 2/20  Iteration 233/3560 Training loss: 2.5534 5.9624 sec/batch
Epoch 2/20  Iteration 234/3560 Training loss: 2.5514 5.9615 sec/batch
Epoch 2/20  Iteration 235/3560 Training loss: 2.5494 5.9861 sec/batch
Epoch 2/20  Iteration 236/3560 Training loss: 2.5472 5.9986 sec/batch
Epoch 2/20  Iteration 237/3560 Training loss: 2.5450 5.9928 sec/batch
Epoch 2/20  Iteration 238/3560 Training loss: 2.5433 5.9984 sec/batch
Epoch 2/20  Iteration 239/3560 Training loss: 2.5413 6.0226 sec/batch
Epoch 2/20  Iteration 240/3560 Training loss: 2.5395 5.9977 sec/batch
Epoch 2/20  Iteration 241/3560 Training loss: 2.5381 6.2621 sec/batch
Epoch 2/20  Iteration 242/3560 Training loss: 2.5362 5.9462 sec/batch
Epoch 2/20  Iteration 243/3560 Training loss: 2.5343 6.0042 sec/batch
Epoch 2/20  Iteration 244/3560 Training loss: 2.5327 5.9429 sec/batch
Epoch 2/20  Iteration 245/3560 Training loss: 2.5309 5.9952 sec/batch
Epoch 2/20  Iteration 246/3560 Training loss: 2.5288 6.0274 sec/batch
Epoch 2/20  Iteration 247/3560 Training loss: 2.5267 5.9115 sec/batch
Epoch 2/20  Iteration 248/3560 Training loss: 2.5252 5.9631 sec/batch
Epoch 2/20  Iteration 249/3560 Training loss: 2.5236 6.0250 sec/batch
Epoch 2/20  Iteration 250/3560 Training loss: 2.5220 6.0025 sec/batch
Epoch 2/20  Iteration 251/3560 Training loss: 2.5203 5.9796 sec/batch
Epoch 2/20  Iteration 252/3560 Training loss: 2.5184 5.9789 sec/batch
Epoch 2/20  Iteration 253/3560 Training loss: 2.5167 5.9870 sec/batch
Epoch 2/20  Iteration 254/3560 Training loss: 2.5155 5.9549 sec/batch
Epoch 2/20  Iteration 255/3560 Training loss: 2.5138 5.9363 sec/batch
Epoch 2/20  Iteration 256/3560 Training loss: 2.5123 5.9976 sec/batch
Epoch 2/20  Iteration 257/3560 Training loss: 2.5106 5.9994 sec/batch
Epoch 2/20  Iteration 258/3560 Training loss: 2.5090 5.9760 sec/batch
Epoch 2/20  Iteration 259/3560 Training loss: 2.5074 5.9883 sec/batch
Epoch 2/20  Iteration 260/3560 Training loss: 2.5061 6.0078 sec/batch
Epoch 2/20  Iteration 261/3560 Training loss: 2.5046 6.2676 sec/batch
Epoch 2/20  Iteration 262/3560 Training loss: 2.5028 5.9842 sec/batch
Epoch 2/20  Iteration 263/3560 Training loss: 2.5009 5.9648 sec/batch
Epoch 2/20  Iteration 264/3560 Training loss: 2.4993 6.2455 sec/batch
Epoch 2/20  Iteration 265/3560 Training loss: 2.4979 6.0000 sec/batch
Epoch 2/20  Iteration 266/3560 Training loss: 2.4964 5.9997 sec/batch
Epoch 2/20  Iteration 267/3560 Training loss: 2.4949 5.9476 sec/batch
Epoch 2/20  Iteration 268/3560 Training loss: 2.4935 5.9281 sec/batch
Epoch 2/20  Iteration 269/3560 Training loss: 2.4919 6.0595 sec/batch
Epoch 2/20  Iteration 270/3560 Training loss: 2.4906 6.0011 sec/batch
Epoch 2/20  Iteration 271/3560 Training loss: 2.4890 5.9985 sec/batch
Epoch 2/20  Iteration 272/3560 Training loss: 2.4874 5.9474 sec/batch
Epoch 2/20  Iteration 273/3560 Training loss: 2.4857 6.0102 sec/batch
Epoch 2/20  Iteration 274/3560 Training loss: 2.4841 6.0173 sec/batch
Epoch 2/20  Iteration 275/3560 Training loss: 2.4826 5.9896 sec/batch
Epoch 2/20  Iteration 276/3560 Training loss: 2.4812 5.9776 sec/batch
Epoch 2/20  Iteration 277/3560 Training loss: 2.4796 5.9866 sec/batch
Epoch 2/20  Iteration 278/3560 Training loss: 2.4781 5.9585 sec/batch
Epoch 2/20  Iteration 279/3560 Training loss: 2.4768 5.9968 sec/batch
Epoch 2/20  Iteration 280/3560 Training loss: 2.4755 5.9707 sec/batch
Epoch 2/20  Iteration 281/3560 Training loss: 2.4739 6.2750 sec/batch
Epoch 2/20  Iteration 282/3560 Training loss: 2.4724 5.9806 sec/batch
Epoch 2/20  Iteration 283/3560 Training loss: 2.4708 5.9693 sec/batch
Epoch 2/20  Iteration 284/3560 Training loss: 2.4695 5.9602 sec/batch
Epoch 2/20  Iteration 285/3560 Training loss: 2.4680 5.9439 sec/batch
Epoch 2/20  Iteration 286/3560 Training loss: 2.4670 5.9893 sec/batch
Epoch 2/20  Iteration 287/3560 Training loss: 2.4657 5.9636 sec/batch
Epoch 2/20  Iteration 288/3560 Training loss: 2.4643 6.0162 sec/batch
Epoch 2/20  Iteration 289/3560 Training loss: 2.4631 6.0218 sec/batch
Epoch 2/20  Iteration 290/3560 Training loss: 2.4619 5.9805 sec/batch
Epoch 2/20  Iteration 291/3560 Training loss: 2.4605 5.9492 sec/batch
Epoch 2/20  Iteration 292/3560 Training loss: 2.4590 6.0204 sec/batch
Epoch 2/20  Iteration 293/3560 Training loss: 2.4576 5.9697 sec/batch
Epoch 2/20  Iteration 294/3560 Training loss: 2.4560 6.1446 sec/batch
Epoch 2/20  Iteration 295/3560 Training loss: 2.4548 5.9374 sec/batch
Epoch 2/20  Iteration 296/3560 Training loss: 2.4535 5.9876 sec/batch
Epoch 2/20  Iteration 297/3560 Training loss: 2.4524 5.9857 sec/batch
Epoch 2/20  Iteration 298/3560 Training loss: 2.4511 5.9848 sec/batch
Epoch 2/20  Iteration 299/3560 Training loss: 2.4500 6.1335 sec/batch
Epoch 2/20  Iteration 300/3560 Training loss: 2.4487 5.9717 sec/batch
Epoch 2/20  Iteration 301/3560 Training loss: 2.4474 6.3213 sec/batch
Epoch 2/20  Iteration 302/3560 Training loss: 2.4463 5.9985 sec/batch
Epoch 2/20  Iteration 303/3560 Training loss: 2.4450 5.9574 sec/batch
Epoch 2/20  Iteration 304/3560 Training loss: 2.4436 5.9838 sec/batch
Epoch 2/20  Iteration 305/3560 Training loss: 2.4423 5.9852 sec/batch
Epoch 2/20  Iteration 306/3560 Training loss: 2.4412 6.0386 sec/batch
Epoch 2/20  Iteration 307/3560 Training loss: 2.4400 5.9718 sec/batch
Epoch 2/20  Iteration 308/3560 Training loss: 2.4388 5.9910 sec/batch
Epoch 2/20  Iteration 309/3560 Training loss: 2.4376 6.0131 sec/batch
Epoch 2/20  Iteration 310/3560 Training loss: 2.4362 5.9545 sec/batch
Epoch 2/20  Iteration 311/3560 Training loss: 2.4350 6.0013 sec/batch
Epoch 2/20  Iteration 312/3560 Training loss: 2.4339 5.9596 sec/batch
Epoch 2/20  Iteration 313/3560 Training loss: 2.4326 6.0015 sec/batch
Epoch 2/20  Iteration 314/3560 Training loss: 2.4315 6.3283 sec/batch
Epoch 2/20  Iteration 315/3560 Training loss: 2.4303 5.9858 sec/batch
Epoch 2/20  Iteration 316/3560 Training loss: 2.4292 5.9979 sec/batch
Epoch 2/20  Iteration 317/3560 Training loss: 2.4282 6.0500 sec/batch
Epoch 2/20  Iteration 318/3560 Training loss: 2.4270 5.9608 sec/batch
Epoch 2/20  Iteration 319/3560 Training loss: 2.4260 6.0329 sec/batch
Epoch 2/20  Iteration 320/3560 Training loss: 2.4248 5.9921 sec/batch
Epoch 2/20  Iteration 321/3560 Training loss: 2.4237 6.2501 sec/batch
Epoch 2/20  Iteration 322/3560 Training loss: 2.4226 6.0143 sec/batch
Epoch 2/20  Iteration 323/3560 Training loss: 2.4215 6.0006 sec/batch
Epoch 2/20  Iteration 324/3560 Training loss: 2.4207 5.9652 sec/batch
Epoch 2/20  Iteration 325/3560 Training loss: 2.4196 5.9409 sec/batch
Epoch 2/20  Iteration 326/3560 Training loss: 2.4187 5.9864 sec/batch
Epoch 2/20  Iteration 327/3560 Training loss: 2.4175 6.0033 sec/batch
Epoch 2/20  Iteration 328/3560 Training loss: 2.4164 5.9894 sec/batch
Epoch 2/20  Iteration 329/3560 Training loss: 2.4154 6.0109 sec/batch
Epoch 2/20  Iteration 330/3560 Training loss: 2.4146 5.9616 sec/batch
Epoch 2/20  Iteration 331/3560 Training loss: 2.4136 5.9950 sec/batch
Epoch 2/20  Iteration 332/3560 Training loss: 2.4126 5.9346 sec/batch
Epoch 2/20  Iteration 333/3560 Training loss: 2.4114 6.0073 sec/batch
Epoch 2/20  Iteration 334/3560 Training loss: 2.4103 5.9520 sec/batch
Epoch 2/20  Iteration 335/3560 Training loss: 2.4092 6.0225 sec/batch
Epoch 2/20  Iteration 336/3560 Training loss: 2.4081 5.9764 sec/batch
Epoch 2/20  Iteration 337/3560 Training loss: 2.4068 5.9539 sec/batch
Epoch 2/20  Iteration 338/3560 Training loss: 2.4060 5.9799 sec/batch
Epoch 2/20  Iteration 339/3560 Training loss: 2.4050 6.0388 sec/batch
Epoch 2/20  Iteration 340/3560 Training loss: 2.4038 6.0068 sec/batch
Epoch 2/20  Iteration 341/3560 Training loss: 2.4026 6.2075 sec/batch
Epoch 2/20  Iteration 342/3560 Training loss: 2.4016 5.9739 sec/batch
Epoch 2/20  Iteration 343/3560 Training loss: 2.4006 5.9721 sec/batch
Epoch 2/20  Iteration 344/3560 Training loss: 2.3995 5.9607 sec/batch
Epoch 2/20  Iteration 345/3560 Training loss: 2.3986 5.9477 sec/batch
Epoch 2/20  Iteration 346/3560 Training loss: 2.3976 6.0114 sec/batch
Epoch 2/20  Iteration 347/3560 Training loss: 2.3966 6.0321 sec/batch
Epoch 2/20  Iteration 348/3560 Training loss: 2.3955 5.9781 sec/batch
Epoch 2/20  Iteration 349/3560 Training loss: 2.3944 5.9719 sec/batch
Epoch 2/20  Iteration 350/3560 Training loss: 2.3934 6.0337 sec/batch
Epoch 2/20  Iteration 351/3560 Training loss: 2.3924 6.0035 sec/batch
Epoch 2/20  Iteration 352/3560 Training loss: 2.3915 6.0605 sec/batch
Epoch 2/20  Iteration 353/3560 Training loss: 2.3906 6.0148 sec/batch
Epoch 2/20  Iteration 354/3560 Training loss: 2.3899 5.9462 sec/batch
Epoch 2/20  Iteration 355/3560 Training loss: 2.3889 5.9863 sec/batch
Epoch 2/20  Iteration 356/3560 Training loss: 2.3879 5.9902 sec/batch
Epoch 3/20  Iteration 357/3560 Training loss: 2.2982 5.9724 sec/batch
Epoch 3/20  Iteration 358/3560 Training loss: 2.2392 6.0130 sec/batch
Epoch 3/20  Iteration 359/3560 Training loss: 2.2235 6.0010 sec/batch
Epoch 3/20  Iteration 360/3560 Training loss: 2.2153 6.0769 sec/batch
Epoch 3/20  Iteration 361/3560 Training loss: 2.2110 6.1661 sec/batch
Epoch 3/20  Iteration 362/3560 Training loss: 2.2070 5.9704 sec/batch
Epoch 3/20  Iteration 363/3560 Training loss: 2.2047 6.0014 sec/batch
Epoch 3/20  Iteration 364/3560 Training loss: 2.2040 6.2376 sec/batch
Epoch 3/20  Iteration 365/3560 Training loss: 2.2048 5.9342 sec/batch
Epoch 3/20  Iteration 366/3560 Training loss: 2.2037 5.9868 sec/batch
Epoch 3/20  Iteration 367/3560 Training loss: 2.2011 5.9628 sec/batch
Epoch 3/20  Iteration 368/3560 Training loss: 2.1990 5.9354 sec/batch
Epoch 3/20  Iteration 369/3560 Training loss: 2.1978 5.9788 sec/batch
Epoch 3/20  Iteration 370/3560 Training loss: 2.1998 5.9813 sec/batch
Epoch 3/20  Iteration 371/3560 Training loss: 2.1990 5.9754 sec/batch
Epoch 3/20  Iteration 372/3560 Training loss: 2.1974 5.9183 sec/batch
Epoch 3/20  Iteration 373/3560 Training loss: 2.1962 6.0197 sec/batch
Epoch 3/20  Iteration 374/3560 Training loss: 2.1970 5.9722 sec/batch
Epoch 3/20  Iteration 375/3560 Training loss: 2.1967 5.9490 sec/batch
Epoch 3/20  Iteration 376/3560 Training loss: 2.1958 5.9948 sec/batch
Epoch 3/20  Iteration 377/3560 Training loss: 2.1944 5.9395 sec/batch
Epoch 3/20  Iteration 378/3560 Training loss: 2.1954 6.0374 sec/batch
Epoch 3/20  Iteration 379/3560 Training loss: 2.1941 5.9833 sec/batch
Epoch 3/20  Iteration 380/3560 Training loss: 2.1923 5.9336 sec/batch
Epoch 3/20  Iteration 381/3560 Training loss: 2.1911 6.2439 sec/batch
Epoch 3/20  Iteration 382/3560 Training loss: 2.1895 5.9863 sec/batch
Epoch 3/20  Iteration 383/3560 Training loss: 2.1879 5.9996 sec/batch
Epoch 3/20  Iteration 384/3560 Training loss: 2.1873 6.0071 sec/batch
Epoch 3/20  Iteration 385/3560 Training loss: 2.1873 5.9511 sec/batch
Epoch 3/20  Iteration 386/3560 Training loss: 2.1866 5.9282 sec/batch
Epoch 3/20  Iteration 387/3560 Training loss: 2.1858 5.9720 sec/batch
Epoch 3/20  Iteration 388/3560 Training loss: 2.1843 5.9870 sec/batch
Epoch 3/20  Iteration 389/3560 Training loss: 2.1830 5.9649 sec/batch
Epoch 3/20  Iteration 390/3560 Training loss: 2.1827 5.9567 sec/batch
Epoch 3/20  Iteration 391/3560 Training loss: 2.1816 6.0314 sec/batch
Epoch 3/20  Iteration 392/3560 Training loss: 2.1808 5.9601 sec/batch
Epoch 3/20  Iteration 393/3560 Training loss: 2.1796 6.0021 sec/batch
Epoch 3/20  Iteration 394/3560 Training loss: 2.1777 5.9602 sec/batch
Epoch 3/20  Iteration 395/3560 Training loss: 2.1759 6.0177 sec/batch
Epoch 3/20  Iteration 396/3560 Training loss: 2.1743 6.0049 sec/batch
Epoch 3/20  Iteration 397/3560 Training loss: 2.1731 5.9503 sec/batch
Epoch 3/20  Iteration 398/3560 Training loss: 2.1721 6.0373 sec/batch
Epoch 3/20  Iteration 399/3560 Training loss: 2.1709 5.9459 sec/batch
Epoch 3/20  Iteration 400/3560 Training loss: 2.1695 5.9786 sec/batch
Epoch 3/20  Iteration 401/3560 Training loss: 2.1686 6.2932 sec/batch
Epoch 3/20  Iteration 402/3560 Training loss: 2.1664 5.9975 sec/batch
Epoch 3/20  Iteration 403/3560 Training loss: 2.1659 6.0014 sec/batch
Epoch 3/20  Iteration 404/3560 Training loss: 2.1648 5.9896 sec/batch
Epoch 3/20  Iteration 405/3560 Training loss: 2.1636 5.9361 sec/batch
Epoch 3/20  Iteration 406/3560 Training loss: 2.1633 5.9895 sec/batch
Epoch 3/20  Iteration 407/3560 Training loss: 2.1618 6.0288 sec/batch
Epoch 3/20  Iteration 408/3560 Training loss: 2.1617 6.0157 sec/batch
Epoch 3/20  Iteration 409/3560 Training loss: 2.1607 5.9391 sec/batch
Epoch 3/20  Iteration 410/3560 Training loss: 2.1596 5.9363 sec/batch
Epoch 3/20  Iteration 411/3560 Training loss: 2.1585 6.0029 sec/batch
Epoch 3/20  Iteration 412/3560 Training loss: 2.1578 5.9572 sec/batch
Epoch 3/20  Iteration 413/3560 Training loss: 2.1571 5.9925 sec/batch
Epoch 3/20  Iteration 414/3560 Training loss: 2.1561 6.4379 sec/batch
Epoch 3/20  Iteration 415/3560 Training loss: 2.1550 5.9584 sec/batch
Epoch 3/20  Iteration 416/3560 Training loss: 2.1547 5.9469 sec/batch
Epoch 3/20  Iteration 417/3560 Training loss: 2.1539 5.9774 sec/batch
Epoch 3/20  Iteration 418/3560 Training loss: 2.1536 5.9931 sec/batch
Epoch 3/20  Iteration 419/3560 Training loss: 2.1533 6.0253 sec/batch
Epoch 3/20  Iteration 420/3560 Training loss: 2.1527 6.1347 sec/batch
Epoch 3/20  Iteration 421/3560 Training loss: 2.1517 6.0961 sec/batch
Epoch 3/20  Iteration 422/3560 Training loss: 2.1513 5.9655 sec/batch
Epoch 3/20  Iteration 423/3560 Training loss: 2.1506 6.1941 sec/batch
Epoch 3/20  Iteration 424/3560 Training loss: 2.1494 6.0017 sec/batch
Epoch 3/20  Iteration 425/3560 Training loss: 2.1485 5.9787 sec/batch
Epoch 3/20  Iteration 426/3560 Training loss: 2.1479 5.9999 sec/batch
Epoch 3/20  Iteration 427/3560 Training loss: 2.1476 6.0347 sec/batch
Epoch 3/20  Iteration 428/3560 Training loss: 2.1469 5.9672 sec/batch
Epoch 3/20  Iteration 429/3560 Training loss: 2.1465 5.9887 sec/batch
Epoch 3/20  Iteration 430/3560 Training loss: 2.1455 5.9676 sec/batch
Epoch 3/20  Iteration 431/3560 Training loss: 2.1447 5.9708 sec/batch
Epoch 3/20  Iteration 432/3560 Training loss: 2.1443 5.9847 sec/batch
Epoch 3/20  Iteration 433/3560 Training loss: 2.1435 6.0022 sec/batch
Epoch 3/20  Iteration 434/3560 Training loss: 2.1430 5.9944 sec/batch
Epoch 3/20  Iteration 435/3560 Training loss: 2.1420 5.9459 sec/batch
Epoch 3/20  Iteration 436/3560 Training loss: 2.1412 5.9727 sec/batch
Epoch 3/20  Iteration 437/3560 Training loss: 2.1403 5.9849 sec/batch
Epoch 3/20  Iteration 438/3560 Training loss: 2.1397 5.9783 sec/batch
Epoch 3/20  Iteration 439/3560 Training loss: 2.1387 5.9779 sec/batch
Epoch 3/20  Iteration 440/3560 Training loss: 2.1378 6.1138 sec/batch
Epoch 3/20  Iteration 441/3560 Training loss: 2.1367 6.1500 sec/batch
Epoch 3/20  Iteration 442/3560 Training loss: 2.1358 5.9809 sec/batch
Epoch 3/20  Iteration 443/3560 Training loss: 2.1349 5.9394 sec/batch
Epoch 3/20  Iteration 444/3560 Training loss: 2.1339 5.9792 sec/batch
Epoch 3/20  Iteration 445/3560 Training loss: 2.1329 5.9652 sec/batch
Epoch 3/20  Iteration 446/3560 Training loss: 2.1324 5.9677 sec/batch
Epoch 3/20  Iteration 447/3560 Training loss: 2.1316 5.9377 sec/batch
Epoch 3/20  Iteration 448/3560 Training loss: 2.1308 6.0171 sec/batch
Epoch 3/20  Iteration 449/3560 Training loss: 2.1296 5.9747 sec/batch
Epoch 3/20  Iteration 450/3560 Training loss: 2.1287 5.9381 sec/batch
Epoch 3/20  Iteration 451/3560 Training loss: 2.1276 5.9716 sec/batch
Epoch 3/20  Iteration 452/3560 Training loss: 2.1269 6.0149 sec/batch
Epoch 3/20  Iteration 453/3560 Training loss: 2.1262 5.9769 sec/batch
Epoch 3/20  Iteration 454/3560 Training loss: 2.1254 5.9810 sec/batch
Epoch 3/20  Iteration 455/3560 Training loss: 2.1244 5.9773 sec/batch
Epoch 3/20  Iteration 456/3560 Training loss: 2.1233 5.9583 sec/batch
Epoch 3/20  Iteration 457/3560 Training loss: 2.1227 5.9821 sec/batch
Epoch 3/20  Iteration 458/3560 Training loss: 2.1221 5.9826 sec/batch
Epoch 3/20  Iteration 459/3560 Training loss: 2.1212 5.9842 sec/batch
Epoch 3/20  Iteration 460/3560 Training loss: 2.1204 6.1132 sec/batch
Epoch 3/20  Iteration 461/3560 Training loss: 2.1194 6.0696 sec/batch
Epoch 3/20  Iteration 462/3560 Training loss: 2.1187 6.0174 sec/batch
Epoch 3/20  Iteration 463/3560 Training loss: 2.1180 5.9773 sec/batch
Epoch 3/20  Iteration 464/3560 Training loss: 2.1174 6.2457 sec/batch
Epoch 3/20  Iteration 465/3560 Training loss: 2.1168 5.9487 sec/batch
Epoch 3/20  Iteration 466/3560 Training loss: 2.1161 6.0403 sec/batch
Epoch 3/20  Iteration 467/3560 Training loss: 2.1154 5.9920 sec/batch
Epoch 3/20  Iteration 468/3560 Training loss: 2.1147 5.9692 sec/batch
Epoch 3/20  Iteration 469/3560 Training loss: 2.1140 5.9929 sec/batch
Epoch 3/20  Iteration 470/3560 Training loss: 2.1131 5.9274 sec/batch
Epoch 3/20  Iteration 471/3560 Training loss: 2.1123 5.9894 sec/batch
Epoch 3/20  Iteration 472/3560 Training loss: 2.1113 5.9698 sec/batch
Epoch 3/20  Iteration 473/3560 Training loss: 2.1106 5.9886 sec/batch
Epoch 3/20  Iteration 474/3560 Training loss: 2.1098 5.9978 sec/batch
Epoch 3/20  Iteration 475/3560 Training loss: 2.1092 5.9979 sec/batch
Epoch 3/20  Iteration 476/3560 Training loss: 2.1086 6.0390 sec/batch
Epoch 3/20  Iteration 477/3560 Training loss: 2.1080 6.0040 sec/batch
Epoch 3/20  Iteration 478/3560 Training loss: 2.1072 5.9923 sec/batch
Epoch 3/20  Iteration 479/3560 Training loss: 2.1064 5.9591 sec/batch
Epoch 3/20  Iteration 480/3560 Training loss: 2.1058 6.1391 sec/batch
Epoch 3/20  Iteration 481/3560 Training loss: 2.1050 6.1158 sec/batch
Epoch 3/20  Iteration 482/3560 Training loss: 2.1041 5.9829 sec/batch
Epoch 3/20  Iteration 483/3560 Training loss: 2.1035 5.9770 sec/batch
Epoch 3/20  Iteration 484/3560 Training loss: 2.1030 5.9487 sec/batch
Epoch 3/20  Iteration 485/3560 Training loss: 2.1023 5.9927 sec/batch
Epoch 3/20  Iteration 486/3560 Training loss: 2.1018 5.9803 sec/batch
Epoch 3/20  Iteration 487/3560 Training loss: 2.1009 5.9753 sec/batch
Epoch 3/20  Iteration 488/3560 Training loss: 2.1000 5.9708 sec/batch
Epoch 3/20  Iteration 489/3560 Training loss: 2.0994 5.9260 sec/batch
Epoch 3/20  Iteration 490/3560 Training loss: 2.0988 5.9773 sec/batch
Epoch 3/20  Iteration 491/3560 Training loss: 2.0981 5.9568 sec/batch
Epoch 3/20  Iteration 492/3560 Training loss: 2.0976 5.9701 sec/batch
Epoch 3/20  Iteration 493/3560 Training loss: 2.0970 5.9905 sec/batch
Epoch 3/20  Iteration 494/3560 Training loss: 2.0965 5.9552 sec/batch
Epoch 3/20  Iteration 495/3560 Training loss: 2.0961 6.0032 sec/batch
Epoch 3/20  Iteration 496/3560 Training loss: 2.0954 5.9190 sec/batch
Epoch 3/20  Iteration 497/3560 Training loss: 2.0950 5.9638 sec/batch
Epoch 3/20  Iteration 498/3560 Training loss: 2.0943 5.9957 sec/batch
Epoch 3/20  Iteration 499/3560 Training loss: 2.0937 6.0225 sec/batch
Epoch 3/20  Iteration 500/3560 Training loss: 2.0931 6.0979 sec/batch
Epoch 3/20  Iteration 501/3560 Training loss: 2.0924 6.0834 sec/batch
Epoch 3/20  Iteration 502/3560 Training loss: 2.0919 6.0095 sec/batch
Epoch 3/20  Iteration 503/3560 Training loss: 2.0914 5.9339 sec/batch
Epoch 3/20  Iteration 504/3560 Training loss: 2.0910 5.9782 sec/batch
Epoch 3/20  Iteration 505/3560 Training loss: 2.0904 6.0155 sec/batch
Epoch 3/20  Iteration 506/3560 Training loss: 2.0897 5.9637 sec/batch
Epoch 3/20  Iteration 507/3560 Training loss: 2.0891 5.9840 sec/batch
Epoch 3/20  Iteration 508/3560 Training loss: 2.0887 5.9281 sec/batch
Epoch 3/20  Iteration 509/3560 Training loss: 2.0882 5.9895 sec/batch
Epoch 3/20  Iteration 510/3560 Training loss: 2.0876 5.9401 sec/batch
Epoch 3/20  Iteration 511/3560 Training loss: 2.0870 5.9463 sec/batch
Epoch 3/20  Iteration 512/3560 Training loss: 2.0864 5.9769 sec/batch
Epoch 3/20  Iteration 513/3560 Training loss: 2.0858 5.9626 sec/batch
Epoch 3/20  Iteration 514/3560 Training loss: 2.0852 6.4043 sec/batch
Epoch 3/20  Iteration 515/3560 Training loss: 2.0844 5.9717 sec/batch
Epoch 3/20  Iteration 516/3560 Training loss: 2.0841 5.9978 sec/batch
Epoch 3/20  Iteration 517/3560 Training loss: 2.0836 5.9953 sec/batch
Epoch 3/20  Iteration 518/3560 Training loss: 2.0830 6.0087 sec/batch
Epoch 3/20  Iteration 519/3560 Training loss: 2.0826 6.0021 sec/batch
Epoch 3/20  Iteration 520/3560 Training loss: 2.0821 6.1870 sec/batch
Epoch 3/20  Iteration 521/3560 Training loss: 2.0815 6.0056 sec/batch
Epoch 3/20  Iteration 522/3560 Training loss: 2.0808 5.9898 sec/batch
Epoch 3/20  Iteration 523/3560 Training loss: 2.0804 6.0162 sec/batch
Epoch 3/20  Iteration 524/3560 Training loss: 2.0801 5.9550 sec/batch
Epoch 3/20  Iteration 525/3560 Training loss: 2.0794 5.9899 sec/batch
Epoch 3/20  Iteration 526/3560 Training loss: 2.0788 5.9915 sec/batch
Epoch 3/20  Iteration 527/3560 Training loss: 2.0781 5.9600 sec/batch
Epoch 3/20  Iteration 528/3560 Training loss: 2.0774 5.9677 sec/batch
Epoch 3/20  Iteration 529/3560 Training loss: 2.0770 5.9856 sec/batch
Epoch 3/20  Iteration 530/3560 Training loss: 2.0765 6.0138 sec/batch
Epoch 3/20  Iteration 531/3560 Training loss: 2.0760 6.0185 sec/batch
Epoch 3/20  Iteration 532/3560 Training loss: 2.0754 5.9822 sec/batch
Epoch 3/20  Iteration 533/3560 Training loss: 2.0748 6.0025 sec/batch
Epoch 3/20  Iteration 534/3560 Training loss: 2.0742 5.9100 sec/batch
Epoch 4/20  Iteration 535/3560 Training loss: 2.0505 5.9733 sec/batch
Epoch 4/20  Iteration 536/3560 Training loss: 1.9992 5.9430 sec/batch
Epoch 4/20  Iteration 537/3560 Training loss: 1.9836 5.9366 sec/batch
Epoch 4/20  Iteration 538/3560 Training loss: 1.9771 5.9884 sec/batch
Epoch 4/20  Iteration 539/3560 Training loss: 1.9743 5.9613 sec/batch
Epoch 4/20  Iteration 540/3560 Training loss: 1.9668 6.2026 sec/batch
Epoch 4/20  Iteration 541/3560 Training loss: 1.9667 6.0819 sec/batch
Epoch 4/20  Iteration 542/3560 Training loss: 1.9667 5.9828 sec/batch
Epoch 4/20  Iteration 543/3560 Training loss: 1.9689 5.9793 sec/batch
Epoch 4/20  Iteration 544/3560 Training loss: 1.9681 5.9757 sec/batch
Epoch 4/20  Iteration 545/3560 Training loss: 1.9649 5.9480 sec/batch
Epoch 4/20  Iteration 546/3560 Training loss: 1.9630 5.9766 sec/batch
Epoch 4/20  Iteration 547/3560 Training loss: 1.9626 5.9208 sec/batch
Epoch 4/20  Iteration 548/3560 Training loss: 1.9648 6.0015 sec/batch
Epoch 4/20  Iteration 549/3560 Training loss: 1.9640 5.9748 sec/batch
Epoch 4/20  Iteration 550/3560 Training loss: 1.9626 5.9983 sec/batch
Epoch 4/20  Iteration 551/3560 Training loss: 1.9625 5.9610 sec/batch
Epoch 4/20  Iteration 552/3560 Training loss: 1.9656 5.9671 sec/batch
Epoch 4/20  Iteration 553/3560 Training loss: 1.9651 5.9858 sec/batch
Epoch 4/20  Iteration 554/3560 Training loss: 1.9651 5.9647 sec/batch
Epoch 4/20  Iteration 555/3560 Training loss: 1.9644 5.9196 sec/batch
Epoch 4/20  Iteration 556/3560 Training loss: 1.9662 5.9986 sec/batch
Epoch 4/20  Iteration 557/3560 Training loss: 1.9653 5.9630 sec/batch
Epoch 4/20  Iteration 558/3560 Training loss: 1.9642 5.9695 sec/batch
Epoch 4/20  Iteration 559/3560 Training loss: 1.9638 5.9599 sec/batch
Epoch 4/20  Iteration 560/3560 Training loss: 1.9623 6.1646 sec/batch
Epoch 4/20  Iteration 561/3560 Training loss: 1.9612 6.0969 sec/batch
Epoch 4/20  Iteration 562/3560 Training loss: 1.9610 5.9902 sec/batch
Epoch 4/20  Iteration 563/3560 Training loss: 1.9614 5.9870 sec/batch
Epoch 4/20  Iteration 564/3560 Training loss: 1.9611 6.2041 sec/batch
Epoch 4/20  Iteration 565/3560 Training loss: 1.9607 5.9463 sec/batch
Epoch 4/20  Iteration 566/3560 Training loss: 1.9595 5.9677 sec/batch
Epoch 4/20  Iteration 567/3560 Training loss: 1.9590 5.9722 sec/batch
Epoch 4/20  Iteration 568/3560 Training loss: 1.9594 5.9851 sec/batch
Epoch 4/20  Iteration 569/3560 Training loss: 1.9585 5.9811 sec/batch
Epoch 4/20  Iteration 570/3560 Training loss: 1.9577 5.9929 sec/batch
Epoch 4/20  Iteration 571/3560 Training loss: 1.9571 5.9936 sec/batch
Epoch 4/20  Iteration 572/3560 Training loss: 1.9556 5.9397 sec/batch
Epoch 4/20  Iteration 573/3560 Training loss: 1.9539 5.9652 sec/batch
Epoch 4/20  Iteration 574/3560 Training loss: 1.9524 5.9526 sec/batch
Epoch 4/20  Iteration 575/3560 Training loss: 1.9516 5.9414 sec/batch
Epoch 4/20  Iteration 576/3560 Training loss: 1.9512 5.9116 sec/batch
Epoch 4/20  Iteration 577/3560 Training loss: 1.9503 5.9559 sec/batch
Epoch 4/20  Iteration 578/3560 Training loss: 1.9492 6.0008 sec/batch
Epoch 4/20  Iteration 579/3560 Training loss: 1.9488 6.0103 sec/batch
Epoch 4/20  Iteration 580/3560 Training loss: 1.9471 6.1984 sec/batch
Epoch 4/20  Iteration 581/3560 Training loss: 1.9467 6.0428 sec/batch
Epoch 4/20  Iteration 582/3560 Training loss: 1.9456 5.9746 sec/batch
Epoch 4/20  Iteration 583/3560 Training loss: 1.9452 5.9736 sec/batch
Epoch 4/20  Iteration 584/3560 Training loss: 1.9454 5.9702 sec/batch
Epoch 4/20  Iteration 585/3560 Training loss: 1.9444 5.9606 sec/batch
Epoch 4/20  Iteration 586/3560 Training loss: 1.9446 5.9234 sec/batch
Epoch 4/20  Iteration 587/3560 Training loss: 1.9440 5.9550 sec/batch
Epoch 4/20  Iteration 588/3560 Training loss: 1.9436 6.0301 sec/batch
Epoch 4/20  Iteration 589/3560 Training loss: 1.9429 6.1361 sec/batch
Epoch 4/20  Iteration 590/3560 Training loss: 1.9426 6.0447 sec/batch
Epoch 4/20  Iteration 591/3560 Training loss: 1.9422 5.9630 sec/batch
Epoch 4/20  Iteration 592/3560 Training loss: 1.9415 5.9548 sec/batch
Epoch 4/20  Iteration 593/3560 Training loss: 1.9407 6.0088 sec/batch
Epoch 4/20  Iteration 594/3560 Training loss: 1.9407 5.9362 sec/batch
Epoch 4/20  Iteration 595/3560 Training loss: 1.9402 5.9592 sec/batch
Epoch 4/20  Iteration 596/3560 Training loss: 1.9404 5.9785 sec/batch
Epoch 4/20  Iteration 597/3560 Training loss: 1.9405 5.9928 sec/batch
Epoch 4/20  Iteration 598/3560 Training loss: 1.9403 5.9875 sec/batch
Epoch 4/20  Iteration 599/3560 Training loss: 1.9397 5.9873 sec/batch
Epoch 4/20  Iteration 600/3560 Training loss: 1.9398 6.1669 sec/batch
Epoch 4/20  Iteration 601/3560 Training loss: 1.9395 6.0387 sec/batch
Epoch 4/20  Iteration 602/3560 Training loss: 1.9387 5.9949 sec/batch
Epoch 4/20  Iteration 603/3560 Training loss: 1.9382 5.9872 sec/batch
Epoch 4/20  Iteration 604/3560 Training loss: 1.9376 5.9608 sec/batch
Epoch 4/20  Iteration 605/3560 Training loss: 1.9376 6.0392 sec/batch
Epoch 4/20  Iteration 606/3560 Training loss: 1.9374 5.9241 sec/batch
Epoch 4/20  Iteration 607/3560 Training loss: 1.9373 5.9980 sec/batch
Epoch 4/20  Iteration 608/3560 Training loss: 1.9365 6.0108 sec/batch
Epoch 4/20  Iteration 609/3560 Training loss: 1.9359 5.9761 sec/batch
Epoch 4/20  Iteration 610/3560 Training loss: 1.9359 5.9756 sec/batch
Epoch 4/20  Iteration 611/3560 Training loss: 1.9353 5.9974 sec/batch
Epoch 4/20  Iteration 612/3560 Training loss: 1.9350 6.0172 sec/batch
Epoch 4/20  Iteration 613/3560 Training loss: 1.9341 5.9828 sec/batch
Epoch 4/20  Iteration 614/3560 Training loss: 1.9335 6.2414 sec/batch
Epoch 4/20  Iteration 615/3560 Training loss: 1.9326 5.9675 sec/batch
Epoch 4/20  Iteration 616/3560 Training loss: 1.9323 6.0000 sec/batch
Epoch 4/20  Iteration 617/3560 Training loss: 1.9314 5.9582 sec/batch
Epoch 4/20  Iteration 618/3560 Training loss: 1.9308 6.0029 sec/batch
Epoch 4/20  Iteration 619/3560 Training loss: 1.9298 6.0523 sec/batch
Epoch 4/20  Iteration 620/3560 Training loss: 1.9291 6.1888 sec/batch
Epoch 4/20  Iteration 621/3560 Training loss: 1.9285 6.0455 sec/batch
Epoch 4/20  Iteration 622/3560 Training loss: 1.9278 6.0449 sec/batch
Epoch 4/20  Iteration 623/3560 Training loss: 1.9269 6.0127 sec/batch
Epoch 4/20  Iteration 624/3560 Training loss: 1.9266 5.9546 sec/batch
Epoch 4/20  Iteration 625/3560 Training loss: 1.9259 5.9588 sec/batch
Epoch 4/20  Iteration 626/3560 Training loss: 1.9254 5.9750 sec/batch
Epoch 4/20  Iteration 627/3560 Training loss: 1.9244 5.9584 sec/batch
Epoch 4/20  Iteration 628/3560 Training loss: 1.9238 6.0115 sec/batch
Epoch 4/20  Iteration 629/3560 Training loss: 1.9230 5.9998 sec/batch
Epoch 4/20  Iteration 630/3560 Training loss: 1.9226 5.9864 sec/batch
Epoch 4/20  Iteration 631/3560 Training loss: 1.9221 5.9388 sec/batch
Epoch 4/20  Iteration 632/3560 Training loss: 1.9214 5.9679 sec/batch
Epoch 4/20  Iteration 633/3560 Training loss: 1.9208 6.0211 sec/batch
Epoch 4/20  Iteration 634/3560 Training loss: 1.9198 5.9629 sec/batch
Epoch 4/20  Iteration 635/3560 Training loss: 1.9194 5.9613 sec/batch
Epoch 4/20  Iteration 636/3560 Training loss: 1.9189 6.0130 sec/batch
Epoch 4/20  Iteration 637/3560 Training loss: 1.9182 5.9547 sec/batch
Epoch 4/20  Iteration 638/3560 Training loss: 1.9176 5.9751 sec/batch
Epoch 4/20  Iteration 639/3560 Training loss: 1.9170 5.9835 sec/batch
Epoch 4/20  Iteration 640/3560 Training loss: 1.9165 6.1730 sec/batch
Epoch 4/20  Iteration 641/3560 Training loss: 1.9161 5.9887 sec/batch
Epoch 4/20  Iteration 642/3560 Training loss: 1.9156 5.9659 sec/batch
Epoch 4/20  Iteration 643/3560 Training loss: 1.9153 6.0124 sec/batch
Epoch 4/20  Iteration 644/3560 Training loss: 1.9149 5.9782 sec/batch
Epoch 4/20  Iteration 645/3560 Training loss: 1.9144 5.9748 sec/batch
Epoch 4/20  Iteration 646/3560 Training loss: 1.9139 5.9682 sec/batch
Epoch 4/20  Iteration 647/3560 Training loss: 1.9133 5.9890 sec/batch
Epoch 4/20  Iteration 648/3560 Training loss: 1.9128 6.0057 sec/batch
Epoch 4/20  Iteration 649/3560 Training loss: 1.9121 6.0236 sec/batch
Epoch 4/20  Iteration 650/3560 Training loss: 1.9114 6.0006 sec/batch
Epoch 4/20  Iteration 651/3560 Training loss: 1.9110 6.0041 sec/batch
Epoch 4/20  Iteration 652/3560 Training loss: 1.9105 5.9652 sec/batch
Epoch 4/20  Iteration 653/3560 Training loss: 1.9101 5.9860 sec/batch
Epoch 4/20  Iteration 654/3560 Training loss: 1.9096 5.9609 sec/batch
Epoch 4/20  Iteration 655/3560 Training loss: 1.9092 5.9998 sec/batch
Epoch 4/20  Iteration 656/3560 Training loss: 1.9086 5.9443 sec/batch
Epoch 4/20  Iteration 657/3560 Training loss: 1.9080 6.0140 sec/batch
Epoch 4/20  Iteration 658/3560 Training loss: 1.9077 5.9647 sec/batch
Epoch 4/20  Iteration 659/3560 Training loss: 1.9073 6.0095 sec/batch
Epoch 4/20  Iteration 660/3560 Training loss: 1.9065 6.1938 sec/batch
Epoch 4/20  Iteration 661/3560 Training loss: 1.9062 6.0337 sec/batch
Epoch 4/20  Iteration 662/3560 Training loss: 1.9058 5.9629 sec/batch
Epoch 4/20  Iteration 663/3560 Training loss: 1.9054 6.1730 sec/batch
Epoch 4/20  Iteration 664/3560 Training loss: 1.9049 6.0508 sec/batch
Epoch 4/20  Iteration 665/3560 Training loss: 1.9042 5.9944 sec/batch
Epoch 4/20  Iteration 666/3560 Training loss: 1.9036 5.9815 sec/batch
Epoch 4/20  Iteration 667/3560 Training loss: 1.9033 6.0091 sec/batch
Epoch 4/20  Iteration 668/3560 Training loss: 1.9029 5.9675 sec/batch
Epoch 4/20  Iteration 669/3560 Training loss: 1.9024 5.9991 sec/batch
Epoch 4/20  Iteration 670/3560 Training loss: 1.9021 5.9806 sec/batch
Epoch 4/20  Iteration 671/3560 Training loss: 1.9018 5.9557 sec/batch
Epoch 4/20  Iteration 672/3560 Training loss: 1.9014 6.0057 sec/batch
Epoch 4/20  Iteration 673/3560 Training loss: 1.9012 5.9822 sec/batch
Epoch 4/20  Iteration 674/3560 Training loss: 1.9008 5.9758 sec/batch
Epoch 4/20  Iteration 675/3560 Training loss: 1.9006 5.9524 sec/batch
Epoch 4/20  Iteration 676/3560 Training loss: 1.9002 6.0017 sec/batch
Epoch 4/20  Iteration 677/3560 Training loss: 1.8998 5.9548 sec/batch
Epoch 4/20  Iteration 678/3560 Training loss: 1.8994 5.9993 sec/batch
Epoch 4/20  Iteration 679/3560 Training loss: 1.8990 5.9870 sec/batch
Epoch 4/20  Iteration 680/3560 Training loss: 1.8988 6.2289 sec/batch
Epoch 4/20  Iteration 681/3560 Training loss: 1.8985 6.0263 sec/batch
Epoch 4/20  Iteration 682/3560 Training loss: 1.8983 5.9656 sec/batch
Epoch 4/20  Iteration 683/3560 Training loss: 1.8979 5.9818 sec/batch
Epoch 4/20  Iteration 684/3560 Training loss: 1.8974 5.9240 sec/batch
Epoch 4/20  Iteration 685/3560 Training loss: 1.8969 5.9647 sec/batch
Epoch 4/20  Iteration 686/3560 Training loss: 1.8967 5.9845 sec/batch
Epoch 4/20  Iteration 687/3560 Training loss: 1.8964 6.0499 sec/batch
Epoch 4/20  Iteration 688/3560 Training loss: 1.8961 5.9759 sec/batch
Epoch 4/20  Iteration 689/3560 Training loss: 1.8956 5.9801 sec/batch
Epoch 4/20  Iteration 690/3560 Training loss: 1.8953 5.9849 sec/batch
Epoch 4/20  Iteration 691/3560 Training loss: 1.8949 5.9702 sec/batch
Epoch 4/20  Iteration 692/3560 Training loss: 1.8945 5.9539 sec/batch
Epoch 4/20  Iteration 693/3560 Training loss: 1.8940 5.9980 sec/batch
Epoch 4/20  Iteration 694/3560 Training loss: 1.8938 5.9882 sec/batch
Epoch 4/20  Iteration 695/3560 Training loss: 1.8936 5.9770 sec/batch
Epoch 4/20  Iteration 696/3560 Training loss: 1.8932 5.9848 sec/batch
Epoch 4/20  Iteration 697/3560 Training loss: 1.8929 5.9852 sec/batch
Epoch 4/20  Iteration 698/3560 Training loss: 1.8925 5.9485 sec/batch
Epoch 4/20  Iteration 699/3560 Training loss: 1.8921 5.9828 sec/batch
Epoch 4/20  Iteration 700/3560 Training loss: 1.8916 6.1881 sec/batch
Epoch 4/20  Iteration 701/3560 Training loss: 1.8913 6.0426 sec/batch
Epoch 4/20  Iteration 702/3560 Training loss: 1.8913 6.0262 sec/batch
Epoch 4/20  Iteration 703/3560 Training loss: 1.8908 5.9561 sec/batch
Epoch 4/20  Iteration 704/3560 Training loss: 1.8905 5.9625 sec/batch
Epoch 4/20  Iteration 705/3560 Training loss: 1.8900 5.9925 sec/batch
Epoch 4/20  Iteration 706/3560 Training loss: 1.8895 6.0079 sec/batch
Epoch 4/20  Iteration 707/3560 Training loss: 1.8892 5.9418 sec/batch
Epoch 4/20  Iteration 708/3560 Training loss: 1.8888 5.9582 sec/batch
Epoch 4/20  Iteration 709/3560 Training loss: 1.8885 6.0073 sec/batch
Epoch 4/20  Iteration 710/3560 Training loss: 1.8882 5.9832 sec/batch
Epoch 4/20  Iteration 711/3560 Training loss: 1.8877 6.0094 sec/batch
Epoch 4/20  Iteration 712/3560 Training loss: 1.8874 5.9413 sec/batch
Epoch 5/20  Iteration 713/3560 Training loss: 1.9141 6.4283 sec/batch
Epoch 5/20  Iteration 714/3560 Training loss: 1.8630 6.0775 sec/batch
Epoch 5/20  Iteration 715/3560 Training loss: 1.8463 6.0099 sec/batch
Epoch 5/20  Iteration 716/3560 Training loss: 1.8366 5.9842 sec/batch
Epoch 5/20  Iteration 717/3560 Training loss: 1.8324 6.0201 sec/batch
Epoch 5/20  Iteration 718/3560 Training loss: 1.8211 6.0309 sec/batch
Epoch 5/20  Iteration 719/3560 Training loss: 1.8215 5.9683 sec/batch
Epoch 5/20  Iteration 720/3560 Training loss: 1.8187 6.3202 sec/batch
Epoch 5/20  Iteration 721/3560 Training loss: 1.8213 5.9673 sec/batch
Epoch 5/20  Iteration 722/3560 Training loss: 1.8205 6.1319 sec/batch
Epoch 5/20  Iteration 723/3560 Training loss: 1.8167 5.9576 sec/batch
Epoch 5/20  Iteration 724/3560 Training loss: 1.8150 6.0279 sec/batch
Epoch 5/20  Iteration 725/3560 Training loss: 1.8156 6.1076 sec/batch
Epoch 5/20  Iteration 726/3560 Training loss: 1.8172 6.0066 sec/batch
Epoch 5/20  Iteration 727/3560 Training loss: 1.8158 6.0492 sec/batch
Epoch 5/20  Iteration 728/3560 Training loss: 1.8137 6.0052 sec/batch
Epoch 5/20  Iteration 729/3560 Training loss: 1.8131 5.9836 sec/batch
Epoch 5/20  Iteration 730/3560 Training loss: 1.8150 6.1320 sec/batch
Epoch 5/20  Iteration 731/3560 Training loss: 1.8147 6.0033 sec/batch
Epoch 5/20  Iteration 732/3560 Training loss: 1.8144 6.0116 sec/batch
Epoch 5/20  Iteration 733/3560 Training loss: 1.8131 5.9774 sec/batch
Epoch 5/20  Iteration 734/3560 Training loss: 1.8144 5.9409 sec/batch
Epoch 5/20  Iteration 735/3560 Training loss: 1.8134 5.9972 sec/batch
Epoch 5/20  Iteration 736/3560 Training loss: 1.8126 5.9682 sec/batch
Epoch 5/20  Iteration 737/3560 Training loss: 1.8121 5.9976 sec/batch
Epoch 5/20  Iteration 738/3560 Training loss: 1.8102 5.9721 sec/batch
Epoch 5/20  Iteration 739/3560 Training loss: 1.8093 5.9704 sec/batch
Epoch 5/20  Iteration 740/3560 Training loss: 1.8091 6.2340 sec/batch
Epoch 5/20  Iteration 741/3560 Training loss: 1.8097 5.9742 sec/batch
Epoch 5/20  Iteration 742/3560 Training loss: 1.8095 6.0156 sec/batch
Epoch 5/20  Iteration 743/3560 Training loss: 1.8093 5.9540 sec/batch
Epoch 5/20  Iteration 744/3560 Training loss: 1.8083 5.9720 sec/batch
Epoch 5/20  Iteration 745/3560 Training loss: 1.8078 5.9524 sec/batch
Epoch 5/20  Iteration 746/3560 Training loss: 1.8083 5.9894 sec/batch
Epoch 5/20  Iteration 747/3560 Training loss: 1.8077 5.9810 sec/batch
Epoch 5/20  Iteration 748/3560 Training loss: 1.8071 5.9934 sec/batch
Epoch 5/20  Iteration 749/3560 Training loss: 1.8063 5.9589 sec/batch
Epoch 5/20  Iteration 750/3560 Training loss: 1.8048 5.9644 sec/batch
Epoch 5/20  Iteration 751/3560 Training loss: 1.8032 6.0083 sec/batch
Epoch 5/20  Iteration 752/3560 Training loss: 1.8023 6.0019 sec/batch
Epoch 5/20  Iteration 753/3560 Training loss: 1.8015 6.0006 sec/batch
Epoch 5/20  Iteration 754/3560 Training loss: 1.8017 5.9801 sec/batch
Epoch 5/20  Iteration 755/3560 Training loss: 1.8011 5.9466 sec/batch
Epoch 5/20  Iteration 756/3560 Training loss: 1.8001 5.9441 sec/batch
Epoch 5/20  Iteration 757/3560 Training loss: 1.8000 6.0233 sec/batch
Epoch 5/20  Iteration 758/3560 Training loss: 1.7987 5.9620 sec/batch
Epoch 5/20  Iteration 759/3560 Training loss: 1.7983 6.0140 sec/batch
Epoch 5/20  Iteration 760/3560 Training loss: 1.7975 6.2560 sec/batch
Epoch 5/20  Iteration 761/3560 Training loss: 1.7969 5.9742 sec/batch
Epoch 5/20  Iteration 762/3560 Training loss: 1.7974 5.9815 sec/batch
Epoch 5/20  Iteration 763/3560 Training loss: 1.7966 6.2413 sec/batch
Epoch 5/20  Iteration 764/3560 Training loss: 1.7972 6.0595 sec/batch
Epoch 5/20  Iteration 765/3560 Training loss: 1.7965 6.0199 sec/batch
Epoch 5/20  Iteration 766/3560 Training loss: 1.7962 6.0041 sec/batch
Epoch 5/20  Iteration 767/3560 Training loss: 1.7954 5.9635 sec/batch
Epoch 5/20  Iteration 768/3560 Training loss: 1.7951 5.9519 sec/batch
Epoch 5/20  Iteration 769/3560 Training loss: 1.7949 5.9637 sec/batch
Epoch 5/20  Iteration 770/3560 Training loss: 1.7943 5.9658 sec/batch
Epoch 5/20  Iteration 771/3560 Training loss: 1.7935 5.9705 sec/batch
Epoch 5/20  Iteration 772/3560 Training loss: 1.7938 5.8996 sec/batch
Epoch 5/20  Iteration 773/3560 Training loss: 1.7934 5.9687 sec/batch
Epoch 5/20  Iteration 774/3560 Training loss: 1.7939 5.9499 sec/batch
Epoch 5/20  Iteration 775/3560 Training loss: 1.7941 5.9569 sec/batch
Epoch 5/20  Iteration 776/3560 Training loss: 1.7943 5.9926 sec/batch
Epoch 5/20  Iteration 777/3560 Training loss: 1.7939 6.0376 sec/batch
Epoch 5/20  Iteration 778/3560 Training loss: 1.7942 5.9503 sec/batch
Epoch 5/20  Iteration 779/3560 Training loss: 1.7941 5.9533 sec/batch
Epoch 5/20  Iteration 780/3560 Training loss: 1.7934 6.2125 sec/batch
Epoch 5/20  Iteration 781/3560 Training loss: 1.7930 5.9653 sec/batch
Epoch 5/20  Iteration 782/3560 Training loss: 1.7926 6.0055 sec/batch
Epoch 5/20  Iteration 783/3560 Training loss: 1.7929 5.9274 sec/batch
Epoch 5/20  Iteration 784/3560 Training loss: 1.7926 5.9374 sec/batch
Epoch 5/20  Iteration 785/3560 Training loss: 1.7926 5.9720 sec/batch
Epoch 5/20  Iteration 786/3560 Training loss: 1.7919 5.9386 sec/batch
Epoch 5/20  Iteration 787/3560 Training loss: 1.7915 6.0052 sec/batch
Epoch 5/20  Iteration 788/3560 Training loss: 1.7915 5.9405 sec/batch
Epoch 5/20  Iteration 789/3560 Training loss: 1.7911 6.0250 sec/batch
Epoch 5/20  Iteration 790/3560 Training loss: 1.7909 5.9719 sec/batch
Epoch 5/20  Iteration 791/3560 Training loss: 1.7901 5.9781 sec/batch
Epoch 5/20  Iteration 792/3560 Training loss: 1.7897 5.9689 sec/batch
Epoch 5/20  Iteration 793/3560 Training loss: 1.7889 5.9616 sec/batch
Epoch 5/20  Iteration 794/3560 Training loss: 1.7888 6.0028 sec/batch
Epoch 5/20  Iteration 795/3560 Training loss: 1.7881 5.9771 sec/batch
Epoch 5/20  Iteration 796/3560 Training loss: 1.7878 5.9714 sec/batch
Epoch 5/20  Iteration 797/3560 Training loss: 1.7871 5.9376 sec/batch
Epoch 5/20  Iteration 798/3560 Training loss: 1.7865 5.9549 sec/batch
Epoch 5/20  Iteration 799/3560 Training loss: 1.7860 5.9834 sec/batch
Epoch 5/20  Iteration 800/3560 Training loss: 1.7855 6.2311 sec/batch
Epoch 5/20  Iteration 801/3560 Training loss: 1.7847 5.9784 sec/batch
Epoch 5/20  Iteration 802/3560 Training loss: 1.7845 6.0315 sec/batch
Epoch 5/20  Iteration 803/3560 Training loss: 1.7838 5.9062 sec/batch
Epoch 5/20  Iteration 804/3560 Training loss: 1.7834 5.9927 sec/batch
Epoch 5/20  Iteration 805/3560 Training loss: 1.7827 5.9307 sec/batch
Epoch 5/20  Iteration 806/3560 Training loss: 1.7821 5.9789 sec/batch
Epoch 5/20  Iteration 807/3560 Training loss: 1.7815 5.9790 sec/batch
Epoch 5/20  Iteration 808/3560 Training loss: 1.7811 5.9768 sec/batch
Epoch 5/20  Iteration 809/3560 Training loss: 1.7807 6.0075 sec/batch
Epoch 5/20  Iteration 810/3560 Training loss: 1.7800 5.9664 sec/batch
Epoch 5/20  Iteration 811/3560 Training loss: 1.7794 5.9810 sec/batch
Epoch 5/20  Iteration 812/3560 Training loss: 1.7787 6.0127 sec/batch
Epoch 5/20  Iteration 813/3560 Training loss: 1.7785 6.3561 sec/batch
Epoch 5/20  Iteration 814/3560 Training loss: 1.7780 5.9797 sec/batch
Epoch 5/20  Iteration 815/3560 Training loss: 1.7776 5.9427 sec/batch
Epoch 5/20  Iteration 816/3560 Training loss: 1.7771 6.0121 sec/batch
Epoch 5/20  Iteration 817/3560 Training loss: 1.7766 5.9570 sec/batch
Epoch 5/20  Iteration 818/3560 Training loss: 1.7763 5.9632 sec/batch
Epoch 5/20  Iteration 819/3560 Training loss: 1.7760 6.0217 sec/batch
Epoch 5/20  Iteration 820/3560 Training loss: 1.7756 6.2000 sec/batch
Epoch 5/20  Iteration 821/3560 Training loss: 1.7753 5.9944 sec/batch
Epoch 5/20  Iteration 822/3560 Training loss: 1.7751 5.9842 sec/batch
Epoch 5/20  Iteration 823/3560 Training loss: 1.7747 5.9934 sec/batch
Epoch 5/20  Iteration 824/3560 Training loss: 1.7743 5.9661 sec/batch
Epoch 5/20  Iteration 825/3560 Training loss: 1.7738 5.9533 sec/batch
Epoch 5/20  Iteration 826/3560 Training loss: 1.7734 6.0278 sec/batch
Epoch 5/20  Iteration 827/3560 Training loss: 1.7729 6.0017 sec/batch
Epoch 5/20  Iteration 828/3560 Training loss: 1.7722 5.9789 sec/batch
Epoch 5/20  Iteration 829/3560 Training loss: 1.7719 5.9929 sec/batch
Epoch 5/20  Iteration 830/3560 Training loss: 1.7715 5.9793 sec/batch
Epoch 5/20  Iteration 831/3560 Training loss: 1.7711 5.9413 sec/batch
Epoch 5/20  Iteration 832/3560 Training loss: 1.7708 6.0183 sec/batch
Epoch 5/20  Iteration 833/3560 Training loss: 1.7704 5.9847 sec/batch
Epoch 5/20  Iteration 834/3560 Training loss: 1.7699 5.9365 sec/batch
Epoch 5/20  Iteration 835/3560 Training loss: 1.7693 5.9829 sec/batch
Epoch 5/20  Iteration 836/3560 Training loss: 1.7691 6.0194 sec/batch
Epoch 5/20  Iteration 837/3560 Training loss: 1.7687 5.9531 sec/batch
Epoch 5/20  Iteration 838/3560 Training loss: 1.7681 5.9937 sec/batch
Epoch 5/20  Iteration 839/3560 Training loss: 1.7680 5.9499 sec/batch
Epoch 5/20  Iteration 840/3560 Training loss: 1.7677 6.2578 sec/batch
Epoch 5/20  Iteration 841/3560 Training loss: 1.7674 5.9840 sec/batch
Epoch 5/20  Iteration 842/3560 Training loss: 1.7670 6.0036 sec/batch
Epoch 5/20  Iteration 843/3560 Training loss: 1.7664 5.9554 sec/batch
Epoch 5/20  Iteration 844/3560 Training loss: 1.7659 5.9789 sec/batch
Epoch 5/20  Iteration 845/3560 Training loss: 1.7657 5.9860 sec/batch
Epoch 5/20  Iteration 846/3560 Training loss: 1.7655 6.0002 sec/batch
Epoch 5/20  Iteration 847/3560 Training loss: 1.7652 5.9982 sec/batch
Epoch 5/20  Iteration 848/3560 Training loss: 1.7649 5.9578 sec/batch
Epoch 5/20  Iteration 849/3560 Training loss: 1.7648 5.9816 sec/batch
Epoch 5/20  Iteration 850/3560 Training loss: 1.7646 5.9650 sec/batch
Epoch 5/20  Iteration 851/3560 Training loss: 1.7645 5.9712 sec/batch
Epoch 5/20  Iteration 852/3560 Training loss: 1.7642 6.0020 sec/batch
Epoch 5/20  Iteration 853/3560 Training loss: 1.7642 5.9846 sec/batch
Epoch 5/20  Iteration 854/3560 Training loss: 1.7639 5.9697 sec/batch
Epoch 5/20  Iteration 855/3560 Training loss: 1.7636 5.9617 sec/batch
Epoch 5/20  Iteration 856/3560 Training loss: 1.7634 5.9979 sec/batch
Epoch 5/20  Iteration 857/3560 Training loss: 1.7631 5.9973 sec/batch
Epoch 5/20  Iteration 858/3560 Training loss: 1.7629 6.0005 sec/batch
Epoch 5/20  Iteration 859/3560 Training loss: 1.7627 5.9670 sec/batch
Epoch 5/20  Iteration 860/3560 Training loss: 1.7627 6.2365 sec/batch
Epoch 5/20  Iteration 861/3560 Training loss: 1.7625 5.9789 sec/batch
Epoch 5/20  Iteration 862/3560 Training loss: 1.7622 5.9910 sec/batch
Epoch 5/20  Iteration 863/3560 Training loss: 1.7617 6.2379 sec/batch
Epoch 5/20  Iteration 864/3560 Training loss: 1.7615 5.9497 sec/batch
Epoch 5/20  Iteration 865/3560 Training loss: 1.7614 5.9644 sec/batch
Epoch 5/20  Iteration 866/3560 Training loss: 1.7612 5.9850 sec/batch
Epoch 5/20  Iteration 867/3560 Training loss: 1.7609 6.0011 sec/batch
Epoch 5/20  Iteration 868/3560 Training loss: 1.7606 6.0437 sec/batch
Epoch 5/20  Iteration 869/3560 Training loss: 1.7604 6.0580 sec/batch
Epoch 5/20  Iteration 870/3560 Training loss: 1.7601 6.0053 sec/batch
Epoch 5/20  Iteration 871/3560 Training loss: 1.7597 5.9757 sec/batch
Epoch 5/20  Iteration 872/3560 Training loss: 1.7595 6.0335 sec/batch
Epoch 5/20  Iteration 873/3560 Training loss: 1.7594 5.9406 sec/batch
Epoch 5/20  Iteration 874/3560 Training loss: 1.7591 5.9970 sec/batch
Epoch 5/20  Iteration 875/3560 Training loss: 1.7590 5.9651 sec/batch
Epoch 5/20  Iteration 876/3560 Training loss: 1.7587 5.9495 sec/batch
Epoch 5/20  Iteration 877/3560 Training loss: 1.7583 5.9763 sec/batch
Epoch 5/20  Iteration 878/3560 Training loss: 1.7580 5.9590 sec/batch
Epoch 5/20  Iteration 879/3560 Training loss: 1.7579 5.9898 sec/batch
Epoch 5/20  Iteration 880/3560 Training loss: 1.7580 6.1952 sec/batch
Epoch 5/20  Iteration 881/3560 Training loss: 1.7576 6.0410 sec/batch
Epoch 5/20  Iteration 882/3560 Training loss: 1.7574 5.9954 sec/batch
Epoch 5/20  Iteration 883/3560 Training loss: 1.7571 5.9685 sec/batch
Epoch 5/20  Iteration 884/3560 Training loss: 1.7567 5.9418 sec/batch
Epoch 5/20  Iteration 885/3560 Training loss: 1.7565 5.9996 sec/batch
Epoch 5/20  Iteration 886/3560 Training loss: 1.7562 6.0027 sec/batch
Epoch 5/20  Iteration 887/3560 Training loss: 1.7561 5.9858 sec/batch
Epoch 5/20  Iteration 888/3560 Training loss: 1.7558 5.9634 sec/batch
Epoch 5/20  Iteration 889/3560 Training loss: 1.7554 5.9643 sec/batch
Epoch 5/20  Iteration 890/3560 Training loss: 1.7552 5.9463 sec/batch
Epoch 6/20  Iteration 891/3560 Training loss: 1.7913 5.9571 sec/batch
Epoch 6/20  Iteration 892/3560 Training loss: 1.7474 5.9668 sec/batch
Epoch 6/20  Iteration 893/3560 Training loss: 1.7337 5.9742 sec/batch
Epoch 6/20  Iteration 894/3560 Training loss: 1.7265 6.0131 sec/batch
Epoch 6/20  Iteration 895/3560 Training loss: 1.7193 5.9780 sec/batch
Epoch 6/20  Iteration 896/3560 Training loss: 1.7088 5.9536 sec/batch
Epoch 6/20  Iteration 897/3560 Training loss: 1.7084 5.9749 sec/batch
Epoch 6/20  Iteration 898/3560 Training loss: 1.7063 5.9474 sec/batch
Epoch 6/20  Iteration 899/3560 Training loss: 1.7075 5.9463 sec/batch
Epoch 6/20  Iteration 900/3560 Training loss: 1.7064 6.3273 sec/batch
Epoch 6/20  Iteration 901/3560 Training loss: 1.7022 5.9781 sec/batch
Epoch 6/20  Iteration 902/3560 Training loss: 1.6999 6.0080 sec/batch
Epoch 6/20  Iteration 903/3560 Training loss: 1.6994 5.9572 sec/batch
Epoch 6/20  Iteration 904/3560 Training loss: 1.7013 6.0184 sec/batch
Epoch 6/20  Iteration 905/3560 Training loss: 1.7001 5.9781 sec/batch
Epoch 6/20  Iteration 906/3560 Training loss: 1.6982 6.1790 sec/batch
Epoch 6/20  Iteration 907/3560 Training loss: 1.6981 5.9835 sec/batch
Epoch 6/20  Iteration 908/3560 Training loss: 1.6995 5.9960 sec/batch
Epoch 6/20  Iteration 909/3560 Training loss: 1.6990 5.9922 sec/batch
Epoch 6/20  Iteration 910/3560 Training loss: 1.6994 5.9841 sec/batch
Epoch 6/20  Iteration 911/3560 Training loss: 1.6984 5.9865 sec/batch
Epoch 6/20  Iteration 912/3560 Training loss: 1.6988 5.9861 sec/batch
Epoch 6/20  Iteration 913/3560 Training loss: 1.6978 6.4334 sec/batch
Epoch 6/20  Iteration 914/3560 Training loss: 1.6971 5.9450 sec/batch
Epoch 6/20  Iteration 915/3560 Training loss: 1.6968 5.9847 sec/batch
Epoch 6/20  Iteration 916/3560 Training loss: 1.6953 6.0313 sec/batch
Epoch 6/20  Iteration 917/3560 Training loss: 1.6941 6.0164 sec/batch
Epoch 6/20  Iteration 918/3560 Training loss: 1.6940 6.0065 sec/batch
Epoch 6/20  Iteration 919/3560 Training loss: 1.6943 5.9819 sec/batch
Epoch 6/20  Iteration 920/3560 Training loss: 1.6944 6.2865 sec/batch
Epoch 6/20  Iteration 921/3560 Training loss: 1.6943 5.9854 sec/batch
Epoch 6/20  Iteration 922/3560 Training loss: 1.6933 5.9755 sec/batch
Epoch 6/20  Iteration 923/3560 Training loss: 1.6933 5.9398 sec/batch
Epoch 6/20  Iteration 924/3560 Training loss: 1.6939 5.9591 sec/batch
Epoch 6/20  Iteration 925/3560 Training loss: 1.6935 5.9933 sec/batch
Epoch 6/20  Iteration 926/3560 Training loss: 1.6932 5.9661 sec/batch
Epoch 6/20  Iteration 927/3560 Training loss: 1.6923 5.9946 sec/batch
Epoch 6/20  Iteration 928/3560 Training loss: 1.6911 6.0100 sec/batch
Epoch 6/20  Iteration 929/3560 Training loss: 1.6896 5.9273 sec/batch
Epoch 6/20  Iteration 930/3560 Training loss: 1.6888 6.0082 sec/batch
Epoch 6/20  Iteration 931/3560 Training loss: 1.6881 5.9616 sec/batch
Epoch 6/20  Iteration 932/3560 Training loss: 1.6884 6.0011 sec/batch
Epoch 6/20  Iteration 933/3560 Training loss: 1.6878 5.9314 sec/batch
Epoch 6/20  Iteration 934/3560 Training loss: 1.6867 5.9442 sec/batch
Epoch 6/20  Iteration 935/3560 Training loss: 1.6869 5.9473 sec/batch
Epoch 6/20  Iteration 936/3560 Training loss: 1.6858 5.9406 sec/batch
Epoch 6/20  Iteration 937/3560 Training loss: 1.6853 5.9680 sec/batch
Epoch 6/20  Iteration 938/3560 Training loss: 1.6847 5.9731 sec/batch
Epoch 6/20  Iteration 939/3560 Training loss: 1.6841 5.9942 sec/batch
Epoch 6/20  Iteration 940/3560 Training loss: 1.6843 6.2793 sec/batch
Epoch 6/20  Iteration 941/3560 Training loss: 1.6835 5.9988 sec/batch
Epoch 6/20  Iteration 942/3560 Training loss: 1.6842 5.9578 sec/batch
Epoch 6/20  Iteration 943/3560 Training loss: 1.6839 5.9357 sec/batch
Epoch 6/20  Iteration 944/3560 Training loss: 1.6837 5.9998 sec/batch
Epoch 6/20  Iteration 945/3560 Training loss: 1.6833 5.9730 sec/batch
Epoch 6/20  Iteration 946/3560 Training loss: 1.6832 5.9584 sec/batch
Epoch 6/20  Iteration 947/3560 Training loss: 1.6833 6.0672 sec/batch
Epoch 6/20  Iteration 948/3560 Training loss: 1.6828 5.9909 sec/batch
Epoch 6/20  Iteration 949/3560 Training loss: 1.6821 6.0227 sec/batch
Epoch 6/20  Iteration 950/3560 Training loss: 1.6825 5.9712 sec/batch
Epoch 6/20  Iteration 951/3560 Training loss: 1.6821 5.9912 sec/batch
Epoch 6/20  Iteration 952/3560 Training loss: 1.6827 5.9944 sec/batch
Epoch 6/20  Iteration 953/3560 Training loss: 1.6829 5.9739 sec/batch
Epoch 6/20  Iteration 954/3560 Training loss: 1.6829 5.9752 sec/batch
Epoch 6/20  Iteration 955/3560 Training loss: 1.6827 5.9794 sec/batch
Epoch 6/20  Iteration 956/3560 Training loss: 1.6829 6.0009 sec/batch
Epoch 6/20  Iteration 957/3560 Training loss: 1.6829 6.0002 sec/batch
Epoch 6/20  Iteration 958/3560 Training loss: 1.6824 5.9657 sec/batch
Epoch 6/20  Iteration 959/3560 Training loss: 1.6822 5.9911 sec/batch
Epoch 6/20  Iteration 960/3560 Training loss: 1.6820 6.2047 sec/batch
Epoch 6/20  Iteration 961/3560 Training loss: 1.6824 5.9874 sec/batch
Epoch 6/20  Iteration 962/3560 Training loss: 1.6823 5.9839 sec/batch
Epoch 6/20  Iteration 963/3560 Training loss: 1.6826 6.2140 sec/batch
Epoch 6/20  Iteration 964/3560 Training loss: 1.6820 5.9846 sec/batch
Epoch 6/20  Iteration 965/3560 Training loss: 1.6817 5.9416 sec/batch
Epoch 6/20  Iteration 966/3560 Training loss: 1.6818 5.9956 sec/batch
Epoch 6/20  Iteration 967/3560 Training loss: 1.6815 5.9844 sec/batch
Epoch 6/20  Iteration 968/3560 Training loss: 1.6814 5.9680 sec/batch
Epoch 6/20  Iteration 969/3560 Training loss: 1.6807 5.9642 sec/batch
Epoch 6/20  Iteration 970/3560 Training loss: 1.6804 6.0313 sec/batch
Epoch 6/20  Iteration 971/3560 Training loss: 1.6798 6.0041 sec/batch
Epoch 6/20  Iteration 972/3560 Training loss: 1.6797 5.9880 sec/batch
Epoch 6/20  Iteration 973/3560 Training loss: 1.6790 5.9356 sec/batch
Epoch 6/20  Iteration 974/3560 Training loss: 1.6787 5.8977 sec/batch
Epoch 6/20  Iteration 975/3560 Training loss: 1.6782 5.9901 sec/batch
Epoch 6/20  Iteration 976/3560 Training loss: 1.6777 5.9629 sec/batch
Epoch 6/20  Iteration 977/3560 Training loss: 1.6773 6.0030 sec/batch
Epoch 6/20  Iteration 978/3560 Training loss: 1.6769 6.0053 sec/batch
Epoch 6/20  Iteration 979/3560 Training loss: 1.6762 5.9604 sec/batch
Epoch 6/20  Iteration 980/3560 Training loss: 1.6762 6.1970 sec/batch
Epoch 6/20  Iteration 981/3560 Training loss: 1.6757 6.0477 sec/batch
Epoch 6/20  Iteration 982/3560 Training loss: 1.6752 5.9795 sec/batch
Epoch 6/20  Iteration 983/3560 Training loss: 1.6746 5.9851 sec/batch
Epoch 6/20  Iteration 984/3560 Training loss: 1.6742 5.9429 sec/batch
Epoch 6/20  Iteration 985/3560 Training loss: 1.6737 5.9554 sec/batch
Epoch 6/20  Iteration 986/3560 Training loss: 1.6734 5.9420 sec/batch
Epoch 6/20  Iteration 987/3560 Training loss: 1.6731 6.0088 sec/batch
Epoch 6/20  Iteration 988/3560 Training loss: 1.6725 5.9492 sec/batch
Epoch 6/20  Iteration 989/3560 Training loss: 1.6720 6.0145 sec/batch
Epoch 6/20  Iteration 990/3560 Training loss: 1.6713 5.9705 sec/batch
Epoch 6/20  Iteration 991/3560 Training loss: 1.6712 5.9745 sec/batch
Epoch 6/20  Iteration 992/3560 Training loss: 1.6708 5.9586 sec/batch
Epoch 6/20  Iteration 993/3560 Training loss: 1.6704 6.0163 sec/batch
Epoch 6/20  Iteration 994/3560 Training loss: 1.6701 5.9385 sec/batch
Epoch 6/20  Iteration 995/3560 Training loss: 1.6696 5.9498 sec/batch
Epoch 6/20  Iteration 996/3560 Training loss: 1.6694 5.9738 sec/batch
Epoch 6/20  Iteration 997/3560 Training loss: 1.6691 5.9665 sec/batch
Epoch 6/20  Iteration 998/3560 Training loss: 1.6688 6.0082 sec/batch
Epoch 6/20  Iteration 999/3560 Training loss: 1.6686 5.9530 sec/batch
Epoch 6/20  Iteration 1000/3560 Training loss: 1.6684 6.2670 sec/batch
Epoch 6/20  Iteration 1001/3560 Training loss: 1.6680 5.9626 sec/batch
Epoch 6/20  Iteration 1002/3560 Training loss: 1.6678 5.9945 sec/batch
Epoch 6/20  Iteration 1003/3560 Training loss: 1.6674 5.9613 sec/batch
Epoch 6/20  Iteration 1004/3560 Training loss: 1.6671 5.9870 sec/batch
Epoch 6/20  Iteration 1005/3560 Training loss: 1.6667 6.0007 sec/batch
Epoch 6/20  Iteration 1006/3560 Training loss: 1.6661 6.0252 sec/batch
Epoch 6/20  Iteration 1007/3560 Training loss: 1.6659 5.9759 sec/batch
Epoch 6/20  Iteration 1008/3560 Training loss: 1.6657 5.9907 sec/batch
Epoch 6/20  Iteration 1009/3560 Training loss: 1.6653 5.9749 sec/batch
Epoch 6/20  Iteration 1010/3560 Training loss: 1.6650 5.9884 sec/batch
Epoch 6/20  Iteration 1011/3560 Training loss: 1.6647 5.9917 sec/batch
Epoch 6/20  Iteration 1012/3560 Training loss: 1.6642 5.9454 sec/batch
Epoch 6/20  Iteration 1013/3560 Training loss: 1.6637 6.2680 sec/batch
Epoch 6/20  Iteration 1014/3560 Training loss: 1.6635 5.9582 sec/batch
Epoch 6/20  Iteration 1015/3560 Training loss: 1.6634 6.0163 sec/batch
Epoch 6/20  Iteration 1016/3560 Training loss: 1.6627 6.0135 sec/batch
Epoch 6/20  Iteration 1017/3560 Training loss: 1.6627 6.0123 sec/batch
Epoch 6/20  Iteration 1018/3560 Training loss: 1.6625 5.9752 sec/batch
Epoch 6/20  Iteration 1019/3560 Training loss: 1.6622 6.0160 sec/batch
Epoch 6/20  Iteration 1020/3560 Training loss: 1.6618 6.2472 sec/batch
Epoch 6/20  Iteration 1021/3560 Training loss: 1.6613 6.0105 sec/batch
Epoch 6/20  Iteration 1022/3560 Training loss: 1.6608 6.0146 sec/batch
Epoch 6/20  Iteration 1023/3560 Training loss: 1.6607 5.9850 sec/batch
Epoch 6/20  Iteration 1024/3560 Training loss: 1.6606 5.9723 sec/batch
Epoch 6/20  Iteration 1025/3560 Training loss: 1.6603 5.9386 sec/batch
Epoch 6/20  Iteration 1026/3560 Training loss: 1.6602 5.9307 sec/batch
Epoch 6/20  Iteration 1027/3560 Training loss: 1.6601 5.9697 sec/batch
Epoch 6/20  Iteration 1028/3560 Training loss: 1.6600 6.0050 sec/batch
Epoch 6/20  Iteration 1029/3560 Training loss: 1.6599 6.4687 sec/batch
Epoch 6/20  Iteration 1030/3560 Training loss: 1.6597 6.1940 sec/batch
Epoch 6/20  Iteration 1031/3560 Training loss: 1.6597 6.4478 sec/batch
Epoch 6/20  Iteration 1032/3560 Training loss: 1.6594 6.0661 sec/batch
Epoch 6/20  Iteration 1033/3560 Training loss: 1.6592 5.9424 sec/batch
Epoch 6/20  Iteration 1034/3560 Training loss: 1.6591 6.0286 sec/batch
Epoch 6/20  Iteration 1035/3560 Training loss: 1.6589 5.9920 sec/batch
Epoch 6/20  Iteration 1036/3560 Training loss: 1.6588 5.9351 sec/batch
Epoch 6/20  Iteration 1037/3560 Training loss: 1.6587 6.0074 sec/batch
Epoch 6/20  Iteration 1038/3560 Training loss: 1.6587 5.9869 sec/batch
Epoch 6/20  Iteration 1039/3560 Training loss: 1.6585 5.9457 sec/batch
Epoch 6/20  Iteration 1040/3560 Training loss: 1.6582 6.2565 sec/batch
Epoch 6/20  Iteration 1041/3560 Training loss: 1.6578 5.9794 sec/batch
Epoch 6/20  Iteration 1042/3560 Training loss: 1.6576 6.0625 sec/batch
Epoch 6/20  Iteration 1043/3560 Training loss: 1.6575 5.9169 sec/batch
Epoch 6/20  Iteration 1044/3560 Training loss: 1.6573 5.9605 sec/batch
Epoch 6/20  Iteration 1045/3560 Training loss: 1.6571 5.9985 sec/batch
Epoch 6/20  Iteration 1046/3560 Training loss: 1.6569 5.9758 sec/batch
Epoch 6/20  Iteration 1047/3560 Training loss: 1.6568 5.9706 sec/batch
Epoch 6/20  Iteration 1048/3560 Training loss: 1.6566 5.9789 sec/batch
Epoch 6/20  Iteration 1049/3560 Training loss: 1.6562 6.0001 sec/batch
Epoch 6/20  Iteration 1050/3560 Training loss: 1.6561 5.9350 sec/batch
Epoch 6/20  Iteration 1051/3560 Training loss: 1.6561 5.9675 sec/batch
Epoch 6/20  Iteration 1052/3560 Training loss: 1.6559 6.0980 sec/batch
Epoch 6/20  Iteration 1053/3560 Training loss: 1.6557 5.9987 sec/batch
Epoch 6/20  Iteration 1054/3560 Training loss: 1.6555 5.9248 sec/batch
Epoch 6/20  Iteration 1055/3560 Training loss: 1.6553 5.9477 sec/batch
Epoch 6/20  Iteration 1056/3560 Training loss: 1.6551 5.9438 sec/batch
Epoch 6/20  Iteration 1057/3560 Training loss: 1.6550 5.9801 sec/batch
Epoch 6/20  Iteration 1058/3560 Training loss: 1.6553 5.9855 sec/batch
Epoch 6/20  Iteration 1059/3560 Training loss: 1.6551 6.1345 sec/batch
Epoch 6/20  Iteration 1060/3560 Training loss: 1.6549 6.1221 sec/batch
Epoch 6/20  Iteration 1061/3560 Training loss: 1.6546 5.9854 sec/batch
Epoch 6/20  Iteration 1062/3560 Training loss: 1.6542 6.0087 sec/batch
Epoch 6/20  Iteration 1063/3560 Training loss: 1.6542 6.4497 sec/batch
Epoch 6/20  Iteration 1064/3560 Training loss: 1.6541 6.0032 sec/batch
Epoch 6/20  Iteration 1065/3560 Training loss: 1.6540 5.9393 sec/batch
Epoch 6/20  Iteration 1066/3560 Training loss: 1.6537 6.0058 sec/batch
Epoch 6/20  Iteration 1067/3560 Training loss: 1.6533 5.9960 sec/batch
Epoch 6/20  Iteration 1068/3560 Training loss: 1.6532 6.0089 sec/batch
Epoch 7/20  Iteration 1069/3560 Training loss: 1.7123 5.9736 sec/batch
Epoch 7/20  Iteration 1070/3560 Training loss: 1.6607 5.9539 sec/batch
Epoch 7/20  Iteration 1071/3560 Training loss: 1.6455 5.9564 sec/batch
Epoch 7/20  Iteration 1072/3560 Training loss: 1.6373 5.9895 sec/batch
Epoch 7/20  Iteration 1073/3560 Training loss: 1.6296 6.0047 sec/batch
Epoch 7/20  Iteration 1074/3560 Training loss: 1.6189 5.8859 sec/batch
Epoch 7/20  Iteration 1075/3560 Training loss: 1.6192 6.0563 sec/batch
Epoch 7/20  Iteration 1076/3560 Training loss: 1.6164 5.9364 sec/batch
Epoch 7/20  Iteration 1077/3560 Training loss: 1.6177 6.0061 sec/batch
Epoch 7/20  Iteration 1078/3560 Training loss: 1.6172 5.9475 sec/batch
Epoch 7/20  Iteration 1079/3560 Training loss: 1.6142 6.0244 sec/batch
Epoch 7/20  Iteration 1080/3560 Training loss: 1.6126 6.1934 sec/batch
Epoch 7/20  Iteration 1081/3560 Training loss: 1.6116 5.9870 sec/batch
Epoch 7/20  Iteration 1082/3560 Training loss: 1.6134 5.9683 sec/batch
Epoch 7/20  Iteration 1083/3560 Training loss: 1.6122 5.9878 sec/batch
Epoch 7/20  Iteration 1084/3560 Training loss: 1.6110 5.9652 sec/batch
Epoch 7/20  Iteration 1085/3560 Training loss: 1.6112 5.9662 sec/batch
Epoch 7/20  Iteration 1086/3560 Training loss: 1.6133 5.9433 sec/batch
Epoch 7/20  Iteration 1087/3560 Training loss: 1.6130 6.0392 sec/batch
Epoch 7/20  Iteration 1088/3560 Training loss: 1.6133 5.9653 sec/batch
Epoch 7/20  Iteration 1089/3560 Training loss: 1.6127 5.9656 sec/batch
Epoch 7/20  Iteration 1090/3560 Training loss: 1.6134 5.9911 sec/batch
Epoch 7/20  Iteration 1091/3560 Training loss: 1.6125 6.0088 sec/batch
Epoch 7/20  Iteration 1092/3560 Training loss: 1.6121 5.9685 sec/batch
Epoch 7/20  Iteration 1093/3560 Training loss: 1.6120 5.9385 sec/batch
Epoch 7/20  Iteration 1094/3560 Training loss: 1.6103 5.9157 sec/batch
Epoch 7/20  Iteration 1095/3560 Training loss: 1.6086 6.0078 sec/batch
Epoch 7/20  Iteration 1096/3560 Training loss: 1.6090 5.9570 sec/batch
Epoch 7/20  Iteration 1097/3560 Training loss: 1.6095 5.9795 sec/batch
Epoch 7/20  Iteration 1098/3560 Training loss: 1.6095 5.9998 sec/batch
Epoch 7/20  Iteration 1099/3560 Training loss: 1.6090 6.0204 sec/batch
Epoch 7/20  Iteration 1100/3560 Training loss: 1.6078 6.2800 sec/batch
Epoch 7/20  Iteration 1101/3560 Training loss: 1.6080 6.0149 sec/batch
Epoch 7/20  Iteration 1102/3560 Training loss: 1.6082 6.0432 sec/batch
Epoch 7/20  Iteration 1103/3560 Training loss: 1.6078 5.9272 sec/batch
Epoch 7/20  Iteration 1104/3560 Training loss: 1.6074 5.9649 sec/batch
Epoch 7/20  Iteration 1105/3560 Training loss: 1.6063 5.9147 sec/batch
Epoch 7/20  Iteration 1106/3560 Training loss: 1.6052 5.9719 sec/batch
Epoch 7/20  Iteration 1107/3560 Training loss: 1.6035 5.9637 sec/batch
Epoch 7/20  Iteration 1108/3560 Training loss: 1.6027 5.9265 sec/batch
Epoch 7/20  Iteration 1109/3560 Training loss: 1.6021 6.0333 sec/batch
Epoch 7/20  Iteration 1110/3560 Training loss: 1.6026 5.9219 sec/batch
Epoch 7/20  Iteration 1111/3560 Training loss: 1.6019 5.9850 sec/batch
Epoch 7/20  Iteration 1112/3560 Training loss: 1.6012 5.9816 sec/batch
Epoch 7/20  Iteration 1113/3560 Training loss: 1.6014 6.7549 sec/batch
Epoch 7/20  Iteration 1114/3560 Training loss: 1.6007 5.9970 sec/batch
Epoch 7/20  Iteration 1115/3560 Training loss: 1.6002 5.9594 sec/batch
Epoch 7/20  Iteration 1116/3560 Training loss: 1.5996 6.0754 sec/batch
Epoch 7/20  Iteration 1117/3560 Training loss: 1.5992 5.9653 sec/batch
Epoch 7/20  Iteration 1118/3560 Training loss: 1.5995 6.0068 sec/batch
Epoch 7/20  Iteration 1119/3560 Training loss: 1.5990 6.1853 sec/batch
Epoch 7/20  Iteration 1120/3560 Training loss: 1.5999 6.0464 sec/batch
Epoch 7/20  Iteration 1121/3560 Training loss: 1.5997 5.9477 sec/batch
Epoch 7/20  Iteration 1122/3560 Training loss: 1.5995 5.9799 sec/batch
Epoch 7/20  Iteration 1123/3560 Training loss: 1.5991 5.9734 sec/batch
Epoch 7/20  Iteration 1124/3560 Training loss: 1.5991 5.9533 sec/batch
Epoch 7/20  Iteration 1125/3560 Training loss: 1.5994 5.9766 sec/batch
Epoch 7/20  Iteration 1126/3560 Training loss: 1.5990 5.9582 sec/batch
Epoch 7/20  Iteration 1127/3560 Training loss: 1.5982 5.9629 sec/batch
Epoch 7/20  Iteration 1128/3560 Training loss: 1.5987 6.0223 sec/batch
Epoch 7/20  Iteration 1129/3560 Training loss: 1.5984 6.0494 sec/batch
Epoch 7/20  Iteration 1130/3560 Training loss: 1.5991 5.9980 sec/batch
Epoch 7/20  Iteration 1131/3560 Training loss: 1.5995 5.9296 sec/batch
Epoch 7/20  Iteration 1132/3560 Training loss: 1.5997 5.9720 sec/batch
Epoch 7/20  Iteration 1133/3560 Training loss: 1.5994 5.9541 sec/batch
Epoch 7/20  Iteration 1134/3560 Training loss: 1.5994 5.9778 sec/batch
Epoch 7/20  Iteration 1135/3560 Training loss: 1.5994 5.9903 sec/batch
Epoch 7/20  Iteration 1136/3560 Training loss: 1.5991 5.9494 sec/batch
Epoch 7/20  Iteration 1137/3560 Training loss: 1.5989 6.0282 sec/batch
Epoch 7/20  Iteration 1138/3560 Training loss: 1.5988 6.0248 sec/batch
Epoch 7/20  Iteration 1139/3560 Training loss: 1.5992 6.1962 sec/batch
Epoch 7/20  Iteration 1140/3560 Training loss: 1.5994 6.0278 sec/batch
Epoch 7/20  Iteration 1141/3560 Training loss: 1.5996 5.9522 sec/batch
Epoch 7/20  Iteration 1142/3560 Training loss: 1.5992 6.0179 sec/batch
Epoch 7/20  Iteration 1143/3560 Training loss: 1.5989 5.9834 sec/batch
Epoch 7/20  Iteration 1144/3560 Training loss: 1.5992 5.9434 sec/batch
Epoch 7/20  Iteration 1145/3560 Training loss: 1.5989 5.9732 sec/batch
Epoch 7/20  Iteration 1146/3560 Training loss: 1.5987 6.0048 sec/batch
Epoch 7/20  Iteration 1147/3560 Training loss: 1.5980 6.0068 sec/batch
Epoch 7/20  Iteration 1148/3560 Training loss: 1.5977 5.9580 sec/batch
Epoch 7/20  Iteration 1149/3560 Training loss: 1.5971 5.9894 sec/batch
Epoch 7/20  Iteration 1150/3560 Training loss: 1.5969 5.9344 sec/batch
Epoch 7/20  Iteration 1151/3560 Training loss: 1.5962 5.9817 sec/batch
Epoch 7/20  Iteration 1152/3560 Training loss: 1.5960 5.9954 sec/batch
Epoch 7/20  Iteration 1153/3560 Training loss: 1.5955 6.0059 sec/batch
Epoch 7/20  Iteration 1154/3560 Training loss: 1.5951 5.9643 sec/batch
Epoch 7/20  Iteration 1155/3560 Training loss: 1.5947 6.0033 sec/batch
Epoch 7/20  Iteration 1156/3560 Training loss: 1.5943 5.9837 sec/batch
Epoch 7/20  Iteration 1157/3560 Training loss: 1.5936 5.9992 sec/batch
Epoch 7/20  Iteration 1158/3560 Training loss: 1.5936 5.9962 sec/batch
Epoch 7/20  Iteration 1159/3560 Training loss: 1.5930 6.1951 sec/batch
Epoch 7/20  Iteration 1160/3560 Training loss: 1.5926 6.0069 sec/batch
Epoch 7/20  Iteration 1161/3560 Training loss: 1.5921 6.0280 sec/batch
Epoch 7/20  Iteration 1162/3560 Training loss: 1.5916 6.0303 sec/batch
Epoch 7/20  Iteration 1163/3560 Training loss: 1.5911 6.1945 sec/batch
Epoch 7/20  Iteration 1164/3560 Training loss: 1.5910 5.9829 sec/batch
Epoch 7/20  Iteration 1165/3560 Training loss: 1.5909 5.9912 sec/batch
Epoch 7/20  Iteration 1166/3560 Training loss: 1.5903 5.9547 sec/batch
Epoch 7/20  Iteration 1167/3560 Training loss: 1.5899 6.0064 sec/batch
Epoch 7/20  Iteration 1168/3560 Training loss: 1.5893 5.9942 sec/batch
Epoch 7/20  Iteration 1169/3560 Training loss: 1.5891 6.0081 sec/batch
Epoch 7/20  Iteration 1170/3560 Training loss: 1.5888 5.9755 sec/batch
Epoch 7/20  Iteration 1171/3560 Training loss: 1.5885 5.9857 sec/batch
Epoch 7/20  Iteration 1172/3560 Training loss: 1.5883 6.0300 sec/batch
Epoch 7/20  Iteration 1173/3560 Training loss: 1.5879 6.0731 sec/batch
Epoch 7/20  Iteration 1174/3560 Training loss: 1.5877 6.0598 sec/batch
Epoch 7/20  Iteration 1175/3560 Training loss: 1.5873 6.0473 sec/batch
Epoch 7/20  Iteration 1176/3560 Training loss: 1.5871 6.0564 sec/batch
Epoch 7/20  Iteration 1177/3560 Training loss: 1.5869 6.0833 sec/batch
Epoch 7/20  Iteration 1178/3560 Training loss: 1.5868 6.2251 sec/batch
Epoch 7/20  Iteration 1179/3560 Training loss: 1.5865 6.3716 sec/batch
Epoch 7/20  Iteration 1180/3560 Training loss: 1.5863 6.2727 sec/batch
Epoch 7/20  Iteration 1181/3560 Training loss: 1.5860 6.1965 sec/batch
Epoch 7/20  Iteration 1182/3560 Training loss: 1.5858 6.0324 sec/batch
Epoch 7/20  Iteration 1183/3560 Training loss: 1.5854 6.0381 sec/batch
Epoch 7/20  Iteration 1184/3560 Training loss: 1.5849 6.0909 sec/batch
Epoch 7/20  Iteration 1185/3560 Training loss: 1.5848 6.1378 sec/batch
Epoch 7/20  Iteration 1186/3560 Training loss: 1.5846 5.9766 sec/batch
Epoch 7/20  Iteration 1187/3560 Training loss: 1.5843 5.9627 sec/batch
Epoch 7/20  Iteration 1188/3560 Training loss: 1.5841 5.9699 sec/batch
Epoch 7/20  Iteration 1189/3560 Training loss: 1.5839 5.9628 sec/batch
Epoch 7/20  Iteration 1190/3560 Training loss: 1.5834 5.9903 sec/batch
Epoch 7/20  Iteration 1191/3560 Training loss: 1.5829 6.0223 sec/batch
Epoch 7/20  Iteration 1192/3560 Training loss: 1.5829 5.9196 sec/batch
Epoch 7/20  Iteration 1193/3560 Training loss: 1.5827 5.9760 sec/batch
Epoch 7/20  Iteration 1194/3560 Training loss: 1.5822 5.9605 sec/batch
Epoch 7/20  Iteration 1195/3560 Training loss: 1.5822 5.9570 sec/batch
Epoch 7/20  Iteration 1196/3560 Training loss: 1.5822 6.0073 sec/batch
Epoch 7/20  Iteration 1197/3560 Training loss: 1.5820 5.9635 sec/batch
Epoch 7/20  Iteration 1198/3560 Training loss: 1.5817 5.9956 sec/batch
Epoch 7/20  Iteration 1199/3560 Training loss: 1.5812 6.2669 sec/batch
Epoch 7/20  Iteration 1200/3560 Training loss: 1.5808 6.2088 sec/batch
Epoch 7/20  Iteration 1201/3560 Training loss: 1.5807 6.0013 sec/batch
Epoch 7/20  Iteration 1202/3560 Training loss: 1.5806 5.9972 sec/batch
Epoch 7/20  Iteration 1203/3560 Training loss: 1.5805 6.0267 sec/batch
Epoch 7/20  Iteration 1204/3560 Training loss: 1.5804 5.9615 sec/batch
Epoch 7/20  Iteration 1205/3560 Training loss: 1.5805 5.9938 sec/batch
Epoch 7/20  Iteration 1206/3560 Training loss: 1.5804 6.0022 sec/batch
Epoch 7/20  Iteration 1207/3560 Training loss: 1.5803 5.9866 sec/batch
Epoch 7/20  Iteration 1208/3560 Training loss: 1.5801 5.9962 sec/batch
Epoch 7/20  Iteration 1209/3560 Training loss: 1.5803 5.9860 sec/batch
Epoch 7/20  Iteration 1210/3560 Training loss: 1.5802 5.9432 sec/batch
Epoch 7/20  Iteration 1211/3560 Training loss: 1.5800 5.9903 sec/batch
Epoch 7/20  Iteration 1212/3560 Training loss: 1.5800 6.3555 sec/batch
Epoch 7/20  Iteration 1213/3560 Training loss: 1.5799 5.9684 sec/batch
Epoch 7/20  Iteration 1214/3560 Training loss: 1.5798 5.9733 sec/batch
Epoch 7/20  Iteration 1215/3560 Training loss: 1.5797 6.0023 sec/batch
Epoch 7/20  Iteration 1216/3560 Training loss: 1.5797 5.9846 sec/batch
Epoch 7/20  Iteration 1217/3560 Training loss: 1.5797 5.9913 sec/batch
Epoch 7/20  Iteration 1218/3560 Training loss: 1.5795 5.9912 sec/batch
Epoch 7/20  Iteration 1219/3560 Training loss: 1.5791 6.1763 sec/batch
Epoch 7/20  Iteration 1220/3560 Training loss: 1.5790 6.0532 sec/batch
Epoch 7/20  Iteration 1221/3560 Training loss: 1.5789 5.9713 sec/batch
Epoch 7/20  Iteration 1222/3560 Training loss: 1.5787 5.9597 sec/batch
Epoch 7/20  Iteration 1223/3560 Training loss: 1.5786 5.9912 sec/batch
Epoch 7/20  Iteration 1224/3560 Training loss: 1.5784 5.9713 sec/batch
Epoch 7/20  Iteration 1225/3560 Training loss: 1.5783 5.9684 sec/batch
Epoch 7/20  Iteration 1226/3560 Training loss: 1.5781 6.0034 sec/batch
Epoch 7/20  Iteration 1227/3560 Training loss: 1.5777 5.9197 sec/batch
Epoch 7/20  Iteration 1228/3560 Training loss: 1.5776 5.9738 sec/batch
Epoch 7/20  Iteration 1229/3560 Training loss: 1.5777 5.9819 sec/batch
Epoch 7/20  Iteration 1230/3560 Training loss: 1.5776 5.9771 sec/batch
Epoch 7/20  Iteration 1231/3560 Training loss: 1.5774 6.0127 sec/batch
Epoch 7/20  Iteration 1232/3560 Training loss: 1.5773 5.9794 sec/batch
Epoch 7/20  Iteration 1233/3560 Training loss: 1.5772 5.9718 sec/batch
Epoch 7/20  Iteration 1234/3560 Training loss: 1.5769 6.0155 sec/batch
Epoch 7/20  Iteration 1235/3560 Training loss: 1.5770 5.9547 sec/batch
Epoch 7/20  Iteration 1236/3560 Training loss: 1.5773 5.9746 sec/batch
Epoch 7/20  Iteration 1237/3560 Training loss: 1.5771 5.9786 sec/batch
Epoch 7/20  Iteration 1238/3560 Training loss: 1.5769 6.0087 sec/batch
Epoch 7/20  Iteration 1239/3560 Training loss: 1.5767 6.1827 sec/batch
Epoch 7/20  Iteration 1240/3560 Training loss: 1.5764 6.0570 sec/batch
Epoch 7/20  Iteration 1241/3560 Training loss: 1.5764 6.0149 sec/batch
Epoch 7/20  Iteration 1242/3560 Training loss: 1.5762 6.0085 sec/batch
Epoch 7/20  Iteration 1243/3560 Training loss: 1.5762 6.0223 sec/batch
Epoch 7/20  Iteration 1244/3560 Training loss: 1.5759 5.9374 sec/batch
Epoch 7/20  Iteration 1245/3560 Training loss: 1.5756 5.9133 sec/batch
Epoch 7/20  Iteration 1246/3560 Training loss: 1.5756 5.9737 sec/batch
Epoch 8/20  Iteration 1247/3560 Training loss: 1.6400 5.9904 sec/batch
Epoch 8/20  Iteration 1248/3560 Training loss: 1.5955 5.9823 sec/batch
Epoch 8/20  Iteration 1249/3560 Training loss: 1.5790 6.0052 sec/batch
Epoch 8/20  Iteration 1250/3560 Training loss: 1.5712 5.9901 sec/batch
Epoch 8/20  Iteration 1251/3560 Training loss: 1.5639 5.9839 sec/batch
Epoch 8/20  Iteration 1252/3560 Training loss: 1.5519 5.9797 sec/batch
Epoch 8/20  Iteration 1253/3560 Training loss: 1.5515 5.9874 sec/batch
Epoch 8/20  Iteration 1254/3560 Training loss: 1.5496 5.9975 sec/batch
Epoch 8/20  Iteration 1255/3560 Training loss: 1.5502 5.9833 sec/batch
Epoch 8/20  Iteration 1256/3560 Training loss: 1.5495 5.9828 sec/batch
Epoch 8/20  Iteration 1257/3560 Training loss: 1.5466 5.9821 sec/batch
Epoch 8/20  Iteration 1258/3560 Training loss: 1.5451 6.0033 sec/batch
Epoch 8/20  Iteration 1259/3560 Training loss: 1.5453 6.2669 sec/batch
Epoch 8/20  Iteration 1260/3560 Training loss: 1.5478 6.0177 sec/batch
Epoch 8/20  Iteration 1261/3560 Training loss: 1.5463 5.9993 sec/batch
Epoch 8/20  Iteration 1262/3560 Training loss: 1.5453 6.2331 sec/batch
Epoch 8/20  Iteration 1263/3560 Training loss: 1.5459 5.9562 sec/batch
Epoch 8/20  Iteration 1264/3560 Training loss: 1.5474 5.9906 sec/batch
Epoch 8/20  Iteration 1265/3560 Training loss: 1.5470 6.0633 sec/batch
Epoch 8/20  Iteration 1266/3560 Training loss: 1.5485 5.9968 sec/batch
Epoch 8/20  Iteration 1267/3560 Training loss: 1.5476 5.9948 sec/batch
Epoch 8/20  Iteration 1268/3560 Training loss: 1.5482 5.9636 sec/batch
Epoch 8/20  Iteration 1269/3560 Training loss: 1.5473 6.0039 sec/batch
Epoch 8/20  Iteration 1270/3560 Training loss: 1.5467 5.9510 sec/batch
Epoch 8/20  Iteration 1271/3560 Training loss: 1.5465 5.9841 sec/batch
Epoch 8/20  Iteration 1272/3560 Training loss: 1.5451 5.9677 sec/batch
Epoch 8/20  Iteration 1273/3560 Training loss: 1.5442 5.9707 sec/batch
Epoch 8/20  Iteration 1274/3560 Training loss: 1.5441 5.9461 sec/batch
Epoch 8/20  Iteration 1275/3560 Training loss: 1.5444 5.9562 sec/batch
Epoch 8/20  Iteration 1276/3560 Training loss: 1.5445 5.9610 sec/batch
Epoch 8/20  Iteration 1277/3560 Training loss: 1.5441 5.9873 sec/batch
Epoch 8/20  Iteration 1278/3560 Training loss: 1.5432 5.9760 sec/batch
Epoch 8/20  Iteration 1279/3560 Training loss: 1.5433 6.2253 sec/batch
Epoch 8/20  Iteration 1280/3560 Training loss: 1.5435 5.9990 sec/batch
Epoch 8/20  Iteration 1281/3560 Training loss: 1.5433 5.9811 sec/batch
Epoch 8/20  Iteration 1282/3560 Training loss: 1.5428 5.9679 sec/batch
Epoch 8/20  Iteration 1283/3560 Training loss: 1.5418 5.9707 sec/batch
Epoch 8/20  Iteration 1284/3560 Training loss: 1.5410 6.0138 sec/batch
Epoch 8/20  Iteration 1285/3560 Training loss: 1.5395 5.9846 sec/batch
Epoch 8/20  Iteration 1286/3560 Training loss: 1.5386 6.0192 sec/batch
Epoch 8/20  Iteration 1287/3560 Training loss: 1.5379 5.9580 sec/batch
Epoch 8/20  Iteration 1288/3560 Training loss: 1.5385 5.9936 sec/batch
Epoch 8/20  Iteration 1289/3560 Training loss: 1.5379 6.0069 sec/batch
Epoch 8/20  Iteration 1290/3560 Training loss: 1.5369 5.9665 sec/batch
Epoch 8/20  Iteration 1291/3560 Training loss: 1.5370 5.9580 sec/batch
Epoch 8/20  Iteration 1292/3560 Training loss: 1.5363 6.0438 sec/batch
Epoch 8/20  Iteration 1293/3560 Training loss: 1.5358 5.9543 sec/batch
Epoch 8/20  Iteration 1294/3560 Training loss: 1.5352 5.9564 sec/batch
Epoch 8/20  Iteration 1295/3560 Training loss: 1.5351 5.9926 sec/batch
Epoch 8/20  Iteration 1296/3560 Training loss: 1.5353 5.9483 sec/batch
Epoch 8/20  Iteration 1297/3560 Training loss: 1.5346 5.9985 sec/batch
Epoch 8/20  Iteration 1298/3560 Training loss: 1.5353 5.9160 sec/batch
Epoch 8/20  Iteration 1299/3560 Training loss: 1.5351 6.2619 sec/batch
Epoch 8/20  Iteration 1300/3560 Training loss: 1.5350 6.0152 sec/batch
Epoch 8/20  Iteration 1301/3560 Training loss: 1.5346 5.9988 sec/batch
Epoch 8/20  Iteration 1302/3560 Training loss: 1.5345 5.9872 sec/batch
Epoch 8/20  Iteration 1303/3560 Training loss: 1.5347 5.9988 sec/batch
Epoch 8/20  Iteration 1304/3560 Training loss: 1.5342 6.0067 sec/batch
Epoch 8/20  Iteration 1305/3560 Training loss: 1.5337 6.0027 sec/batch
Epoch 8/20  Iteration 1306/3560 Training loss: 1.5342 5.9889 sec/batch
Epoch 8/20  Iteration 1307/3560 Training loss: 1.5339 5.9978 sec/batch
Epoch 8/20  Iteration 1308/3560 Training loss: 1.5346 5.9860 sec/batch
Epoch 8/20  Iteration 1309/3560 Training loss: 1.5348 5.9624 sec/batch
Epoch 8/20  Iteration 1310/3560 Training loss: 1.5350 5.9810 sec/batch
Epoch 8/20  Iteration 1311/3560 Training loss: 1.5348 5.9709 sec/batch
Epoch 8/20  Iteration 1312/3560 Training loss: 1.5348 6.5053 sec/batch
Epoch 8/20  Iteration 1313/3560 Training loss: 1.5350 5.9785 sec/batch
Epoch 8/20  Iteration 1314/3560 Training loss: 1.5346 5.9805 sec/batch
Epoch 8/20  Iteration 1315/3560 Training loss: 1.5345 6.0618 sec/batch
Epoch 8/20  Iteration 1316/3560 Training loss: 1.5344 5.9517 sec/batch
Epoch 8/20  Iteration 1317/3560 Training loss: 1.5348 5.9838 sec/batch
Epoch 8/20  Iteration 1318/3560 Training loss: 1.5349 6.0152 sec/batch
Epoch 8/20  Iteration 1319/3560 Training loss: 1.5351 6.2823 sec/batch
Epoch 8/20  Iteration 1320/3560 Training loss: 1.5346 5.9329 sec/batch
Epoch 8/20  Iteration 1321/3560 Training loss: 1.5343 6.0120 sec/batch
Epoch 8/20  Iteration 1322/3560 Training loss: 1.5344 5.9798 sec/batch
Epoch 8/20  Iteration 1323/3560 Training loss: 1.5342 6.0698 sec/batch
Epoch 8/20  Iteration 1324/3560 Training loss: 1.5340 5.9727 sec/batch
Epoch 8/20  Iteration 1325/3560 Training loss: 1.5335 5.9529 sec/batch
Epoch 8/20  Iteration 1326/3560 Training loss: 1.5333 5.9735 sec/batch
Epoch 8/20  Iteration 1327/3560 Training loss: 1.5327 5.9495 sec/batch
Epoch 8/20  Iteration 1328/3560 Training loss: 1.5327 5.9280 sec/batch
Epoch 8/20  Iteration 1329/3560 Training loss: 1.5322 5.9901 sec/batch
Epoch 8/20  Iteration 1330/3560 Training loss: 1.5321 5.9707 sec/batch
Epoch 8/20  Iteration 1331/3560 Training loss: 1.5317 5.9831 sec/batch
Epoch 8/20  Iteration 1332/3560 Training loss: 1.5315 6.0884 sec/batch
Epoch 8/20  Iteration 1333/3560 Training loss: 1.5312 5.9888 sec/batch
Epoch 8/20  Iteration 1334/3560 Training loss: 1.5307 5.9512 sec/batch
Epoch 8/20  Iteration 1335/3560 Training loss: 1.5302 5.9606 sec/batch
Epoch 8/20  Iteration 1336/3560 Training loss: 1.5302 5.9819 sec/batch
Epoch 8/20  Iteration 1337/3560 Training loss: 1.5297 6.0000 sec/batch
Epoch 8/20  Iteration 1338/3560 Training loss: 1.5295 5.9442 sec/batch
Epoch 8/20  Iteration 1339/3560 Training loss: 1.5289 6.2783 sec/batch
Epoch 8/20  Iteration 1340/3560 Training loss: 1.5284 6.0022 sec/batch
Epoch 8/20  Iteration 1341/3560 Training loss: 1.5280 5.9876 sec/batch
Epoch 8/20  Iteration 1342/3560 Training loss: 1.5279 6.0061 sec/batch
Epoch 8/20  Iteration 1343/3560 Training loss: 1.5277 5.9687 sec/batch
Epoch 8/20  Iteration 1344/3560 Training loss: 1.5272 5.9806 sec/batch
Epoch 8/20  Iteration 1345/3560 Training loss: 1.5268 5.9685 sec/batch
Epoch 8/20  Iteration 1346/3560 Training loss: 1.5262 6.0073 sec/batch
Epoch 8/20  Iteration 1347/3560 Training loss: 1.5262 5.9739 sec/batch
Epoch 8/20  Iteration 1348/3560 Training loss: 1.5259 5.9668 sec/batch
Epoch 8/20  Iteration 1349/3560 Training loss: 1.5258 5.9513 sec/batch
Epoch 8/20  Iteration 1350/3560 Training loss: 1.5255 5.9724 sec/batch
Epoch 8/20  Iteration 1351/3560 Training loss: 1.5252 5.9303 sec/batch
Epoch 8/20  Iteration 1352/3560 Training loss: 1.5250 5.9741 sec/batch
Epoch 8/20  Iteration 1353/3560 Training loss: 1.5248 5.9707 sec/batch
Epoch 8/20  Iteration 1354/3560 Training loss: 1.5247 5.9574 sec/batch
Epoch 8/20  Iteration 1355/3560 Training loss: 1.5245 5.9883 sec/batch
Epoch 8/20  Iteration 1356/3560 Training loss: 1.5244 6.0286 sec/batch
Epoch 8/20  Iteration 1357/3560 Training loss: 1.5241 6.0373 sec/batch
Epoch 8/20  Iteration 1358/3560 Training loss: 1.5239 5.9733 sec/batch
Epoch 8/20  Iteration 1359/3560 Training loss: 1.5237 6.3049 sec/batch
Epoch 8/20  Iteration 1360/3560 Training loss: 1.5234 6.0070 sec/batch
Epoch 8/20  Iteration 1361/3560 Training loss: 1.5230 5.9996 sec/batch
Epoch 8/20  Iteration 1362/3560 Training loss: 1.5226 6.3008 sec/batch
Epoch 8/20  Iteration 1363/3560 Training loss: 1.5224 5.9949 sec/batch
Epoch 8/20  Iteration 1364/3560 Training loss: 1.5222 5.9894 sec/batch
Epoch 8/20  Iteration 1365/3560 Training loss: 1.5220 5.9775 sec/batch
Epoch 8/20  Iteration 1366/3560 Training loss: 1.5218 5.9714 sec/batch
Epoch 8/20  Iteration 1367/3560 Training loss: 1.5215 5.9461 sec/batch
Epoch 8/20  Iteration 1368/3560 Training loss: 1.5211 5.9843 sec/batch
Epoch 8/20  Iteration 1369/3560 Training loss: 1.5207 5.9249 sec/batch
Epoch 8/20  Iteration 1370/3560 Training loss: 1.5207 5.9776 sec/batch
Epoch 8/20  Iteration 1371/3560 Training loss: 1.5206 5.9236 sec/batch
Epoch 8/20  Iteration 1372/3560 Training loss: 1.5202 6.0126 sec/batch
Epoch 8/20  Iteration 1373/3560 Training loss: 1.5202 6.0526 sec/batch
Epoch 8/20  Iteration 1374/3560 Training loss: 1.5201 5.9964 sec/batch
Epoch 8/20  Iteration 1375/3560 Training loss: 1.5198 5.9558 sec/batch
Epoch 8/20  Iteration 1376/3560 Training loss: 1.5195 5.9762 sec/batch
Epoch 8/20  Iteration 1377/3560 Training loss: 1.5190 5.9708 sec/batch
Epoch 8/20  Iteration 1378/3560 Training loss: 1.5187 5.9521 sec/batch
Epoch 8/20  Iteration 1379/3560 Training loss: 1.5187 6.2590 sec/batch
Epoch 8/20  Iteration 1380/3560 Training loss: 1.5187 6.0128 sec/batch
Epoch 8/20  Iteration 1381/3560 Training loss: 1.5185 5.9793 sec/batch
Epoch 8/20  Iteration 1382/3560 Training loss: 1.5185 5.9576 sec/batch
Epoch 8/20  Iteration 1383/3560 Training loss: 1.5186 5.9664 sec/batch
Epoch 8/20  Iteration 1384/3560 Training loss: 1.5186 5.9520 sec/batch
Epoch 8/20  Iteration 1385/3560 Training loss: 1.5186 5.9783 sec/batch
Epoch 8/20  Iteration 1386/3560 Training loss: 1.5184 5.9756 sec/batch
Epoch 8/20  Iteration 1387/3560 Training loss: 1.5186 5.9956 sec/batch
Epoch 8/20  Iteration 1388/3560 Training loss: 1.5185 5.9908 sec/batch
Epoch 8/20  Iteration 1389/3560 Training loss: 1.5184 5.9669 sec/batch
Epoch 8/20  Iteration 1390/3560 Training loss: 1.5184 5.9258 sec/batch
Epoch 8/20  Iteration 1391/3560 Training loss: 1.5182 5.9799 sec/batch
Epoch 8/20  Iteration 1392/3560 Training loss: 1.5181 5.9962 sec/batch
Epoch 8/20  Iteration 1393/3560 Training loss: 1.5181 5.9580 sec/batch
Epoch 8/20  Iteration 1394/3560 Training loss: 1.5181 5.9638 sec/batch
Epoch 8/20  Iteration 1395/3560 Training loss: 1.5181 5.9466 sec/batch
Epoch 8/20  Iteration 1396/3560 Training loss: 1.5179 5.9776 sec/batch
Epoch 8/20  Iteration 1397/3560 Training loss: 1.5176 6.0128 sec/batch
Epoch 8/20  Iteration 1398/3560 Training loss: 1.5175 5.9490 sec/batch
Epoch 8/20  Iteration 1399/3560 Training loss: 1.5175 6.2811 sec/batch
Epoch 8/20  Iteration 1400/3560 Training loss: 1.5174 5.9909 sec/batch
Epoch 8/20  Iteration 1401/3560 Training loss: 1.5173 6.0311 sec/batch
Epoch 8/20  Iteration 1402/3560 Training loss: 1.5172 5.9863 sec/batch
Epoch 8/20  Iteration 1403/3560 Training loss: 1.5171 6.0288 sec/batch
Epoch 8/20  Iteration 1404/3560 Training loss: 1.5170 5.9737 sec/batch
Epoch 8/20  Iteration 1405/3560 Training loss: 1.5167 5.9459 sec/batch
Epoch 8/20  Iteration 1406/3560 Training loss: 1.5167 6.0006 sec/batch
Epoch 8/20  Iteration 1407/3560 Training loss: 1.5169 5.9655 sec/batch
Epoch 8/20  Iteration 1408/3560 Training loss: 1.5168 5.9919 sec/batch
Epoch 8/20  Iteration 1409/3560 Training loss: 1.5166 6.0018 sec/batch
Epoch 8/20  Iteration 1410/3560 Training loss: 1.5166 6.0485 sec/batch
Epoch 8/20  Iteration 1411/3560 Training loss: 1.5165 6.0319 sec/batch
Epoch 8/20  Iteration 1412/3560 Training loss: 1.5164 6.2700 sec/batch
Epoch 8/20  Iteration 1413/3560 Training loss: 1.5164 5.9581 sec/batch
Epoch 8/20  Iteration 1414/3560 Training loss: 1.5168 5.9898 sec/batch
Epoch 8/20  Iteration 1415/3560 Training loss: 1.5167 5.9432 sec/batch
Epoch 8/20  Iteration 1416/3560 Training loss: 1.5165 5.9927 sec/batch
Epoch 8/20  Iteration 1417/3560 Training loss: 1.5164 6.0149 sec/batch
Epoch 8/20  Iteration 1418/3560 Training loss: 1.5160 5.9535 sec/batch
Epoch 8/20  Iteration 1419/3560 Training loss: 1.5160 6.2679 sec/batch
Epoch 8/20  Iteration 1420/3560 Training loss: 1.5159 5.9551 sec/batch
Epoch 8/20  Iteration 1421/3560 Training loss: 1.5160 5.9873 sec/batch
Epoch 8/20  Iteration 1422/3560 Training loss: 1.5157 5.9553 sec/batch
Epoch 8/20  Iteration 1423/3560 Training loss: 1.5154 5.9830 sec/batch
Epoch 8/20  Iteration 1424/3560 Training loss: 1.5154 5.9497 sec/batch
Epoch 9/20  Iteration 1425/3560 Training loss: 1.5908 5.9906 sec/batch
Epoch 9/20  Iteration 1426/3560 Training loss: 1.5462 5.9732 sec/batch
Epoch 9/20  Iteration 1427/3560 Training loss: 1.5289 5.9867 sec/batch
Epoch 9/20  Iteration 1428/3560 Training loss: 1.5235 5.9826 sec/batch
Epoch 9/20  Iteration 1429/3560 Training loss: 1.5133 6.0876 sec/batch
Epoch 9/20  Iteration 1430/3560 Training loss: 1.5031 5.9408 sec/batch
Epoch 9/20  Iteration 1431/3560 Training loss: 1.5029 5.9577 sec/batch
Epoch 9/20  Iteration 1432/3560 Training loss: 1.5005 5.9606 sec/batch
Epoch 9/20  Iteration 1433/3560 Training loss: 1.5009 5.9744 sec/batch
Epoch 9/20  Iteration 1434/3560 Training loss: 1.4995 5.9386 sec/batch
Epoch 9/20  Iteration 1435/3560 Training loss: 1.4953 5.9727 sec/batch
Epoch 9/20  Iteration 1436/3560 Training loss: 1.4938 6.0314 sec/batch
Epoch 9/20  Iteration 1437/3560 Training loss: 1.4927 5.9857 sec/batch
Epoch 9/20  Iteration 1438/3560 Training loss: 1.4948 5.9656 sec/batch
Epoch 9/20  Iteration 1439/3560 Training loss: 1.4933 6.2281 sec/batch
Epoch 9/20  Iteration 1440/3560 Training loss: 1.4918 5.9269 sec/batch
Epoch 9/20  Iteration 1441/3560 Training loss: 1.4923 5.9982 sec/batch
Epoch 9/20  Iteration 1442/3560 Training loss: 1.4935 5.9663 sec/batch
Epoch 9/20  Iteration 1443/3560 Training loss: 1.4933 6.0556 sec/batch
Epoch 9/20  Iteration 1444/3560 Training loss: 1.4942 5.9912 sec/batch
Epoch 9/20  Iteration 1445/3560 Training loss: 1.4931 5.9190 sec/batch
Epoch 9/20  Iteration 1446/3560 Training loss: 1.4939 5.9977 sec/batch
Epoch 9/20  Iteration 1447/3560 Training loss: 1.4932 5.9470 sec/batch
Epoch 9/20  Iteration 1448/3560 Training loss: 1.4932 5.9310 sec/batch
Epoch 9/20  Iteration 1449/3560 Training loss: 1.4929 5.9389 sec/batch
Epoch 9/20  Iteration 1450/3560 Training loss: 1.4912 5.9545 sec/batch
Epoch 9/20  Iteration 1451/3560 Training loss: 1.4897 5.9867 sec/batch
Epoch 9/20  Iteration 1452/3560 Training loss: 1.4902 5.9241 sec/batch
Epoch 9/20  Iteration 1453/3560 Training loss: 1.4906 5.9909 sec/batch
Epoch 9/20  Iteration 1454/3560 Training loss: 1.4908 5.9862 sec/batch
Epoch 9/20  Iteration 1455/3560 Training loss: 1.4906 5.9705 sec/batch
Epoch 9/20  Iteration 1456/3560 Training loss: 1.4894 5.9823 sec/batch
Epoch 9/20  Iteration 1457/3560 Training loss: 1.4896 6.0197 sec/batch
Epoch 9/20  Iteration 1458/3560 Training loss: 1.4898 5.9562 sec/batch
Epoch 9/20  Iteration 1459/3560 Training loss: 1.4892 6.2277 sec/batch
Epoch 9/20  Iteration 1460/3560 Training loss: 1.4888 6.0019 sec/batch
Epoch 9/20  Iteration 1461/3560 Training loss: 1.4879 6.0273 sec/batch
Epoch 9/20  Iteration 1462/3560 Training loss: 1.4867 6.3272 sec/batch
Epoch 9/20  Iteration 1463/3560 Training loss: 1.4852 6.0184 sec/batch
Epoch 9/20  Iteration 1464/3560 Training loss: 1.4845 6.0029 sec/batch
Epoch 9/20  Iteration 1465/3560 Training loss: 1.4837 5.9450 sec/batch
Epoch 9/20  Iteration 1466/3560 Training loss: 1.4842 5.9718 sec/batch
Epoch 9/20  Iteration 1467/3560 Training loss: 1.4833 6.0121 sec/batch
Epoch 9/20  Iteration 1468/3560 Training loss: 1.4826 5.9646 sec/batch
Epoch 9/20  Iteration 1469/3560 Training loss: 1.4826 6.0014 sec/batch
Epoch 9/20  Iteration 1470/3560 Training loss: 1.4818 5.9894 sec/batch
Epoch 9/20  Iteration 1471/3560 Training loss: 1.4814 6.0338 sec/batch
Epoch 9/20  Iteration 1472/3560 Training loss: 1.4811 5.9674 sec/batch
Epoch 9/20  Iteration 1473/3560 Training loss: 1.4808 6.0516 sec/batch
Epoch 9/20  Iteration 1474/3560 Training loss: 1.4812 5.9635 sec/batch
Epoch 9/20  Iteration 1475/3560 Training loss: 1.4807 6.0224 sec/batch
Epoch 9/20  Iteration 1476/3560 Training loss: 1.4815 5.9471 sec/batch
Epoch 9/20  Iteration 1477/3560 Training loss: 1.4818 5.9698 sec/batch
Epoch 9/20  Iteration 1478/3560 Training loss: 1.4818 5.9759 sec/batch
Epoch 9/20  Iteration 1479/3560 Training loss: 1.4815 6.2193 sec/batch
Epoch 9/20  Iteration 1480/3560 Training loss: 1.4816 5.9915 sec/batch
Epoch 9/20  Iteration 1481/3560 Training loss: 1.4819 6.0164 sec/batch
Epoch 9/20  Iteration 1482/3560 Training loss: 1.4813 5.9542 sec/batch
Epoch 9/20  Iteration 1483/3560 Training loss: 1.4807 5.9908 sec/batch
Epoch 9/20  Iteration 1484/3560 Training loss: 1.4812 5.9908 sec/batch
Epoch 9/20  Iteration 1485/3560 Training loss: 1.4810 5.9718 sec/batch
Epoch 9/20  Iteration 1486/3560 Training loss: 1.4819 6.0117 sec/batch
Epoch 9/20  Iteration 1487/3560 Training loss: 1.4821 6.0179 sec/batch
Epoch 9/20  Iteration 1488/3560 Training loss: 1.4822 5.9875 sec/batch
Epoch 9/20  Iteration 1489/3560 Training loss: 1.4820 5.9857 sec/batch
Epoch 9/20  Iteration 1490/3560 Training loss: 1.4821 5.9799 sec/batch
Epoch 9/20  Iteration 1491/3560 Training loss: 1.4823 6.0171 sec/batch
Epoch 9/20  Iteration 1492/3560 Training loss: 1.4818 5.9797 sec/batch
Epoch 9/20  Iteration 1493/3560 Training loss: 1.4817 5.9457 sec/batch
Epoch 9/20  Iteration 1494/3560 Training loss: 1.4816 5.9652 sec/batch
Epoch 9/20  Iteration 1495/3560 Training loss: 1.4820 5.9961 sec/batch
Epoch 9/20  Iteration 1496/3560 Training loss: 1.4822 5.9817 sec/batch
Epoch 9/20  Iteration 1497/3560 Training loss: 1.4826 5.9692 sec/batch
Epoch 9/20  Iteration 1498/3560 Training loss: 1.4821 5.9493 sec/batch
Epoch 9/20  Iteration 1499/3560 Training loss: 1.4819 6.2321 sec/batch
Epoch 9/20  Iteration 1500/3560 Training loss: 1.4820 5.9645 sec/batch
Epoch 9/20  Iteration 1501/3560 Training loss: 1.4817 6.0046 sec/batch
Epoch 9/20  Iteration 1502/3560 Training loss: 1.4816 5.9827 sec/batch
Epoch 9/20  Iteration 1503/3560 Training loss: 1.4810 5.9999 sec/batch
Epoch 9/20  Iteration 1504/3560 Training loss: 1.4809 5.9592 sec/batch
Epoch 9/20  Iteration 1505/3560 Training loss: 1.4802 5.9602 sec/batch
Epoch 9/20  Iteration 1506/3560 Training loss: 1.4801 5.9331 sec/batch
Epoch 9/20  Iteration 1507/3560 Training loss: 1.4795 6.1833 sec/batch
Epoch 9/20  Iteration 1508/3560 Training loss: 1.4794 5.9589 sec/batch
Epoch 9/20  Iteration 1509/3560 Training loss: 1.4791 6.0205 sec/batch
Epoch 9/20  Iteration 1510/3560 Training loss: 1.4788 5.9660 sec/batch
Epoch 9/20  Iteration 1511/3560 Training loss: 1.4783 6.0149 sec/batch
Epoch 9/20  Iteration 1512/3560 Training loss: 1.4780 6.3217 sec/batch
Epoch 9/20  Iteration 1513/3560 Training loss: 1.4774 5.9638 sec/batch
Epoch 9/20  Iteration 1514/3560 Training loss: 1.4772 5.9813 sec/batch
Epoch 9/20  Iteration 1515/3560 Training loss: 1.4769 6.0503 sec/batch
Epoch 9/20  Iteration 1516/3560 Training loss: 1.4766 5.9859 sec/batch
Epoch 9/20  Iteration 1517/3560 Training loss: 1.4761 6.0143 sec/batch
Epoch 9/20  Iteration 1518/3560 Training loss: 1.4756 6.1799 sec/batch
Epoch 9/20  Iteration 1519/3560 Training loss: 1.4752 6.0585 sec/batch
Epoch 9/20  Iteration 1520/3560 Training loss: 1.4752 6.0851 sec/batch
Epoch 9/20  Iteration 1521/3560 Training loss: 1.4750 5.9893 sec/batch
Epoch 9/20  Iteration 1522/3560 Training loss: 1.4745 5.9566 sec/batch
Epoch 9/20  Iteration 1523/3560 Training loss: 1.4741 5.9847 sec/batch
Epoch 9/20  Iteration 1524/3560 Training loss: 1.4736 5.9575 sec/batch
Epoch 9/20  Iteration 1525/3560 Training loss: 1.4735 5.9667 sec/batch
Epoch 9/20  Iteration 1526/3560 Training loss: 1.4733 5.9791 sec/batch
Epoch 9/20  Iteration 1527/3560 Training loss: 1.4732 5.9538 sec/batch
Epoch 9/20  Iteration 1528/3560 Training loss: 1.4728 5.9812 sec/batch
Epoch 9/20  Iteration 1529/3560 Training loss: 1.4724 5.9910 sec/batch
Epoch 9/20  Iteration 1530/3560 Training loss: 1.4723 5.9953 sec/batch
Epoch 9/20  Iteration 1531/3560 Training loss: 1.4720 5.9313 sec/batch
Epoch 9/20  Iteration 1532/3560 Training loss: 1.4718 5.9773 sec/batch
Epoch 9/20  Iteration 1533/3560 Training loss: 1.4716 5.9838 sec/batch
Epoch 9/20  Iteration 1534/3560 Training loss: 1.4715 5.9759 sec/batch
Epoch 9/20  Iteration 1535/3560 Training loss: 1.4712 5.9873 sec/batch
Epoch 9/20  Iteration 1536/3560 Training loss: 1.4710 5.9818 sec/batch
Epoch 9/20  Iteration 1537/3560 Training loss: 1.4708 5.9747 sec/batch
Epoch 9/20  Iteration 1538/3560 Training loss: 1.4706 6.0659 sec/batch
Epoch 9/20  Iteration 1539/3560 Training loss: 1.4702 6.1961 sec/batch
Epoch 9/20  Iteration 1540/3560 Training loss: 1.4698 5.9597 sec/batch
Epoch 9/20  Iteration 1541/3560 Training loss: 1.4698 6.0042 sec/batch
Epoch 9/20  Iteration 1542/3560 Training loss: 1.4698 6.0043 sec/batch
Epoch 9/20  Iteration 1543/3560 Training loss: 1.4696 5.9754 sec/batch
Epoch 9/20  Iteration 1544/3560 Training loss: 1.4695 6.0203 sec/batch
Epoch 9/20  Iteration 1545/3560 Training loss: 1.4694 6.0083 sec/batch
Epoch 9/20  Iteration 1546/3560 Training loss: 1.4690 5.9618 sec/batch
Epoch 9/20  Iteration 1547/3560 Training loss: 1.4685 6.0002 sec/batch
Epoch 9/20  Iteration 1548/3560 Training loss: 1.4685 5.9981 sec/batch
Epoch 9/20  Iteration 1549/3560 Training loss: 1.4683 5.9926 sec/batch
Epoch 9/20  Iteration 1550/3560 Training loss: 1.4678 6.0066 sec/batch
Epoch 9/20  Iteration 1551/3560 Training loss: 1.4678 5.9745 sec/batch
Epoch 9/20  Iteration 1552/3560 Training loss: 1.4678 5.9476 sec/batch
Epoch 9/20  Iteration 1553/3560 Training loss: 1.4676 5.9810 sec/batch
Epoch 9/20  Iteration 1554/3560 Training loss: 1.4672 5.9662 sec/batch
Epoch 9/20  Iteration 1555/3560 Training loss: 1.4667 5.9715 sec/batch
Epoch 9/20  Iteration 1556/3560 Training loss: 1.4665 5.9888 sec/batch
Epoch 9/20  Iteration 1557/3560 Training loss: 1.4664 6.0074 sec/batch
Epoch 9/20  Iteration 1558/3560 Training loss: 1.4663 6.1196 sec/batch
Epoch 9/20  Iteration 1559/3560 Training loss: 1.4663 6.0619 sec/batch
Epoch 9/20  Iteration 1560/3560 Training loss: 1.4662 5.9982 sec/batch
Epoch 9/20  Iteration 1561/3560 Training loss: 1.4663 6.0303 sec/batch
Epoch 9/20  Iteration 1562/3560 Training loss: 1.4662 6.3466 sec/batch
Epoch 9/20  Iteration 1563/3560 Training loss: 1.4661 5.9496 sec/batch
Epoch 9/20  Iteration 1564/3560 Training loss: 1.4660 5.9912 sec/batch
Epoch 9/20  Iteration 1565/3560 Training loss: 1.4663 5.9557 sec/batch
Epoch 9/20  Iteration 1566/3560 Training loss: 1.4662 6.0089 sec/batch
Epoch 9/20  Iteration 1567/3560 Training loss: 1.4661 5.9827 sec/batch
Epoch 9/20  Iteration 1568/3560 Training loss: 1.4661 5.9881 sec/batch
Epoch 9/20  Iteration 1569/3560 Training loss: 1.4660 6.0054 sec/batch
Epoch 9/20  Iteration 1570/3560 Training loss: 1.4660 5.9645 sec/batch
Epoch 9/20  Iteration 1571/3560 Training loss: 1.4659 5.9774 sec/batch
Epoch 9/20  Iteration 1572/3560 Training loss: 1.4661 5.9699 sec/batch
Epoch 9/20  Iteration 1573/3560 Training loss: 1.4660 5.9908 sec/batch
Epoch 9/20  Iteration 1574/3560 Training loss: 1.4658 5.9339 sec/batch
Epoch 9/20  Iteration 1575/3560 Training loss: 1.4653 5.9813 sec/batch
Epoch 9/20  Iteration 1576/3560 Training loss: 1.4652 5.9819 sec/batch
Epoch 9/20  Iteration 1577/3560 Training loss: 1.4652 5.9945 sec/batch
Epoch 9/20  Iteration 1578/3560 Training loss: 1.4651 6.1851 sec/batch
Epoch 9/20  Iteration 1579/3560 Training loss: 1.4650 6.0873 sec/batch
Epoch 9/20  Iteration 1580/3560 Training loss: 1.4649 5.9421 sec/batch
Epoch 9/20  Iteration 1581/3560 Training loss: 1.4648 6.0263 sec/batch
Epoch 9/20  Iteration 1582/3560 Training loss: 1.4647 6.0245 sec/batch
Epoch 9/20  Iteration 1583/3560 Training loss: 1.4643 5.9724 sec/batch
Epoch 9/20  Iteration 1584/3560 Training loss: 1.4643 5.9616 sec/batch
Epoch 9/20  Iteration 1585/3560 Training loss: 1.4644 6.0137 sec/batch
Epoch 9/20  Iteration 1586/3560 Training loss: 1.4642 5.9759 sec/batch
Epoch 9/20  Iteration 1587/3560 Training loss: 1.4641 5.9751 sec/batch
Epoch 9/20  Iteration 1588/3560 Training loss: 1.4640 5.9753 sec/batch
Epoch 9/20  Iteration 1589/3560 Training loss: 1.4638 5.9693 sec/batch
Epoch 9/20  Iteration 1590/3560 Training loss: 1.4637 5.9413 sec/batch
Epoch 9/20  Iteration 1591/3560 Training loss: 1.4637 5.9930 sec/batch
Epoch 9/20  Iteration 1592/3560 Training loss: 1.4640 5.9846 sec/batch
Epoch 9/20  Iteration 1593/3560 Training loss: 1.4639 5.9929 sec/batch
Epoch 9/20  Iteration 1594/3560 Training loss: 1.4638 6.0038 sec/batch
Epoch 9/20  Iteration 1595/3560 Training loss: 1.4637 5.9637 sec/batch
Epoch 9/20  Iteration 1596/3560 Training loss: 1.4634 5.9755 sec/batch
Epoch 9/20  Iteration 1597/3560 Training loss: 1.4634 6.0138 sec/batch
Epoch 9/20  Iteration 1598/3560 Training loss: 1.4633 6.1523 sec/batch
Epoch 9/20  Iteration 1599/3560 Training loss: 1.4633 6.0438 sec/batch
Epoch 9/20  Iteration 1600/3560 Training loss: 1.4630 6.0141 sec/batch
Epoch 9/20  Iteration 1601/3560 Training loss: 1.4628 5.9567 sec/batch
Epoch 9/20  Iteration 1602/3560 Training loss: 1.4628 6.3437 sec/batch
Epoch 10/20  Iteration 1603/3560 Training loss: 1.5406 6.2805 sec/batch
Epoch 10/20  Iteration 1604/3560 Training loss: 1.4962 5.9726 sec/batch
Epoch 10/20  Iteration 1605/3560 Training loss: 1.4785 6.0220 sec/batch
Epoch 10/20  Iteration 1606/3560 Training loss: 1.4736 5.9766 sec/batch
Epoch 10/20  Iteration 1607/3560 Training loss: 1.4654 5.9592 sec/batch
Epoch 10/20  Iteration 1608/3560 Training loss: 1.4539 6.3070 sec/batch
Epoch 10/20  Iteration 1609/3560 Training loss: 1.4517 6.0754 sec/batch
Epoch 10/20  Iteration 1610/3560 Training loss: 1.4486 5.9645 sec/batch
Epoch 10/20  Iteration 1611/3560 Training loss: 1.4484 6.2393 sec/batch
Epoch 10/20  Iteration 1612/3560 Training loss: 1.4472 5.9105 sec/batch
Epoch 10/20  Iteration 1613/3560 Training loss: 1.4438 5.9831 sec/batch
Epoch 10/20  Iteration 1614/3560 Training loss: 1.4424 5.9744 sec/batch
Epoch 10/20  Iteration 1615/3560 Training loss: 1.4417 5.9634 sec/batch
Epoch 10/20  Iteration 1616/3560 Training loss: 1.4429 5.9714 sec/batch
Epoch 10/20  Iteration 1617/3560 Training loss: 1.4415 6.0156 sec/batch
Epoch 10/20  Iteration 1618/3560 Training loss: 1.4397 6.2977 sec/batch
Epoch 10/20  Iteration 1619/3560 Training loss: 1.4392 5.9704 sec/batch
Epoch 10/20  Iteration 1620/3560 Training loss: 1.4405 6.0075 sec/batch
Epoch 10/20  Iteration 1621/3560 Training loss: 1.4401 6.0953 sec/batch
Epoch 10/20  Iteration 1622/3560 Training loss: 1.4412 5.9182 sec/batch
Epoch 10/20  Iteration 1623/3560 Training loss: 1.4398 6.0167 sec/batch
Epoch 10/20  Iteration 1624/3560 Training loss: 1.4398 5.9616 sec/batch
Epoch 10/20  Iteration 1625/3560 Training loss: 1.4386 5.9739 sec/batch
Epoch 10/20  Iteration 1626/3560 Training loss: 1.4381 6.1605 sec/batch
Epoch 10/20  Iteration 1627/3560 Training loss: 1.4377 5.8969 sec/batch
Epoch 10/20  Iteration 1628/3560 Training loss: 1.4364 5.9738 sec/batch
Epoch 10/20  Iteration 1629/3560 Training loss: 1.4352 5.9944 sec/batch
Epoch 10/20  Iteration 1630/3560 Training loss: 1.4356 5.9690 sec/batch
Epoch 10/20  Iteration 1631/3560 Training loss: 1.4357 5.9741 sec/batch
Epoch 10/20  Iteration 1632/3560 Training loss: 1.4354 5.9641 sec/batch
Epoch 10/20  Iteration 1633/3560 Training loss: 1.4350 5.9541 sec/batch
Epoch 10/20  Iteration 1634/3560 Training loss: 1.4338 5.9845 sec/batch
Epoch 10/20  Iteration 1635/3560 Training loss: 1.4340 5.9910 sec/batch
Epoch 10/20  Iteration 1636/3560 Training loss: 1.4343 5.9787 sec/batch
Epoch 10/20  Iteration 1637/3560 Training loss: 1.4338 6.1466 sec/batch
Epoch 10/20  Iteration 1638/3560 Training loss: 1.4335 6.2449 sec/batch
Epoch 10/20  Iteration 1639/3560 Training loss: 1.4328 6.0105 sec/batch
Epoch 10/20  Iteration 1640/3560 Training loss: 1.4316 5.9815 sec/batch
Epoch 10/20  Iteration 1641/3560 Training loss: 1.4301 6.0067 sec/batch
Epoch 10/20  Iteration 1642/3560 Training loss: 1.4296 5.9737 sec/batch
Epoch 10/20  Iteration 1643/3560 Training loss: 1.4288 5.9643 sec/batch
Epoch 10/20  Iteration 1644/3560 Training loss: 1.4293 6.0031 sec/batch
Epoch 10/20  Iteration 1645/3560 Training loss: 1.4287 5.9807 sec/batch
Epoch 10/20  Iteration 1646/3560 Training loss: 1.4279 5.9826 sec/batch
Epoch 10/20  Iteration 1647/3560 Training loss: 1.4279 5.9782 sec/batch
Epoch 10/20  Iteration 1648/3560 Training loss: 1.4268 5.9797 sec/batch
Epoch 10/20  Iteration 1649/3560 Training loss: 1.4265 5.9497 sec/batch
Epoch 10/20  Iteration 1650/3560 Training loss: 1.4259 5.9981 sec/batch
Epoch 10/20  Iteration 1651/3560 Training loss: 1.4257 5.9984 sec/batch
Epoch 10/20  Iteration 1652/3560 Training loss: 1.4258 5.9864 sec/batch
Epoch 10/20  Iteration 1653/3560 Training loss: 1.4251 5.9788 sec/batch
Epoch 10/20  Iteration 1654/3560 Training loss: 1.4257 5.9759 sec/batch
Epoch 10/20  Iteration 1655/3560 Training loss: 1.4256 6.0027 sec/batch
Epoch 10/20  Iteration 1656/3560 Training loss: 1.4258 5.9684 sec/batch
Epoch 10/20  Iteration 1657/3560 Training loss: 1.4257 5.9856 sec/batch
Epoch 10/20  Iteration 1658/3560 Training loss: 1.4256 6.2258 sec/batch
Epoch 10/20  Iteration 1659/3560 Training loss: 1.4259 5.9508 sec/batch
Epoch 10/20  Iteration 1660/3560 Training loss: 1.4254 5.9696 sec/batch
Epoch 10/20  Iteration 1661/3560 Training loss: 1.4247 6.3681 sec/batch
Epoch 10/20  Iteration 1662/3560 Training loss: 1.4251 6.0928 sec/batch
Epoch 10/20  Iteration 1663/3560 Training loss: 1.4249 5.9777 sec/batch
Epoch 10/20  Iteration 1664/3560 Training loss: 1.4257 5.9958 sec/batch
Epoch 10/20  Iteration 1665/3560 Training loss: 1.4261 5.9867 sec/batch
Epoch 10/20  Iteration 1666/3560 Training loss: 1.4262 5.9619 sec/batch
Epoch 10/20  Iteration 1667/3560 Training loss: 1.4261 5.9896 sec/batch
Epoch 10/20  Iteration 1668/3560 Training loss: 1.4263 5.9559 sec/batch
Epoch 10/20  Iteration 1669/3560 Training loss: 1.4265 6.0026 sec/batch
Epoch 10/20  Iteration 1670/3560 Training loss: 1.4261 6.0497 sec/batch
Epoch 10/20  Iteration 1671/3560 Training loss: 1.4261 5.9393 sec/batch
Epoch 10/20  Iteration 1672/3560 Training loss: 1.4259 5.9552 sec/batch
Epoch 10/20  Iteration 1673/3560 Training loss: 1.4263 5.9625 sec/batch
Epoch 10/20  Iteration 1674/3560 Training loss: 1.4264 5.9494 sec/batch
Epoch 10/20  Iteration 1675/3560 Training loss: 1.4267 5.9836 sec/batch
Epoch 10/20  Iteration 1676/3560 Training loss: 1.4262 5.9880 sec/batch
Epoch 10/20  Iteration 1677/3560 Training loss: 1.4259 5.9589 sec/batch
Epoch 10/20  Iteration 1678/3560 Training loss: 1.4261 6.2452 sec/batch
Epoch 10/20  Iteration 1679/3560 Training loss: 1.4259 5.9481 sec/batch
Epoch 10/20  Iteration 1680/3560 Training loss: 1.4258 5.9639 sec/batch
Epoch 10/20  Iteration 1681/3560 Training loss: 1.4251 5.9575 sec/batch
Epoch 10/20  Iteration 1682/3560 Training loss: 1.4249 5.9518 sec/batch
Epoch 10/20  Iteration 1683/3560 Training loss: 1.4243 6.0090 sec/batch
Epoch 10/20  Iteration 1684/3560 Training loss: 1.4243 5.9922 sec/batch
Epoch 10/20  Iteration 1685/3560 Training loss: 1.4238 6.0665 sec/batch
Epoch 10/20  Iteration 1686/3560 Training loss: 1.4237 6.0414 sec/batch
Epoch 10/20  Iteration 1687/3560 Training loss: 1.4234 5.9962 sec/batch
Epoch 10/20  Iteration 1688/3560 Training loss: 1.4231 5.9653 sec/batch
Epoch 10/20  Iteration 1689/3560 Training loss: 1.4226 6.0076 sec/batch
Epoch 10/20  Iteration 1690/3560 Training loss: 1.4222 5.9908 sec/batch
Epoch 10/20  Iteration 1691/3560 Training loss: 1.4218 5.9563 sec/batch
Epoch 10/20  Iteration 1692/3560 Training loss: 1.4217 5.9856 sec/batch
Epoch 10/20  Iteration 1693/3560 Training loss: 1.4214 5.9362 sec/batch
Epoch 10/20  Iteration 1694/3560 Training loss: 1.4212 6.0059 sec/batch
Epoch 10/20  Iteration 1695/3560 Training loss: 1.4207 5.9829 sec/batch
Epoch 10/20  Iteration 1696/3560 Training loss: 1.4203 5.9385 sec/batch
Epoch 10/20  Iteration 1697/3560 Training loss: 1.4200 5.9608 sec/batch
Epoch 10/20  Iteration 1698/3560 Training loss: 1.4200 6.2342 sec/batch
Epoch 10/20  Iteration 1699/3560 Training loss: 1.4198 5.9620 sec/batch
Epoch 10/20  Iteration 1700/3560 Training loss: 1.4193 6.0120 sec/batch
Epoch 10/20  Iteration 1701/3560 Training loss: 1.4189 6.0050 sec/batch
Epoch 10/20  Iteration 1702/3560 Training loss: 1.4184 5.9747 sec/batch
Epoch 10/20  Iteration 1703/3560 Training loss: 1.4184 5.9682 sec/batch
Epoch 10/20  Iteration 1704/3560 Training loss: 1.4181 5.9764 sec/batch
Epoch 10/20  Iteration 1705/3560 Training loss: 1.4180 6.0367 sec/batch
Epoch 10/20  Iteration 1706/3560 Training loss: 1.4177 5.9999 sec/batch
Epoch 10/20  Iteration 1707/3560 Training loss: 1.4174 5.9891 sec/batch
Epoch 10/20  Iteration 1708/3560 Training loss: 1.4172 5.9967 sec/batch
Epoch 10/20  Iteration 1709/3560 Training loss: 1.4170 5.9363 sec/batch
Epoch 10/20  Iteration 1710/3560 Training loss: 1.4168 5.9694 sec/batch
Epoch 10/20  Iteration 1711/3560 Training loss: 1.4166 6.4895 sec/batch
Epoch 10/20  Iteration 1712/3560 Training loss: 1.4165 6.0305 sec/batch
Epoch 10/20  Iteration 1713/3560 Training loss: 1.4162 6.1444 sec/batch
Epoch 10/20  Iteration 1714/3560 Training loss: 1.4162 6.0403 sec/batch
Epoch 10/20  Iteration 1715/3560 Training loss: 1.4159 5.9850 sec/batch
Epoch 10/20  Iteration 1716/3560 Training loss: 1.4157 5.9523 sec/batch
Epoch 10/20  Iteration 1717/3560 Training loss: 1.4154 6.0079 sec/batch
Epoch 10/20  Iteration 1718/3560 Training loss: 1.4150 6.3640 sec/batch
Epoch 10/20  Iteration 1719/3560 Training loss: 1.4148 6.0022 sec/batch
Epoch 10/20  Iteration 1720/3560 Training loss: 1.4148 6.0280 sec/batch
Epoch 10/20  Iteration 1721/3560 Training loss: 1.4145 5.9854 sec/batch
Epoch 10/20  Iteration 1722/3560 Training loss: 1.4144 5.9522 sec/batch
Epoch 10/20  Iteration 1723/3560 Training loss: 1.4143 6.0117 sec/batch
Epoch 10/20  Iteration 1724/3560 Training loss: 1.4138 5.9643 sec/batch
Epoch 10/20  Iteration 1725/3560 Training loss: 1.4134 6.0145 sec/batch
Epoch 10/20  Iteration 1726/3560 Training loss: 1.4133 5.9728 sec/batch
Epoch 10/20  Iteration 1727/3560 Training loss: 1.4132 6.0051 sec/batch
Epoch 10/20  Iteration 1728/3560 Training loss: 1.4127 5.9405 sec/batch
Epoch 10/20  Iteration 1729/3560 Training loss: 1.4129 6.0108 sec/batch
Epoch 10/20  Iteration 1730/3560 Training loss: 1.4129 5.9716 sec/batch
Epoch 10/20  Iteration 1731/3560 Training loss: 1.4127 5.9930 sec/batch
Epoch 10/20  Iteration 1732/3560 Training loss: 1.4124 5.9828 sec/batch
Epoch 10/20  Iteration 1733/3560 Training loss: 1.4119 5.9770 sec/batch
Epoch 10/20  Iteration 1734/3560 Training loss: 1.4116 5.9531 sec/batch
Epoch 10/20  Iteration 1735/3560 Training loss: 1.4116 5.9665 sec/batch
Epoch 10/20  Iteration 1736/3560 Training loss: 1.4116 5.9461 sec/batch
Epoch 10/20  Iteration 1737/3560 Training loss: 1.4115 5.9556 sec/batch
Epoch 10/20  Iteration 1738/3560 Training loss: 1.4115 6.2582 sec/batch
Epoch 10/20  Iteration 1739/3560 Training loss: 1.4116 6.0006 sec/batch
Epoch 10/20  Iteration 1740/3560 Training loss: 1.4117 6.0081 sec/batch
Epoch 10/20  Iteration 1741/3560 Training loss: 1.4116 5.9561 sec/batch
Epoch 10/20  Iteration 1742/3560 Training loss: 1.4115 5.9635 sec/batch
Epoch 10/20  Iteration 1743/3560 Training loss: 1.4118 6.0003 sec/batch
Epoch 10/20  Iteration 1744/3560 Training loss: 1.4117 5.9608 sec/batch
Epoch 10/20  Iteration 1745/3560 Training loss: 1.4116 5.9836 sec/batch
Epoch 10/20  Iteration 1746/3560 Training loss: 1.4117 5.9475 sec/batch
Epoch 10/20  Iteration 1747/3560 Training loss: 1.4115 5.9798 sec/batch
Epoch 10/20  Iteration 1748/3560 Training loss: 1.4116 5.9783 sec/batch
Epoch 10/20  Iteration 1749/3560 Training loss: 1.4116 5.9917 sec/batch
Epoch 10/20  Iteration 1750/3560 Training loss: 1.4117 6.0041 sec/batch
Epoch 10/20  Iteration 1751/3560 Training loss: 1.4118 5.9736 sec/batch
Epoch 10/20  Iteration 1752/3560 Training loss: 1.4116 5.9642 sec/batch
Epoch 10/20  Iteration 1753/3560 Training loss: 1.4113 5.9911 sec/batch
Epoch 10/20  Iteration 1754/3560 Training loss: 1.4111 5.9710 sec/batch
Epoch 10/20  Iteration 1755/3560 Training loss: 1.4111 5.9719 sec/batch
Epoch 10/20  Iteration 1756/3560 Training loss: 1.4110 5.9660 sec/batch
Epoch 10/20  Iteration 1757/3560 Training loss: 1.4110 5.9992 sec/batch
Epoch 10/20  Iteration 1758/3560 Training loss: 1.4109 6.1946 sec/batch
Epoch 10/20  Iteration 1759/3560 Training loss: 1.4108 5.9931 sec/batch
Epoch 10/20  Iteration 1760/3560 Training loss: 1.4107 6.0371 sec/batch
Epoch 10/20  Iteration 1761/3560 Training loss: 1.4104 6.3508 sec/batch
Epoch 10/20  Iteration 1762/3560 Training loss: 1.4104 6.0095 sec/batch
Epoch 10/20  Iteration 1763/3560 Training loss: 1.4105 5.9683 sec/batch
Epoch 10/20  Iteration 1764/3560 Training loss: 1.4104 6.0181 sec/batch
Epoch 10/20  Iteration 1765/3560 Training loss: 1.4104 5.9788 sec/batch
Epoch 10/20  Iteration 1766/3560 Training loss: 1.4103 5.9670 sec/batch
Epoch 10/20  Iteration 1767/3560 Training loss: 1.4102 5.9881 sec/batch
Epoch 10/20  Iteration 1768/3560 Training loss: 1.4101 5.9888 sec/batch
Epoch 10/20  Iteration 1769/3560 Training loss: 1.4103 6.0293 sec/batch
Epoch 10/20  Iteration 1770/3560 Training loss: 1.4106 5.9729 sec/batch
Epoch 10/20  Iteration 1771/3560 Training loss: 1.4106 5.9723 sec/batch
Epoch 10/20  Iteration 1772/3560 Training loss: 1.4105 5.9683 sec/batch
Epoch 10/20  Iteration 1773/3560 Training loss: 1.4103 6.0145 sec/batch
Epoch 10/20  Iteration 1774/3560 Training loss: 1.4101 5.9534 sec/batch
Epoch 10/20  Iteration 1775/3560 Training loss: 1.4101 6.0322 sec/batch
Epoch 10/20  Iteration 1776/3560 Training loss: 1.4101 6.0024 sec/batch
Epoch 10/20  Iteration 1777/3560 Training loss: 1.4101 5.9942 sec/batch
Epoch 10/20  Iteration 1778/3560 Training loss: 1.4099 6.2513 sec/batch
Epoch 10/20  Iteration 1779/3560 Training loss: 1.4097 6.0156 sec/batch
Epoch 10/20  Iteration 1780/3560 Training loss: 1.4098 5.9859 sec/batch
Epoch 11/20  Iteration 1781/3560 Training loss: 1.5011 6.0326 sec/batch
Epoch 11/20  Iteration 1782/3560 Training loss: 1.4521 5.9893 sec/batch
Epoch 11/20  Iteration 1783/3560 Training loss: 1.4338 5.9991 sec/batch
Epoch 11/20  Iteration 1784/3560 Training loss: 1.4312 5.9490 sec/batch
Epoch 11/20  Iteration 1785/3560 Training loss: 1.4211 6.0022 sec/batch
Epoch 11/20  Iteration 1786/3560 Training loss: 1.4119 5.9813 sec/batch
Epoch 11/20  Iteration 1787/3560 Training loss: 1.4105 5.9723 sec/batch
Epoch 11/20  Iteration 1788/3560 Training loss: 1.4073 5.9962 sec/batch
Epoch 11/20  Iteration 1789/3560 Training loss: 1.4070 5.9482 sec/batch
Epoch 11/20  Iteration 1790/3560 Training loss: 1.4057 5.9900 sec/batch
Epoch 11/20  Iteration 1791/3560 Training loss: 1.4017 6.0027 sec/batch
Epoch 11/20  Iteration 1792/3560 Training loss: 1.4002 5.9684 sec/batch
Epoch 11/20  Iteration 1793/3560 Training loss: 1.3990 5.9489 sec/batch
Epoch 11/20  Iteration 1794/3560 Training loss: 1.3999 5.9871 sec/batch
Epoch 11/20  Iteration 1795/3560 Training loss: 1.3980 6.0097 sec/batch
Epoch 11/20  Iteration 1796/3560 Training loss: 1.3954 6.0200 sec/batch
Epoch 11/20  Iteration 1797/3560 Training loss: 1.3954 5.9826 sec/batch
Epoch 11/20  Iteration 1798/3560 Training loss: 1.3964 6.3025 sec/batch
Epoch 11/20  Iteration 1799/3560 Training loss: 1.3956 5.9988 sec/batch
Epoch 11/20  Iteration 1800/3560 Training loss: 1.3969 5.9447 sec/batch
Epoch 11/20  Iteration 1801/3560 Training loss: 1.3965 5.9996 sec/batch
Epoch 11/20  Iteration 1802/3560 Training loss: 1.3971 5.9602 sec/batch
Epoch 11/20  Iteration 1803/3560 Training loss: 1.3960 5.9741 sec/batch
Epoch 11/20  Iteration 1804/3560 Training loss: 1.3960 5.9708 sec/batch
Epoch 11/20  Iteration 1805/3560 Training loss: 1.3956 5.9790 sec/batch
Epoch 11/20  Iteration 1806/3560 Training loss: 1.3936 6.0224 sec/batch
Epoch 11/20  Iteration 1807/3560 Training loss: 1.3921 5.9551 sec/batch
Epoch 11/20  Iteration 1808/3560 Training loss: 1.3923 5.9635 sec/batch
Epoch 11/20  Iteration 1809/3560 Training loss: 1.3925 5.9772 sec/batch
Epoch 11/20  Iteration 1810/3560 Training loss: 1.3923 5.9619 sec/batch
Epoch 11/20  Iteration 1811/3560 Training loss: 1.3920 6.4420 sec/batch
Epoch 11/20  Iteration 1812/3560 Training loss: 1.3907 5.9792 sec/batch
Epoch 11/20  Iteration 1813/3560 Training loss: 1.3909 6.0083 sec/batch
Epoch 11/20  Iteration 1814/3560 Training loss: 1.3913 5.9877 sec/batch
Epoch 11/20  Iteration 1815/3560 Training loss: 1.3910 6.0769 sec/batch
Epoch 11/20  Iteration 1816/3560 Training loss: 1.3907 6.0311 sec/batch
Epoch 11/20  Iteration 1817/3560 Training loss: 1.3900 6.0652 sec/batch
Epoch 11/20  Iteration 1818/3560 Training loss: 1.3886 6.2606 sec/batch
Epoch 11/20  Iteration 1819/3560 Training loss: 1.3872 6.0218 sec/batch
Epoch 11/20  Iteration 1820/3560 Training loss: 1.3868 6.0269 sec/batch
Epoch 11/20  Iteration 1821/3560 Training loss: 1.3863 6.0020 sec/batch
Epoch 11/20  Iteration 1822/3560 Training loss: 1.3869 5.9781 sec/batch
Epoch 11/20  Iteration 1823/3560 Training loss: 1.3866 6.1984 sec/batch
Epoch 11/20  Iteration 1824/3560 Training loss: 1.3859 6.1103 sec/batch
Epoch 11/20  Iteration 1825/3560 Training loss: 1.3862 6.0411 sec/batch
Epoch 11/20  Iteration 1826/3560 Training loss: 1.3851 6.0243 sec/batch
Epoch 11/20  Iteration 1827/3560 Training loss: 1.3845 5.9541 sec/batch
Epoch 11/20  Iteration 1828/3560 Training loss: 1.3842 6.0057 sec/batch
Epoch 11/20  Iteration 1829/3560 Training loss: 1.3839 5.9709 sec/batch
Epoch 11/20  Iteration 1830/3560 Training loss: 1.3840 5.9993 sec/batch
Epoch 11/20  Iteration 1831/3560 Training loss: 1.3835 5.9548 sec/batch
Epoch 11/20  Iteration 1832/3560 Training loss: 1.3842 6.0340 sec/batch
Epoch 11/20  Iteration 1833/3560 Training loss: 1.3842 5.9832 sec/batch
Epoch 11/20  Iteration 1834/3560 Training loss: 1.3844 6.0052 sec/batch
Epoch 11/20  Iteration 1835/3560 Training loss: 1.3842 6.0029 sec/batch
Epoch 11/20  Iteration 1836/3560 Training loss: 1.3841 5.9639 sec/batch
Epoch 11/20  Iteration 1837/3560 Training loss: 1.3844 5.9659 sec/batch
Epoch 11/20  Iteration 1838/3560 Training loss: 1.3840 6.2561 sec/batch
Epoch 11/20  Iteration 1839/3560 Training loss: 1.3832 5.9831 sec/batch
Epoch 11/20  Iteration 1840/3560 Training loss: 1.3838 6.0176 sec/batch
Epoch 11/20  Iteration 1841/3560 Training loss: 1.3837 5.9822 sec/batch
Epoch 11/20  Iteration 1842/3560 Training loss: 1.3845 5.9765 sec/batch
Epoch 11/20  Iteration 1843/3560 Training loss: 1.3848 5.9981 sec/batch
Epoch 11/20  Iteration 1844/3560 Training loss: 1.3849 5.9969 sec/batch
Epoch 11/20  Iteration 1845/3560 Training loss: 1.3847 5.9783 sec/batch
Epoch 11/20  Iteration 1846/3560 Training loss: 1.3849 5.9415 sec/batch
Epoch 11/20  Iteration 1847/3560 Training loss: 1.3852 6.0492 sec/batch
Epoch 11/20  Iteration 1848/3560 Training loss: 1.3848 6.0180 sec/batch
Epoch 11/20  Iteration 1849/3560 Training loss: 1.3848 5.9567 sec/batch
Epoch 11/20  Iteration 1850/3560 Training loss: 1.3845 6.0021 sec/batch
Epoch 11/20  Iteration 1851/3560 Training loss: 1.3849 5.9837 sec/batch
Epoch 11/20  Iteration 1852/3560 Training loss: 1.3851 5.9151 sec/batch
Epoch 11/20  Iteration 1853/3560 Training loss: 1.3854 5.9885 sec/batch
Epoch 11/20  Iteration 1854/3560 Training loss: 1.3850 6.2936 sec/batch
Epoch 11/20  Iteration 1855/3560 Training loss: 1.3848 7.7822 sec/batch
Epoch 11/20  Iteration 1856/3560 Training loss: 1.3849 6.8221 sec/batch
Epoch 11/20  Iteration 1857/3560 Training loss: 1.3847 6.5390 sec/batch
Epoch 11/20  Iteration 1858/3560 Training loss: 1.3846 6.0226 sec/batch
Epoch 11/20  Iteration 1859/3560 Training loss: 1.3840 6.0211 sec/batch
Epoch 11/20  Iteration 1860/3560 Training loss: 1.3840 6.3120 sec/batch
Epoch 11/20  Iteration 1861/3560 Training loss: 1.3835 6.0399 sec/batch
Epoch 11/20  Iteration 1862/3560 Training loss: 1.3834 6.0150 sec/batch
Epoch 11/20  Iteration 1863/3560 Training loss: 1.3829 6.0390 sec/batch
Epoch 11/20  Iteration 1864/3560 Training loss: 1.3829 6.0368 sec/batch
Epoch 11/20  Iteration 1865/3560 Training loss: 1.3825 5.9901 sec/batch
Epoch 11/20  Iteration 1866/3560 Training loss: 1.3823 5.9897 sec/batch
Epoch 11/20  Iteration 1867/3560 Training loss: 1.3820 6.0110 sec/batch
Epoch 11/20  Iteration 1868/3560 Training loss: 1.3816 6.0171 sec/batch
Epoch 11/20  Iteration 1869/3560 Training loss: 1.3812 6.0310 sec/batch
Epoch 11/20  Iteration 1870/3560 Training loss: 1.3812 6.0358 sec/batch
Epoch 11/20  Iteration 1871/3560 Training loss: 1.3808 5.9790 sec/batch
Epoch 11/20  Iteration 1872/3560 Training loss: 1.3807 6.0369 sec/batch
Epoch 11/20  Iteration 1873/3560 Training loss: 1.3802 5.9997 sec/batch
Epoch 11/20  Iteration 1874/3560 Training loss: 1.3799 6.0172 sec/batch
Epoch 11/20  Iteration 1875/3560 Training loss: 1.3795 5.9993 sec/batch
Epoch 11/20  Iteration 1876/3560 Training loss: 1.3795 6.1402 sec/batch
Epoch 11/20  Iteration 1877/3560 Training loss: 1.3794 6.2584 sec/batch
Epoch 11/20  Iteration 1878/3560 Training loss: 1.3788 6.0239 sec/batch
Epoch 11/20  Iteration 1879/3560 Training loss: 1.3784 6.0083 sec/batch
Epoch 11/20  Iteration 1880/3560 Training loss: 1.3779 6.0339 sec/batch
Epoch 11/20  Iteration 1881/3560 Training loss: 1.3778 6.0084 sec/batch
Epoch 11/20  Iteration 1882/3560 Training loss: 1.3776 6.0010 sec/batch
Epoch 11/20  Iteration 1883/3560 Training loss: 1.3774 5.9911 sec/batch
Epoch 11/20  Iteration 1884/3560 Training loss: 1.3773 6.0392 sec/batch
Epoch 11/20  Iteration 1885/3560 Training loss: 1.3771 6.0480 sec/batch
Epoch 11/20  Iteration 1886/3560 Training loss: 1.3769 6.0306 sec/batch
Epoch 11/20  Iteration 1887/3560 Training loss: 1.3767 5.9937 sec/batch
Epoch 11/20  Iteration 1888/3560 Training loss: 1.3767 5.9760 sec/batch
Epoch 11/20  Iteration 1889/3560 Training loss: 1.3766 6.0130 sec/batch
Epoch 11/20  Iteration 1890/3560 Training loss: 1.3766 6.0677 sec/batch
Epoch 11/20  Iteration 1891/3560 Training loss: 1.3764 6.0037 sec/batch
Epoch 11/20  Iteration 1892/3560 Training loss: 1.3763 5.9858 sec/batch
Epoch 11/20  Iteration 1893/3560 Training loss: 1.3761 6.0008 sec/batch
Epoch 11/20  Iteration 1894/3560 Training loss: 1.3759 5.9763 sec/batch
Epoch 11/20  Iteration 1895/3560 Training loss: 1.3755 6.0654 sec/batch
Epoch 11/20  Iteration 1896/3560 Training loss: 1.3751 6.4381 sec/batch
Epoch 11/20  Iteration 1897/3560 Training loss: 1.3751 6.1805 sec/batch
Epoch 11/20  Iteration 1898/3560 Training loss: 1.3750 6.0418 sec/batch
Epoch 11/20  Iteration 1899/3560 Training loss: 1.3749 5.9880 sec/batch
Epoch 11/20  Iteration 1900/3560 Training loss: 1.3748 6.0436 sec/batch
Epoch 11/20  Iteration 1901/3560 Training loss: 1.3747 6.0266 sec/batch
Epoch 11/20  Iteration 1902/3560 Training loss: 1.3743 6.0255 sec/batch
Epoch 11/20  Iteration 1903/3560 Training loss: 1.3739 5.9953 sec/batch
Epoch 11/20  Iteration 1904/3560 Training loss: 1.3739 6.0041 sec/batch
Epoch 11/20  Iteration 1905/3560 Training loss: 1.3738 6.0336 sec/batch
Epoch 11/20  Iteration 1906/3560 Training loss: 1.3734 5.9855 sec/batch
Epoch 11/20  Iteration 1907/3560 Training loss: 1.3734 6.0401 sec/batch
Epoch 11/20  Iteration 1908/3560 Training loss: 1.3734 6.0303 sec/batch
Epoch 11/20  Iteration 1909/3560 Training loss: 1.3732 6.0917 sec/batch
Epoch 11/20  Iteration 1910/3560 Training loss: 1.3729 6.4578 sec/batch
Epoch 11/20  Iteration 1911/3560 Training loss: 1.3724 5.9759 sec/batch
Epoch 11/20  Iteration 1912/3560 Training loss: 1.3722 5.9992 sec/batch
Epoch 11/20  Iteration 1913/3560 Training loss: 1.3723 6.0446 sec/batch
Epoch 11/20  Iteration 1914/3560 Training loss: 1.3722 6.0288 sec/batch
Epoch 11/20  Iteration 1915/3560 Training loss: 1.3722 6.0264 sec/batch
Epoch 11/20  Iteration 1916/3560 Training loss: 1.3722 6.3419 sec/batch
Epoch 11/20  Iteration 1917/3560 Training loss: 1.3723 6.0126 sec/batch
Epoch 11/20  Iteration 1918/3560 Training loss: 1.3724 6.0053 sec/batch
Epoch 11/20  Iteration 1919/3560 Training loss: 1.3724 6.0287 sec/batch
Epoch 11/20  Iteration 1920/3560 Training loss: 1.3723 6.0009 sec/batch
Epoch 11/20  Iteration 1921/3560 Training loss: 1.3726 6.0112 sec/batch
Epoch 11/20  Iteration 1922/3560 Training loss: 1.3726 5.9810 sec/batch
Epoch 11/20  Iteration 1923/3560 Training loss: 1.3724 6.0100 sec/batch
Epoch 11/20  Iteration 1924/3560 Training loss: 1.3726 6.0242 sec/batch
Epoch 11/20  Iteration 1925/3560 Training loss: 1.3724 6.0337 sec/batch
Epoch 11/20  Iteration 1926/3560 Training loss: 1.3725 5.9661 sec/batch
Epoch 11/20  Iteration 1927/3560 Training loss: 1.3725 6.0068 sec/batch
Epoch 11/20  Iteration 1928/3560 Training loss: 1.3727 6.0150 sec/batch
Epoch 11/20  Iteration 1929/3560 Training loss: 1.3728 6.0128 sec/batch
Epoch 11/20  Iteration 1930/3560 Training loss: 1.3726 5.9643 sec/batch
Epoch 11/20  Iteration 1931/3560 Training loss: 1.3723 6.0285 sec/batch
Epoch 11/20  Iteration 1932/3560 Training loss: 1.3721 6.0332 sec/batch
Epoch 11/20  Iteration 1933/3560 Training loss: 1.3721 6.0088 sec/batch
Epoch 11/20  Iteration 1934/3560 Training loss: 1.3721 6.0600 sec/batch
Epoch 11/20  Iteration 1935/3560 Training loss: 1.3720 5.9943 sec/batch
Epoch 11/20  Iteration 1936/3560 Training loss: 1.3719 6.3780 sec/batch
Epoch 11/20  Iteration 1937/3560 Training loss: 1.3720 6.0032 sec/batch
Epoch 11/20  Iteration 1938/3560 Training loss: 1.3719 6.0560 sec/batch
Epoch 11/20  Iteration 1939/3560 Training loss: 1.3716 6.0412 sec/batch
Epoch 11/20  Iteration 1940/3560 Training loss: 1.3717 5.9898 sec/batch
Epoch 11/20  Iteration 1941/3560 Training loss: 1.3719 5.9895 sec/batch
Epoch 11/20  Iteration 1942/3560 Training loss: 1.3718 5.9732 sec/batch
Epoch 11/20  Iteration 1943/3560 Training loss: 1.3717 5.9970 sec/batch
Epoch 11/20  Iteration 1944/3560 Training loss: 1.3716 6.0142 sec/batch
Epoch 11/20  Iteration 1945/3560 Training loss: 1.3716 6.0346 sec/batch
Epoch 11/20  Iteration 1946/3560 Training loss: 1.3715 6.0304 sec/batch
Epoch 11/20  Iteration 1947/3560 Training loss: 1.3717 6.0576 sec/batch
Epoch 11/20  Iteration 1948/3560 Training loss: 1.3720 6.0008 sec/batch
Epoch 11/20  Iteration 1949/3560 Training loss: 1.3719 5.9831 sec/batch
Epoch 11/20  Iteration 1950/3560 Training loss: 1.3719 6.0330 sec/batch
Epoch 11/20  Iteration 1951/3560 Training loss: 1.3718 6.0038 sec/batch
Epoch 11/20  Iteration 1952/3560 Training loss: 1.3716 6.0156 sec/batch
Epoch 11/20  Iteration 1953/3560 Training loss: 1.3717 6.0277 sec/batch
Epoch 11/20  Iteration 1954/3560 Training loss: 1.3717 6.0494 sec/batch
Epoch 11/20  Iteration 1955/3560 Training loss: 1.3717 6.0583 sec/batch
Epoch 11/20  Iteration 1956/3560 Training loss: 1.3714 6.3868 sec/batch
Epoch 11/20  Iteration 1957/3560 Training loss: 1.3713 5.9725 sec/batch
Epoch 11/20  Iteration 1958/3560 Training loss: 1.3714 6.0487 sec/batch
Epoch 12/20  Iteration 1959/3560 Training loss: 1.4805 6.4225 sec/batch
Epoch 12/20  Iteration 1960/3560 Training loss: 1.4311 6.3493 sec/batch
Epoch 12/20  Iteration 1961/3560 Training loss: 1.4106 6.0536 sec/batch
Epoch 12/20  Iteration 1962/3560 Training loss: 1.4057 6.0289 sec/batch
Epoch 12/20  Iteration 1963/3560 Training loss: 1.3930 6.0254 sec/batch
Epoch 12/20  Iteration 1964/3560 Training loss: 1.3805 6.0321 sec/batch
Epoch 12/20  Iteration 1965/3560 Training loss: 1.3791 6.0688 sec/batch
Epoch 12/20  Iteration 1966/3560 Training loss: 1.3759 5.9917 sec/batch
Epoch 12/20  Iteration 1967/3560 Training loss: 1.3757 6.0267 sec/batch
Epoch 12/20  Iteration 1968/3560 Training loss: 1.3740 6.0247 sec/batch
Epoch 12/20  Iteration 1969/3560 Training loss: 1.3707 5.9883 sec/batch
Epoch 12/20  Iteration 1970/3560 Training loss: 1.3688 6.0133 sec/batch
Epoch 12/20  Iteration 1971/3560 Training loss: 1.3684 6.0241 sec/batch
Epoch 12/20  Iteration 1972/3560 Training loss: 1.3689 5.9357 sec/batch
Epoch 12/20  Iteration 1973/3560 Training loss: 1.3670 5.9837 sec/batch
Epoch 12/20  Iteration 1974/3560 Training loss: 1.3652 6.0015 sec/batch
Epoch 12/20  Iteration 1975/3560 Training loss: 1.3651 6.0439 sec/batch
Epoch 12/20  Iteration 1976/3560 Training loss: 1.3658 6.3837 sec/batch
Epoch 12/20  Iteration 1977/3560 Training loss: 1.3649 6.0465 sec/batch
Epoch 12/20  Iteration 1978/3560 Training loss: 1.3661 5.9631 sec/batch
Epoch 12/20  Iteration 1979/3560 Training loss: 1.3649 5.9902 sec/batch
Epoch 12/20  Iteration 1980/3560 Training loss: 1.3652 6.0303 sec/batch
Epoch 12/20  Iteration 1981/3560 Training loss: 1.3638 5.9979 sec/batch
Epoch 12/20  Iteration 1982/3560 Training loss: 1.3631 6.0267 sec/batch
Epoch 12/20  Iteration 1983/3560 Training loss: 1.3625 5.9676 sec/batch
Epoch 12/20  Iteration 1984/3560 Training loss: 1.3604 6.0488 sec/batch
Epoch 12/20  Iteration 1985/3560 Training loss: 1.3594 6.0292 sec/batch
Epoch 12/20  Iteration 1986/3560 Training loss: 1.3598 5.9666 sec/batch
Epoch 12/20  Iteration 1987/3560 Training loss: 1.3597 6.0370 sec/batch
Epoch 12/20  Iteration 1988/3560 Training loss: 1.3598 5.9994 sec/batch
Epoch 12/20  Iteration 1989/3560 Training loss: 1.3593 5.9963 sec/batch
Epoch 12/20  Iteration 1990/3560 Training loss: 1.3582 5.9991 sec/batch
Epoch 12/20  Iteration 1991/3560 Training loss: 1.3585 6.0636 sec/batch
Epoch 12/20  Iteration 1992/3560 Training loss: 1.3585 5.9889 sec/batch
Epoch 12/20  Iteration 1993/3560 Training loss: 1.3581 5.9900 sec/batch
Epoch 12/20  Iteration 1994/3560 Training loss: 1.3580 5.9937 sec/batch
Epoch 12/20  Iteration 1995/3560 Training loss: 1.3571 6.1005 sec/batch
Epoch 12/20  Iteration 1996/3560 Training loss: 1.3559 6.2586 sec/batch
Epoch 12/20  Iteration 1997/3560 Training loss: 1.3545 5.9481 sec/batch
Epoch 12/20  Iteration 1998/3560 Training loss: 1.3539 5.9878 sec/batch
Epoch 12/20  Iteration 1999/3560 Training loss: 1.3533 5.9630 sec/batch
Epoch 12/20  Iteration 2000/3560 Training loss: 1.3539 6.0035 sec/batch
Epoch 12/20  Iteration 2001/3560 Training loss: 1.3534 5.9670 sec/batch
Epoch 12/20  Iteration 2002/3560 Training loss: 1.3526 5.9950 sec/batch
Epoch 12/20  Iteration 2003/3560 Training loss: 1.3529 5.9676 sec/batch
Epoch 12/20  Iteration 2004/3560 Training loss: 1.3522 5.9475 sec/batch
Epoch 12/20  Iteration 2005/3560 Training loss: 1.3519 5.9566 sec/batch
Epoch 12/20  Iteration 2006/3560 Training loss: 1.3516 5.9940 sec/batch
Epoch 12/20  Iteration 2007/3560 Training loss: 1.3514 6.0082 sec/batch
Epoch 12/20  Iteration 2008/3560 Training loss: 1.3515 6.0111 sec/batch
Epoch 12/20  Iteration 2009/3560 Training loss: 1.3512 6.2584 sec/batch
Epoch 12/20  Iteration 2010/3560 Training loss: 1.3519 6.0416 sec/batch
Epoch 12/20  Iteration 2011/3560 Training loss: 1.3517 6.1096 sec/batch
Epoch 12/20  Iteration 2012/3560 Training loss: 1.3519 6.6944 sec/batch
Epoch 12/20  Iteration 2013/3560 Training loss: 1.3517 7.2036 sec/batch
Epoch 12/20  Iteration 2014/3560 Training loss: 1.3518 6.9038 sec/batch
Epoch 12/20  Iteration 2015/3560 Training loss: 1.3520 9.0788 sec/batch
Epoch 12/20  Iteration 2016/3560 Training loss: 1.3515 10.9115 sec/batch
Epoch 12/20  Iteration 2017/3560 Training loss: 1.3509 9.1497 sec/batch
Epoch 12/20  Iteration 2018/3560 Training loss: 1.3514 7.6199 sec/batch
Epoch 12/20  Iteration 2019/3560 Training loss: 1.3513 6.2472 sec/batch
Epoch 12/20  Iteration 2020/3560 Training loss: 1.3522 6.2567 sec/batch
Epoch 12/20  Iteration 2021/3560 Training loss: 1.3524 6.1641 sec/batch
Epoch 12/20  Iteration 2022/3560 Training loss: 1.3523 6.0988 sec/batch
Epoch 12/20  Iteration 2023/3560 Training loss: 1.3523 6.1262 sec/batch
Epoch 12/20  Iteration 2024/3560 Training loss: 1.3522 6.2226 sec/batch
Epoch 12/20  Iteration 2025/3560 Training loss: 1.3525 6.1552 sec/batch
Epoch 12/20  Iteration 2026/3560 Training loss: 1.3520 6.1558 sec/batch
Epoch 12/20  Iteration 2027/3560 Training loss: 1.3520 6.1132 sec/batch
Epoch 12/20  Iteration 2028/3560 Training loss: 1.3520 6.1003 sec/batch
Epoch 12/20  Iteration 2029/3560 Training loss: 1.3525 6.1032 sec/batch
Epoch 12/20  Iteration 2030/3560 Training loss: 1.3526 6.1454 sec/batch
Epoch 12/20  Iteration 2031/3560 Training loss: 1.3531 6.1107 sec/batch
Epoch 12/20  Iteration 2032/3560 Training loss: 1.3527 6.3268 sec/batch
Epoch 12/20  Iteration 2033/3560 Training loss: 1.3527 6.5310 sec/batch
Epoch 12/20  Iteration 2034/3560 Training loss: 1.3527 6.1724 sec/batch
Epoch 12/20  Iteration 2035/3560 Training loss: 1.3525 6.2203 sec/batch
Epoch 12/20  Iteration 2036/3560 Training loss: 1.3524 6.1645 sec/batch
Epoch 12/20  Iteration 2037/3560 Training loss: 1.3517 6.1344 sec/batch
Epoch 12/20  Iteration 2038/3560 Training loss: 1.3517 6.1216 sec/batch
Epoch 12/20  Iteration 2039/3560 Training loss: 1.3513 6.1430 sec/batch
Epoch 12/20  Iteration 2040/3560 Training loss: 1.3512 6.1561 sec/batch
Epoch 12/20  Iteration 2041/3560 Training loss: 1.3507 6.1158 sec/batch
Epoch 12/20  Iteration 2042/3560 Training loss: 1.3506 6.1303 sec/batch
Epoch 12/20  Iteration 2043/3560 Training loss: 1.3503 6.1065 sec/batch
Epoch 12/20  Iteration 2044/3560 Training loss: 1.3501 6.1082 sec/batch
Epoch 12/20  Iteration 2045/3560 Training loss: 1.3499 6.1047 sec/batch
Epoch 12/20  Iteration 2046/3560 Training loss: 1.3495 6.1617 sec/batch
Epoch 12/20  Iteration 2047/3560 Training loss: 1.3491 6.1446 sec/batch
Epoch 12/20  Iteration 2048/3560 Training loss: 1.3491 6.1354 sec/batch
Epoch 12/20  Iteration 2049/3560 Training loss: 1.3488 6.2084 sec/batch
Epoch 12/20  Iteration 2050/3560 Training loss: 1.3486 6.2666 sec/batch
Epoch 12/20  Iteration 2051/3560 Training loss: 1.3481 6.1409 sec/batch
Epoch 12/20  Iteration 2052/3560 Training loss: 1.3478 6.4319 sec/batch
Epoch 12/20  Iteration 2053/3560 Training loss: 1.3475 6.1220 sec/batch
Epoch 12/20  Iteration 2054/3560 Training loss: 1.3476 6.1125 sec/batch
Epoch 12/20  Iteration 2055/3560 Training loss: 1.3475 6.6014 sec/batch
Epoch 12/20  Iteration 2056/3560 Training loss: 1.3471 6.1632 sec/batch
Epoch 12/20  Iteration 2057/3560 Training loss: 1.3467 6.1743 sec/batch
Epoch 12/20  Iteration 2058/3560 Training loss: 1.3463 6.1892 sec/batch
Epoch 12/20  Iteration 2059/3560 Training loss: 1.3463 6.1185 sec/batch
Epoch 12/20  Iteration 2060/3560 Training loss: 1.3461 6.1112 sec/batch
Epoch 12/20  Iteration 2061/3560 Training loss: 1.3459 6.4684 sec/batch
Epoch 12/20  Iteration 2062/3560 Training loss: 1.3458 6.2285 sec/batch
Epoch 12/20  Iteration 2063/3560 Training loss: 1.3456 6.2531 sec/batch
Epoch 12/20  Iteration 2064/3560 Training loss: 1.3455 6.3705 sec/batch
Epoch 12/20  Iteration 2065/3560 Training loss: 1.3454 6.3500 sec/batch
Epoch 12/20  Iteration 2066/3560 Training loss: 1.3454 6.3647 sec/batch
Epoch 12/20  Iteration 2067/3560 Training loss: 1.3452 6.3678 sec/batch
Epoch 12/20  Iteration 2068/3560 Training loss: 1.3452 6.2329 sec/batch
Epoch 12/20  Iteration 2069/3560 Training loss: 1.3450 6.1835 sec/batch
Epoch 12/20  Iteration 2070/3560 Training loss: 1.3449 6.4104 sec/batch
Epoch 12/20  Iteration 2071/3560 Training loss: 1.3448 6.6534 sec/batch
Epoch 12/20  Iteration 2072/3560 Training loss: 1.3446 6.2907 sec/batch
Epoch 12/20  Iteration 2073/3560 Training loss: 1.3443 6.1680 sec/batch
Epoch 12/20  Iteration 2074/3560 Training loss: 1.3439 6.3293 sec/batch
Epoch 12/20  Iteration 2075/3560 Training loss: 1.3439 7.5862 sec/batch
Epoch 12/20  Iteration 2076/3560 Training loss: 1.3438 6.5159 sec/batch
Epoch 12/20  Iteration 2077/3560 Training loss: 1.3436 6.2625 sec/batch
Epoch 12/20  Iteration 2078/3560 Training loss: 1.3435 6.3066 sec/batch
Epoch 12/20  Iteration 2079/3560 Training loss: 1.3433 6.4605 sec/batch
Epoch 12/20  Iteration 2080/3560 Training loss: 1.3429 7.2015 sec/batch
Epoch 12/20  Iteration 2081/3560 Training loss: 1.3424 6.7500 sec/batch
Epoch 12/20  Iteration 2082/3560 Training loss: 1.3423 7.3521 sec/batch
Epoch 12/20  Iteration 2083/3560 Training loss: 1.3422 6.7850 sec/batch
Epoch 12/20  Iteration 2084/3560 Training loss: 1.3417 7.2450 sec/batch
Epoch 12/20  Iteration 2085/3560 Training loss: 1.3417 6.3766 sec/batch
Epoch 12/20  Iteration 2086/3560 Training loss: 1.3418 7.0898 sec/batch
Epoch 12/20  Iteration 2087/3560 Training loss: 1.3416 8.2429 sec/batch
Epoch 12/20  Iteration 2088/3560 Training loss: 1.3413 9.1139 sec/batch
Epoch 12/20  Iteration 2089/3560 Training loss: 1.3409 7.7923 sec/batch
Epoch 12/20  Iteration 2090/3560 Training loss: 1.3407 7.8196 sec/batch
Epoch 12/20  Iteration 2091/3560 Training loss: 1.3407 7.1846 sec/batch
Epoch 12/20  Iteration 2092/3560 Training loss: 1.3406 7.4277 sec/batch
Epoch 12/20  Iteration 2093/3560 Training loss: 1.3406 7.4899 sec/batch
Epoch 12/20  Iteration 2094/3560 Training loss: 1.3406 8.8186 sec/batch
Epoch 12/20  Iteration 2095/3560 Training loss: 1.3407 6.2391 sec/batch
Epoch 12/20  Iteration 2096/3560 Training loss: 1.3408 6.2687 sec/batch
Epoch 12/20  Iteration 2097/3560 Training loss: 1.3408 6.2233 sec/batch
Epoch 12/20  Iteration 2098/3560 Training loss: 1.3408 6.1375 sec/batch
Epoch 12/20  Iteration 2099/3560 Training loss: 1.3411 6.2264 sec/batch
Epoch 12/20  Iteration 2100/3560 Training loss: 1.3411 6.6946 sec/batch
Epoch 12/20  Iteration 2101/3560 Training loss: 1.3409 6.3116 sec/batch
Epoch 12/20  Iteration 2102/3560 Training loss: 1.3411 6.4319 sec/batch
Epoch 12/20  Iteration 2103/3560 Training loss: 1.3409 6.2208 sec/batch
Epoch 12/20  Iteration 2104/3560 Training loss: 1.3411 6.2915 sec/batch
Epoch 12/20  Iteration 2105/3560 Training loss: 1.3411 6.2159 sec/batch
Epoch 12/20  Iteration 2106/3560 Training loss: 1.3413 6.5865 sec/batch
Epoch 12/20  Iteration 2107/3560 Training loss: 1.3414 6.2446 sec/batch
Epoch 12/20  Iteration 2108/3560 Training loss: 1.3412 6.2229 sec/batch
Epoch 12/20  Iteration 2109/3560 Training loss: 1.3409 6.2841 sec/batch
Epoch 12/20  Iteration 2110/3560 Training loss: 1.3407 6.1684 sec/batch
Epoch 12/20  Iteration 2111/3560 Training loss: 1.3407 6.1464 sec/batch
Epoch 12/20  Iteration 2112/3560 Training loss: 1.3406 6.3055 sec/batch
Epoch 12/20  Iteration 2113/3560 Training loss: 1.3406 6.2921 sec/batch
Epoch 12/20  Iteration 2114/3560 Training loss: 1.3405 6.2403 sec/batch
Epoch 12/20  Iteration 2115/3560 Training loss: 1.3404 6.1922 sec/batch
Epoch 12/20  Iteration 2116/3560 Training loss: 1.3403 6.1630 sec/batch
Epoch 12/20  Iteration 2117/3560 Training loss: 1.3401 6.2541 sec/batch
Epoch 12/20  Iteration 2118/3560 Training loss: 1.3401 6.2274 sec/batch
Epoch 12/20  Iteration 2119/3560 Training loss: 1.3403 6.1910 sec/batch
Epoch 12/20  Iteration 2120/3560 Training loss: 1.3402 6.7317 sec/batch
Epoch 12/20  Iteration 2121/3560 Training loss: 1.3401 7.2489 sec/batch
Epoch 12/20  Iteration 2122/3560 Training loss: 1.3402 6.6128 sec/batch
Epoch 12/20  Iteration 2123/3560 Training loss: 1.3402 6.5745 sec/batch
Epoch 12/20  Iteration 2124/3560 Training loss: 1.3401 6.5711 sec/batch
Epoch 12/20  Iteration 2125/3560 Training loss: 1.3403 7.0117 sec/batch
Epoch 12/20  Iteration 2126/3560 Training loss: 1.3406 6.5355 sec/batch
Epoch 12/20  Iteration 2127/3560 Training loss: 1.3406 6.5952 sec/batch
Epoch 12/20  Iteration 2128/3560 Training loss: 1.3405 6.5732 sec/batch
Epoch 12/20  Iteration 2129/3560 Training loss: 1.3404 6.5656 sec/batch
Epoch 12/20  Iteration 2130/3560 Training loss: 1.3402 6.5653 sec/batch
Epoch 12/20  Iteration 2131/3560 Training loss: 1.3404 6.6371 sec/batch
Epoch 12/20  Iteration 2132/3560 Training loss: 1.3403 6.5610 sec/batch
Epoch 12/20  Iteration 2133/3560 Training loss: 1.3403 6.5316 sec/batch
Epoch 12/20  Iteration 2134/3560 Training loss: 1.3401 6.5813 sec/batch
Epoch 12/20  Iteration 2135/3560 Training loss: 1.3400 6.5999 sec/batch
Epoch 12/20  Iteration 2136/3560 Training loss: 1.3401 6.5137 sec/batch
Epoch 13/20  Iteration 2137/3560 Training loss: 1.4548 6.6064 sec/batch
Epoch 13/20  Iteration 2138/3560 Training loss: 1.4026 6.5322 sec/batch
Epoch 13/20  Iteration 2139/3560 Training loss: 1.3827 6.5437 sec/batch
Epoch 13/20  Iteration 2140/3560 Training loss: 1.3767 6.5545 sec/batch
Epoch 13/20  Iteration 2141/3560 Training loss: 1.3666 6.5265 sec/batch
Epoch 13/20  Iteration 2142/3560 Training loss: 1.3555 6.6499 sec/batch
Epoch 13/20  Iteration 2143/3560 Training loss: 1.3529 7.0630 sec/batch
Epoch 13/20  Iteration 2144/3560 Training loss: 1.3503 6.5799 sec/batch
Epoch 13/20  Iteration 2145/3560 Training loss: 1.3493 6.5874 sec/batch
Epoch 13/20  Iteration 2146/3560 Training loss: 1.3477 7.0145 sec/batch
Epoch 13/20  Iteration 2147/3560 Training loss: 1.3436 6.5784 sec/batch
Epoch 13/20  Iteration 2148/3560 Training loss: 1.3417 6.5290 sec/batch
Epoch 13/20  Iteration 2149/3560 Training loss: 1.3407 6.5420 sec/batch
Epoch 13/20  Iteration 2150/3560 Training loss: 1.3413 6.5269 sec/batch
Epoch 13/20  Iteration 2151/3560 Training loss: 1.3395 6.5499 sec/batch
Epoch 13/20  Iteration 2152/3560 Training loss: 1.3377 6.5796 sec/batch
Epoch 13/20  Iteration 2153/3560 Training loss: 1.3379 6.5297 sec/batch
Epoch 13/20  Iteration 2154/3560 Training loss: 1.3391 6.5330 sec/batch
Epoch 13/20  Iteration 2155/3560 Training loss: 1.3382 6.5395 sec/batch
Epoch 13/20  Iteration 2156/3560 Training loss: 1.3390 6.8097 sec/batch
Epoch 13/20  Iteration 2157/3560 Training loss: 1.3383 7.7435 sec/batch
Epoch 13/20  Iteration 2158/3560 Training loss: 1.3385 7.1207 sec/batch
Epoch 13/20  Iteration 2159/3560 Training loss: 1.3373 7.2296 sec/batch
Epoch 13/20  Iteration 2160/3560 Training loss: 1.3370 11.2066 sec/batch
Epoch 13/20  Iteration 2161/3560 Training loss: 1.3368 8.5371 sec/batch
Epoch 13/20  Iteration 2162/3560 Training loss: 1.3348 8.8048 sec/batch
Epoch 13/20  Iteration 2163/3560 Training loss: 1.3335 6.8987 sec/batch
Epoch 13/20  Iteration 2164/3560 Training loss: 1.3339 6.5744 sec/batch
Epoch 13/20  Iteration 2165/3560 Training loss: 1.3337 6.4372 sec/batch
Epoch 13/20  Iteration 2166/3560 Training loss: 1.3337 6.3290 sec/batch
Epoch 13/20  Iteration 2167/3560 Training loss: 1.3331 6.3408 sec/batch
Epoch 13/20  Iteration 2168/3560 Training loss: 1.3318 6.3275 sec/batch
Epoch 13/20  Iteration 2169/3560 Training loss: 1.3320 6.2664 sec/batch
Epoch 13/20  Iteration 2170/3560 Training loss: 1.3320 6.3632 sec/batch
Epoch 13/20  Iteration 2171/3560 Training loss: 1.3316 6.3578 sec/batch
Epoch 13/20  Iteration 2172/3560 Training loss: 1.3315 6.2608 sec/batch
Epoch 13/20  Iteration 2173/3560 Training loss: 1.3307 6.2600 sec/batch
Epoch 13/20  Iteration 2174/3560 Training loss: 1.3294 6.3394 sec/batch
Epoch 13/20  Iteration 2175/3560 Training loss: 1.3281 6.3490 sec/batch
Epoch 13/20  Iteration 2176/3560 Training loss: 1.3275 6.4361 sec/batch
Epoch 13/20  Iteration 2177/3560 Training loss: 1.3269 6.3691 sec/batch
Epoch 13/20  Iteration 2178/3560 Training loss: 1.3275 6.7877 sec/batch
Epoch 13/20  Iteration 2179/3560 Training loss: 1.3271 6.3865 sec/batch
Epoch 13/20  Iteration 2180/3560 Training loss: 1.3264 6.3633 sec/batch
Epoch 13/20  Iteration 2181/3560 Training loss: 1.3265 6.3397 sec/batch
Epoch 13/20  Iteration 2182/3560 Training loss: 1.3257 6.2902 sec/batch
Epoch 13/20  Iteration 2183/3560 Training loss: 1.3251 6.2976 sec/batch
Epoch 13/20  Iteration 2184/3560 Training loss: 1.3248 6.3510 sec/batch
Epoch 13/20  Iteration 2185/3560 Training loss: 1.3245 6.3063 sec/batch
Epoch 13/20  Iteration 2186/3560 Training loss: 1.3247 6.4397 sec/batch
Epoch 13/20  Iteration 2187/3560 Training loss: 1.3242 6.3211 sec/batch
Epoch 13/20  Iteration 2188/3560 Training loss: 1.3247 6.3120 sec/batch
Epoch 13/20  Iteration 2189/3560 Training loss: 1.3247 6.2507 sec/batch
Epoch 13/20  Iteration 2190/3560 Training loss: 1.3248 6.3542 sec/batch
Epoch 13/20  Iteration 2191/3560 Training loss: 1.3246 6.7919 sec/batch
Epoch 13/20  Iteration 2192/3560 Training loss: 1.3245 6.3068 sec/batch
Epoch 13/20  Iteration 2193/3560 Training loss: 1.3249 6.4545 sec/batch
Epoch 13/20  Iteration 2194/3560 Training loss: 1.3245 6.3089 sec/batch
Epoch 13/20  Iteration 2195/3560 Training loss: 1.3238 6.2711 sec/batch
Epoch 13/20  Iteration 2196/3560 Training loss: 1.3244 6.3749 sec/batch
Epoch 13/20  Iteration 2197/3560 Training loss: 1.3242 6.6158 sec/batch
Epoch 13/20  Iteration 2198/3560 Training loss: 1.3251 6.2823 sec/batch
Epoch 13/20  Iteration 2199/3560 Training loss: 1.3254 6.3311 sec/batch
Epoch 13/20  Iteration 2200/3560 Training loss: 1.3254 6.5899 sec/batch
Epoch 13/20  Iteration 2201/3560 Training loss: 1.3252 6.7997 sec/batch
Epoch 13/20  Iteration 2202/3560 Training loss: 1.3254 6.4383 sec/batch
Epoch 13/20  Iteration 2203/3560 Training loss: 1.3255 6.3552 sec/batch
Epoch 13/20  Iteration 2204/3560 Training loss: 1.3250 6.2926 sec/batch
Epoch 13/20  Iteration 2205/3560 Training loss: 1.3249 6.3344 sec/batch
Epoch 13/20  Iteration 2206/3560 Training loss: 1.3246 6.2638 sec/batch
Epoch 13/20  Iteration 2207/3560 Training loss: 1.3250 6.2682 sec/batch
Epoch 13/20  Iteration 2208/3560 Training loss: 1.3251 6.3006 sec/batch
Epoch 13/20  Iteration 2209/3560 Training loss: 1.3254 6.2746 sec/batch
Epoch 13/20  Iteration 2210/3560 Training loss: 1.3252 6.2528 sec/batch
Epoch 13/20  Iteration 2211/3560 Training loss: 1.3251 6.2907 sec/batch
Epoch 13/20  Iteration 2212/3560 Training loss: 1.3253 6.2781 sec/batch
Epoch 13/20  Iteration 2213/3560 Training loss: 1.3251 6.3114 sec/batch
Epoch 13/20  Iteration 2214/3560 Training loss: 1.3249 6.2705 sec/batch
Epoch 13/20  Iteration 2215/3560 Training loss: 1.3244 6.4231 sec/batch
Epoch 13/20  Iteration 2216/3560 Training loss: 1.3244 6.5403 sec/batch
Epoch 13/20  Iteration 2217/3560 Training loss: 1.3238 6.2659 sec/batch
Epoch 13/20  Iteration 2218/3560 Training loss: 1.3237 6.2931 sec/batch
Epoch 13/20  Iteration 2219/3560 Training loss: 1.3232 6.2517 sec/batch
Epoch 13/20  Iteration 2220/3560 Training loss: 1.3231 6.2294 sec/batch
Epoch 13/20  Iteration 2221/3560 Training loss: 1.3228 6.3421 sec/batch
Epoch 13/20  Iteration 2222/3560 Training loss: 1.3227 6.4640 sec/batch
Epoch 13/20  Iteration 2223/3560 Training loss: 1.3224 6.2438 sec/batch
Epoch 13/20  Iteration 2224/3560 Training loss: 1.3221 6.2810 sec/batch
Epoch 13/20  Iteration 2225/3560 Training loss: 1.3216 6.2591 sec/batch
Epoch 13/20  Iteration 2226/3560 Training loss: 1.3216 6.2866 sec/batch
Epoch 13/20  Iteration 2227/3560 Training loss: 1.3213 6.3434 sec/batch
Epoch 13/20  Iteration 2228/3560 Training loss: 1.3212 6.3109 sec/batch
Epoch 13/20  Iteration 2229/3560 Training loss: 1.3208 6.2959 sec/batch
Epoch 13/20  Iteration 2230/3560 Training loss: 1.3204 6.2539 sec/batch
Epoch 13/20  Iteration 2231/3560 Training loss: 1.3201 6.2785 sec/batch
Epoch 13/20  Iteration 2232/3560 Training loss: 1.3202 6.3001 sec/batch
Epoch 13/20  Iteration 2233/3560 Training loss: 1.3201 6.2554 sec/batch
Epoch 13/20  Iteration 2234/3560 Training loss: 1.3197 6.4248 sec/batch
Epoch 13/20  Iteration 2235/3560 Training loss: 1.3193 6.4827 sec/batch
Epoch 13/20  Iteration 2236/3560 Training loss: 1.3189 6.2680 sec/batch
Epoch 13/20  Iteration 2237/3560 Training loss: 1.3188 6.2836 sec/batch
Epoch 13/20  Iteration 2238/3560 Training loss: 1.3186 7.0848 sec/batch
Epoch 13/20  Iteration 2239/3560 Training loss: 1.3185 6.3004 sec/batch
Epoch 13/20  Iteration 2240/3560 Training loss: 1.3184 6.3273 sec/batch
Epoch 13/20  Iteration 2241/3560 Training loss: 1.3182 6.2667 sec/batch
Epoch 13/20  Iteration 2242/3560 Training loss: 1.3181 6.2940 sec/batch
Epoch 13/20  Iteration 2243/3560 Training loss: 1.3180 6.3074 sec/batch
Epoch 13/20  Iteration 2244/3560 Training loss: 1.3179 6.2655 sec/batch
Epoch 13/20  Iteration 2245/3560 Training loss: 1.3178 6.2893 sec/batch
Epoch 13/20  Iteration 2246/3560 Training loss: 1.3179 6.3251 sec/batch
Epoch 13/20  Iteration 2247/3560 Training loss: 1.3176 6.2492 sec/batch
Epoch 13/20  Iteration 2248/3560 Training loss: 1.3175 6.2707 sec/batch
Epoch 13/20  Iteration 2249/3560 Training loss: 1.3174 6.2244 sec/batch
Epoch 13/20  Iteration 2250/3560 Training loss: 1.3173 6.2685 sec/batch
Epoch 13/20  Iteration 2251/3560 Training loss: 1.3169 6.3008 sec/batch
Epoch 13/20  Iteration 2252/3560 Training loss: 1.3166 6.2937 sec/batch
Epoch 13/20  Iteration 2253/3560 Training loss: 1.3166 6.5007 sec/batch
Epoch 13/20  Iteration 2254/3560 Training loss: 1.3166 6.3795 sec/batch
Epoch 13/20  Iteration 2255/3560 Training loss: 1.3165 6.2576 sec/batch
Epoch 13/20  Iteration 2256/3560 Training loss: 1.3165 6.3129 sec/batch
Epoch 13/20  Iteration 2257/3560 Training loss: 1.3164 6.3003 sec/batch
Epoch 13/20  Iteration 2258/3560 Training loss: 1.3160 6.3010 sec/batch
Epoch 13/20  Iteration 2259/3560 Training loss: 1.3156 6.3923 sec/batch
Epoch 13/20  Iteration 2260/3560 Training loss: 1.3156 6.2630 sec/batch
Epoch 13/20  Iteration 2261/3560 Training loss: 1.3155 6.3540 sec/batch
Epoch 13/20  Iteration 2262/3560 Training loss: 1.3151 6.2712 sec/batch
Epoch 13/20  Iteration 2263/3560 Training loss: 1.3151 6.2503 sec/batch
Epoch 13/20  Iteration 2264/3560 Training loss: 1.3150 6.2773 sec/batch
Epoch 13/20  Iteration 2265/3560 Training loss: 1.3148 6.2623 sec/batch
Epoch 13/20  Iteration 2266/3560 Training loss: 1.3145 6.2544 sec/batch
Epoch 13/20  Iteration 2267/3560 Training loss: 1.3141 6.2930 sec/batch
Epoch 13/20  Iteration 2268/3560 Training loss: 1.3138 6.2745 sec/batch
Epoch 13/20  Iteration 2269/3560 Training loss: 1.3139 6.2440 sec/batch
Epoch 13/20  Iteration 2270/3560 Training loss: 1.3139 6.2564 sec/batch
Epoch 13/20  Iteration 2271/3560 Training loss: 1.3138 6.2715 sec/batch
Epoch 13/20  Iteration 2272/3560 Training loss: 1.3137 6.5394 sec/batch
Epoch 13/20  Iteration 2273/3560 Training loss: 1.3138 6.4297 sec/batch
Epoch 13/20  Iteration 2274/3560 Training loss: 1.3139 6.3138 sec/batch
Epoch 13/20  Iteration 2275/3560 Training loss: 1.3139 6.4293 sec/batch
Epoch 13/20  Iteration 2276/3560 Training loss: 1.3139 6.2931 sec/batch
Epoch 13/20  Iteration 2277/3560 Training loss: 1.3142 6.2864 sec/batch
Epoch 13/20  Iteration 2278/3560 Training loss: 1.3142 6.2484 sec/batch
Epoch 13/20  Iteration 2279/3560 Training loss: 1.3141 6.3094 sec/batch
Epoch 13/20  Iteration 2280/3560 Training loss: 1.3142 6.4213 sec/batch
Epoch 13/20  Iteration 2281/3560 Training loss: 1.3141 6.4033 sec/batch
Epoch 13/20  Iteration 2282/3560 Training loss: 1.3142 6.3437 sec/batch
Epoch 13/20  Iteration 2283/3560 Training loss: 1.3142 6.2997 sec/batch
Epoch 13/20  Iteration 2284/3560 Training loss: 1.3145 6.3296 sec/batch
Epoch 13/20  Iteration 2285/3560 Training loss: 1.3146 6.5595 sec/batch
Epoch 13/20  Iteration 2286/3560 Training loss: 1.3144 6.2919 sec/batch
Epoch 13/20  Iteration 2287/3560 Training loss: 1.3141 6.3043 sec/batch
Epoch 13/20  Iteration 2288/3560 Training loss: 1.3139 6.2909 sec/batch
Epoch 13/20  Iteration 2289/3560 Training loss: 1.3139 6.2901 sec/batch
Epoch 13/20  Iteration 2290/3560 Training loss: 1.3139 6.3313 sec/batch
Epoch 13/20  Iteration 2291/3560 Training loss: 1.3139 6.6183 sec/batch
Epoch 13/20  Iteration 2292/3560 Training loss: 1.3139 6.3341 sec/batch
Epoch 13/20  Iteration 2293/3560 Training loss: 1.3139 6.2989 sec/batch
Epoch 13/20  Iteration 2294/3560 Training loss: 1.3138 6.2829 sec/batch
Epoch 13/20  Iteration 2295/3560 Training loss: 1.3136 6.2173 sec/batch
Epoch 13/20  Iteration 2296/3560 Training loss: 1.3137 6.3995 sec/batch
Epoch 13/20  Iteration 2297/3560 Training loss: 1.3139 6.3367 sec/batch
Epoch 13/20  Iteration 2298/3560 Training loss: 1.3139 6.1965 sec/batch
Epoch 13/20  Iteration 2299/3560 Training loss: 1.3138 6.2541 sec/batch
Epoch 13/20  Iteration 2300/3560 Training loss: 1.3137 6.3085 sec/batch
Epoch 13/20  Iteration 2301/3560 Training loss: 1.3137 6.3017 sec/batch
Epoch 13/20  Iteration 2302/3560 Training loss: 1.3137 6.2327 sec/batch
Epoch 13/20  Iteration 2303/3560 Training loss: 1.3139 6.2736 sec/batch
Epoch 13/20  Iteration 2304/3560 Training loss: 1.3143 6.1761 sec/batch
Epoch 13/20  Iteration 2305/3560 Training loss: 1.3143 6.2049 sec/batch
Epoch 13/20  Iteration 2306/3560 Training loss: 1.3143 6.2288 sec/batch
Epoch 13/20  Iteration 2307/3560 Training loss: 1.3142 6.2027 sec/batch
Epoch 13/20  Iteration 2308/3560 Training loss: 1.3141 6.2600 sec/batch
Epoch 13/20  Iteration 2309/3560 Training loss: 1.3142 6.2397 sec/batch
Epoch 13/20  Iteration 2310/3560 Training loss: 1.3141 6.2764 sec/batch
Epoch 13/20  Iteration 2311/3560 Training loss: 1.3142 6.5869 sec/batch
Epoch 13/20  Iteration 2312/3560 Training loss: 1.3140 6.2887 sec/batch
Epoch 13/20  Iteration 2313/3560 Training loss: 1.3138 6.1961 sec/batch
Epoch 13/20  Iteration 2314/3560 Training loss: 1.3140 6.2809 sec/batch
Epoch 14/20  Iteration 2315/3560 Training loss: 1.4173 6.2831 sec/batch
Epoch 14/20  Iteration 2316/3560 Training loss: 1.3736 6.2396 sec/batch
Epoch 14/20  Iteration 2317/3560 Training loss: 1.3518 6.2146 sec/batch
Epoch 14/20  Iteration 2318/3560 Training loss: 1.3456 6.2458 sec/batch
Epoch 14/20  Iteration 2319/3560 Training loss: 1.3359 6.2052 sec/batch
Epoch 14/20  Iteration 2320/3560 Training loss: 1.3236 6.9783 sec/batch
Epoch 14/20  Iteration 2321/3560 Training loss: 1.3224 6.3128 sec/batch
Epoch 14/20  Iteration 2322/3560 Training loss: 1.3196 6.2881 sec/batch
Epoch 14/20  Iteration 2323/3560 Training loss: 1.3191 6.3404 sec/batch
Epoch 14/20  Iteration 2324/3560 Training loss: 1.3176 6.2710 sec/batch
Epoch 14/20  Iteration 2325/3560 Training loss: 1.3142 6.3101 sec/batch
Epoch 14/20  Iteration 2326/3560 Training loss: 1.3136 6.3001 sec/batch
Epoch 14/20  Iteration 2327/3560 Training loss: 1.3133 6.2825 sec/batch
Epoch 14/20  Iteration 2328/3560 Training loss: 1.3145 6.2662 sec/batch
Epoch 14/20  Iteration 2329/3560 Training loss: 1.3131 6.5276 sec/batch
Epoch 14/20  Iteration 2330/3560 Training loss: 1.3115 6.4013 sec/batch
Epoch 14/20  Iteration 2331/3560 Training loss: 1.3122 6.2789 sec/batch
Epoch 14/20  Iteration 2332/3560 Training loss: 1.3136 6.2454 sec/batch
Epoch 14/20  Iteration 2333/3560 Training loss: 1.3130 6.7041 sec/batch
Epoch 14/20  Iteration 2334/3560 Training loss: 1.3139 6.2673 sec/batch
Epoch 14/20  Iteration 2335/3560 Training loss: 1.3129 6.2625 sec/batch
Epoch 14/20  Iteration 2336/3560 Training loss: 1.3129 6.2913 sec/batch
Epoch 14/20  Iteration 2337/3560 Training loss: 1.3119 6.3035 sec/batch
Epoch 14/20  Iteration 2338/3560 Training loss: 1.3115 6.2273 sec/batch
Epoch 14/20  Iteration 2339/3560 Training loss: 1.3112 6.2493 sec/batch
Epoch 14/20  Iteration 2340/3560 Training loss: 1.3092 6.2796 sec/batch
Epoch 14/20  Iteration 2341/3560 Training loss: 1.3080 6.2632 sec/batch
Epoch 14/20  Iteration 2342/3560 Training loss: 1.3083 6.3442 sec/batch
Epoch 14/20  Iteration 2343/3560 Training loss: 1.3083 6.2622 sec/batch
Epoch 14/20  Iteration 2344/3560 Training loss: 1.3083 6.2440 sec/batch
Epoch 14/20  Iteration 2345/3560 Training loss: 1.3075 6.2662 sec/batch
Epoch 14/20  Iteration 2346/3560 Training loss: 1.3064 6.2314 sec/batch
Epoch 14/20  Iteration 2347/3560 Training loss: 1.3067 6.2309 sec/batch
Epoch 14/20  Iteration 2348/3560 Training loss: 1.3067 6.4955 sec/batch
Epoch 14/20  Iteration 2349/3560 Training loss: 1.3064 6.3479 sec/batch
Epoch 14/20  Iteration 2350/3560 Training loss: 1.3060 6.2839 sec/batch
Epoch 14/20  Iteration 2351/3560 Training loss: 1.3053 6.2632 sec/batch
Epoch 14/20  Iteration 2352/3560 Training loss: 1.3040 6.2795 sec/batch
Epoch 14/20  Iteration 2353/3560 Training loss: 1.3027 6.2571 sec/batch
Epoch 14/20  Iteration 2354/3560 Training loss: 1.3023 6.2808 sec/batch
Epoch 14/20  Iteration 2355/3560 Training loss: 1.3014 6.2030 sec/batch
Epoch 14/20  Iteration 2356/3560 Training loss: 1.3020 6.2587 sec/batch
Epoch 14/20  Iteration 2357/3560 Training loss: 1.3016 6.2809 sec/batch
Epoch 14/20  Iteration 2358/3560 Training loss: 1.3010 6.2647 sec/batch
Epoch 14/20  Iteration 2359/3560 Training loss: 1.3013 6.2363 sec/batch
Epoch 14/20  Iteration 2360/3560 Training loss: 1.3007 6.3199 sec/batch
Epoch 14/20  Iteration 2361/3560 Training loss: 1.3005 6.2504 sec/batch
Epoch 14/20  Iteration 2362/3560 Training loss: 1.3000 6.2684 sec/batch
Epoch 14/20  Iteration 2363/3560 Training loss: 1.3001 6.4309 sec/batch
Epoch 14/20  Iteration 2364/3560 Training loss: 1.3003 6.2989 sec/batch
Epoch 14/20  Iteration 2365/3560 Training loss: 1.2997 6.2673 sec/batch
Epoch 14/20  Iteration 2366/3560 Training loss: 1.3005 6.2823 sec/batch
Epoch 14/20  Iteration 2367/3560 Training loss: 1.3004 6.5459 sec/batch
Epoch 14/20  Iteration 2368/3560 Training loss: 1.3003 6.4111 sec/batch
Epoch 14/20  Iteration 2369/3560 Training loss: 1.3002 6.3171 sec/batch
Epoch 14/20  Iteration 2370/3560 Training loss: 1.3001 6.4099 sec/batch
Epoch 14/20  Iteration 2371/3560 Training loss: 1.3004 6.2804 sec/batch
Epoch 14/20  Iteration 2372/3560 Training loss: 1.3000 6.2764 sec/batch
Epoch 14/20  Iteration 2373/3560 Training loss: 1.2995 6.4169 sec/batch
Epoch 14/20  Iteration 2374/3560 Training loss: 1.3000 6.3112 sec/batch
Epoch 14/20  Iteration 2375/3560 Training loss: 1.2997 6.3019 sec/batch
Epoch 14/20  Iteration 2376/3560 Training loss: 1.3006 6.3852 sec/batch
Epoch 14/20  Iteration 2377/3560 Training loss: 1.3008 6.2295 sec/batch
Epoch 14/20  Iteration 2378/3560 Training loss: 1.3008 6.2812 sec/batch
Epoch 14/20  Iteration 2379/3560 Training loss: 1.3008 6.3039 sec/batch
Epoch 14/20  Iteration 2380/3560 Training loss: 1.3009 6.5005 sec/batch
Epoch 14/20  Iteration 2381/3560 Training loss: 1.3011 6.2610 sec/batch
Epoch 14/20  Iteration 2382/3560 Training loss: 1.3008 6.3034 sec/batch
Epoch 14/20  Iteration 2383/3560 Training loss: 1.3009 6.3097 sec/batch
Epoch 14/20  Iteration 2384/3560 Training loss: 1.3007 6.3000 sec/batch
Epoch 14/20  Iteration 2385/3560 Training loss: 1.3012 6.3155 sec/batch
Epoch 14/20  Iteration 2386/3560 Training loss: 1.3014 6.5418 sec/batch
Epoch 14/20  Iteration 2387/3560 Training loss: 1.3019 6.3983 sec/batch
Epoch 14/20  Iteration 2388/3560 Training loss: 1.3015 6.2409 sec/batch
Epoch 14/20  Iteration 2389/3560 Training loss: 1.3015 6.2622 sec/batch
Epoch 14/20  Iteration 2390/3560 Training loss: 1.3015 6.2356 sec/batch
Epoch 14/20  Iteration 2391/3560 Training loss: 1.3014 6.2586 sec/batch
Epoch 14/20  Iteration 2392/3560 Training loss: 1.3013 6.2525 sec/batch
Epoch 14/20  Iteration 2393/3560 Training loss: 1.3008 6.3624 sec/batch
Epoch 14/20  Iteration 2394/3560 Training loss: 1.3006 6.4501 sec/batch
Epoch 14/20  Iteration 2395/3560 Training loss: 1.3002 6.2876 sec/batch
Epoch 14/20  Iteration 2396/3560 Training loss: 1.3001 6.2374 sec/batch
Epoch 14/20  Iteration 2397/3560 Training loss: 1.2996 6.2232 sec/batch
Epoch 14/20  Iteration 2398/3560 Training loss: 1.2995 6.2445 sec/batch
Epoch 14/20  Iteration 2399/3560 Training loss: 1.2992 6.2647 sec/batch
Epoch 14/20  Iteration 2400/3560 Training loss: 1.2991 6.2252 sec/batch
Epoch 14/20  Iteration 2401/3560 Training loss: 1.2989 6.2053 sec/batch
Epoch 14/20  Iteration 2402/3560 Training loss: 1.2987 6.4205 sec/batch
Epoch 14/20  Iteration 2403/3560 Training loss: 1.2983 6.2902 sec/batch
Epoch 14/20  Iteration 2404/3560 Training loss: 1.2983 6.3020 sec/batch
Epoch 14/20  Iteration 2405/3560 Training loss: 1.2981 6.5774 sec/batch
Epoch 14/20  Iteration 2406/3560 Training loss: 1.2979 6.3994 sec/batch
Epoch 14/20  Iteration 2407/3560 Training loss: 1.2976 6.3375 sec/batch
Epoch 14/20  Iteration 2408/3560 Training loss: 1.2973 6.2842 sec/batch
Epoch 14/20  Iteration 2409/3560 Training loss: 1.2970 6.2367 sec/batch
Epoch 14/20  Iteration 2410/3560 Training loss: 1.2970 6.2905 sec/batch
Epoch 14/20  Iteration 2411/3560 Training loss: 1.2969 6.2535 sec/batch
Epoch 14/20  Iteration 2412/3560 Training loss: 1.2964 6.3326 sec/batch
Epoch 14/20  Iteration 2413/3560 Training loss: 1.2960 6.3019 sec/batch
Epoch 14/20  Iteration 2414/3560 Training loss: 1.2957 6.2908 sec/batch
Epoch 14/20  Iteration 2415/3560 Training loss: 1.2956 6.4377 sec/batch
Epoch 14/20  Iteration 2416/3560 Training loss: 1.2954 6.2756 sec/batch
Epoch 14/20  Iteration 2417/3560 Training loss: 1.2953 6.2508 sec/batch
Epoch 14/20  Iteration 2418/3560 Training loss: 1.2951 6.2739 sec/batch
Epoch 14/20  Iteration 2419/3560 Training loss: 1.2949 6.2279 sec/batch
Epoch 14/20  Iteration 2420/3560 Training loss: 1.2948 6.5183 sec/batch
Epoch 14/20  Iteration 2421/3560 Training loss: 1.2947 7.1222 sec/batch
Epoch 14/20  Iteration 2422/3560 Training loss: 1.2946 6.8383 sec/batch
Epoch 14/20  Iteration 2423/3560 Training loss: 1.2944 7.2820 sec/batch
Epoch 14/20  Iteration 2424/3560 Training loss: 1.2945 6.9476 sec/batch
Epoch 14/20  Iteration 2425/3560 Training loss: 1.2942 6.4100 sec/batch
Epoch 14/20  Iteration 2426/3560 Training loss: 1.2942 7.3188 sec/batch
Epoch 14/20  Iteration 2427/3560 Training loss: 1.2940 7.5558 sec/batch
Epoch 14/20  Iteration 2428/3560 Training loss: 1.2939 6.8184 sec/batch
Epoch 14/20  Iteration 2429/3560 Training loss: 1.2936 7.2087 sec/batch
Epoch 14/20  Iteration 2430/3560 Training loss: 1.2932 6.3010 sec/batch
Epoch 14/20  Iteration 2431/3560 Training loss: 1.2932 6.2833 sec/batch
Epoch 14/20  Iteration 2432/3560 Training loss: 1.2933 6.2607 sec/batch
Epoch 14/20  Iteration 2433/3560 Training loss: 1.2931 6.2114 sec/batch
Epoch 14/20  Iteration 2434/3560 Training loss: 1.2930 6.2373 sec/batch
Epoch 14/20  Iteration 2435/3560 Training loss: 1.2929 6.2776 sec/batch
Epoch 14/20  Iteration 2436/3560 Training loss: 1.2925 6.2312 sec/batch
Epoch 14/20  Iteration 2437/3560 Training loss: 1.2920 6.2529 sec/batch
Epoch 14/20  Iteration 2438/3560 Training loss: 1.2919 6.2266 sec/batch
Epoch 14/20  Iteration 2439/3560 Training loss: 1.2918 6.2578 sec/batch
Epoch 14/20  Iteration 2440/3560 Training loss: 1.2914 6.2285 sec/batch
Epoch 14/20  Iteration 2441/3560 Training loss: 1.2914 6.2671 sec/batch
Epoch 14/20  Iteration 2442/3560 Training loss: 1.2914 6.5792 sec/batch
Epoch 14/20  Iteration 2443/3560 Training loss: 1.2913 6.3920 sec/batch
Epoch 14/20  Iteration 2444/3560 Training loss: 1.2910 6.2292 sec/batch
Epoch 14/20  Iteration 2445/3560 Training loss: 1.2905 6.2019 sec/batch
Epoch 14/20  Iteration 2446/3560 Training loss: 1.2904 6.2763 sec/batch
Epoch 14/20  Iteration 2447/3560 Training loss: 1.2905 6.2339 sec/batch
Epoch 14/20  Iteration 2448/3560 Training loss: 1.2905 6.2748 sec/batch
Epoch 14/20  Iteration 2449/3560 Training loss: 1.2905 6.2647 sec/batch
Epoch 14/20  Iteration 2450/3560 Training loss: 1.2905 6.2464 sec/batch
Epoch 14/20  Iteration 2451/3560 Training loss: 1.2906 6.2677 sec/batch
Epoch 14/20  Iteration 2452/3560 Training loss: 1.2906 6.2545 sec/batch
Epoch 14/20  Iteration 2453/3560 Training loss: 1.2907 6.2100 sec/batch
Epoch 14/20  Iteration 2454/3560 Training loss: 1.2907 6.2942 sec/batch
Epoch 14/20  Iteration 2455/3560 Training loss: 1.2910 7.4213 sec/batch
Epoch 14/20  Iteration 2456/3560 Training loss: 1.2910 7.3305 sec/batch
Epoch 14/20  Iteration 2457/3560 Training loss: 1.2909 6.5950 sec/batch
Epoch 14/20  Iteration 2458/3560 Training loss: 1.2911 6.2497 sec/batch
Epoch 14/20  Iteration 2459/3560 Training loss: 1.2910 6.2994 sec/batch
Epoch 14/20  Iteration 2460/3560 Training loss: 1.2911 6.2684 sec/batch
Epoch 14/20  Iteration 2461/3560 Training loss: 1.2911 6.6232 sec/batch
Epoch 14/20  Iteration 2462/3560 Training loss: 1.2914 6.2455 sec/batch
Epoch 14/20  Iteration 2463/3560 Training loss: 1.2915 6.2504 sec/batch
Epoch 14/20  Iteration 2464/3560 Training loss: 1.2914 6.2366 sec/batch
Epoch 14/20  Iteration 2465/3560 Training loss: 1.2911 6.2411 sec/batch
Epoch 14/20  Iteration 2466/3560 Training loss: 1.2909 6.2565 sec/batch
Epoch 14/20  Iteration 2467/3560 Training loss: 1.2910 6.2682 sec/batch
Epoch 14/20  Iteration 2468/3560 Training loss: 1.2909 7.2472 sec/batch
Epoch 14/20  Iteration 2469/3560 Training loss: 1.2909 7.3234 sec/batch
Epoch 14/20  Iteration 2470/3560 Training loss: 1.2909 6.5720 sec/batch
Epoch 14/20  Iteration 2471/3560 Training loss: 1.2910 6.3360 sec/batch
Epoch 14/20  Iteration 2472/3560 Training loss: 1.2909 6.3186 sec/batch
Epoch 14/20  Iteration 2473/3560 Training loss: 1.2907 6.6510 sec/batch
Epoch 14/20  Iteration 2474/3560 Training loss: 1.2907 6.3360 sec/batch
Epoch 14/20  Iteration 2475/3560 Training loss: 1.2910 6.3535 sec/batch
Epoch 14/20  Iteration 2476/3560 Training loss: 1.2910 6.2593 sec/batch
Epoch 14/20  Iteration 2477/3560 Training loss: 1.2909 6.2754 sec/batch
Epoch 14/20  Iteration 2478/3560 Training loss: 1.2908 6.2402 sec/batch
Epoch 14/20  Iteration 2479/3560 Training loss: 1.2908 6.3222 sec/batch
Epoch 14/20  Iteration 2480/3560 Training loss: 1.2908 6.6683 sec/batch
Epoch 14/20  Iteration 2481/3560 Training loss: 1.2909 6.2921 sec/batch
Epoch 14/20  Iteration 2482/3560 Training loss: 1.2913 6.3046 sec/batch
Epoch 14/20  Iteration 2483/3560 Training loss: 1.2913 6.2918 sec/batch
Epoch 14/20  Iteration 2484/3560 Training loss: 1.2913 6.2295 sec/batch
Epoch 14/20  Iteration 2485/3560 Training loss: 1.2912 6.2552 sec/batch
Epoch 14/20  Iteration 2486/3560 Training loss: 1.2910 6.2377 sec/batch
Epoch 14/20  Iteration 2487/3560 Training loss: 1.2912 6.2779 sec/batch
Epoch 14/20  Iteration 2488/3560 Training loss: 1.2912 6.2644 sec/batch
Epoch 14/20  Iteration 2489/3560 Training loss: 1.2912 6.2144 sec/batch
Epoch 14/20  Iteration 2490/3560 Training loss: 1.2911 6.2273 sec/batch
Epoch 14/20  Iteration 2491/3560 Training loss: 1.2909 6.2701 sec/batch
Epoch 14/20  Iteration 2492/3560 Training loss: 1.2910 6.2719 sec/batch
Epoch 15/20  Iteration 2493/3560 Training loss: 1.4033 6.2557 sec/batch
Epoch 15/20  Iteration 2494/3560 Training loss: 1.3500 6.2356 sec/batch
Epoch 15/20  Iteration 2495/3560 Training loss: 1.3317 6.2796 sec/batch
Epoch 15/20  Iteration 2496/3560 Training loss: 1.3246 6.2963 sec/batch
Epoch 15/20  Iteration 2497/3560 Training loss: 1.3134 6.2193 sec/batch
Epoch 15/20  Iteration 2498/3560 Training loss: 1.3007 6.2138 sec/batch
Epoch 15/20  Iteration 2499/3560 Training loss: 1.2999 6.7962 sec/batch
Epoch 15/20  Iteration 2500/3560 Training loss: 1.2967 6.2061 sec/batch
Epoch 15/20  Iteration 2501/3560 Training loss: 1.2964 6.2778 sec/batch
Epoch 15/20  Iteration 2502/3560 Training loss: 1.2945 6.2783 sec/batch
Epoch 15/20  Iteration 2503/3560 Training loss: 1.2913 6.3460 sec/batch
Epoch 15/20  Iteration 2504/3560 Training loss: 1.2902 6.2963 sec/batch
Epoch 15/20  Iteration 2505/3560 Training loss: 1.2897 6.3823 sec/batch
Epoch 15/20  Iteration 2506/3560 Training loss: 1.2898 6.3048 sec/batch
Epoch 15/20  Iteration 2507/3560 Training loss: 1.2884 6.2924 sec/batch
Epoch 15/20  Iteration 2508/3560 Training loss: 1.2864 6.3861 sec/batch
Epoch 15/20  Iteration 2509/3560 Training loss: 1.2862 6.2931 sec/batch
Epoch 15/20  Iteration 2510/3560 Training loss: 1.2868 6.2840 sec/batch
Epoch 15/20  Iteration 2511/3560 Training loss: 1.2867 6.2692 sec/batch
Epoch 15/20  Iteration 2512/3560 Training loss: 1.2878 6.2372 sec/batch
Epoch 15/20  Iteration 2513/3560 Training loss: 1.2871 6.2767 sec/batch
Epoch 15/20  Iteration 2514/3560 Training loss: 1.2871 6.2549 sec/batch
Epoch 15/20  Iteration 2515/3560 Training loss: 1.2862 6.3685 sec/batch
Epoch 15/20  Iteration 2516/3560 Training loss: 1.2863 6.2732 sec/batch
Epoch 15/20  Iteration 2517/3560 Training loss: 1.2861 6.3109 sec/batch
Epoch 15/20  Iteration 2518/3560 Training loss: 1.2840 6.6745 sec/batch
Epoch 15/20  Iteration 2519/3560 Training loss: 1.2828 6.2415 sec/batch
Epoch 15/20  Iteration 2520/3560 Training loss: 1.2834 6.2310 sec/batch
Epoch 15/20  Iteration 2521/3560 Training loss: 1.2834 6.8004 sec/batch
Epoch 15/20  Iteration 2522/3560 Training loss: 1.2839 6.2914 sec/batch
Epoch 15/20  Iteration 2523/3560 Training loss: 1.2835 6.2889 sec/batch
Epoch 15/20  Iteration 2524/3560 Training loss: 1.2826 6.2466 sec/batch
Epoch 15/20  Iteration 2525/3560 Training loss: 1.2826 6.3772 sec/batch
Epoch 15/20  Iteration 2526/3560 Training loss: 1.2827 6.2680 sec/batch
Epoch 15/20  Iteration 2527/3560 Training loss: 1.2827 6.2724 sec/batch
Epoch 15/20  Iteration 2528/3560 Training loss: 1.2823 6.3007 sec/batch
Epoch 15/20  Iteration 2529/3560 Training loss: 1.2818 6.3583 sec/batch
Epoch 15/20  Iteration 2530/3560 Training loss: 1.2809 6.3627 sec/batch
Epoch 15/20  Iteration 2531/3560 Training loss: 1.2797 6.4455 sec/batch
Epoch 15/20  Iteration 2532/3560 Training loss: 1.2793 6.4679 sec/batch
Epoch 15/20  Iteration 2533/3560 Training loss: 1.2785 6.4214 sec/batch
Epoch 15/20  Iteration 2534/3560 Training loss: 1.2794 6.4959 sec/batch
Epoch 15/20  Iteration 2535/3560 Training loss: 1.2789 6.3761 sec/batch
Epoch 15/20  Iteration 2536/3560 Training loss: 1.2782 6.5328 sec/batch
Epoch 15/20  Iteration 2537/3560 Training loss: 1.2784 6.5652 sec/batch
Epoch 15/20  Iteration 2538/3560 Training loss: 1.2778 6.5207 sec/batch
Epoch 15/20  Iteration 2539/3560 Training loss: 1.2775 6.4506 sec/batch
Epoch 15/20  Iteration 2540/3560 Training loss: 1.2771 6.4327 sec/batch
Epoch 15/20  Iteration 2541/3560 Training loss: 1.2771 6.5417 sec/batch
Epoch 15/20  Iteration 2542/3560 Training loss: 1.2773 6.4699 sec/batch
Epoch 15/20  Iteration 2543/3560 Training loss: 1.2768 6.4681 sec/batch
Epoch 15/20  Iteration 2544/3560 Training loss: 1.2773 6.4224 sec/batch
Epoch 15/20  Iteration 2545/3560 Training loss: 1.2770 6.2520 sec/batch
Epoch 15/20  Iteration 2546/3560 Training loss: 1.2772 6.2737 sec/batch
Epoch 15/20  Iteration 2547/3560 Training loss: 1.2771 6.2275 sec/batch
Epoch 15/20  Iteration 2548/3560 Training loss: 1.2771 6.2677 sec/batch
Epoch 15/20  Iteration 2549/3560 Training loss: 1.2775 6.1956 sec/batch
Epoch 15/20  Iteration 2550/3560 Training loss: 1.2772 6.2652 sec/batch
Epoch 15/20  Iteration 2551/3560 Training loss: 1.2767 6.2838 sec/batch
Epoch 15/20  Iteration 2552/3560 Training loss: 1.2772 6.2405 sec/batch
Epoch 15/20  Iteration 2553/3560 Training loss: 1.2771 6.2924 sec/batch
Epoch 15/20  Iteration 2554/3560 Training loss: 1.2779 6.2681 sec/batch
Epoch 15/20  Iteration 2555/3560 Training loss: 1.2781 6.5554 sec/batch
Epoch 15/20  Iteration 2556/3560 Training loss: 1.2782 6.3486 sec/batch
Epoch 15/20  Iteration 2557/3560 Training loss: 1.2782 6.4131 sec/batch
Epoch 15/20  Iteration 2558/3560 Training loss: 1.2782 6.4357 sec/batch
Epoch 15/20  Iteration 2559/3560 Training loss: 1.2785 6.2725 sec/batch
Epoch 15/20  Iteration 2560/3560 Training loss: 1.2783 6.2351 sec/batch
Epoch 15/20  Iteration 2561/3560 Training loss: 1.2784 6.2360 sec/batch
Epoch 15/20  Iteration 2562/3560 Training loss: 1.2782 6.2767 sec/batch
Epoch 15/20  Iteration 2563/3560 Training loss: 1.2787 6.5141 sec/batch
Epoch 15/20  Iteration 2564/3560 Training loss: 1.2789 6.4867 sec/batch
Epoch 15/20  Iteration 2565/3560 Training loss: 1.2794 6.6846 sec/batch
Epoch 15/20  Iteration 2566/3560 Training loss: 1.2790 6.2323 sec/batch
Epoch 15/20  Iteration 2567/3560 Training loss: 1.2791 6.2693 sec/batch
Epoch 15/20  Iteration 2568/3560 Training loss: 1.2792 6.6507 sec/batch
Epoch 15/20  Iteration 2569/3560 Training loss: 1.2790 6.3077 sec/batch
Epoch 15/20  Iteration 2570/3560 Training loss: 1.2789 6.2829 sec/batch
Epoch 15/20  Iteration 2571/3560 Training loss: 1.2782 6.2658 sec/batch
Epoch 15/20  Iteration 2572/3560 Training loss: 1.2783 6.3070 sec/batch
Epoch 15/20  Iteration 2573/3560 Training loss: 1.2779 6.2989 sec/batch
Epoch 15/20  Iteration 2574/3560 Training loss: 1.2778 6.6657 sec/batch
Epoch 15/20  Iteration 2575/3560 Training loss: 1.2773 6.2893 sec/batch
Epoch 15/20  Iteration 2576/3560 Training loss: 1.2773 6.2019 sec/batch
Epoch 15/20  Iteration 2577/3560 Training loss: 1.2771 6.2638 sec/batch
Epoch 15/20  Iteration 2578/3560 Training loss: 1.2769 6.2538 sec/batch
Epoch 15/20  Iteration 2579/3560 Training loss: 1.2767 6.3457 sec/batch
Epoch 15/20  Iteration 2580/3560 Training loss: 1.2764 6.3014 sec/batch
Epoch 15/20  Iteration 2581/3560 Training loss: 1.2761 6.3678 sec/batch
Epoch 15/20  Iteration 2582/3560 Training loss: 1.2761 6.2885 sec/batch
Epoch 15/20  Iteration 2583/3560 Training loss: 1.2759 6.2606 sec/batch
Epoch 15/20  Iteration 2584/3560 Training loss: 1.2759 6.3156 sec/batch
Epoch 15/20  Iteration 2585/3560 Training loss: 1.2754 6.2207 sec/batch
Epoch 15/20  Iteration 2586/3560 Training loss: 1.2751 6.2669 sec/batch
Epoch 15/20  Iteration 2587/3560 Training loss: 1.2749 6.5338 sec/batch
Epoch 15/20  Iteration 2588/3560 Training loss: 1.2750 6.2673 sec/batch
Epoch 15/20  Iteration 2589/3560 Training loss: 1.2750 6.2884 sec/batch
Epoch 15/20  Iteration 2590/3560 Training loss: 1.2746 6.2371 sec/batch
Epoch 15/20  Iteration 2591/3560 Training loss: 1.2743 6.2621 sec/batch
Epoch 15/20  Iteration 2592/3560 Training loss: 1.2741 6.2966 sec/batch
Epoch 15/20  Iteration 2593/3560 Training loss: 1.2741 6.6218 sec/batch
Epoch 15/20  Iteration 2594/3560 Training loss: 1.2740 6.2617 sec/batch
Epoch 15/20  Iteration 2595/3560 Training loss: 1.2739 6.2569 sec/batch
Epoch 15/20  Iteration 2596/3560 Training loss: 1.2737 6.2588 sec/batch
Epoch 15/20  Iteration 2597/3560 Training loss: 1.2735 6.2033 sec/batch
Epoch 15/20  Iteration 2598/3560 Training loss: 1.2733 6.3041 sec/batch
Epoch 15/20  Iteration 2599/3560 Training loss: 1.2732 6.2476 sec/batch
Epoch 15/20  Iteration 2600/3560 Training loss: 1.2732 6.2670 sec/batch
Epoch 15/20  Iteration 2601/3560 Training loss: 1.2731 6.2351 sec/batch
Epoch 15/20  Iteration 2602/3560 Training loss: 1.2732 6.2552 sec/batch
Epoch 15/20  Iteration 2603/3560 Training loss: 1.2729 6.2257 sec/batch
Epoch 15/20  Iteration 2604/3560 Training loss: 1.2729 6.2441 sec/batch
Epoch 15/20  Iteration 2605/3560 Training loss: 1.2729 6.2887 sec/batch
Epoch 15/20  Iteration 2606/3560 Training loss: 1.2727 6.2820 sec/batch
Epoch 15/20  Iteration 2607/3560 Training loss: 1.2724 6.2539 sec/batch
Epoch 15/20  Iteration 2608/3560 Training loss: 1.2720 6.1812 sec/batch
Epoch 15/20  Iteration 2609/3560 Training loss: 1.2720 6.2689 sec/batch
Epoch 15/20  Iteration 2610/3560 Training loss: 1.2721 6.2398 sec/batch
Epoch 15/20  Iteration 2611/3560 Training loss: 1.2720 6.2708 sec/batch
Epoch 15/20  Iteration 2612/3560 Training loss: 1.2719 6.4760 sec/batch
Epoch 15/20  Iteration 2613/3560 Training loss: 1.2718 6.3888 sec/batch
Epoch 15/20  Iteration 2614/3560 Training loss: 1.2714 6.2714 sec/batch
Epoch 15/20  Iteration 2615/3560 Training loss: 1.2711 6.4930 sec/batch
Epoch 15/20  Iteration 2616/3560 Training loss: 1.2711 6.4621 sec/batch
Epoch 15/20  Iteration 2617/3560 Training loss: 1.2710 6.2674 sec/batch
Epoch 15/20  Iteration 2618/3560 Training loss: 1.2707 6.2487 sec/batch
Epoch 15/20  Iteration 2619/3560 Training loss: 1.2706 6.2932 sec/batch
Epoch 15/20  Iteration 2620/3560 Training loss: 1.2706 6.2176 sec/batch
Epoch 15/20  Iteration 2621/3560 Training loss: 1.2704 6.3078 sec/batch
Epoch 15/20  Iteration 2622/3560 Training loss: 1.2701 6.3135 sec/batch
Epoch 15/20  Iteration 2623/3560 Training loss: 1.2697 6.2558 sec/batch
Epoch 15/20  Iteration 2624/3560 Training loss: 1.2695 6.2732 sec/batch
Epoch 15/20  Iteration 2625/3560 Training loss: 1.2696 6.2253 sec/batch
Epoch 15/20  Iteration 2626/3560 Training loss: 1.2696 6.2490 sec/batch
Epoch 15/20  Iteration 2627/3560 Training loss: 1.2696 6.2804 sec/batch
Epoch 15/20  Iteration 2628/3560 Training loss: 1.2696 6.2384 sec/batch
Epoch 15/20  Iteration 2629/3560 Training loss: 1.2697 6.2787 sec/batch
Epoch 15/20  Iteration 2630/3560 Training loss: 1.2698 6.2291 sec/batch
Epoch 15/20  Iteration 2631/3560 Training loss: 1.2698 6.5146 sec/batch
Epoch 15/20  Iteration 2632/3560 Training loss: 1.2698 6.3664 sec/batch
Epoch 15/20  Iteration 2633/3560 Training loss: 1.2701 6.2302 sec/batch
Epoch 15/20  Iteration 2634/3560 Training loss: 1.2702 6.2624 sec/batch
Epoch 15/20  Iteration 2635/3560 Training loss: 1.2701 6.2863 sec/batch
Epoch 15/20  Iteration 2636/3560 Training loss: 1.2703 6.3028 sec/batch
Epoch 15/20  Iteration 2637/3560 Training loss: 1.2702 6.2801 sec/batch
Epoch 15/20  Iteration 2638/3560 Training loss: 1.2704 6.2797 sec/batch
Epoch 15/20  Iteration 2639/3560 Training loss: 1.2704 6.2843 sec/batch
Epoch 15/20  Iteration 2640/3560 Training loss: 1.2706 6.2349 sec/batch
Epoch 15/20  Iteration 2641/3560 Training loss: 1.2708 6.2948 sec/batch
Epoch 15/20  Iteration 2642/3560 Training loss: 1.2706 6.2932 sec/batch
Epoch 15/20  Iteration 2643/3560 Training loss: 1.2704 6.3250 sec/batch
Epoch 15/20  Iteration 2644/3560 Training loss: 1.2702 6.2307 sec/batch
Epoch 15/20  Iteration 2645/3560 Training loss: 1.2703 6.2463 sec/batch
Epoch 15/20  Iteration 2646/3560 Training loss: 1.2702 6.2470 sec/batch
Epoch 15/20  Iteration 2647/3560 Training loss: 1.2702 6.2638 sec/batch
Epoch 15/20  Iteration 2648/3560 Training loss: 1.2702 6.2767 sec/batch
Epoch 15/20  Iteration 2649/3560 Training loss: 1.2702 6.3008 sec/batch
Epoch 15/20  Iteration 2650/3560 Training loss: 1.2701 6.6100 sec/batch
Epoch 15/20  Iteration 2651/3560 Training loss: 1.2699 6.3397 sec/batch
Epoch 15/20  Iteration 2652/3560 Training loss: 1.2700 6.2534 sec/batch
Epoch 15/20  Iteration 2653/3560 Training loss: 1.2703 6.3160 sec/batch
Epoch 15/20  Iteration 2654/3560 Training loss: 1.2702 6.3268 sec/batch
Epoch 15/20  Iteration 2655/3560 Training loss: 1.2702 6.2865 sec/batch
Epoch 15/20  Iteration 2656/3560 Training loss: 1.2702 6.2219 sec/batch
Epoch 15/20  Iteration 2657/3560 Training loss: 1.2702 6.3190 sec/batch
Epoch 15/20  Iteration 2658/3560 Training loss: 1.2702 6.2657 sec/batch
Epoch 15/20  Iteration 2659/3560 Training loss: 1.2704 6.2082 sec/batch
Epoch 15/20  Iteration 2660/3560 Training loss: 1.2707 6.2050 sec/batch
Epoch 15/20  Iteration 2661/3560 Training loss: 1.2708 6.5455 sec/batch
Epoch 15/20  Iteration 2662/3560 Training loss: 1.2708 6.2981 sec/batch
Epoch 15/20  Iteration 2663/3560 Training loss: 1.2707 6.6301 sec/batch
Epoch 15/20  Iteration 2664/3560 Training loss: 1.2706 6.3416 sec/batch
Epoch 15/20  Iteration 2665/3560 Training loss: 1.2708 6.3670 sec/batch
Epoch 15/20  Iteration 2666/3560 Training loss: 1.2708 6.4075 sec/batch
Epoch 15/20  Iteration 2667/3560 Training loss: 1.2709 6.2611 sec/batch
Epoch 15/20  Iteration 2668/3560 Training loss: 1.2707 6.3233 sec/batch
Epoch 15/20  Iteration 2669/3560 Training loss: 1.2706 6.5118 sec/batch
Epoch 15/20  Iteration 2670/3560 Training loss: 1.2708 6.3799 sec/batch
Epoch 16/20  Iteration 2671/3560 Training loss: 1.3823 6.2887 sec/batch
Epoch 16/20  Iteration 2672/3560 Training loss: 1.3358 6.2488 sec/batch
Epoch 16/20  Iteration 2673/3560 Training loss: 1.3149 6.2302 sec/batch
Epoch 16/20  Iteration 2674/3560 Training loss: 1.3074 6.3693 sec/batch
Epoch 16/20  Iteration 2675/3560 Training loss: 1.2962 6.2402 sec/batch
Epoch 16/20  Iteration 2676/3560 Training loss: 1.2835 6.1848 sec/batch
Epoch 16/20  Iteration 2677/3560 Training loss: 1.2830 6.2326 sec/batch
Epoch 16/20  Iteration 2678/3560 Training loss: 1.2800 6.3554 sec/batch
Epoch 16/20  Iteration 2679/3560 Training loss: 1.2783 6.2879 sec/batch
Epoch 16/20  Iteration 2680/3560 Training loss: 1.2763 6.2981 sec/batch
Epoch 16/20  Iteration 2681/3560 Training loss: 1.2732 6.2473 sec/batch
Epoch 16/20  Iteration 2682/3560 Training loss: 1.2728 6.6211 sec/batch
Epoch 16/20  Iteration 2683/3560 Training loss: 1.2719 7.5744 sec/batch
Epoch 16/20  Iteration 2684/3560 Training loss: 1.2723 7.2200 sec/batch
Epoch 16/20  Iteration 2685/3560 Training loss: 1.2711 6.6928 sec/batch
Epoch 16/20  Iteration 2686/3560 Training loss: 1.2693 6.9159 sec/batch
Epoch 16/20  Iteration 2687/3560 Training loss: 1.2701 6.5017 sec/batch
Epoch 16/20  Iteration 2688/3560 Training loss: 1.2706 6.9518 sec/batch
Epoch 16/20  Iteration 2689/3560 Training loss: 1.2704 6.3696 sec/batch
Epoch 16/20  Iteration 2690/3560 Training loss: 1.2718 6.2938 sec/batch
Epoch 16/20  Iteration 2691/3560 Training loss: 1.2708 6.2439 sec/batch
Epoch 16/20  Iteration 2692/3560 Training loss: 1.2710 6.4606 sec/batch
Epoch 16/20  Iteration 2693/3560 Training loss: 1.2703 6.4774 sec/batch
Epoch 16/20  Iteration 2694/3560 Training loss: 1.2704 6.7327 sec/batch
Epoch 16/20  Iteration 2695/3560 Training loss: 1.2702 6.4119 sec/batch
Epoch 16/20  Iteration 2696/3560 Training loss: 1.2687 6.9695 sec/batch
Epoch 16/20  Iteration 2697/3560 Training loss: 1.2678 6.3782 sec/batch
Epoch 16/20  Iteration 2698/3560 Training loss: 1.2680 6.4896 sec/batch
Epoch 16/20  Iteration 2699/3560 Training loss: 1.2679 6.3988 sec/batch
Epoch 16/20  Iteration 2700/3560 Training loss: 1.2681 6.3603 sec/batch
Epoch 16/20  Iteration 2701/3560 Training loss: 1.2673 6.3324 sec/batch
Epoch 16/20  Iteration 2702/3560 Training loss: 1.2662 6.7208 sec/batch
Epoch 16/20  Iteration 2703/3560 Training loss: 1.2663 7.0012 sec/batch
Epoch 16/20  Iteration 2704/3560 Training loss: 1.2667 6.5434 sec/batch
Epoch 16/20  Iteration 2705/3560 Training loss: 1.2663 6.4469 sec/batch
Epoch 16/20  Iteration 2706/3560 Training loss: 1.2661 6.7714 sec/batch
Epoch 16/20  Iteration 2707/3560 Training loss: 1.2652 6.3893 sec/batch
Epoch 16/20  Iteration 2708/3560 Training loss: 1.2640 6.3147 sec/batch
Epoch 16/20  Iteration 2709/3560 Training loss: 1.2628 6.5936 sec/batch
Epoch 16/20  Iteration 2710/3560 Training loss: 1.2625 6.3713 sec/batch
Epoch 16/20  Iteration 2711/3560 Training loss: 1.2619 6.2960 sec/batch
Epoch 16/20  Iteration 2712/3560 Training loss: 1.2628 6.2814 sec/batch
Epoch 16/20  Iteration 2713/3560 Training loss: 1.2623 6.2530 sec/batch
Epoch 16/20  Iteration 2714/3560 Training loss: 1.2615 6.2275 sec/batch
Epoch 16/20  Iteration 2715/3560 Training loss: 1.2616 6.2133 sec/batch
Epoch 16/20  Iteration 2716/3560 Training loss: 1.2608 6.2626 sec/batch
Epoch 16/20  Iteration 2717/3560 Training loss: 1.2605 6.2485 sec/batch
Epoch 16/20  Iteration 2718/3560 Training loss: 1.2601 6.2437 sec/batch
Epoch 16/20  Iteration 2719/3560 Training loss: 1.2601 6.2469 sec/batch
Epoch 16/20  Iteration 2720/3560 Training loss: 1.2602 6.3043 sec/batch
Epoch 16/20  Iteration 2721/3560 Training loss: 1.2598 6.2610 sec/batch
Epoch 16/20  Iteration 2722/3560 Training loss: 1.2604 6.2358 sec/batch
Epoch 16/20  Iteration 2723/3560 Training loss: 1.2602 6.2692 sec/batch
Epoch 16/20  Iteration 2724/3560 Training loss: 1.2602 6.2569 sec/batch
Epoch 16/20  Iteration 2725/3560 Training loss: 1.2598 6.6634 sec/batch
Epoch 16/20  Iteration 2726/3560 Training loss: 1.2599 6.2732 sec/batch
Epoch 16/20  Iteration 2727/3560 Training loss: 1.2602 6.2886 sec/batch
Epoch 16/20  Iteration 2728/3560 Training loss: 1.2600 6.2854 sec/batch
Epoch 16/20  Iteration 2729/3560 Training loss: 1.2594 6.2485 sec/batch
Epoch 16/20  Iteration 2730/3560 Training loss: 1.2601 6.2622 sec/batch
Epoch 16/20  Iteration 2731/3560 Training loss: 1.2599 6.2400 sec/batch
Epoch 16/20  Iteration 2732/3560 Training loss: 1.2608 6.2572 sec/batch
Epoch 16/20  Iteration 2733/3560 Training loss: 1.2611 6.3493 sec/batch
Epoch 16/20  Iteration 2734/3560 Training loss: 1.2611 6.3886 sec/batch
Epoch 16/20  Iteration 2735/3560 Training loss: 1.2611 6.5058 sec/batch
Epoch 16/20  Iteration 2736/3560 Training loss: 1.2612 6.7865 sec/batch
Epoch 16/20  Iteration 2737/3560 Training loss: 1.2614 6.5324 sec/batch
Epoch 16/20  Iteration 2738/3560 Training loss: 1.2612 6.4588 sec/batch
Epoch 16/20  Iteration 2739/3560 Training loss: 1.2612 6.5081 sec/batch
Epoch 16/20  Iteration 2740/3560 Training loss: 1.2611 8.7933 sec/batch
Epoch 16/20  Iteration 2741/3560 Training loss: 1.2615 9.5135 sec/batch
Epoch 16/20  Iteration 2742/3560 Training loss: 1.2618 8.3544 sec/batch
Epoch 16/20  Iteration 2743/3560 Training loss: 1.2622 7.1570 sec/batch
Epoch 16/20  Iteration 2744/3560 Training loss: 1.2618 7.7626 sec/batch
Epoch 16/20  Iteration 2745/3560 Training loss: 1.2618 7.4149 sec/batch
Epoch 16/20  Iteration 2746/3560 Training loss: 1.2619 6.3955 sec/batch
Epoch 16/20  Iteration 2747/3560 Training loss: 1.2618 6.3307 sec/batch
Epoch 16/20  Iteration 2748/3560 Training loss: 1.2616 6.3404 sec/batch
Epoch 16/20  Iteration 2749/3560 Training loss: 1.2610 6.3419 sec/batch
Epoch 16/20  Iteration 2750/3560 Training loss: 1.2609 6.3805 sec/batch
Epoch 16/20  Iteration 2751/3560 Training loss: 1.2603 6.3013 sec/batch
Epoch 16/20  Iteration 2752/3560 Training loss: 1.2602 6.3448 sec/batch
Epoch 16/20  Iteration 2753/3560 Training loss: 1.2598 6.4208 sec/batch
Epoch 16/20  Iteration 2754/3560 Training loss: 1.2598 6.4725 sec/batch
Epoch 16/20  Iteration 2755/3560 Training loss: 1.2595 6.8594 sec/batch
Epoch 16/20  Iteration 2756/3560 Training loss: 1.2592 6.5476 sec/batch
Epoch 16/20  Iteration 2757/3560 Training loss: 1.2590 6.4594 sec/batch
Epoch 16/20  Iteration 2758/3560 Training loss: 1.2588 6.5287 sec/batch
Epoch 16/20  Iteration 2759/3560 Training loss: 1.2585 6.5413 sec/batch
Epoch 16/20  Iteration 2760/3560 Training loss: 1.2584 6.5321 sec/batch
Epoch 16/20  Iteration 2761/3560 Training loss: 1.2582 7.3051 sec/batch
Epoch 16/20  Iteration 2762/3560 Training loss: 1.2582 6.8628 sec/batch
Epoch 16/20  Iteration 2763/3560 Training loss: 1.2578 7.5461 sec/batch
Epoch 16/20  Iteration 2764/3560 Training loss: 1.2574 6.3409 sec/batch
Epoch 16/20  Iteration 2765/3560 Training loss: 1.2572 6.4782 sec/batch
Epoch 16/20  Iteration 2766/3560 Training loss: 1.2573 6.4132 sec/batch
Epoch 16/20  Iteration 2767/3560 Training loss: 1.2573 6.5532 sec/batch
Epoch 16/20  Iteration 2768/3560 Training loss: 1.2570 6.3169 sec/batch
Epoch 16/20  Iteration 2769/3560 Training loss: 1.2566 6.4585 sec/batch
Epoch 16/20  Iteration 2770/3560 Training loss: 1.2563 6.4913 sec/batch
Epoch 16/20  Iteration 2771/3560 Training loss: 1.2563 6.4487 sec/batch
Epoch 16/20  Iteration 2772/3560 Training loss: 1.2564 6.4002 sec/batch
Epoch 16/20  Iteration 2773/3560 Training loss: 1.2562 6.5079 sec/batch
Epoch 16/20  Iteration 2774/3560 Training loss: 1.2561 6.3029 sec/batch
Epoch 16/20  Iteration 2775/3560 Training loss: 1.2560 6.3427 sec/batch
Epoch 16/20  Iteration 2776/3560 Training loss: 1.2560 6.5367 sec/batch
Epoch 16/20  Iteration 2777/3560 Training loss: 1.2559 6.5372 sec/batch
Epoch 16/20  Iteration 2778/3560 Training loss: 1.2559 7.6334 sec/batch
Epoch 16/20  Iteration 2779/3560 Training loss: 1.2557 8.0446 sec/batch
Epoch 16/20  Iteration 2780/3560 Training loss: 1.2558 6.5591 sec/batch
Epoch 16/20  Iteration 2781/3560 Training loss: 1.2556 7.0620 sec/batch
Epoch 16/20  Iteration 2782/3560 Training loss: 1.2556 6.6264 sec/batch
Epoch 16/20  Iteration 2783/3560 Training loss: 1.2555 8.4725 sec/batch
Epoch 16/20  Iteration 2784/3560 Training loss: 1.2553 7.4482 sec/batch
Epoch 16/20  Iteration 2785/3560 Training loss: 1.2551 6.5453 sec/batch
Epoch 16/20  Iteration 2786/3560 Training loss: 1.2548 6.5914 sec/batch
Epoch 16/20  Iteration 2787/3560 Training loss: 1.2548 6.4856 sec/batch
Epoch 16/20  Iteration 2788/3560 Training loss: 1.2548 7.0231 sec/batch
Epoch 16/20  Iteration 2789/3560 Training loss: 1.2548 6.7569 sec/batch
Epoch 16/20  Iteration 2790/3560 Training loss: 1.2548 7.0816 sec/batch
Epoch 16/20  Iteration 2791/3560 Training loss: 1.2547 7.9220 sec/batch
Epoch 16/20  Iteration 2792/3560 Training loss: 1.2543 6.8350 sec/batch
Epoch 16/20  Iteration 2793/3560 Training loss: 1.2539 8.6505 sec/batch
Epoch 16/20  Iteration 2794/3560 Training loss: 1.2537 8.0939 sec/batch
Epoch 16/20  Iteration 2795/3560 Training loss: 1.2537 9.9024 sec/batch
Epoch 16/20  Iteration 2796/3560 Training loss: 1.2533 8.0704 sec/batch
Epoch 16/20  Iteration 2797/3560 Training loss: 1.2533 8.9855 sec/batch
Epoch 16/20  Iteration 2798/3560 Training loss: 1.2533 11.7709 sec/batch
Epoch 16/20  Iteration 2799/3560 Training loss: 1.2531 7.7808 sec/batch
Epoch 16/20  Iteration 2800/3560 Training loss: 1.2528 6.8373 sec/batch
Epoch 16/20  Iteration 2801/3560 Training loss: 1.2525 6.6442 sec/batch
Epoch 16/20  Iteration 2802/3560 Training loss: 1.2523 6.6851 sec/batch
Epoch 16/20  Iteration 2803/3560 Training loss: 1.2524 6.5650 sec/batch
Epoch 16/20  Iteration 2804/3560 Training loss: 1.2525 6.4774 sec/batch
Epoch 16/20  Iteration 2805/3560 Training loss: 1.2525 7.5511 sec/batch
Epoch 16/20  Iteration 2806/3560 Training loss: 1.2524 8.5878 sec/batch
Epoch 16/20  Iteration 2807/3560 Training loss: 1.2527 8.2414 sec/batch
Epoch 16/20  Iteration 2808/3560 Training loss: 1.2528 8.5586 sec/batch
Epoch 16/20  Iteration 2809/3560 Training loss: 1.2528 8.3108 sec/batch
Epoch 16/20  Iteration 2810/3560 Training loss: 1.2529 9.0906 sec/batch
Epoch 16/20  Iteration 2811/3560 Training loss: 1.2531 8.2947 sec/batch
Epoch 16/20  Iteration 2812/3560 Training loss: 1.2532 8.2851 sec/batch
Epoch 16/20  Iteration 2813/3560 Training loss: 1.2531 8.2492 sec/batch
Epoch 16/20  Iteration 2814/3560 Training loss: 1.2533 8.2563 sec/batch
Epoch 16/20  Iteration 2815/3560 Training loss: 1.2532 8.4116 sec/batch
Epoch 16/20  Iteration 2816/3560 Training loss: 1.2533 8.2878 sec/batch
Epoch 16/20  Iteration 2817/3560 Training loss: 1.2533 8.2284 sec/batch
Epoch 16/20  Iteration 2818/3560 Training loss: 1.2535 8.3273 sec/batch
Epoch 16/20  Iteration 2819/3560 Training loss: 1.2537 8.2976 sec/batch
Epoch 16/20  Iteration 2820/3560 Training loss: 1.2536 8.3081 sec/batch
Epoch 16/20  Iteration 2821/3560 Training loss: 1.2534 8.3202 sec/batch
Epoch 16/20  Iteration 2822/3560 Training loss: 1.2532 8.2485 sec/batch
Epoch 16/20  Iteration 2823/3560 Training loss: 1.2532 8.3226 sec/batch
Epoch 16/20  Iteration 2824/3560 Training loss: 1.2531 8.6307 sec/batch
Epoch 16/20  Iteration 2825/3560 Training loss: 1.2531 8.4636 sec/batch
Epoch 16/20  Iteration 2826/3560 Training loss: 1.2530 8.2501 sec/batch
Epoch 16/20  Iteration 2827/3560 Training loss: 1.2530 8.1881 sec/batch
Epoch 16/20  Iteration 2828/3560 Training loss: 1.2530 8.2662 sec/batch
Epoch 16/20  Iteration 2829/3560 Training loss: 1.2528 8.1594 sec/batch
Epoch 16/20  Iteration 2830/3560 Training loss: 1.2529 8.1718 sec/batch
Epoch 16/20  Iteration 2831/3560 Training loss: 1.2531 8.3779 sec/batch
Epoch 16/20  Iteration 2832/3560 Training loss: 1.2531 8.3648 sec/batch
Epoch 16/20  Iteration 2833/3560 Training loss: 1.2531 8.4417 sec/batch
Epoch 16/20  Iteration 2834/3560 Training loss: 1.2530 8.7243 sec/batch
Epoch 16/20  Iteration 2835/3560 Training loss: 1.2529 8.3797 sec/batch
Epoch 16/20  Iteration 2836/3560 Training loss: 1.2529 8.3564 sec/batch
Epoch 16/20  Iteration 2837/3560 Training loss: 1.2531 8.3008 sec/batch
Epoch 16/20  Iteration 2838/3560 Training loss: 1.2535 8.2991 sec/batch
Epoch 16/20  Iteration 2839/3560 Training loss: 1.2535 9.0904 sec/batch
Epoch 16/20  Iteration 2840/3560 Training loss: 1.2535 8.4669 sec/batch
Epoch 16/20  Iteration 2841/3560 Training loss: 1.2534 8.3171 sec/batch
Epoch 16/20  Iteration 2842/3560 Training loss: 1.2532 8.3590 sec/batch
Epoch 16/20  Iteration 2843/3560 Training loss: 1.2534 8.3135 sec/batch
Epoch 16/20  Iteration 2844/3560 Training loss: 1.2534 8.3666 sec/batch
Epoch 16/20  Iteration 2845/3560 Training loss: 1.2534 8.4081 sec/batch
Epoch 16/20  Iteration 2846/3560 Training loss: 1.2533 8.5300 sec/batch
Epoch 16/20  Iteration 2847/3560 Training loss: 1.2531 7.9130 sec/batch
Epoch 16/20  Iteration 2848/3560 Training loss: 1.2533 6.7079 sec/batch
Epoch 17/20  Iteration 2849/3560 Training loss: 1.3556 8.1580 sec/batch
Epoch 17/20  Iteration 2850/3560 Training loss: 1.3171 8.4468 sec/batch
Epoch 17/20  Iteration 2851/3560 Training loss: 1.2966 7.0973 sec/batch
Epoch 17/20  Iteration 2852/3560 Training loss: 1.2909 7.2742 sec/batch
Epoch 17/20  Iteration 2853/3560 Training loss: 1.2803 9.7371 sec/batch
Epoch 17/20  Iteration 2854/3560 Training loss: 1.2693 7.3055 sec/batch
Epoch 17/20  Iteration 2855/3560 Training loss: 1.2685 6.6846 sec/batch
Epoch 17/20  Iteration 2856/3560 Training loss: 1.2647 6.5246 sec/batch
Epoch 17/20  Iteration 2857/3560 Training loss: 1.2630 6.5693 sec/batch
Epoch 17/20  Iteration 2858/3560 Training loss: 1.2625 6.7543 sec/batch
Epoch 17/20  Iteration 2859/3560 Training loss: 1.2582 6.5522 sec/batch
Epoch 17/20  Iteration 2860/3560 Training loss: 1.2572 6.6686 sec/batch
Epoch 17/20  Iteration 2861/3560 Training loss: 1.2565 6.5225 sec/batch
Epoch 17/20  Iteration 2862/3560 Training loss: 1.2567 6.4396 sec/batch
Epoch 17/20  Iteration 2863/3560 Training loss: 1.2552 6.3472 sec/batch
Epoch 17/20  Iteration 2864/3560 Training loss: 1.2533 6.3981 sec/batch
Epoch 17/20  Iteration 2865/3560 Training loss: 1.2532 6.5063 sec/batch
Epoch 17/20  Iteration 2866/3560 Training loss: 1.2540 6.3345 sec/batch
Epoch 17/20  Iteration 2867/3560 Training loss: 1.2539 6.3960 sec/batch
Epoch 17/20  Iteration 2868/3560 Training loss: 1.2551 6.4700 sec/batch
Epoch 17/20  Iteration 2869/3560 Training loss: 1.2540 6.4200 sec/batch
Epoch 17/20  Iteration 2870/3560 Training loss: 1.2540 6.4033 sec/batch
Epoch 17/20  Iteration 2871/3560 Training loss: 1.2533 6.5823 sec/batch
Epoch 17/20  Iteration 2872/3560 Training loss: 1.2537 6.5681 sec/batch
Epoch 17/20  Iteration 2873/3560 Training loss: 1.2538 6.4099 sec/batch
Epoch 17/20  Iteration 2874/3560 Training loss: 1.2520 6.2875 sec/batch
Epoch 17/20  Iteration 2875/3560 Training loss: 1.2509 6.6542 sec/batch
Epoch 17/20  Iteration 2876/3560 Training loss: 1.2511 6.3257 sec/batch
Epoch 17/20  Iteration 2877/3560 Training loss: 1.2513 6.3041 sec/batch
Epoch 17/20  Iteration 2878/3560 Training loss: 1.2517 6.3092 sec/batch
Epoch 17/20  Iteration 2879/3560 Training loss: 1.2510 6.3174 sec/batch
Epoch 17/20  Iteration 2880/3560 Training loss: 1.2498 6.6483 sec/batch
Epoch 17/20  Iteration 2881/3560 Training loss: 1.2498 6.4279 sec/batch
Epoch 17/20  Iteration 2882/3560 Training loss: 1.2501 6.3551 sec/batch
Epoch 17/20  Iteration 2883/3560 Training loss: 1.2497 6.3731 sec/batch
Epoch 17/20  Iteration 2884/3560 Training loss: 1.2496 6.3573 sec/batch
Epoch 17/20  Iteration 2885/3560 Training loss: 1.2488 6.3630 sec/batch
Epoch 17/20  Iteration 2886/3560 Training loss: 1.2475 6.3825 sec/batch
Epoch 17/20  Iteration 2887/3560 Training loss: 1.2462 6.3247 sec/batch
Epoch 17/20  Iteration 2888/3560 Training loss: 1.2459 6.3353 sec/batch
Epoch 17/20  Iteration 2889/3560 Training loss: 1.2453 6.3561 sec/batch
Epoch 17/20  Iteration 2890/3560 Training loss: 1.2461 6.6008 sec/batch
Epoch 17/20  Iteration 2891/3560 Training loss: 1.2457 6.4567 sec/batch
Epoch 17/20  Iteration 2892/3560 Training loss: 1.2451 6.3555 sec/batch
Epoch 17/20  Iteration 2893/3560 Training loss: 1.2453 6.3265 sec/batch
Epoch 17/20  Iteration 2894/3560 Training loss: 1.2444 6.3787 sec/batch
Epoch 17/20  Iteration 2895/3560 Training loss: 1.2439 6.2903 sec/batch
Epoch 17/20  Iteration 2896/3560 Training loss: 1.2435 6.3515 sec/batch
Epoch 17/20  Iteration 2897/3560 Training loss: 1.2433 6.3489 sec/batch
Epoch 17/20  Iteration 2898/3560 Training loss: 1.2433 6.4917 sec/batch
Epoch 17/20  Iteration 2899/3560 Training loss: 1.2428 6.3337 sec/batch
Epoch 17/20  Iteration 2900/3560 Training loss: 1.2435 6.3410 sec/batch
Epoch 17/20  Iteration 2901/3560 Training loss: 1.2434 6.3455 sec/batch
Epoch 17/20  Iteration 2902/3560 Training loss: 1.2435 6.3261 sec/batch
Epoch 17/20  Iteration 2903/3560 Training loss: 1.2432 6.3472 sec/batch
Epoch 17/20  Iteration 2904/3560 Training loss: 1.2433 6.5284 sec/batch
Epoch 17/20  Iteration 2905/3560 Training loss: 1.2435 6.3238 sec/batch
Epoch 17/20  Iteration 2906/3560 Training loss: 1.2431 6.3359 sec/batch
Epoch 17/20  Iteration 2907/3560 Training loss: 1.2426 6.4252 sec/batch
Epoch 17/20  Iteration 2908/3560 Training loss: 1.2433 6.3753 sec/batch
Epoch 17/20  Iteration 2909/3560 Training loss: 1.2432 6.7745 sec/batch
Epoch 17/20  Iteration 2910/3560 Training loss: 1.2441 6.4001 sec/batch
Epoch 17/20  Iteration 2911/3560 Training loss: 1.2444 6.3996 sec/batch
Epoch 17/20  Iteration 2912/3560 Training loss: 1.2445 6.3380 sec/batch
Epoch 17/20  Iteration 2913/3560 Training loss: 1.2445 6.3047 sec/batch
Epoch 17/20  Iteration 2914/3560 Training loss: 1.2446 6.2676 sec/batch
Epoch 17/20  Iteration 2915/3560 Training loss: 1.2448 6.3244 sec/batch
Epoch 17/20  Iteration 2916/3560 Training loss: 1.2445 6.3348 sec/batch
Epoch 17/20  Iteration 2917/3560 Training loss: 1.2445 6.4189 sec/batch
Epoch 17/20  Iteration 2918/3560 Training loss: 1.2442 6.2961 sec/batch
Epoch 17/20  Iteration 2919/3560 Training loss: 1.2448 6.3529 sec/batch
Epoch 17/20  Iteration 2920/3560 Training loss: 1.2450 6.3639 sec/batch
Epoch 17/20  Iteration 2921/3560 Training loss: 1.2455 6.3463 sec/batch
Epoch 17/20  Iteration 2922/3560 Training loss: 1.2451 6.6862 sec/batch
Epoch 17/20  Iteration 2923/3560 Training loss: 1.2451 6.3170 sec/batch
Epoch 17/20  Iteration 2924/3560 Training loss: 1.2452 6.3698 sec/batch
Epoch 17/20  Iteration 2925/3560 Training loss: 1.2451 6.2843 sec/batch
Epoch 17/20  Iteration 2926/3560 Training loss: 1.2449 6.3477 sec/batch
Epoch 17/20  Iteration 2927/3560 Training loss: 1.2444 6.4186 sec/batch
Epoch 17/20  Iteration 2928/3560 Training loss: 1.2443 6.7622 sec/batch
Epoch 17/20  Iteration 2929/3560 Training loss: 1.2439 6.9969 sec/batch
Epoch 17/20  Iteration 2930/3560 Training loss: 1.2439 6.6958 sec/batch
Epoch 17/20  Iteration 2931/3560 Training loss: 1.2435 6.8102 sec/batch
Epoch 17/20  Iteration 2932/3560 Training loss: 1.2436 6.5518 sec/batch
Epoch 17/20  Iteration 2933/3560 Training loss: 1.2432 6.3212 sec/batch
Epoch 17/20  Iteration 2934/3560 Training loss: 1.2430 6.2589 sec/batch
Epoch 17/20  Iteration 2935/3560 Training loss: 1.2429 6.2419 sec/batch
Epoch 17/20  Iteration 2936/3560 Training loss: 1.2426 6.3058 sec/batch
Epoch 17/20  Iteration 2937/3560 Training loss: 1.2422 6.2497 sec/batch
Epoch 17/20  Iteration 2938/3560 Training loss: 1.2421 6.2894 sec/batch
Epoch 17/20  Iteration 2939/3560 Training loss: 1.2419 6.3473 sec/batch
Epoch 17/20  Iteration 2940/3560 Training loss: 1.2418 6.3497 sec/batch
Epoch 17/20  Iteration 2941/3560 Training loss: 1.2414 6.2270 sec/batch
Epoch 17/20  Iteration 2942/3560 Training loss: 1.2410 6.2780 sec/batch
Epoch 17/20  Iteration 2943/3560 Training loss: 1.2407 6.2959 sec/batch
Epoch 17/20  Iteration 2944/3560 Training loss: 1.2408 6.2912 sec/batch
Epoch 17/20  Iteration 2945/3560 Training loss: 1.2408 6.5950 sec/batch
Epoch 17/20  Iteration 2946/3560 Training loss: 1.2404 6.4981 sec/batch
Epoch 17/20  Iteration 2947/3560 Training loss: 1.2400 6.8127 sec/batch
Epoch 17/20  Iteration 2948/3560 Training loss: 1.2397 6.3371 sec/batch
Epoch 17/20  Iteration 2949/3560 Training loss: 1.2397 6.4356 sec/batch
Epoch 17/20  Iteration 2950/3560 Training loss: 1.2397 6.2643 sec/batch
Epoch 17/20  Iteration 2951/3560 Training loss: 1.2395 6.2933 sec/batch
Epoch 17/20  Iteration 2952/3560 Training loss: 1.2395 6.2497 sec/batch
Epoch 17/20  Iteration 2953/3560 Training loss: 1.2393 6.2729 sec/batch
Epoch 17/20  Iteration 2954/3560 Training loss: 1.2392 6.2775 sec/batch
Epoch 17/20  Iteration 2955/3560 Training loss: 1.2391 6.2634 sec/batch
Epoch 17/20  Iteration 2956/3560 Training loss: 1.2391 6.3636 sec/batch
Epoch 17/20  Iteration 2957/3560 Training loss: 1.2389 6.5587 sec/batch
Epoch 17/20  Iteration 2958/3560 Training loss: 1.2390 6.3260 sec/batch
Epoch 17/20  Iteration 2959/3560 Training loss: 1.2388 6.2736 sec/batch
Epoch 17/20  Iteration 2960/3560 Training loss: 1.2387 6.2406 sec/batch
Epoch 17/20  Iteration 2961/3560 Training loss: 1.2386 6.5300 sec/batch
Epoch 17/20  Iteration 2962/3560 Training loss: 1.2385 6.6530 sec/batch
Epoch 17/20  Iteration 2963/3560 Training loss: 1.2382 6.4404 sec/batch
Epoch 17/20  Iteration 2964/3560 Training loss: 1.2379 6.3189 sec/batch
Epoch 17/20  Iteration 2965/3560 Training loss: 1.2380 6.6039 sec/batch
Epoch 17/20  Iteration 2966/3560 Training loss: 1.2381 6.3557 sec/batch
Epoch 17/20  Iteration 2967/3560 Training loss: 1.2379 6.2846 sec/batch
Epoch 17/20  Iteration 2968/3560 Training loss: 1.2380 6.6360 sec/batch
Epoch 17/20  Iteration 2969/3560 Training loss: 1.2380 6.3825 sec/batch
Epoch 17/20  Iteration 2970/3560 Training loss: 1.2376 6.2832 sec/batch
Epoch 17/20  Iteration 2971/3560 Training loss: 1.2373 6.2816 sec/batch
Epoch 17/20  Iteration 2972/3560 Training loss: 1.2372 6.3007 sec/batch
Epoch 17/20  Iteration 2973/3560 Training loss: 1.2371 6.3136 sec/batch
Epoch 17/20  Iteration 2974/3560 Training loss: 1.2367 6.3593 sec/batch
Epoch 17/20  Iteration 2975/3560 Training loss: 1.2367 6.2177 sec/batch
Epoch 17/20  Iteration 2976/3560 Training loss: 1.2366 6.3198 sec/batch
Epoch 17/20  Iteration 2977/3560 Training loss: 1.2364 6.3222 sec/batch
Epoch 17/20  Iteration 2978/3560 Training loss: 1.2360 6.2701 sec/batch
Epoch 17/20  Iteration 2979/3560 Training loss: 1.2356 6.2755 sec/batch
Epoch 17/20  Iteration 2980/3560 Training loss: 1.2355 6.2197 sec/batch
Epoch 17/20  Iteration 2981/3560 Training loss: 1.2356 6.3276 sec/batch
Epoch 17/20  Iteration 2982/3560 Training loss: 1.2356 6.2741 sec/batch
Epoch 17/20  Iteration 2983/3560 Training loss: 1.2356 6.3469 sec/batch
Epoch 17/20  Iteration 2984/3560 Training loss: 1.2356 6.7137 sec/batch
Epoch 17/20  Iteration 2985/3560 Training loss: 1.2358 6.3257 sec/batch
Epoch 17/20  Iteration 2986/3560 Training loss: 1.2359 6.2765 sec/batch
Epoch 17/20  Iteration 2987/3560 Training loss: 1.2360 6.3353 sec/batch
Epoch 17/20  Iteration 2988/3560 Training loss: 1.2360 6.1959 sec/batch
Epoch 17/20  Iteration 2989/3560 Training loss: 1.2363 6.2625 sec/batch
Epoch 17/20  Iteration 2990/3560 Training loss: 1.2364 6.3034 sec/batch
Epoch 17/20  Iteration 2991/3560 Training loss: 1.2364 6.3077 sec/batch
Epoch 17/20  Iteration 2992/3560 Training loss: 1.2366 6.2686 sec/batch
Epoch 17/20  Iteration 2993/3560 Training loss: 1.2364 6.2622 sec/batch
Epoch 17/20  Iteration 2994/3560 Training loss: 1.2367 6.2625 sec/batch
Epoch 17/20  Iteration 2995/3560 Training loss: 1.2367 6.3145 sec/batch
Epoch 17/20  Iteration 2996/3560 Training loss: 1.2369 6.2925 sec/batch
Epoch 17/20  Iteration 2997/3560 Training loss: 1.2370 6.2253 sec/batch
Epoch 17/20  Iteration 2998/3560 Training loss: 1.2369 6.2167 sec/batch
Epoch 17/20  Iteration 2999/3560 Training loss: 1.2366 6.3092 sec/batch
Epoch 17/20  Iteration 3000/3560 Training loss: 1.2365 6.3152 sec/batch
Epoch 17/20  Iteration 3001/3560 Training loss: 1.2365 6.3038 sec/batch
Epoch 17/20  Iteration 3002/3560 Training loss: 1.2365 6.2638 sec/batch
Epoch 17/20  Iteration 3003/3560 Training loss: 1.2365 6.5907 sec/batch
Epoch 17/20  Iteration 3004/3560 Training loss: 1.2365 6.2667 sec/batch
Epoch 17/20  Iteration 3005/3560 Training loss: 1.2365 6.3034 sec/batch
Epoch 17/20  Iteration 3006/3560 Training loss: 1.2364 6.3004 sec/batch
Epoch 17/20  Iteration 3007/3560 Training loss: 1.2362 6.3386 sec/batch
Epoch 17/20  Iteration 3008/3560 Training loss: 1.2362 6.2673 sec/batch
Epoch 17/20  Iteration 3009/3560 Training loss: 1.2364 6.3297 sec/batch
Epoch 17/20  Iteration 3010/3560 Training loss: 1.2364 6.2612 sec/batch
Epoch 17/20  Iteration 3011/3560 Training loss: 1.2364 6.2788 sec/batch
Epoch 17/20  Iteration 3012/3560 Training loss: 1.2363 6.2855 sec/batch
Epoch 17/20  Iteration 3013/3560 Training loss: 1.2364 6.2719 sec/batch
Epoch 17/20  Iteration 3014/3560 Training loss: 1.2363 6.3293 sec/batch
Epoch 17/20  Iteration 3015/3560 Training loss: 1.2365 6.2824 sec/batch
Epoch 17/20  Iteration 3016/3560 Training loss: 1.2368 6.6742 sec/batch
Epoch 17/20  Iteration 3017/3560 Training loss: 1.2368 6.3206 sec/batch
Epoch 17/20  Iteration 3018/3560 Training loss: 1.2369 6.3440 sec/batch
Epoch 17/20  Iteration 3019/3560 Training loss: 1.2369 6.2635 sec/batch
Epoch 17/20  Iteration 3020/3560 Training loss: 1.2367 6.2839 sec/batch
Epoch 17/20  Iteration 3021/3560 Training loss: 1.2369 6.2928 sec/batch
Epoch 17/20  Iteration 3022/3560 Training loss: 1.2369 6.7090 sec/batch
Epoch 17/20  Iteration 3023/3560 Training loss: 1.2370 6.3035 sec/batch
Epoch 17/20  Iteration 3024/3560 Training loss: 1.2368 6.2689 sec/batch
Epoch 17/20  Iteration 3025/3560 Training loss: 1.2368 6.2956 sec/batch
Epoch 17/20  Iteration 3026/3560 Training loss: 1.2369 6.2593 sec/batch
Epoch 18/20  Iteration 3027/3560 Training loss: 1.3350 6.2684 sec/batch
Epoch 18/20  Iteration 3028/3560 Training loss: 1.2916 6.2890 sec/batch
Epoch 18/20  Iteration 3029/3560 Training loss: 1.2779 6.2749 sec/batch
Epoch 18/20  Iteration 3030/3560 Training loss: 1.2729 6.3277 sec/batch
Epoch 18/20  Iteration 3031/3560 Training loss: 1.2621 6.2949 sec/batch
Epoch 18/20  Iteration 3032/3560 Training loss: 1.2503 6.3536 sec/batch
Epoch 18/20  Iteration 3033/3560 Training loss: 1.2490 6.2432 sec/batch
Epoch 18/20  Iteration 3034/3560 Training loss: 1.2461 6.3284 sec/batch
Epoch 18/20  Iteration 3035/3560 Training loss: 1.2448 6.2834 sec/batch
Epoch 18/20  Iteration 3036/3560 Training loss: 1.2439 6.2638 sec/batch
Epoch 18/20  Iteration 3037/3560 Training loss: 1.2400 6.2690 sec/batch
Epoch 18/20  Iteration 3038/3560 Training loss: 1.2399 6.2901 sec/batch
Epoch 18/20  Iteration 3039/3560 Training loss: 1.2398 6.3371 sec/batch
Epoch 18/20  Iteration 3040/3560 Training loss: 1.2400 6.2518 sec/batch
Epoch 18/20  Iteration 3041/3560 Training loss: 1.2381 6.6468 sec/batch
Epoch 18/20  Iteration 3042/3560 Training loss: 1.2369 6.3290 sec/batch
Epoch 18/20  Iteration 3043/3560 Training loss: 1.2371 6.2629 sec/batch
Epoch 18/20  Iteration 3044/3560 Training loss: 1.2383 6.5411 sec/batch
Epoch 18/20  Iteration 3045/3560 Training loss: 1.2379 6.2977 sec/batch
Epoch 18/20  Iteration 3046/3560 Training loss: 1.2392 6.2735 sec/batch
Epoch 18/20  Iteration 3047/3560 Training loss: 1.2383 6.2515 sec/batch
Epoch 18/20  Iteration 3048/3560 Training loss: 1.2383 6.2420 sec/batch
Epoch 18/20  Iteration 3049/3560 Training loss: 1.2375 6.2754 sec/batch
Epoch 18/20  Iteration 3050/3560 Training loss: 1.2378 6.2588 sec/batch
Epoch 18/20  Iteration 3051/3560 Training loss: 1.2376 6.2786 sec/batch
Epoch 18/20  Iteration 3052/3560 Training loss: 1.2360 6.3237 sec/batch
Epoch 18/20  Iteration 3053/3560 Training loss: 1.2350 6.3083 sec/batch
Epoch 18/20  Iteration 3054/3560 Training loss: 1.2353 6.3934 sec/batch
Epoch 18/20  Iteration 3055/3560 Training loss: 1.2352 6.2836 sec/batch
Epoch 18/20  Iteration 3056/3560 Training loss: 1.2354 6.2763 sec/batch
Epoch 18/20  Iteration 3057/3560 Training loss: 1.2347 6.2646 sec/batch
Epoch 18/20  Iteration 3058/3560 Training loss: 1.2337 6.2736 sec/batch
Epoch 18/20  Iteration 3059/3560 Training loss: 1.2339 6.2827 sec/batch
Epoch 18/20  Iteration 3060/3560 Training loss: 1.2341 6.6643 sec/batch
Epoch 18/20  Iteration 3061/3560 Training loss: 1.2338 6.3095 sec/batch
Epoch 18/20  Iteration 3062/3560 Training loss: 1.2334 6.2919 sec/batch
Epoch 18/20  Iteration 3063/3560 Training loss: 1.2328 6.6353 sec/batch
Epoch 18/20  Iteration 3064/3560 Training loss: 1.2319 6.3518 sec/batch
Epoch 18/20  Iteration 3065/3560 Training loss: 1.2306 6.3268 sec/batch
Epoch 18/20  Iteration 3066/3560 Training loss: 1.2302 6.3402 sec/batch
Epoch 18/20  Iteration 3067/3560 Training loss: 1.2295 6.2646 sec/batch
Epoch 18/20  Iteration 3068/3560 Training loss: 1.2303 6.4151 sec/batch
Epoch 18/20  Iteration 3069/3560 Training loss: 1.2301 6.6432 sec/batch
Epoch 18/20  Iteration 3070/3560 Training loss: 1.2294 6.8103 sec/batch
Epoch 18/20  Iteration 3071/3560 Training loss: 1.2293 6.6335 sec/batch
Epoch 18/20  Iteration 3072/3560 Training loss: 1.2287 7.8406 sec/batch
Epoch 18/20  Iteration 3073/3560 Training loss: 1.2283 7.6120 sec/batch
Epoch 18/20  Iteration 3074/3560 Training loss: 1.2281 6.4521 sec/batch
Epoch 18/20  Iteration 3075/3560 Training loss: 1.2280 6.4315 sec/batch
Epoch 18/20  Iteration 3076/3560 Training loss: 1.2282 6.3412 sec/batch
Epoch 18/20  Iteration 3077/3560 Training loss: 1.2277 6.3221 sec/batch
Epoch 18/20  Iteration 3078/3560 Training loss: 1.2283 6.6247 sec/batch
Epoch 18/20  Iteration 3079/3560 Training loss: 1.2282 6.3818 sec/batch
Epoch 18/20  Iteration 3080/3560 Training loss: 1.2285 6.2587 sec/batch
Epoch 18/20  Iteration 3081/3560 Training loss: 1.2282 6.3472 sec/batch
Epoch 18/20  Iteration 3082/3560 Training loss: 1.2283 6.3195 sec/batch
Epoch 18/20  Iteration 3083/3560 Training loss: 1.2285 6.2534 sec/batch
Epoch 18/20  Iteration 3084/3560 Training loss: 1.2284 6.3013 sec/batch
Epoch 18/20  Iteration 3085/3560 Training loss: 1.2278 6.3301 sec/batch
Epoch 18/20  Iteration 3086/3560 Training loss: 1.2284 6.3696 sec/batch
Epoch 18/20  Iteration 3087/3560 Training loss: 1.2284 6.2788 sec/batch
Epoch 18/20  Iteration 3088/3560 Training loss: 1.2292 6.2824 sec/batch
Epoch 18/20  Iteration 3089/3560 Training loss: 1.2296 6.3142 sec/batch
Epoch 18/20  Iteration 3090/3560 Training loss: 1.2298 6.2687 sec/batch
Epoch 18/20  Iteration 3091/3560 Training loss: 1.2298 6.2997 sec/batch
Epoch 18/20  Iteration 3092/3560 Training loss: 1.2298 6.2710 sec/batch
Epoch 18/20  Iteration 3093/3560 Training loss: 1.2301 6.2817 sec/batch
Epoch 18/20  Iteration 3094/3560 Training loss: 1.2299 6.2909 sec/batch
Epoch 18/20  Iteration 3095/3560 Training loss: 1.2301 6.2314 sec/batch
Epoch 18/20  Iteration 3096/3560 Training loss: 1.2300 6.2565 sec/batch
Epoch 18/20  Iteration 3097/3560 Training loss: 1.2305 6.6651 sec/batch
Epoch 18/20  Iteration 3098/3560 Training loss: 1.2308 6.3849 sec/batch
Epoch 18/20  Iteration 3099/3560 Training loss: 1.2313 6.3834 sec/batch
Epoch 18/20  Iteration 3100/3560 Training loss: 1.2310 6.3243 sec/batch
Epoch 18/20  Iteration 3101/3560 Training loss: 1.2310 6.2605 sec/batch
Epoch 18/20  Iteration 3102/3560 Training loss: 1.2311 6.2234 sec/batch
Epoch 18/20  Iteration 3103/3560 Training loss: 1.2311 6.3168 sec/batch
Epoch 18/20  Iteration 3104/3560 Training loss: 1.2309 6.4779 sec/batch
Epoch 18/20  Iteration 3105/3560 Training loss: 1.2304 6.3442 sec/batch
Epoch 18/20  Iteration 3106/3560 Training loss: 1.2303 6.3638 sec/batch
Epoch 18/20  Iteration 3107/3560 Training loss: 1.2300 6.2679 sec/batch
Epoch 18/20  Iteration 3108/3560 Training loss: 1.2299 6.2825 sec/batch
Epoch 18/20  Iteration 3109/3560 Training loss: 1.2295 6.3663 sec/batch
Epoch 18/20  Iteration 3110/3560 Training loss: 1.2295 6.7331 sec/batch
Epoch 18/20  Iteration 3111/3560 Training loss: 1.2292 6.4374 sec/batch
Epoch 18/20  Iteration 3112/3560 Training loss: 1.2291 6.5137 sec/batch
Epoch 18/20  Iteration 3113/3560 Training loss: 1.2289 6.3499 sec/batch
Epoch 18/20  Iteration 3114/3560 Training loss: 1.2287 6.3253 sec/batch
Epoch 18/20  Iteration 3115/3560 Training loss: 1.2284 6.3184 sec/batch
Epoch 18/20  Iteration 3116/3560 Training loss: 1.2285 6.6822 sec/batch
Epoch 18/20  Iteration 3117/3560 Training loss: 1.2283 6.2967 sec/batch
Epoch 18/20  Iteration 3118/3560 Training loss: 1.2282 6.3131 sec/batch
Epoch 18/20  Iteration 3119/3560 Training loss: 1.2278 6.3169 sec/batch
Epoch 18/20  Iteration 3120/3560 Training loss: 1.2275 6.2667 sec/batch
Epoch 18/20  Iteration 3121/3560 Training loss: 1.2272 6.3100 sec/batch
Epoch 18/20  Iteration 3122/3560 Training loss: 1.2273 6.2611 sec/batch
Epoch 18/20  Iteration 3123/3560 Training loss: 1.2273 6.3620 sec/batch
Epoch 18/20  Iteration 3124/3560 Training loss: 1.2269 6.3761 sec/batch
Epoch 18/20  Iteration 3125/3560 Training loss: 1.2265 6.3036 sec/batch
Epoch 18/20  Iteration 3126/3560 Training loss: 1.2263 6.3237 sec/batch
Epoch 18/20  Iteration 3127/3560 Training loss: 1.2263 6.3161 sec/batch
Epoch 18/20  Iteration 3128/3560 Training loss: 1.2262 6.3303 sec/batch
Epoch 18/20  Iteration 3129/3560 Training loss: 1.2260 6.3203 sec/batch
Epoch 18/20  Iteration 3130/3560 Training loss: 1.2258 6.3655 sec/batch
Epoch 18/20  Iteration 3131/3560 Training loss: 1.2255 6.3058 sec/batch
Epoch 18/20  Iteration 3132/3560 Training loss: 1.2255 6.3054 sec/batch
Epoch 18/20  Iteration 3133/3560 Training loss: 1.2253 6.3242 sec/batch
Epoch 18/20  Iteration 3134/3560 Training loss: 1.2253 6.3304 sec/batch
Epoch 18/20  Iteration 3135/3560 Training loss: 1.2251 6.8202 sec/batch
Epoch 18/20  Iteration 3136/3560 Training loss: 1.2252 6.3188 sec/batch
Epoch 18/20  Iteration 3137/3560 Training loss: 1.2250 6.3140 sec/batch
Epoch 18/20  Iteration 3138/3560 Training loss: 1.2249 6.3354 sec/batch
Epoch 18/20  Iteration 3139/3560 Training loss: 1.2247 6.2776 sec/batch
Epoch 18/20  Iteration 3140/3560 Training loss: 1.2245 6.2845 sec/batch
Epoch 18/20  Iteration 3141/3560 Training loss: 1.2243 6.3233 sec/batch
Epoch 18/20  Iteration 3142/3560 Training loss: 1.2240 6.3186 sec/batch
Epoch 18/20  Iteration 3143/3560 Training loss: 1.2240 6.3131 sec/batch
Epoch 18/20  Iteration 3144/3560 Training loss: 1.2240 6.3731 sec/batch
Epoch 18/20  Iteration 3145/3560 Training loss: 1.2240 6.3110 sec/batch
Epoch 18/20  Iteration 3146/3560 Training loss: 1.2239 6.2819 sec/batch
Epoch 18/20  Iteration 3147/3560 Training loss: 1.2238 6.3372 sec/batch
Epoch 18/20  Iteration 3148/3560 Training loss: 1.2235 6.3100 sec/batch
Epoch 18/20  Iteration 3149/3560 Training loss: 1.2232 6.2857 sec/batch
Epoch 18/20  Iteration 3150/3560 Training loss: 1.2232 6.2721 sec/batch
Epoch 18/20  Iteration 3151/3560 Training loss: 1.2231 6.2841 sec/batch
Epoch 18/20  Iteration 3152/3560 Training loss: 1.2227 6.3860 sec/batch
Epoch 18/20  Iteration 3153/3560 Training loss: 1.2228 6.3370 sec/batch
Epoch 18/20  Iteration 3154/3560 Training loss: 1.2227 6.6639 sec/batch
Epoch 18/20  Iteration 3155/3560 Training loss: 1.2225 6.2897 sec/batch
Epoch 18/20  Iteration 3156/3560 Training loss: 1.2222 6.3105 sec/batch
Epoch 18/20  Iteration 3157/3560 Training loss: 1.2217 6.6290 sec/batch
Epoch 18/20  Iteration 3158/3560 Training loss: 1.2216 6.4077 sec/batch
Epoch 18/20  Iteration 3159/3560 Training loss: 1.2218 6.2622 sec/batch
Epoch 18/20  Iteration 3160/3560 Training loss: 1.2218 6.2951 sec/batch
Epoch 18/20  Iteration 3161/3560 Training loss: 1.2218 6.2985 sec/batch
Epoch 18/20  Iteration 3162/3560 Training loss: 1.2218 6.2770 sec/batch
Epoch 18/20  Iteration 3163/3560 Training loss: 1.2220 6.2518 sec/batch
Epoch 18/20  Iteration 3164/3560 Training loss: 1.2221 6.3131 sec/batch
Epoch 18/20  Iteration 3165/3560 Training loss: 1.2221 6.2573 sec/batch
Epoch 18/20  Iteration 3166/3560 Training loss: 1.2221 6.2762 sec/batch
Epoch 18/20  Iteration 3167/3560 Training loss: 1.2224 6.3328 sec/batch
Epoch 18/20  Iteration 3168/3560 Training loss: 1.2226 6.3732 sec/batch
Epoch 18/20  Iteration 3169/3560 Training loss: 1.2225 6.2769 sec/batch
Epoch 18/20  Iteration 3170/3560 Training loss: 1.2227 6.3434 sec/batch
Epoch 18/20  Iteration 3171/3560 Training loss: 1.2226 6.2928 sec/batch
Epoch 18/20  Iteration 3172/3560 Training loss: 1.2228 6.3202 sec/batch
Epoch 18/20  Iteration 3173/3560 Training loss: 1.2229 6.7665 sec/batch
Epoch 18/20  Iteration 3174/3560 Training loss: 1.2231 6.2513 sec/batch
Epoch 18/20  Iteration 3175/3560 Training loss: 1.2232 6.2288 sec/batch
Epoch 18/20  Iteration 3176/3560 Training loss: 1.2231 6.2691 sec/batch
Epoch 18/20  Iteration 3177/3560 Training loss: 1.2228 6.2188 sec/batch
Epoch 18/20  Iteration 3178/3560 Training loss: 1.2227 7.0067 sec/batch
Epoch 18/20  Iteration 3179/3560 Training loss: 1.2228 7.0429 sec/batch
Epoch 18/20  Iteration 3180/3560 Training loss: 1.2228 6.5104 sec/batch
Epoch 18/20  Iteration 3181/3560 Training loss: 1.2227 7.0041 sec/batch
Epoch 18/20  Iteration 3182/3560 Training loss: 1.2227 7.1676 sec/batch
Epoch 18/20  Iteration 3183/3560 Training loss: 1.2226 6.8714 sec/batch
Epoch 18/20  Iteration 3184/3560 Training loss: 1.2225 6.3286 sec/batch
Epoch 18/20  Iteration 3185/3560 Training loss: 1.2223 6.3350 sec/batch
Epoch 18/20  Iteration 3186/3560 Training loss: 1.2223 6.2898 sec/batch
Epoch 18/20  Iteration 3187/3560 Training loss: 1.2225 6.3724 sec/batch
Epoch 18/20  Iteration 3188/3560 Training loss: 1.2224 6.2580 sec/batch
Epoch 18/20  Iteration 3189/3560 Training loss: 1.2224 6.4658 sec/batch
Epoch 18/20  Iteration 3190/3560 Training loss: 1.2224 6.3028 sec/batch
Epoch 18/20  Iteration 3191/3560 Training loss: 1.2223 6.6665 sec/batch
Epoch 18/20  Iteration 3192/3560 Training loss: 1.2223 6.3230 sec/batch
Epoch 18/20  Iteration 3193/3560 Training loss: 1.2225 6.3675 sec/batch
Epoch 18/20  Iteration 3194/3560 Training loss: 1.2229 6.3376 sec/batch
Epoch 18/20  Iteration 3195/3560 Training loss: 1.2229 6.3286 sec/batch
Epoch 18/20  Iteration 3196/3560 Training loss: 1.2229 6.2799 sec/batch
Epoch 18/20  Iteration 3197/3560 Training loss: 1.2228 6.3450 sec/batch
Epoch 18/20  Iteration 3198/3560 Training loss: 1.2228 6.3324 sec/batch
Epoch 18/20  Iteration 3199/3560 Training loss: 1.2229 6.2973 sec/batch
Epoch 18/20  Iteration 3200/3560 Training loss: 1.2229 6.3419 sec/batch
Epoch 18/20  Iteration 3201/3560 Training loss: 1.2230 6.2491 sec/batch
Epoch 18/20  Iteration 3202/3560 Training loss: 1.2228 6.3193 sec/batch
Epoch 18/20  Iteration 3203/3560 Training loss: 1.2227 6.3010 sec/batch
Epoch 18/20  Iteration 3204/3560 Training loss: 1.2229 6.7850 sec/batch
Epoch 19/20  Iteration 3205/3560 Training loss: 1.3327 6.3344 sec/batch
Epoch 19/20  Iteration 3206/3560 Training loss: 1.2915 6.3340 sec/batch
Epoch 19/20  Iteration 3207/3560 Training loss: 1.2756 6.2581 sec/batch
Epoch 19/20  Iteration 3208/3560 Training loss: 1.2712 6.2772 sec/batch
Epoch 19/20  Iteration 3209/3560 Training loss: 1.2583 6.4317 sec/batch
Epoch 19/20  Iteration 3210/3560 Training loss: 1.2463 6.6436 sec/batch
Epoch 19/20  Iteration 3211/3560 Training loss: 1.2431 6.3106 sec/batch
Epoch 19/20  Iteration 3212/3560 Training loss: 1.2394 6.2638 sec/batch
Epoch 19/20  Iteration 3213/3560 Training loss: 1.2376 6.2856 sec/batch
Epoch 19/20  Iteration 3214/3560 Training loss: 1.2364 6.2901 sec/batch
Epoch 19/20  Iteration 3215/3560 Training loss: 1.2335 6.2855 sec/batch
Epoch 19/20  Iteration 3216/3560 Training loss: 1.2335 6.3038 sec/batch
Epoch 19/20  Iteration 3217/3560 Training loss: 1.2324 6.7629 sec/batch
Epoch 19/20  Iteration 3218/3560 Training loss: 1.2323 6.3221 sec/batch
Epoch 19/20  Iteration 3219/3560 Training loss: 1.2302 6.2815 sec/batch
Epoch 19/20  Iteration 3220/3560 Training loss: 1.2284 6.3397 sec/batch
Epoch 19/20  Iteration 3221/3560 Training loss: 1.2288 6.2132 sec/batch
Epoch 19/20  Iteration 3222/3560 Training loss: 1.2293 6.3014 sec/batch
Epoch 19/20  Iteration 3223/3560 Training loss: 1.2290 6.2982 sec/batch
Epoch 19/20  Iteration 3224/3560 Training loss: 1.2295 6.2515 sec/batch
Epoch 19/20  Iteration 3225/3560 Training loss: 1.2287 6.3120 sec/batch
Epoch 19/20  Iteration 3226/3560 Training loss: 1.2288 6.2932 sec/batch
Epoch 19/20  Iteration 3227/3560 Training loss: 1.2280 6.3537 sec/batch
Epoch 19/20  Iteration 3228/3560 Training loss: 1.2279 6.3037 sec/batch
Epoch 19/20  Iteration 3229/3560 Training loss: 1.2273 6.6627 sec/batch
Epoch 19/20  Iteration 3230/3560 Training loss: 1.2258 6.2853 sec/batch
Epoch 19/20  Iteration 3231/3560 Training loss: 1.2245 6.2743 sec/batch
Epoch 19/20  Iteration 3232/3560 Training loss: 1.2244 6.3196 sec/batch
Epoch 19/20  Iteration 3233/3560 Training loss: 1.2246 6.2325 sec/batch
Epoch 19/20  Iteration 3234/3560 Training loss: 1.2248 6.2138 sec/batch
Epoch 19/20  Iteration 3235/3560 Training loss: 1.2241 6.3224 sec/batch
Epoch 19/20  Iteration 3236/3560 Training loss: 1.2231 6.6373 sec/batch
Epoch 19/20  Iteration 3237/3560 Training loss: 1.2233 6.4174 sec/batch
Epoch 19/20  Iteration 3238/3560 Training loss: 1.2232 6.2530 sec/batch
Epoch 19/20  Iteration 3239/3560 Training loss: 1.2225 6.3461 sec/batch
Epoch 19/20  Iteration 3240/3560 Training loss: 1.2223 6.2763 sec/batch
Epoch 19/20  Iteration 3241/3560 Training loss: 1.2217 6.3064 sec/batch
Epoch 19/20  Iteration 3242/3560 Training loss: 1.2205 6.5279 sec/batch
Epoch 19/20  Iteration 3243/3560 Training loss: 1.2192 7.0046 sec/batch
Epoch 19/20  Iteration 3244/3560 Training loss: 1.2191 7.8989 sec/batch
Epoch 19/20  Iteration 3245/3560 Training loss: 1.2183 8.7679 sec/batch
Epoch 19/20  Iteration 3246/3560 Training loss: 1.2192 7.2033 sec/batch
Epoch 19/20  Iteration 3247/3560 Training loss: 1.2188 7.1287 sec/batch
Epoch 19/20  Iteration 3248/3560 Training loss: 1.2181 6.3405 sec/batch
Epoch 19/20  Iteration 3249/3560 Training loss: 1.2179 6.3073 sec/batch
Epoch 19/20  Iteration 3250/3560 Training loss: 1.2173 6.6448 sec/batch
Epoch 19/20  Iteration 3251/3560 Training loss: 1.2170 6.4773 sec/batch
Epoch 19/20  Iteration 3252/3560 Training loss: 1.2163 6.3688 sec/batch
Epoch 19/20  Iteration 3253/3560 Training loss: 1.2163 6.3377 sec/batch
Epoch 19/20  Iteration 3254/3560 Training loss: 1.2162 6.3739 sec/batch
Epoch 19/20  Iteration 3255/3560 Training loss: 1.2156 7.1323 sec/batch
Epoch 19/20  Iteration 3256/3560 Training loss: 1.2162 8.8136 sec/batch
Epoch 19/20  Iteration 3257/3560 Training loss: 1.2161 8.0051 sec/batch
Epoch 19/20  Iteration 3258/3560 Training loss: 1.2163 6.3535 sec/batch
Epoch 19/20  Iteration 3259/3560 Training loss: 1.2161 6.4130 sec/batch
Epoch 19/20  Iteration 3260/3560 Training loss: 1.2161 6.3397 sec/batch
Epoch 19/20  Iteration 3261/3560 Training loss: 1.2163 6.2854 sec/batch
Epoch 19/20  Iteration 3262/3560 Training loss: 1.2161 6.2814 sec/batch
Epoch 19/20  Iteration 3263/3560 Training loss: 1.2154 6.3894 sec/batch
Epoch 19/20  Iteration 3264/3560 Training loss: 1.2160 6.3443 sec/batch
Epoch 19/20  Iteration 3265/3560 Training loss: 1.2160 6.7030 sec/batch
Epoch 19/20  Iteration 3266/3560 Training loss: 1.2168 6.3794 sec/batch
Epoch 19/20  Iteration 3267/3560 Training loss: 1.2169 6.3170 sec/batch
Epoch 19/20  Iteration 3268/3560 Training loss: 1.2170 6.3115 sec/batch
Epoch 19/20  Iteration 3269/3560 Training loss: 1.2170 6.3194 sec/batch
Epoch 19/20  Iteration 3270/3560 Training loss: 1.2172 6.2914 sec/batch
Epoch 19/20  Iteration 3271/3560 Training loss: 1.2174 6.3145 sec/batch
Epoch 19/20  Iteration 3272/3560 Training loss: 1.2172 6.3882 sec/batch
Epoch 19/20  Iteration 3273/3560 Training loss: 1.2173 6.3285 sec/batch
Epoch 19/20  Iteration 3274/3560 Training loss: 1.2172 6.3907 sec/batch
Epoch 19/20  Iteration 3275/3560 Training loss: 1.2177 6.3641 sec/batch
Epoch 19/20  Iteration 3276/3560 Training loss: 1.2180 6.3458 sec/batch
Epoch 19/20  Iteration 3277/3560 Training loss: 1.2185 6.3818 sec/batch
Epoch 19/20  Iteration 3278/3560 Training loss: 1.2181 6.3779 sec/batch
Epoch 19/20  Iteration 3279/3560 Training loss: 1.2180 6.3379 sec/batch
Epoch 19/20  Iteration 3280/3560 Training loss: 1.2181 6.3064 sec/batch
Epoch 19/20  Iteration 3281/3560 Training loss: 1.2180 6.2799 sec/batch
Epoch 19/20  Iteration 3282/3560 Training loss: 1.2179 6.3155 sec/batch
Epoch 19/20  Iteration 3283/3560 Training loss: 1.2173 6.4122 sec/batch
Epoch 19/20  Iteration 3284/3560 Training loss: 1.2173 6.6731 sec/batch
Epoch 19/20  Iteration 3285/3560 Training loss: 1.2169 6.3322 sec/batch
Epoch 19/20  Iteration 3286/3560 Training loss: 1.2168 6.3418 sec/batch
Epoch 19/20  Iteration 3287/3560 Training loss: 1.2164 6.3666 sec/batch
Epoch 19/20  Iteration 3288/3560 Training loss: 1.2164 6.2889 sec/batch
Epoch 19/20  Iteration 3289/3560 Training loss: 1.2161 6.3261 sec/batch
Epoch 19/20  Iteration 3290/3560 Training loss: 1.2160 6.3406 sec/batch
Epoch 19/20  Iteration 3291/3560 Training loss: 1.2158 6.3815 sec/batch
Epoch 19/20  Iteration 3292/3560 Training loss: 1.2156 6.3510 sec/batch
Epoch 19/20  Iteration 3293/3560 Training loss: 1.2152 6.2747 sec/batch
Epoch 19/20  Iteration 3294/3560 Training loss: 1.2153 6.9128 sec/batch
Epoch 19/20  Iteration 3295/3560 Training loss: 1.2150 6.3846 sec/batch
Epoch 19/20  Iteration 3296/3560 Training loss: 1.2150 6.8542 sec/batch
Epoch 19/20  Iteration 3297/3560 Training loss: 1.2146 6.3861 sec/batch
Epoch 19/20  Iteration 3298/3560 Training loss: 1.2142 6.3988 sec/batch
Epoch 19/20  Iteration 3299/3560 Training loss: 1.2139 6.4608 sec/batch
Epoch 19/20  Iteration 3300/3560 Training loss: 1.2139 6.3417 sec/batch
Epoch 19/20  Iteration 3301/3560 Training loss: 1.2139 6.4417 sec/batch
Epoch 19/20  Iteration 3302/3560 Training loss: 1.2135 7.6688 sec/batch
Epoch 19/20  Iteration 3303/3560 Training loss: 1.2131 7.6116 sec/batch
Epoch 19/20  Iteration 3304/3560 Training loss: 1.2128 6.6917 sec/batch
Epoch 19/20  Iteration 3305/3560 Training loss: 1.2127 6.4399 sec/batch
Epoch 19/20  Iteration 3306/3560 Training loss: 1.2126 8.2315 sec/batch
Epoch 19/20  Iteration 3307/3560 Training loss: 1.2124 6.8809 sec/batch
Epoch 19/20  Iteration 3308/3560 Training loss: 1.2123 6.3079 sec/batch
Epoch 19/20  Iteration 3309/3560 Training loss: 1.2122 6.6740 sec/batch
Epoch 19/20  Iteration 3310/3560 Training loss: 1.2121 7.3329 sec/batch
Epoch 19/20  Iteration 3311/3560 Training loss: 1.2120 6.9835 sec/batch
Epoch 19/20  Iteration 3312/3560 Training loss: 1.2119 8.4597 sec/batch
Epoch 19/20  Iteration 3313/3560 Training loss: 1.2117 6.4798 sec/batch
Epoch 19/20  Iteration 3314/3560 Training loss: 1.2118 6.3124 sec/batch
Epoch 19/20  Iteration 3315/3560 Training loss: 1.2116 6.2955 sec/batch
Epoch 19/20  Iteration 3316/3560 Training loss: 1.2115 6.3364 sec/batch
Epoch 19/20  Iteration 3317/3560 Training loss: 1.2114 6.8971 sec/batch
Epoch 19/20  Iteration 3318/3560 Training loss: 1.2113 6.4755 sec/batch
Epoch 19/20  Iteration 3319/3560 Training loss: 1.2110 7.3100 sec/batch
Epoch 19/20  Iteration 3320/3560 Training loss: 1.2107 6.6785 sec/batch
Epoch 19/20  Iteration 3321/3560 Training loss: 1.2108 6.7661 sec/batch
Epoch 19/20  Iteration 3322/3560 Training loss: 1.2108 6.5041 sec/batch
Epoch 19/20  Iteration 3323/3560 Training loss: 1.2107 6.4746 sec/batch
Epoch 19/20  Iteration 3324/3560 Training loss: 1.2107 7.2765 sec/batch
Epoch 19/20  Iteration 3325/3560 Training loss: 1.2106 6.6526 sec/batch
Epoch 19/20  Iteration 3326/3560 Training loss: 1.2103 7.1273 sec/batch
Epoch 19/20  Iteration 3327/3560 Training loss: 1.2099 6.5457 sec/batch
Epoch 19/20  Iteration 3328/3560 Training loss: 1.2099 6.4013 sec/batch
Epoch 19/20  Iteration 3329/3560 Training loss: 1.2098 6.2846 sec/batch
Epoch 19/20  Iteration 3330/3560 Training loss: 1.2094 6.2896 sec/batch
Epoch 19/20  Iteration 3331/3560 Training loss: 1.2095 6.2768 sec/batch
Epoch 19/20  Iteration 3332/3560 Training loss: 1.2094 6.2400 sec/batch
Epoch 19/20  Iteration 3333/3560 Training loss: 1.2092 6.2971 sec/batch
Epoch 19/20  Iteration 3334/3560 Training loss: 1.2089 6.2995 sec/batch
Epoch 19/20  Iteration 3335/3560 Training loss: 1.2085 6.2800 sec/batch
Epoch 19/20  Iteration 3336/3560 Training loss: 1.2084 6.2680 sec/batch
Epoch 19/20  Iteration 3337/3560 Training loss: 1.2085 6.2440 sec/batch
Epoch 19/20  Iteration 3338/3560 Training loss: 1.2085 6.8112 sec/batch
Epoch 19/20  Iteration 3339/3560 Training loss: 1.2086 6.4327 sec/batch
Epoch 19/20  Iteration 3340/3560 Training loss: 1.2085 6.3757 sec/batch
Epoch 19/20  Iteration 3341/3560 Training loss: 1.2087 6.6213 sec/batch
Epoch 19/20  Iteration 3342/3560 Training loss: 1.2088 6.3405 sec/batch
Epoch 19/20  Iteration 3343/3560 Training loss: 1.2088 6.2518 sec/batch
Epoch 19/20  Iteration 3344/3560 Training loss: 1.2088 6.3241 sec/batch
Epoch 19/20  Iteration 3345/3560 Training loss: 1.2091 6.2799 sec/batch
Epoch 19/20  Iteration 3346/3560 Training loss: 1.2092 6.2842 sec/batch
Epoch 19/20  Iteration 3347/3560 Training loss: 1.2092 6.2418 sec/batch
Epoch 19/20  Iteration 3348/3560 Training loss: 1.2094 6.1963 sec/batch
Epoch 19/20  Iteration 3349/3560 Training loss: 1.2093 6.2700 sec/batch
Epoch 19/20  Iteration 3350/3560 Training loss: 1.2094 6.2919 sec/batch
Epoch 19/20  Iteration 3351/3560 Training loss: 1.2095 6.2542 sec/batch
Epoch 19/20  Iteration 3352/3560 Training loss: 1.2097 6.2725 sec/batch
Epoch 19/20  Iteration 3353/3560 Training loss: 1.2099 6.2492 sec/batch
Epoch 19/20  Iteration 3354/3560 Training loss: 1.2098 6.2558 sec/batch
Epoch 19/20  Iteration 3355/3560 Training loss: 1.2095 6.2551 sec/batch
Epoch 19/20  Iteration 3356/3560 Training loss: 1.2094 6.2699 sec/batch
Epoch 19/20  Iteration 3357/3560 Training loss: 1.2094 6.6675 sec/batch
Epoch 19/20  Iteration 3358/3560 Training loss: 1.2094 6.3122 sec/batch
Epoch 19/20  Iteration 3359/3560 Training loss: 1.2094 6.2713 sec/batch
Epoch 19/20  Iteration 3360/3560 Training loss: 1.2093 6.3116 sec/batch
Epoch 19/20  Iteration 3361/3560 Training loss: 1.2093 6.3053 sec/batch
Epoch 19/20  Iteration 3362/3560 Training loss: 1.2092 6.2487 sec/batch
Epoch 19/20  Iteration 3363/3560 Training loss: 1.2089 6.2829 sec/batch
Epoch 19/20  Iteration 3364/3560 Training loss: 1.2091 6.3002 sec/batch
Epoch 19/20  Iteration 3365/3560 Training loss: 1.2092 6.3062 sec/batch
Epoch 19/20  Iteration 3366/3560 Training loss: 1.2092 6.2598 sec/batch
Epoch 19/20  Iteration 3367/3560 Training loss: 1.2092 6.3127 sec/batch
Epoch 19/20  Iteration 3368/3560 Training loss: 1.2092 6.3845 sec/batch
Epoch 19/20  Iteration 3369/3560 Training loss: 1.2092 6.2648 sec/batch
Epoch 19/20  Iteration 3370/3560 Training loss: 1.2091 6.2979 sec/batch
Epoch 19/20  Iteration 3371/3560 Training loss: 1.2093 6.2806 sec/batch
Epoch 19/20  Iteration 3372/3560 Training loss: 1.2097 6.2884 sec/batch
Epoch 19/20  Iteration 3373/3560 Training loss: 1.2097 6.3075 sec/batch
Epoch 19/20  Iteration 3374/3560 Training loss: 1.2098 7.2847 sec/batch
Epoch 19/20  Iteration 3375/3560 Training loss: 1.2097 7.1301 sec/batch
Epoch 19/20  Iteration 3376/3560 Training loss: 1.2096 7.5499 sec/batch
Epoch 19/20  Iteration 3377/3560 Training loss: 1.2098 7.9599 sec/batch
Epoch 19/20  Iteration 3378/3560 Training loss: 1.2098 7.5861 sec/batch
Epoch 19/20  Iteration 3379/3560 Training loss: 1.2099 6.3252 sec/batch
Epoch 19/20  Iteration 3380/3560 Training loss: 1.2097 6.3118 sec/batch
Epoch 19/20  Iteration 3381/3560 Training loss: 1.2096 6.3435 sec/batch
Epoch 19/20  Iteration 3382/3560 Training loss: 1.2098 6.2940 sec/batch
Epoch 20/20  Iteration 3383/3560 Training loss: 1.3296 6.2686 sec/batch
Epoch 20/20  Iteration 3384/3560 Training loss: 1.2758 6.3133 sec/batch
Epoch 20/20  Iteration 3385/3560 Training loss: 1.2561 6.1895 sec/batch
Epoch 20/20  Iteration 3386/3560 Training loss: 1.2485 6.3150 sec/batch
Epoch 20/20  Iteration 3387/3560 Training loss: 1.2351 6.2844 sec/batch
Epoch 20/20  Iteration 3388/3560 Training loss: 1.2221 6.7047 sec/batch
Epoch 20/20  Iteration 3389/3560 Training loss: 1.2211 6.2605 sec/batch
Epoch 20/20  Iteration 3390/3560 Training loss: 1.2181 6.1926 sec/batch
Epoch 20/20  Iteration 3391/3560 Training loss: 1.2175 6.2531 sec/batch
Epoch 20/20  Iteration 3392/3560 Training loss: 1.2154 6.3152 sec/batch
Epoch 20/20  Iteration 3393/3560 Training loss: 1.2122 6.2759 sec/batch
Epoch 20/20  Iteration 3394/3560 Training loss: 1.2125 6.6253 sec/batch
Epoch 20/20  Iteration 3395/3560 Training loss: 1.2120 6.3012 sec/batch
Epoch 20/20  Iteration 3396/3560 Training loss: 1.2130 6.3084 sec/batch
Epoch 20/20  Iteration 3397/3560 Training loss: 1.2118 6.3648 sec/batch
Epoch 20/20  Iteration 3398/3560 Training loss: 1.2095 6.2788 sec/batch
Epoch 20/20  Iteration 3399/3560 Training loss: 1.2097 6.2405 sec/batch
Epoch 20/20  Iteration 3400/3560 Training loss: 1.2104 6.2415 sec/batch
Epoch 20/20  Iteration 3401/3560 Training loss: 1.2098 6.2615 sec/batch
Epoch 20/20  Iteration 3402/3560 Training loss: 1.2112 6.2997 sec/batch
Epoch 20/20  Iteration 3403/3560 Training loss: 1.2107 6.2544 sec/batch
Epoch 20/20  Iteration 3404/3560 Training loss: 1.2107 6.2706 sec/batch
Epoch 20/20  Iteration 3405/3560 Training loss: 1.2102 6.2682 sec/batch
Epoch 20/20  Iteration 3406/3560 Training loss: 1.2106 6.2262 sec/batch
Epoch 20/20  Iteration 3407/3560 Training loss: 1.2104 6.2875 sec/batch
Epoch 20/20  Iteration 3408/3560 Training loss: 1.2088 6.2355 sec/batch
Epoch 20/20  Iteration 3409/3560 Training loss: 1.2074 6.2845 sec/batch
Epoch 20/20  Iteration 3410/3560 Training loss: 1.2080 6.2880 sec/batch
Epoch 20/20  Iteration 3411/3560 Training loss: 1.2082 6.2369 sec/batch
Epoch 20/20  Iteration 3412/3560 Training loss: 1.2086 6.2629 sec/batch
Epoch 20/20  Iteration 3413/3560 Training loss: 1.2078 6.6452 sec/batch
Epoch 20/20  Iteration 3414/3560 Training loss: 1.2068 6.2920 sec/batch
Epoch 20/20  Iteration 3415/3560 Training loss: 1.2067 6.3099 sec/batch
Epoch 20/20  Iteration 3416/3560 Training loss: 1.2068 6.2669 sec/batch
Epoch 20/20  Iteration 3417/3560 Training loss: 1.2062 6.2480 sec/batch
Epoch 20/20  Iteration 3418/3560 Training loss: 1.2059 6.3017 sec/batch
Epoch 20/20  Iteration 3419/3560 Training loss: 1.2052 6.2867 sec/batch
Epoch 20/20  Iteration 3420/3560 Training loss: 1.2041 6.2902 sec/batch
Epoch 20/20  Iteration 3421/3560 Training loss: 1.2031 6.2649 sec/batch
Epoch 20/20  Iteration 3422/3560 Training loss: 1.2030 6.3465 sec/batch
Epoch 20/20  Iteration 3423/3560 Training loss: 1.2024 6.2663 sec/batch
Epoch 20/20  Iteration 3424/3560 Training loss: 1.2032 6.2672 sec/batch
Epoch 20/20  Iteration 3425/3560 Training loss: 1.2030 6.2858 sec/batch
Epoch 20/20  Iteration 3426/3560 Training loss: 1.2024 6.2807 sec/batch
Epoch 20/20  Iteration 3427/3560 Training loss: 1.2026 6.2896 sec/batch
Epoch 20/20  Iteration 3428/3560 Training loss: 1.2018 6.3911 sec/batch
Epoch 20/20  Iteration 3429/3560 Training loss: 1.2013 6.3203 sec/batch
Epoch 20/20  Iteration 3430/3560 Training loss: 1.2009 6.3534 sec/batch
Epoch 20/20  Iteration 3431/3560 Training loss: 1.2008 6.2691 sec/batch
Epoch 20/20  Iteration 3432/3560 Training loss: 1.2010 6.7488 sec/batch
Epoch 20/20  Iteration 3433/3560 Training loss: 1.2004 6.3133 sec/batch
Epoch 20/20  Iteration 3434/3560 Training loss: 1.2010 6.2506 sec/batch
Epoch 20/20  Iteration 3435/3560 Training loss: 1.2009 6.7754 sec/batch
Epoch 20/20  Iteration 3436/3560 Training loss: 1.2010 6.3897 sec/batch
Epoch 20/20  Iteration 3437/3560 Training loss: 1.2009 6.2872 sec/batch
Epoch 20/20  Iteration 3438/3560 Training loss: 1.2008 6.3493 sec/batch
Epoch 20/20  Iteration 3439/3560 Training loss: 1.2011 6.3048 sec/batch
Epoch 20/20  Iteration 3440/3560 Training loss: 1.2008 6.2543 sec/batch
Epoch 20/20  Iteration 3441/3560 Training loss: 1.2003 6.2251 sec/batch
Epoch 20/20  Iteration 3442/3560 Training loss: 1.2009 6.1927 sec/batch
Epoch 20/20  Iteration 3443/3560 Training loss: 1.2009 6.2473 sec/batch
Epoch 20/20  Iteration 3444/3560 Training loss: 1.2019 6.2866 sec/batch
Epoch 20/20  Iteration 3445/3560 Training loss: 1.2021 6.2353 sec/batch
Epoch 20/20  Iteration 3446/3560 Training loss: 1.2020 6.2610 sec/batch
Epoch 20/20  Iteration 3447/3560 Training loss: 1.2020 6.2582 sec/batch
Epoch 20/20  Iteration 3448/3560 Training loss: 1.2021 6.2806 sec/batch
Epoch 20/20  Iteration 3449/3560 Training loss: 1.2023 6.2862 sec/batch
Epoch 20/20  Iteration 3450/3560 Training loss: 1.2020 6.3160 sec/batch
Epoch 20/20  Iteration 3451/3560 Training loss: 1.2021 6.6170 sec/batch
Epoch 20/20  Iteration 3452/3560 Training loss: 1.2020 6.3013 sec/batch
Epoch 20/20  Iteration 3453/3560 Training loss: 1.2026 6.2757 sec/batch
Epoch 20/20  Iteration 3454/3560 Training loss: 1.2029 6.3752 sec/batch
Epoch 20/20  Iteration 3455/3560 Training loss: 1.2033 6.2787 sec/batch
Epoch 20/20  Iteration 3456/3560 Training loss: 1.2030 6.2654 sec/batch
Epoch 20/20  Iteration 3457/3560 Training loss: 1.2029 6.2331 sec/batch
Epoch 20/20  Iteration 3458/3560 Training loss: 1.2031 6.2860 sec/batch
Epoch 20/20  Iteration 3459/3560 Training loss: 1.2030 6.2959 sec/batch
Epoch 20/20  Iteration 3460/3560 Training loss: 1.2029 6.2220 sec/batch
Epoch 20/20  Iteration 3461/3560 Training loss: 1.2022 6.2651 sec/batch
Epoch 20/20  Iteration 3462/3560 Training loss: 1.2021 6.2304 sec/batch
Epoch 20/20  Iteration 3463/3560 Training loss: 1.2018 6.2324 sec/batch
Epoch 20/20  Iteration 3464/3560 Training loss: 1.2016 6.2042 sec/batch
Epoch 20/20  Iteration 3465/3560 Training loss: 1.2012 6.2478 sec/batch
Epoch 20/20  Iteration 3466/3560 Training loss: 1.2011 6.2555 sec/batch
Epoch 20/20  Iteration 3467/3560 Training loss: 1.2008 6.2709 sec/batch
Epoch 20/20  Iteration 3468/3560 Training loss: 1.2007 6.2566 sec/batch
Epoch 20/20  Iteration 3469/3560 Training loss: 1.2006 6.2896 sec/batch
Epoch 20/20  Iteration 3470/3560 Training loss: 1.2005 6.6750 sec/batch
Epoch 20/20  Iteration 3471/3560 Training loss: 1.2002 6.2496 sec/batch
Epoch 20/20  Iteration 3472/3560 Training loss: 1.2002 6.3245 sec/batch
Epoch 20/20  Iteration 3473/3560 Training loss: 1.2000 6.2394 sec/batch
Epoch 20/20  Iteration 3474/3560 Training loss: 1.2000 6.2381 sec/batch
Epoch 20/20  Iteration 3475/3560 Training loss: 1.1996 6.3164 sec/batch
Epoch 20/20  Iteration 3476/3560 Training loss: 1.1993 6.6874 sec/batch
Epoch 20/20  Iteration 3477/3560 Training loss: 1.1991 7.2841 sec/batch
Epoch 20/20  Iteration 3478/3560 Training loss: 1.1992 7.5038 sec/batch
Epoch 20/20  Iteration 3479/3560 Training loss: 1.1992 6.8690 sec/batch
Epoch 20/20  Iteration 3480/3560 Training loss: 1.1988 8.5416 sec/batch
Epoch 20/20  Iteration 3481/3560 Training loss: 1.1985 9.2553 sec/batch
Epoch 20/20  Iteration 3482/3560 Training loss: 1.1982 7.0985 sec/batch
Epoch 20/20  Iteration 3483/3560 Training loss: 1.1982 7.0665 sec/batch
Epoch 20/20  Iteration 3484/3560 Training loss: 1.1981 7.0785 sec/batch
Epoch 20/20  Iteration 3485/3560 Training loss: 1.1978 6.9687 sec/batch
Epoch 20/20  Iteration 3486/3560 Training loss: 1.1977 7.0075 sec/batch
Epoch 20/20  Iteration 3487/3560 Training loss: 1.1976 7.6221 sec/batch
Epoch 20/20  Iteration 3488/3560 Training loss: 1.1975 7.0529 sec/batch
Epoch 20/20  Iteration 3489/3560 Training loss: 1.1974 6.9703 sec/batch
Epoch 20/20  Iteration 3490/3560 Training loss: 1.1975 6.8599 sec/batch
Epoch 20/20  Iteration 3491/3560 Training loss: 1.1973 6.8536 sec/batch
Epoch 20/20  Iteration 3492/3560 Training loss: 1.1974 6.8967 sec/batch
Epoch 20/20  Iteration 3493/3560 Training loss: 1.1972 6.9132 sec/batch
Epoch 20/20  Iteration 3494/3560 Training loss: 1.1973 6.9571 sec/batch
Epoch 20/20  Iteration 3495/3560 Training loss: 1.1971 6.9238 sec/batch
Epoch 20/20  Iteration 3496/3560 Training loss: 1.1971 6.9885 sec/batch
Epoch 20/20  Iteration 3497/3560 Training loss: 1.1968 6.9003 sec/batch
Epoch 20/20  Iteration 3498/3560 Training loss: 1.1966 7.0953 sec/batch
Epoch 20/20  Iteration 3499/3560 Training loss: 1.1967 7.6385 sec/batch
Epoch 20/20  Iteration 3500/3560 Training loss: 1.1967 7.9735 sec/batch
Epoch 20/20  Iteration 3501/3560 Training loss: 1.1967 6.9159 sec/batch
Epoch 20/20  Iteration 3502/3560 Training loss: 1.1967 7.0141 sec/batch
Epoch 20/20  Iteration 3503/3560 Training loss: 1.1966 6.9548 sec/batch
Epoch 20/20  Iteration 3504/3560 Training loss: 1.1963 7.4799 sec/batch
Epoch 20/20  Iteration 3505/3560 Training loss: 1.1961 7.0858 sec/batch
Epoch 20/20  Iteration 3506/3560 Training loss: 1.1959 7.1003 sec/batch
Epoch 20/20  Iteration 3507/3560 Training loss: 1.1959 7.0136 sec/batch
Epoch 20/20  Iteration 3508/3560 Training loss: 1.1956 6.9804 sec/batch
Epoch 20/20  Iteration 3509/3560 Training loss: 1.1955 7.0836 sec/batch
Epoch 20/20  Iteration 3510/3560 Training loss: 1.1956 7.0367 sec/batch
Epoch 20/20  Iteration 3511/3560 Training loss: 1.1954 7.1790 sec/batch
Epoch 20/20  Iteration 3512/3560 Training loss: 1.1951 7.0072 sec/batch
Epoch 20/20  Iteration 3513/3560 Training loss: 1.1947 7.0296 sec/batch
Epoch 20/20  Iteration 3514/3560 Training loss: 1.1946 7.0661 sec/batch
Epoch 20/20  Iteration 3515/3560 Training loss: 1.1947 6.9870 sec/batch
Epoch 20/20  Iteration 3516/3560 Training loss: 1.1948 7.0366 sec/batch
Epoch 20/20  Iteration 3517/3560 Training loss: 1.1948 6.9337 sec/batch
Epoch 20/20  Iteration 3518/3560 Training loss: 1.1948 6.9253 sec/batch
Epoch 20/20  Iteration 3519/3560 Training loss: 1.1949 6.9747 sec/batch
Epoch 20/20  Iteration 3520/3560 Training loss: 1.1950 6.9639 sec/batch
Epoch 20/20  Iteration 3521/3560 Training loss: 1.1950 7.4375 sec/batch
Epoch 20/20  Iteration 3522/3560 Training loss: 1.1951 6.9401 sec/batch
Epoch 20/20  Iteration 3523/3560 Training loss: 1.1954 7.2840 sec/batch
Epoch 20/20  Iteration 3524/3560 Training loss: 1.1955 7.4973 sec/batch
Epoch 20/20  Iteration 3525/3560 Training loss: 1.1955 6.9956 sec/batch
Epoch 20/20  Iteration 3526/3560 Training loss: 1.1956 7.3638 sec/batch
Epoch 20/20  Iteration 3527/3560 Training loss: 1.1955 7.0423 sec/batch
Epoch 20/20  Iteration 3528/3560 Training loss: 1.1957 7.0321 sec/batch
Epoch 20/20  Iteration 3529/3560 Training loss: 1.1958 7.1624 sec/batch
Epoch 20/20  Iteration 3530/3560 Training loss: 1.1960 6.6271 sec/batch
Epoch 20/20  Iteration 3531/3560 Training loss: 1.1962 6.2943 sec/batch
Epoch 20/20  Iteration 3532/3560 Training loss: 1.1961 6.3705 sec/batch
Epoch 20/20  Iteration 3533/3560 Training loss: 1.1958 6.2963 sec/batch
Epoch 20/20  Iteration 3534/3560 Training loss: 1.1957 6.2914 sec/batch
Epoch 20/20  Iteration 3535/3560 Training loss: 1.1958 6.2938 sec/batch
Epoch 20/20  Iteration 3536/3560 Training loss: 1.1957 6.3318 sec/batch
Epoch 20/20  Iteration 3537/3560 Training loss: 1.1957 6.3618 sec/batch
Epoch 20/20  Iteration 3538/3560 Training loss: 1.1957 6.3994 sec/batch
Epoch 20/20  Iteration 3539/3560 Training loss: 1.1957 6.5129 sec/batch
Epoch 20/20  Iteration 3540/3560 Training loss: 1.1957 6.2801 sec/batch
Epoch 20/20  Iteration 3541/3560 Training loss: 1.1955 6.3134 sec/batch
Epoch 20/20  Iteration 3542/3560 Training loss: 1.1956 6.4123 sec/batch
Epoch 20/20  Iteration 3543/3560 Training loss: 1.1958 6.3343 sec/batch
Epoch 20/20  Iteration 3544/3560 Training loss: 1.1959 6.2830 sec/batch
Epoch 20/20  Iteration 3545/3560 Training loss: 1.1958 6.2868 sec/batch
Epoch 20/20  Iteration 3546/3560 Training loss: 1.1958 6.3630 sec/batch
Epoch 20/20  Iteration 3547/3560 Training loss: 1.1958 6.2744 sec/batch
Epoch 20/20  Iteration 3548/3560 Training loss: 1.1958 6.2753 sec/batch
Epoch 20/20  Iteration 3549/3560 Training loss: 1.1960 6.2673 sec/batch
Epoch 20/20  Iteration 3550/3560 Training loss: 1.1964 6.3257 sec/batch
Epoch 20/20  Iteration 3551/3560 Training loss: 1.1965 6.3086 sec/batch
Epoch 20/20  Iteration 3552/3560 Training loss: 1.1965 6.2953 sec/batch
Epoch 20/20  Iteration 3553/3560 Training loss: 1.1964 6.2732 sec/batch
Epoch 20/20  Iteration 3554/3560 Training loss: 1.1963 6.3206 sec/batch
Epoch 20/20  Iteration 3555/3560 Training loss: 1.1965 6.2781 sec/batch
Epoch 20/20  Iteration 3556/3560 Training loss: 1.1966 6.2947 sec/batch
Epoch 20/20  Iteration 3557/3560 Training loss: 1.1966 6.4850 sec/batch
Epoch 20/20  Iteration 3558/3560 Training loss: 1.1965 6.5041 sec/batch
Epoch 20/20  Iteration 3559/3560 Training loss: 1.1965 6.2433 sec/batch
Epoch 20/20  Iteration 3560/3560 Training loss: 1.1967 6.3140 sec/batch

In [15]:
tf.train.get_checkpoint_state('checkpoints/anna')

Sampling

Now that the network is trained, we'll can use it to generate new text. The idea is that we pass in a character, then the network will predict the next character. We can use the new one, to predict the next one. And we keep doing this to generate all new text. I also included some functionality to prime the network with some text by passing in a string and building up a state from that.

The network gives us predictions for each character. To reduce noise and make things a little less random, I'm going to only choose a new character from the top N most likely characters.


In [16]:
def pick_top_n(preds, vocab_size, top_n=5):
    p = np.squeeze(preds)
    p[np.argsort(p)[:-top_n]] = 0
    p = p / np.sum(p)
    c = np.random.choice(vocab_size, 1, p=p)[0]
    return c

In [17]:
def sample(checkpoint, n_samples, lstm_size, vocab_size, prime="The "):
    prime = "Far"
    samples = [c for c in prime]
    model = build_rnn(vocab_size, lstm_size=lstm_size, sampling=True)
    saver = tf.train.Saver()
    with tf.Session() as sess:
        saver.restore(sess, checkpoint)
        new_state = sess.run(model.initial_state)
        for c in prime:
            x = np.zeros((1, 1))
            x[0,0] = vocab_to_int[c]
            feed = {model.inputs: x,
                    model.keep_prob: 1.,
                    model.initial_state: new_state}
            preds, new_state = sess.run([model.preds, model.final_state], 
                                         feed_dict=feed)

        c = pick_top_n(preds, len(vocab))
        samples.append(int_to_vocab[c])

        for i in range(n_samples):
            x[0,0] = c
            feed = {model.inputs: x,
                    model.keep_prob: 1.,
                    model.initial_state: new_state}
            preds, new_state = sess.run([model.preds, model.final_state], 
                                         feed_dict=feed)

            c = pick_top_n(preds, len(vocab))
            samples.append(int_to_vocab[c])
        
    return ''.join(samples)

In [18]:
checkpoint = "checkpoints/anna/i3560_l512_1.122.ckpt"
samp = sample(checkpoint, 2000, lstm_size, len(vocab), prime="Far")
print(samp)


INFO:tensorflow:Restoring parameters from checkpoints/anna/i3560_l512_1.122.ckpt
---------------------------------------------------------------------------
NotFoundError                             Traceback (most recent call last)
/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/tensorflow/python/client/session.py in _do_call(self, fn, *args)
   1038     try:
-> 1039       return fn(*args)
   1040     except errors.OpError as e:

/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/tensorflow/python/client/session.py in _run_fn(session, feed_dict, fetch_list, target_list, options, run_metadata)
   1020                                  feed_dict, fetch_list, target_list,
-> 1021                                  status, run_metadata)
   1022 

/Users/arturops/anaconda3/envs/rnn/lib/python3.6/contextlib.py in __exit__(self, type, value, traceback)
     88             try:
---> 89                 next(self.gen)
     90             except StopIteration:

/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/tensorflow/python/framework/errors_impl.py in raise_exception_on_not_ok_status()
    465           compat.as_text(pywrap_tensorflow.TF_Message(status)),
--> 466           pywrap_tensorflow.TF_GetCode(status))
    467   finally:

NotFoundError: Unsuccessful TensorSliceReader constructor: Failed to find any matching files for checkpoints/anna/i3560_l512_1.122.ckpt
	 [[Node: save/RestoreV2 = RestoreV2[dtypes=[DT_FLOAT], _device="/job:localhost/replica:0/task:0/cpu:0"](_recv_save/Const_0, save/RestoreV2/tensor_names, save/RestoreV2/shape_and_slices)]]

During handling of the above exception, another exception occurred:

NotFoundError                             Traceback (most recent call last)
<ipython-input-18-91ad82b46673> in <module>()
      1 checkpoint = "checkpoints/anna/i3560_l512_1.122.ckpt"
----> 2 samp = sample(checkpoint, 2000, lstm_size, len(vocab), prime="Far")
      3 print(samp)

<ipython-input-17-a7ae04af7e97> in sample(checkpoint, n_samples, lstm_size, vocab_size, prime)
      5     saver = tf.train.Saver()
      6     with tf.Session() as sess:
----> 7         saver.restore(sess, checkpoint)
      8         new_state = sess.run(model.initial_state)
      9         for c in prime:

/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/tensorflow/python/training/saver.py in restore(self, sess, save_path)
   1455     logging.info("Restoring parameters from %s", save_path)
   1456     sess.run(self.saver_def.restore_op_name,
-> 1457              {self.saver_def.filename_tensor_name: save_path})
   1458 
   1459   @staticmethod

/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/tensorflow/python/client/session.py in run(self, fetches, feed_dict, options, run_metadata)
    776     try:
    777       result = self._run(None, fetches, feed_dict, options_ptr,
--> 778                          run_metadata_ptr)
    779       if run_metadata:
    780         proto_data = tf_session.TF_GetBuffer(run_metadata_ptr)

/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/tensorflow/python/client/session.py in _run(self, handle, fetches, feed_dict, options, run_metadata)
    980     if final_fetches or final_targets:
    981       results = self._do_run(handle, final_targets, final_fetches,
--> 982                              feed_dict_string, options, run_metadata)
    983     else:
    984       results = []

/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/tensorflow/python/client/session.py in _do_run(self, handle, target_list, fetch_list, feed_dict, options, run_metadata)
   1030     if handle is None:
   1031       return self._do_call(_run_fn, self._session, feed_dict, fetch_list,
-> 1032                            target_list, options, run_metadata)
   1033     else:
   1034       return self._do_call(_prun_fn, self._session, handle, feed_dict,

/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/tensorflow/python/client/session.py in _do_call(self, fn, *args)
   1050         except KeyError:
   1051           pass
-> 1052       raise type(e)(node_def, op, message)
   1053 
   1054   def _extend_graph(self):

NotFoundError: Unsuccessful TensorSliceReader constructor: Failed to find any matching files for checkpoints/anna/i3560_l512_1.122.ckpt
	 [[Node: save/RestoreV2 = RestoreV2[dtypes=[DT_FLOAT], _device="/job:localhost/replica:0/task:0/cpu:0"](_recv_save/Const_0, save/RestoreV2/tensor_names, save/RestoreV2/shape_and_slices)]]

Caused by op 'save/RestoreV2', defined at:
  File "/Users/arturops/anaconda3/envs/rnn/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/Users/arturops/anaconda3/envs/rnn/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/ipykernel_launcher.py", line 16, in <module>
    app.launch_new_instance()
  File "/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/traitlets/config/application.py", line 658, in launch_instance
    app.start()
  File "/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/ipykernel/kernelapp.py", line 477, in start
    ioloop.IOLoop.instance().start()
  File "/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/zmq/eventloop/ioloop.py", line 177, in start
    super(ZMQIOLoop, self).start()
  File "/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/tornado/ioloop.py", line 888, in start
    handler_func(fd_obj, events)
  File "/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/tornado/stack_context.py", line 277, in null_wrapper
    return fn(*args, **kwargs)
  File "/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/zmq/eventloop/zmqstream.py", line 440, in _handle_events
    self._handle_recv()
  File "/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/zmq/eventloop/zmqstream.py", line 472, in _handle_recv
    self._run_callback(callback, msg)
  File "/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/zmq/eventloop/zmqstream.py", line 414, in _run_callback
    callback(*args, **kwargs)
  File "/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/tornado/stack_context.py", line 277, in null_wrapper
    return fn(*args, **kwargs)
  File "/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/ipykernel/kernelbase.py", line 283, in dispatcher
    return self.dispatch_shell(stream, msg)
  File "/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/ipykernel/kernelbase.py", line 235, in dispatch_shell
    handler(stream, idents, msg)
  File "/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/ipykernel/kernelbase.py", line 399, in execute_request
    user_expressions, allow_stdin)
  File "/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/ipykernel/ipkernel.py", line 196, in do_execute
    res = shell.run_cell(code, store_history=store_history, silent=silent)
  File "/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/ipykernel/zmqshell.py", line 533, in run_cell
    return super(ZMQInteractiveShell, self).run_cell(*args, **kwargs)
  File "/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/IPython/core/interactiveshell.py", line 2683, in run_cell
    interactivity=interactivity, compiler=compiler, result=result)
  File "/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/IPython/core/interactiveshell.py", line 2787, in run_ast_nodes
    if self.run_code(code, result):
  File "/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/IPython/core/interactiveshell.py", line 2847, in run_code
    exec(code_obj, self.user_global_ns, self.user_ns)
  File "<ipython-input-18-91ad82b46673>", line 2, in <module>
    samp = sample(checkpoint, 2000, lstm_size, len(vocab), prime="Far")
  File "<ipython-input-17-a7ae04af7e97>", line 5, in sample
    saver = tf.train.Saver()
  File "/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/tensorflow/python/training/saver.py", line 1056, in __init__
    self.build()
  File "/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/tensorflow/python/training/saver.py", line 1086, in build
    restore_sequentially=self._restore_sequentially)
  File "/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/tensorflow/python/training/saver.py", line 691, in build
    restore_sequentially, reshape)
  File "/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/tensorflow/python/training/saver.py", line 407, in _AddRestoreOps
    tensors = self.restore_op(filename_tensor, saveable, preferred_shard)
  File "/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/tensorflow/python/training/saver.py", line 247, in restore_op
    [spec.tensor.dtype])[0])
  File "/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/tensorflow/python/ops/gen_io_ops.py", line 669, in restore_v2
    dtypes=dtypes, name=name)
  File "/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/tensorflow/python/framework/op_def_library.py", line 768, in apply_op
    op_def=op_def)
  File "/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 2336, in create_op
    original_op=self._default_original_op, op_def=op_def)
  File "/Users/arturops/anaconda3/envs/rnn/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 1228, in __init__
    self._traceback = _extract_stack()

NotFoundError (see above for traceback): Unsuccessful TensorSliceReader constructor: Failed to find any matching files for checkpoints/anna/i3560_l512_1.122.ckpt
	 [[Node: save/RestoreV2 = RestoreV2[dtypes=[DT_FLOAT], _device="/job:localhost/replica:0/task:0/cpu:0"](_recv_save/Const_0, save/RestoreV2/tensor_names, save/RestoreV2/shape_and_slices)]]

In [ ]:
checkpoint = "checkpoints/anna/i200_l512_2.432.ckpt"
samp = sample(checkpoint, 1000, lstm_size, len(vocab), prime="Far")
print(samp)

In [ ]:
checkpoint = "checkpoints/anna/i600_l512_1.750.ckpt"
samp = sample(checkpoint, 1000, lstm_size, len(vocab), prime="Far")
print(samp)

In [ ]:
checkpoint = "checkpoints/anna/i1000_l512_1.484.ckpt"
samp = sample(checkpoint, 1000, lstm_size, len(vocab), prime="Far")
print(samp)