Skip-gram word2vec

In this notebook, I'll lead you through using TensorFlow to implement the word2vec algorithm using the skip-gram architecture. By implementing this, you'll learn about embedding words for use in natural language processing. This will come in handy when dealing with things like machine translation.

Readings

Here are the resources I used to build this notebook. I suggest reading these either beforehand or while you're working on this material.

Word embeddings

When you're dealing with words in text, you end up with tens of thousands of classes to predict, one for each word. Trying to one-hot encode these words is massively inefficient, you'll have one element set to 1 and the other 50,000 set to 0. The matrix multiplication going into the first hidden layer will have almost all of the resulting values be zero. This a huge waste of computation.

To solve this problem and greatly increase the efficiency of our networks, we use what are called embeddings. Embeddings are just a fully connected layer like you've seen before. We call this layer the embedding layer and the weights are embedding weights. We skip the multiplication into the embedding layer by instead directly grabbing the hidden layer values from the weight matrix. We can do this because the multiplication of a one-hot encoded vector with a matrix returns the row of the matrix corresponding the index of the "on" input unit.

Instead of doing the matrix multiplication, we use the weight matrix as a lookup table. We encode the words as integers, for example "heart" is encoded as 958, "mind" as 18094. Then to get hidden layer values for "heart", you just take the 958th row of the embedding matrix. This process is called an embedding lookup and the number of hidden units is the embedding dimension.

There is nothing magical going on here. The embedding lookup table is just a weight matrix. The embedding layer is just a hidden layer. The lookup is just a shortcut for the matrix multiplication. The lookup table is trained just like any weight matrix as well.

Embeddings aren't only used for words of course. You can use them for any model where you have a massive number of classes. A particular type of model called Word2Vec uses the embedding layer to find vector representations of words that contain semantic meaning.

Word2Vec

The word2vec algorithm finds much more efficient representations by finding vectors that represent the words. These vectors also contain semantic information about the words. Words that show up in similar contexts, such as "black", "white", and "red" will have vectors near each other. There are two architectures for implementing word2vec, CBOW (Continuous Bag-Of-Words) and Skip-gram.

In this implementation, we'll be using the skip-gram architecture because it performs better than CBOW. Here, we pass in a word and try to predict the words surrounding it in the text. In this way, we can train the network to learn representations for words that show up in similar contexts.

First up, importing packages.


In [1]:
import time

import numpy as np
import tensorflow as tf

import utils

Load the text8 dataset, a file of cleaned up Wikipedia articles from Matt Mahoney. The next cell will download the data set to the data folder. Then you can extract it and delete the archive file to save storage space.


In [2]:
from urllib.request import urlretrieve
from os.path import isfile, isdir
from tqdm import tqdm
import zipfile

dataset_folder_path = 'data'
dataset_filename = 'text8.zip'
dataset_name = 'Text8 Dataset'

class DLProgress(tqdm):
    last_block = 0

    def hook(self, block_num=1, block_size=1, total_size=None):
        self.total = total_size
        self.update((block_num - self.last_block) * block_size)
        self.last_block = block_num

if not isfile(dataset_filename):
    with DLProgress(unit='B', unit_scale=True, miniters=1, desc=dataset_name) as pbar:
        urlretrieve(
            'http://mattmahoney.net/dc/text8.zip',
            dataset_filename,
            pbar.hook)

if not isdir(dataset_folder_path):
    with zipfile.ZipFile(dataset_filename) as zip_ref:
        zip_ref.extractall(dataset_folder_path)
        
with open('data/text8') as f:
    text = f.read()

Preprocessing

Here I'm fixing up the text to make training easier. This comes from the utils module I wrote. The preprocess function coverts any punctuation into tokens, so a period is changed to <PERIOD>. In this data set, there aren't any periods, but it will help in other NLP problems. I'm also removing all words that show up five or fewer times in the dataset. This will greatly reduce issues due to noise in the data and improve the quality of the vector representations. If you want to write your own functions for this stuff, go for it.


In [3]:
words = utils.preprocess(text)
print(words[:30])


['anarchism', 'originated', 'as', 'a', 'term', 'of', 'abuse', 'first', 'used', 'against', 'early', 'working', 'class', 'radicals', 'including', 'the', 'diggers', 'of', 'the', 'english', 'revolution', 'and', 'the', 'sans', 'culottes', 'of', 'the', 'french', 'revolution', 'whilst']

In [4]:
print("Total words: {}".format(len(words)))
print("Unique words: {}".format(len(set(words))))


Total words: 16680599
Unique words: 63641

And here I'm creating dictionaries to covert words to integers and backwards, integers to words. The integers are assigned in descending frequency order, so the most frequent word ("the") is given the integer 0 and the next most frequent is 1 and so on. The words are converted to integers and stored in the list int_words.


In [5]:
vocab_to_int, int_to_vocab = utils.create_lookup_tables(words)
int_words = [vocab_to_int[word] for word in words]

Subsampling

Words that show up often such as "the", "of", and "for" don't provide much context to the nearby words. If we discard some of them, we can remove some of the noise from our data and in return get faster training and better representations. This process is called subsampling by Mikolov. For each word $w_i$ in the training set, we'll discard it with probability given by

$$ P(w_i) = 1 - \sqrt{\frac{t}{f(w_i)}} $$

where $t$ is a threshold parameter and $f(w_i)$ is the frequency of word $w_i$ in the total dataset.

I'm going to leave this up to you as an exercise. Check out my solution to see how I did it.

Exercise: Implement subsampling for the words in int_words. That is, go through int_words and discard each word given the probablility $P(w_i)$ shown above. Note that $P(w_i)$ is that probability that a word is discarded. Assign the subsampled data to train_words.


In [6]:
from collections import Counter
import random

threshold = 1e-5
word_counts = Counter(int_words)
total_count = len(int_words)
freqs = {word: count/total_count for word, count in word_counts.items()}
p_drop = {word: 1 - np.sqrt(threshold/freqs[word]) for word in word_counts}
train_words = [word for word in int_words if random.random() < (1 - p_drop[word])]

Making batches

Now that our data is in good shape, we need to get it into the proper form to pass it into our network. With the skip-gram architecture, for each word in the text, we want to grab all the words in a window around that word, with size $C$.

From Mikolov et al.:

"Since the more distant words are usually less related to the current word than those close to it, we give less weight to the distant words by sampling less from those words in our training examples... If we choose $C = 5$, for each training word we will select randomly a number $R$ in range $< 1; C >$, and then use $R$ words from history and $R$ words from the future of the current word as correct labels."

Exercise: Implement a function get_target that receives a list of words, an index, and a window size, then returns a list of words in the window around the index. Make sure to use the algorithm described above, where you chose a random number of words to from the window.


In [7]:
def get_target(words, idx, window_size=5):
    ''' Get a list of words in a window around an index. '''
    
    R = np.random.randint(1, window_size+1)
    start = idx - R if (idx - R) > 0 else 0
    stop = idx + R
    target_words = set(words[start:idx] + words[idx+1:stop+1])
    
    return list(target_words)

Here's a function that returns batches for our network. The idea is that it grabs batch_size words from a words list. Then for each of those words, it gets the target words in the window. I haven't found a way to pass in a random number of target words and get it to work with the architecture, so I make one row per input-target pair. This is a generator function by the way, helps save memory.


In [8]:
def get_batches(words, batch_size, window_size=5):
    ''' Create a generator of word batches as a tuple (inputs, targets) '''
    
    n_batches = len(words)//batch_size
    
    # only full batches
    words = words[:n_batches*batch_size]
    
    for idx in range(0, len(words), batch_size):
        x, y = [], []
        batch = words[idx:idx+batch_size]
        for ii in range(len(batch)):
            batch_x = batch[ii]
            batch_y = get_target(batch, ii, window_size)
            y.extend(batch_y)
            x.extend([batch_x]*len(batch_y))
        yield x, y

Building the graph

From Chris McCormick's blog, we can see the general structure of our network.

The input words are passed in as one-hot encoded vectors. This will go into a hidden layer of linear units, then into a softmax layer. We'll use the softmax layer to make a prediction like normal.

The idea here is to train the hidden layer weight matrix to find efficient representations for our words. We can discard the softmax layer becuase we don't really care about making predictions with this network. We just want the embedding matrix so we can use it in other networks we build from the dataset.

I'm going to have you build the graph in stages now. First off, creating the inputs and labels placeholders like normal.

Exercise: Assign inputs and labels using tf.placeholder. We're going to be passing in integers, so set the data types to tf.int32. The batches we're passing in will have varying sizes, so set the batch sizes to [None]. To make things work later, you'll need to set the second dimension of labels to None or 1.


In [9]:
train_graph = tf.Graph()
with train_graph.as_default():
    inputs = tf.placeholder(tf.int32, [None], name='inputs')
    labels = tf.placeholder(tf.int32, [None, None], name='labels')

Embedding

The embedding matrix has a size of the number of words by the number of units in the hidden layer. So, if you have 10,000 words and 300 hidden units, the matrix will have size $10,000 \times 300$. Remember that we're using tokenized data for our inputs, usually as integers, where the number of tokens is the number of words in our vocabulary.

Exercise: Tensorflow provides a convenient function tf.nn.embedding_lookup that does this lookup for us. You pass in the embedding matrix and a tensor of integers, then it returns rows in the matrix corresponding to those integers. Below, set the number of embedding features you'll use (200 is a good start), create the embedding matrix variable, and use tf.nn.embedding_lookup to get the embedding tensors. For the embedding matrix, I suggest you initialize it with a uniform random numbers between -1 and 1 using tf.random_uniform.


In [10]:
n_vocab = len(int_to_vocab)
n_embedding = 200 # Number of embedding features 
with train_graph.as_default():
    embedding = tf.Variable(tf.random_uniform((n_vocab, n_embedding), -1, 1))
    embed = tf.nn.embedding_lookup(embedding, inputs)

Negative sampling

For every example we give the network, we train it using the output from the softmax layer. That means for each input, we're making very small changes to millions of weights even though we only have one true example. This makes training the network very inefficient. We can approximate the loss from the softmax layer by only updating a small subset of all the weights at once. We'll update the weights for the correct label, but only a small number of incorrect labels. This is called "negative sampling". Tensorflow has a convenient function to do this, tf.nn.sampled_softmax_loss.

Exercise: Below, create weights and biases for the softmax layer. Then, use tf.nn.sampled_softmax_loss to calculate the loss. Be sure to read the documentation to figure out how it works.


In [11]:
# Number of negative labels to sample
n_sampled = 100
with train_graph.as_default():
    softmax_w = tf.Variable(tf.truncated_normal((n_vocab, n_embedding), stddev=0.1))
    softmax_b = tf.Variable(tf.zeros(n_vocab))
    
    # Calculate the loss using negative sampling
    loss = tf.nn.sampled_softmax_loss(softmax_w, softmax_b, 
                                      labels, embed,
                                      n_sampled, n_vocab)
    
    cost = tf.reduce_mean(loss)
    optimizer = tf.train.AdamOptimizer().minimize(cost)

Validation

This code is from Thushan Ganegedara's implementation. Here we're going to choose a few common words and few uncommon words. Then, we'll print out the closest words to them. It's a nice way to check that our embedding table is grouping together words with similar semantic meanings.


In [12]:
with train_graph.as_default():
    ## From Thushan Ganegedara's implementation
    valid_size = 16 # Random set of words to evaluate similarity on.
    valid_window = 100
    # pick 8 samples from (0,100) and (1000,1100) each ranges. lower id implies more frequent 
    valid_examples = np.array(random.sample(range(valid_window), valid_size//2))
    valid_examples = np.append(valid_examples, 
                               random.sample(range(1000,1000+valid_window), valid_size//2))

    valid_dataset = tf.constant(valid_examples, dtype=tf.int32)
    
    # We use the cosine distance:
    norm = tf.sqrt(tf.reduce_sum(tf.square(embedding), 1, keep_dims=True))
    normalized_embedding = embedding / norm
    valid_embedding = tf.nn.embedding_lookup(normalized_embedding, valid_dataset)
    similarity = tf.matmul(valid_embedding, tf.transpose(normalized_embedding))

In [ ]:
# If the checkpoints directory doesn't exist:
!mkdir checkpoints

In [ ]:
epochs = 10
batch_size = 1000
window_size = 10

with train_graph.as_default():
    saver = tf.train.Saver()

with tf.Session(graph=train_graph) as sess:
    iteration = 1
    loss = 0
    sess.run(tf.global_variables_initializer())

    for e in range(1, epochs+1):
        batches = get_batches(train_words, batch_size, window_size)
        start = time.time()
        for x, y in batches:
            
            feed = {inputs: x,
                    labels: np.array(y)[:, None]}
            train_loss, _ = sess.run([cost, optimizer], feed_dict=feed)
            
            loss += train_loss
            
            if iteration % 100 == 0: 
                end = time.time()
                print("Epoch {}/{}".format(e, epochs),
                      "Iteration: {}".format(iteration),
                      "Avg. Training loss: {:.4f}".format(loss/100),
                      "{:.4f} sec/batch".format((end-start)/100))
                loss = 0
                start = time.time()
            
            if iteration % 1000 == 0:
                # note that this is expensive (~20% slowdown if computed every 500 steps)
                sim = similarity.eval()
                for i in range(valid_size):
                    valid_word = int_to_vocab[valid_examples[i]]
                    top_k = 8 # number of nearest neighbors
                    nearest = (-sim[i, :]).argsort()[1:top_k+1]
                    log = 'Nearest to %s:' % valid_word
                    for k in range(top_k):
                        close_word = int_to_vocab[nearest[k]]
                        log = '%s %s,' % (log, close_word)
                    print(log)
            
            iteration += 1
    save_path = saver.save(sess, "checkpoints/text8.ckpt")
    embed_mat = sess.run(normalized_embedding)


Epoch 1/10 Iteration: 100 Avg. Training loss: 5.6200 0.2884 sec/batch
Epoch 1/10 Iteration: 200 Avg. Training loss: 5.6361 0.2843 sec/batch
Epoch 1/10 Iteration: 300 Avg. Training loss: 5.5265 0.2911 sec/batch
Epoch 1/10 Iteration: 400 Avg. Training loss: 5.6113 0.2928 sec/batch
Epoch 1/10 Iteration: 500 Avg. Training loss: 5.5040 0.2966 sec/batch
Epoch 1/10 Iteration: 600 Avg. Training loss: 5.5520 0.3068 sec/batch
Epoch 1/10 Iteration: 700 Avg. Training loss: 5.5487 0.2954 sec/batch
Epoch 1/10 Iteration: 800 Avg. Training loss: 5.5446 0.2821 sec/batch
Epoch 1/10 Iteration: 900 Avg. Training loss: 5.4716 0.2945 sec/batch
Epoch 1/10 Iteration: 1000 Avg. Training loss: 5.4242 0.2829 sec/batch
Nearest to not: branca, zx, ddd, deutschlands, politican, zanjeer, moire, reputedly,
Nearest to people: established, cowl, untethered, tupac, ccc, shooter, fabrication, algal,
Nearest to between: modicum, anarchy, cyrillic, subtitles, earn, poul, dwindle, imps,
Nearest to new: nafta, repaired, stepmania, highest, ericaceae, ideogram, illuminate, seriousness,
Nearest to many: afghana, deprecated, marple, generalised, seniors, pyroclastic, bad, lina,
Nearest to the: schulz, hiroshima, whisper, mayhem, consistent, wikibook, ousterhout, plum,
Nearest to up: naboth, unbaptized, domineering, dominator, poset, bands, nonlinear, diem,
Nearest to which: awakening, sunos, averaging, bsa, mauro, pneumococcus, bourdon, mandelstam,
Nearest to applications: upto, inferiority, regulus, logbook, ljubljana, subjective, ended, arabidopsis,
Nearest to professional: bashar, improper, coals, xd, lefebvre, ifbb, deflation, rnsson,
Nearest to older: wounds, opioid, performer, prohibits, occultism, overcame, sutta, classicism,
Nearest to shown: kilns, macrolides, bred, mondo, glamorgan, intent, blockers, chatham,
Nearest to rise: hilbert, flower, trader, homoerotic, larsson, triphosphate, kdf, savonarola,
Nearest to derived: gmc, nonverbal, disassembly, thornley, golding, transmembrane, honouring, marshalling,
Nearest to cost: graziani, dnb, pseudalopex, polytheism, caledonian, synergistic, bally, everywhere,
Nearest to existence: longifolia, jbls, bureaucrats, hingle, khazars, lustrum, brenda, hiss,
Epoch 1/10 Iteration: 1100 Avg. Training loss: 5.4932 0.2854 sec/batch
Epoch 1/10 Iteration: 1200 Avg. Training loss: 5.4019 0.2861 sec/batch
Epoch 1/10 Iteration: 1300 Avg. Training loss: 5.3680 0.2819 sec/batch
Epoch 1/10 Iteration: 1400 Avg. Training loss: 5.2838 0.2869 sec/batch
Epoch 1/10 Iteration: 1500 Avg. Training loss: 5.2014 0.2869 sec/batch
Epoch 1/10 Iteration: 1600 Avg. Training loss: 5.1819 0.2734 sec/batch
Epoch 1/10 Iteration: 1700 Avg. Training loss: 5.1049 0.3186 sec/batch
Epoch 1/10 Iteration: 1800 Avg. Training loss: 5.0721 0.2997 sec/batch
Epoch 1/10 Iteration: 1900 Avg. Training loss: 4.9995 0.3106 sec/batch
Epoch 1/10 Iteration: 2000 Avg. Training loss: 4.9931 0.2874 sec/batch
Nearest to not: ddd, zx, branca, maintain, hugo, multi, factor, continuation,
Nearest to people: established, shooter, premiered, untethered, less, tupac, cowl, carries,
Nearest to between: earn, measured, imps, hampered, modicum, positive, near, cyrillic,
Nearest to new: highest, repaired, total, seriousness, census, nafta, denominations, emanation,
Nearest to many: deprecated, bad, generalised, in, afghana, bringing, lina, seekers,
Nearest to the: consistent, agree, alter, hiroshima, counted, name, mayhem, anne,
Nearest to up: manufacturing, bands, main, activities, naboth, echoes, play, com,
Nearest to which: awakening, averaging, them, argues, bauhin, telephone, rings, sunos,
Nearest to applications: upto, ended, subjective, inferiority, regulus, jed, arabidopsis, install,
Nearest to professional: improper, coals, bashar, deflation, inflow, currency, constant, ifbb,
Nearest to older: wounds, overcame, occultism, performer, opioid, husayn, prohibits, manage,
Nearest to shown: bred, kilns, macrolides, c, deemed, intent, subclass, blockers,
Nearest to rise: flower, hilbert, trader, larsson, fine, kdf, differentiated, triphosphate,
Nearest to derived: linear, gmc, golding, maximus, loss, apply, honouring, statues,
Nearest to cost: graziani, everywhere, immediate, dnb, polytheism, throughout, ethnic, caledonian,
Nearest to existence: longifolia, jbls, lustrum, connectivity, children, bureaucrats, brenda, called,
Epoch 1/10 Iteration: 2100 Avg. Training loss: 4.9451 0.2996 sec/batch
Epoch 1/10 Iteration: 2200 Avg. Training loss: 4.9114 0.3562 sec/batch
Epoch 1/10 Iteration: 2300 Avg. Training loss: 4.8880 0.3469 sec/batch
Epoch 1/10 Iteration: 2400 Avg. Training loss: 4.8728 0.3844 sec/batch
Epoch 1/10 Iteration: 2500 Avg. Training loss: 4.7958 0.3750 sec/batch
Epoch 1/10 Iteration: 2600 Avg. Training loss: 4.8410 0.4059 sec/batch
Epoch 1/10 Iteration: 2700 Avg. Training loss: 4.8232 0.3656 sec/batch
Epoch 1/10 Iteration: 2800 Avg. Training loss: 4.7796 0.3710 sec/batch
Epoch 1/10 Iteration: 2900 Avg. Training loss: 4.7586 0.3772 sec/batch
Epoch 1/10 Iteration: 3000 Avg. Training loss: 4.7557 0.3733 sec/batch
Nearest to not: ddd, zx, reputedly, branca, indirect, deutschlands, relative, pottery,
Nearest to people: established, shooter, untethered, tupac, cowl, fabrication, premiered, conceivable,
Nearest to between: earn, modicum, hampered, measured, imps, poisonous, cyrillic, subtitles,
Nearest to new: highest, repaired, census, nafta, shall, total, denominations, victoria,
Nearest to many: deprecated, generalised, bringing, bad, seniors, afghana, urinary, seekers,
Nearest to the: anne, whisper, alter, antiquity, hiroshima, levy, kodak, geographical,
Nearest to up: manufacturing, bands, activities, main, echoes, diem, naboth, insurrection,
Nearest to which: awakening, averaging, electoral, fourteen, argues, patient, nineteen, prospective,
Nearest to applications: upto, subjective, inferiority, ended, regulus, unix, install, arabidopsis,
Nearest to professional: improper, bashar, coals, deflation, inflow, constant, xd, currency,
Nearest to older: wounds, performer, overcame, prohibits, occultism, crews, buddhist, amend,
Nearest to shown: bred, kilns, macrolides, intent, glamorgan, subclass, c, chatham,
Nearest to rise: hilbert, flower, trader, larsson, differentiated, fine, distancing, kdf,
Nearest to derived: linear, gmc, opposition, statues, golding, honouring, loss, epic,
Nearest to cost: graziani, polytheism, everywhere, throughout, concerts, caledonian, dnb, ethnic,
Nearest to existence: bureaucrats, longifolia, jbls, emptied, lustrum, brenda, foes, microseconds,
Epoch 1/10 Iteration: 3100 Avg. Training loss: 4.7895 0.3821 sec/batch
Epoch 1/10 Iteration: 3200 Avg. Training loss: 4.7563 0.3717 sec/batch
Epoch 1/10 Iteration: 3300 Avg. Training loss: 4.7297 0.4090 sec/batch
Epoch 1/10 Iteration: 3400 Avg. Training loss: 4.7145 0.3724 sec/batch
Epoch 1/10 Iteration: 3500 Avg. Training loss: 4.7406 0.4341 sec/batch
Epoch 1/10 Iteration: 3600 Avg. Training loss: 4.7175 0.3832 sec/batch
Epoch 1/10 Iteration: 3700 Avg. Training loss: 4.7100 0.3700 sec/batch
Epoch 1/10 Iteration: 3800 Avg. Training loss: 4.7441 0.3664 sec/batch
Epoch 1/10 Iteration: 3900 Avg. Training loss: 4.7016 0.3640 sec/batch
Epoch 1/10 Iteration: 4000 Avg. Training loss: 4.6782 0.3635 sec/batch
Nearest to not: ddd, zx, reputedly, branca, deutschlands, indirect, lop, guilty,
Nearest to people: established, shooter, tupac, cowl, untethered, premiered, conceivable, genome,
Nearest to between: earn, modicum, hampered, cyrillic, subtitles, imps, anarchy, poisonous,
Nearest to new: nafta, repaired, highest, census, joysticks, shall, total, qc,
Nearest to many: deprecated, generalised, bringing, seniors, afghana, seekers, shareholder, urinary,
Nearest to the: anne, goalie, wikibook, merman, whisper, alter, matured, protectorate,
Nearest to up: bands, main, manufacturing, unbaptized, diem, activities, landscapes, prisoner,
Nearest to which: averaging, awakening, electoral, fourteen, quantitative, argues, them, sunos,
Nearest to applications: upto, subjective, inferiority, regulus, unix, install, arabidopsis, correctly,
Nearest to professional: unruly, bashar, improper, xd, coals, inflow, lefebvre, deflation,
Nearest to older: wounds, husayn, hawker, prohibits, performer, classicism, overcame, crews,
Nearest to shown: kilns, bred, macrolides, intent, chatham, glamorgan, encryption, subclass,
Nearest to rise: hilbert, flower, trader, larsson, distancing, homoerotic, kdf, differentiated,
Nearest to derived: gmc, linear, nonverbal, statues, apply, loss, golding, honouring,
Nearest to cost: polytheism, graziani, throughout, everywhere, synergistic, concerts, thurgood, joey,
Nearest to existence: longifolia, bureaucrats, meagre, khazars, emptied, microseconds, lustrum, jbls,
Epoch 1/10 Iteration: 4100 Avg. Training loss: 4.6600 0.3543 sec/batch
Epoch 1/10 Iteration: 4200 Avg. Training loss: 4.6681 0.3408 sec/batch
Epoch 1/10 Iteration: 4300 Avg. Training loss: 4.6172 0.3634 sec/batch
Epoch 1/10 Iteration: 4400 Avg. Training loss: 4.6153 0.3275 sec/batch
Epoch 1/10 Iteration: 4500 Avg. Training loss: 4.6279 0.3182 sec/batch
Epoch 1/10 Iteration: 4600 Avg. Training loss: 4.6606 0.3159 sec/batch
Epoch 2/10 Iteration: 4700 Avg. Training loss: 4.5800 0.2290 sec/batch
Epoch 2/10 Iteration: 4800 Avg. Training loss: 4.5731 0.3134 sec/batch
Epoch 2/10 Iteration: 4900 Avg. Training loss: 4.5170 0.3146 sec/batch
Epoch 2/10 Iteration: 5000 Avg. Training loss: 4.4840 0.3123 sec/batch
Nearest to not: reputedly, zx, ddd, deutschlands, indirect, branca, lop, zanjeer,
Nearest to people: established, shooter, cowl, tupac, untethered, buildup, algal, singapore,
Nearest to between: modicum, earn, cyrillic, subtitles, petroleum, measured, poisonous, mohammed,
Nearest to new: nafta, repaired, illuminate, census, joysticks, qc, shall, ericaceae,
Nearest to many: deprecated, generalised, afghana, seniors, bringing, in, marple, fillmore,
Nearest to the: goalie, antiquity, protectorate, alter, wikibook, butts, elders, whisper,
Nearest to up: bands, unbaptized, landscapes, manufacturing, activities, highlander, diem, main,
Nearest to which: awakening, averaging, sunos, quantitative, argues, them, fourteen, offensive,
Nearest to applications: upto, subjective, unix, inferiority, install, tools, regulus, correctly,
Nearest to professional: unruly, bashar, xd, blanco, sucre, lefebvre, improper, ifbb,
Nearest to older: wounds, tupi, husayn, hawker, performer, prohibits, classicism, crews,
Nearest to shown: kilns, bred, intent, macrolides, chatham, mondo, encryption, glamorgan,
Nearest to rise: flower, hilbert, trader, larsson, thinly, distancing, homoerotic, triphosphate,
Nearest to derived: nonverbal, linear, gmc, apply, pital, elucidated, sarcastic, enriched,
Nearest to cost: polytheism, graziani, throughout, everywhere, synergistic, hapsburg, bicycles, horizontally,
Nearest to existence: bureaucrats, microseconds, meagre, emptied, khazars, lustrum, longifolia, solemn,
Epoch 2/10 Iteration: 5100 Avg. Training loss: 4.4985 0.3206 sec/batch
Epoch 2/10 Iteration: 5200 Avg. Training loss: 4.4799 0.3224 sec/batch
Epoch 2/10 Iteration: 5300 Avg. Training loss: 4.4506 0.3110 sec/batch
Epoch 2/10 Iteration: 5400 Avg. Training loss: 4.5356 0.3528 sec/batch
Epoch 2/10 Iteration: 5500 Avg. Training loss: 4.5175 0.3382 sec/batch
Epoch 2/10 Iteration: 5600 Avg. Training loss: 4.4731 0.3234 sec/batch
Epoch 2/10 Iteration: 5700 Avg. Training loss: 4.4455 0.3165 sec/batch
Epoch 2/10 Iteration: 5800 Avg. Training loss: 4.3816 0.3554 sec/batch
Epoch 2/10 Iteration: 5900 Avg. Training loss: 4.4487 0.2999 sec/batch
Epoch 2/10 Iteration: 6000 Avg. Training loss: 4.4115 0.2998 sec/batch
Nearest to not: deutschlands, reputedly, ddd, indirect, lop, zx, branca, zanjeer,
Nearest to people: established, cowl, shooter, melvyn, untethered, singapore, conceivable, tupac,
Nearest to between: earn, petroleum, measured, modicum, cyrillic, subtitles, alkaline, poisonous,
Nearest to new: nafta, repaired, ericaceae, stepmania, joysticks, qc, illuminate, gustave,
Nearest to many: generalised, deprecated, afghana, seniors, bringing, shareholder, marple, seekers,
Nearest to the: goalie, butts, wikibook, dwarfs, alter, whisper, antiquity, mayhem,
Nearest to up: bands, main, unbaptized, gaited, endeavoring, activities, highlander, manufacturing,
Nearest to which: awakening, bsa, sunos, averaging, argues, quantitative, fourteen, them,
Nearest to applications: unix, upto, subjective, install, inferiority, tools, computers, developers,
Nearest to professional: unruly, blanco, sucre, lefebvre, bashar, pushkin, downsized, xd,
Nearest to older: husayn, hawker, tupi, wounds, classicism, performer, prohibits, householder,
Nearest to shown: kilns, bred, encryption, chatham, macrolides, intent, brouwer, subclass,
Nearest to rise: flower, hilbert, larsson, trader, thinly, distancing, facilitating, kdf,
Nearest to derived: linear, nonverbal, gmc, apply, pital, sarcastic, elucidated, judeo,
Nearest to cost: polytheism, synergistic, everywhere, graziani, throughout, bicycles, horizontally, hapsburg,
Nearest to existence: meagre, microseconds, emptied, bureaucrats, seem, khazars, museu, lustrum,
Epoch 2/10 Iteration: 6100 Avg. Training loss: 4.4282 0.3543 sec/batch
Epoch 2/10 Iteration: 6200 Avg. Training loss: 4.4086 0.3379 sec/batch
Epoch 2/10 Iteration: 6300 Avg. Training loss: 4.4438 0.3533 sec/batch
Epoch 2/10 Iteration: 6400 Avg. Training loss: 4.3766 0.3506 sec/batch
Epoch 2/10 Iteration: 6500 Avg. Training loss: 4.4241 0.3255 sec/batch
Epoch 2/10 Iteration: 6600 Avg. Training loss: 4.4510 0.3750 sec/batch
Epoch 2/10 Iteration: 6700 Avg. Training loss: 4.3678 0.3137 sec/batch
Epoch 2/10 Iteration: 6800 Avg. Training loss: 4.3933 0.3287 sec/batch
Epoch 2/10 Iteration: 6900 Avg. Training loss: 4.4178 0.4148 sec/batch
Epoch 2/10 Iteration: 7000 Avg. Training loss: 4.3710 0.3619 sec/batch
Nearest to not: deutschlands, indirect, leprosy, them, lop, zanjeer, zx, reputedly,
Nearest to people: established, shooter, cowl, superclass, algal, singapore, less, melvyn,
Nearest to between: earn, measured, cyrillic, modicum, subtitles, imps, petroleum, dialog,
Nearest to new: nafta, illuminate, stepmania, joysticks, repaired, ericaceae, gustave, qc,
Nearest to many: generalised, afghana, seniors, deprecated, krew, urinary, shareholder, marple,
Nearest to the: goalie, butts, alter, consistent, wikibook, whisper, is, discarding,
Nearest to up: gaited, bands, unbaptized, play, activities, endeavoring, highlander, disadvantage,
Nearest to which: awakening, sunos, averaging, quantitative, bsa, them, noyce, regard,
Nearest to applications: unix, upto, install, subjective, tools, developers, computers, inferiority,
Nearest to professional: unruly, lefebvre, sucre, blanco, pushkin, downsized, played, mentor,
Nearest to older: householder, tupi, husayn, males, hawker, classicism, wounds, family,
Nearest to shown: kilns, bred, encryption, macrolides, chatham, subclass, materiality, esteem,
Nearest to rise: flower, hilbert, larsson, trader, thinly, distancing, facilitating, myocardial,
Nearest to derived: linear, nonverbal, gmc, pital, judeo, enriched, calw, decorative,
Nearest to cost: synergistic, polytheism, everywhere, graziani, lack, horizontally, bicycles, throughout,
Nearest to existence: meagre, microseconds, emptied, quantifiers, longifolia, bureaucrats, adjectives, seem,
Epoch 2/10 Iteration: 7100 Avg. Training loss: 4.3811 0.3685 sec/batch
Epoch 2/10 Iteration: 7200 Avg. Training loss: 4.3949 0.4101 sec/batch
Epoch 2/10 Iteration: 7300 Avg. Training loss: 4.3721 0.4149 sec/batch
Epoch 2/10 Iteration: 7400 Avg. Training loss: 4.3699 0.4191 sec/batch
Epoch 2/10 Iteration: 7500 Avg. Training loss: 4.4082 0.4590 sec/batch
Epoch 2/10 Iteration: 7600 Avg. Training loss: 4.3451 0.3658 sec/batch
Epoch 2/10 Iteration: 7700 Avg. Training loss: 4.3906 0.3301 sec/batch
Epoch 2/10 Iteration: 7800 Avg. Training loss: 4.3559 0.3507 sec/batch
Epoch 2/10 Iteration: 7900 Avg. Training loss: 4.3191 0.4104 sec/batch
Epoch 2/10 Iteration: 8000 Avg. Training loss: 4.3139 0.3505 sec/batch
Nearest to not: deutschlands, lop, or, indirect, leprosy, them, zanjeer, guilty,
Nearest to people: established, shooter, cowl, algal, singapore, superclass, less, melvyn,
Nearest to between: modicum, cyrillic, earn, measured, subtitles, mohammed, dwindle, ruble,
Nearest to new: nafta, ericaceae, repaired, proscriptions, joysticks, illuminate, stepmania, mjf,
Nearest to many: afghana, krew, shareholder, seniors, generalised, urinary, deprecated, bringing,
Nearest to the: goalie, wikibook, geographical, is, consistent, antiquity, in, discarding,
Nearest to up: gaited, bands, highlander, unbaptized, activities, endeavoring, play, dominator,
Nearest to which: bsa, sunos, noyce, counties, averaging, regard, offensive, them,
Nearest to applications: unix, upto, install, developers, tools, subjective, computers, correctly,
Nearest to professional: unruly, lefebvre, downsized, sucre, pushkin, blanco, xd, mentor,
Nearest to older: householder, tupi, males, husayn, classicism, family, wounds, hawker,
Nearest to shown: kilns, bred, encryption, chatham, materiality, macrolides, undecidability, subclass,
Nearest to rise: flower, larsson, thinly, trader, distancing, hilbert, facilitating, diplomatically,
Nearest to derived: nonverbal, linear, judeo, tribes, calw, pital, gmc, shades,
Nearest to cost: synergistic, lack, polytheism, nix, bicycles, thurgood, cass, throughout,
Nearest to existence: adjectives, microseconds, emptied, meagre, quantifiers, longifolia, seem, bureaucrats,
Epoch 2/10 Iteration: 8100 Avg. Training loss: 4.3634 0.3667 sec/batch
Epoch 2/10 Iteration: 8200 Avg. Training loss: 4.2803 0.3634 sec/batch
Epoch 2/10 Iteration: 8300 Avg. Training loss: 4.3918 0.3846 sec/batch
Epoch 2/10 Iteration: 8400 Avg. Training loss: 4.3740 0.4456 sec/batch
Epoch 2/10 Iteration: 8500 Avg. Training loss: 4.3750 0.3377 sec/batch
Epoch 2/10 Iteration: 8600 Avg. Training loss: 4.2843 0.4026 sec/batch
Epoch 2/10 Iteration: 8700 Avg. Training loss: 4.3196 0.4092 sec/batch
Epoch 2/10 Iteration: 8800 Avg. Training loss: 4.3457 0.3796 sec/batch
Epoch 2/10 Iteration: 8900 Avg. Training loss: 4.2405 0.3479 sec/batch
Epoch 2/10 Iteration: 9000 Avg. Training loss: 4.2924 0.4073 sec/batch
Nearest to not: deutschlands, lop, leprosy, zanjeer, indirect, asked, or, guilty,
Nearest to people: established, shooter, cowl, tupac, less, algal, singapore, rapp,
Nearest to between: cyrillic, measured, modicum, earn, mohammed, dwindle, lysimachus, subtitles,
Nearest to new: nafta, proscriptions, repaired, transducer, illuminate, joysticks, pappas, stepmania,
Nearest to many: afghana, generalised, krew, seniors, shareholder, urinary, marple, deprecated,
Nearest to the: goalie, is, subtrees, wikibook, geographical, butts, discarding, protectorate,
Nearest to up: gaited, highlander, bands, kayaks, emeryville, mislabeled, diem, endeavoring,
Nearest to which: bsa, regard, noyce, sunos, offensive, awakening, counties, argues,
Nearest to applications: unix, install, upto, developers, tools, subjective, desktop, correctly,
Nearest to professional: blanco, sucre, unruly, lefebvre, played, mentor, downsized, popularity,
Nearest to older: householder, tupi, males, husayn, family, coslet, hawker, classicism,
Nearest to shown: kilns, bred, undecidability, encryption, brouwer, chatham, macrolides, materiality,
Nearest to rise: flower, thinly, trader, larsson, distancing, diplomatically, hilbert, facilitating,
Nearest to derived: judeo, linear, nonverbal, calw, gmc, pital, tribes, profusion,
Nearest to cost: lack, nix, synergistic, bicycles, cass, upgrades, horizontally, lotus,
Nearest to existence: adjectives, emptied, longifolia, meagre, microseconds, quantifiers, bureaucrats, auras,
Epoch 2/10 Iteration: 9100 Avg. Training loss: 4.3062 0.3389 sec/batch
Epoch 2/10 Iteration: 9200 Avg. Training loss: 4.3032 0.2870 sec/batch
Epoch 3/10 Iteration: 9300 Avg. Training loss: 4.3353 0.1296 sec/batch
Epoch 3/10 Iteration: 9400 Avg. Training loss: 4.2349 0.2896 sec/batch
Epoch 3/10 Iteration: 9500 Avg. Training loss: 4.2225 0.3005 sec/batch
Epoch 3/10 Iteration: 9600 Avg. Training loss: 4.1992 0.2929 sec/batch
Epoch 3/10 Iteration: 9700 Avg. Training loss: 4.2196 0.3029 sec/batch
Epoch 3/10 Iteration: 9800 Avg. Training loss: 4.2102 0.2928 sec/batch
Epoch 3/10 Iteration: 9900 Avg. Training loss: 4.1979 0.3017 sec/batch
Epoch 3/10 Iteration: 10000 Avg. Training loss: 4.1700 0.3032 sec/batch
Nearest to not: deutschlands, asked, zanjeer, reputedly, lop, them, upon, rias,
Nearest to people: shooter, established, less, cowl, algal, singapore, melvyn, tupac,
Nearest to between: cyrillic, earn, measured, modicum, dwindle, lysimachus, from, mohammed,
Nearest to new: proscriptions, nafta, transducer, joysticks, stepmania, ericaceae, pappas, illuminate,
Nearest to many: afghana, krew, generalised, shareholder, seniors, sub, notable, culture,
Nearest to the: goalie, is, in, protectorate, wikibook, butts, subtrees, rebounding,
Nearest to up: gaited, kayaks, highlander, unbaptized, play, bands, mislabeled, rusher,
Nearest to which: bsa, sunos, noyce, awakening, averaging, alternatively, offensive, counties,
Nearest to applications: unix, install, upto, developers, tools, desktop, networking, subjective,
Nearest to professional: unruly, blanco, lefebvre, trophies, played, popularity, sucre, downsized,
Nearest to older: householder, males, tupi, family, husayn, classicism, coslet, age,
Nearest to shown: kilns, macrolides, bred, chatham, brouwer, conditioner, blockers, materiality,
Nearest to rise: flower, thinly, trader, larsson, diplomatically, facilitating, nigsberg, distancing,
Nearest to derived: nonverbal, judeo, linear, calw, gmc, pital, tribes, profusion,
Nearest to cost: upgrades, lack, bicycles, nix, horizontally, lotus, cass, thurgood,
Nearest to existence: adjectives, emptied, bureaucrats, microseconds, meagre, auras, quantifiers, longifolia,
Epoch 3/10 Iteration: 10100 Avg. Training loss: 4.2290 0.3214 sec/batch
Epoch 3/10 Iteration: 10200 Avg. Training loss: 4.2014 0.2953 sec/batch
Epoch 3/10 Iteration: 10300 Avg. Training loss: 4.2157 0.2830 sec/batch
Epoch 3/10 Iteration: 10400 Avg. Training loss: 4.1430 0.2865 sec/batch
Epoch 3/10 Iteration: 10500 Avg. Training loss: 4.1779 0.2874 sec/batch
Epoch 3/10 Iteration: 10600 Avg. Training loss: 4.1662 0.3126 sec/batch
Epoch 3/10 Iteration: 10700 Avg. Training loss: 4.1764 0.2730 sec/batch
Epoch 3/10 Iteration: 10800 Avg. Training loss: 4.1808 0.2739 sec/batch
Epoch 3/10 Iteration: 10900 Avg. Training loss: 4.2122 0.2823 sec/batch
Epoch 3/10 Iteration: 11000 Avg. Training loss: 4.1818 0.2927 sec/batch
Nearest to not: deutschlands, asked, them, leprosy, want, rejection, or, zanjeer,
Nearest to people: shooter, established, singapore, less, algal, tupac, melvyn, cowl,
Nearest to between: cyrillic, measured, earn, dwindle, modicum, indebtedness, boudin, from,
Nearest to new: proscriptions, joysticks, pappas, stepmania, nafta, ericaceae, transducer, ricks,
Nearest to many: generalised, krew, shareholder, sub, afghana, seniors, bringing, urinary,
Nearest to the: is, goalie, in, consistent, reactants, subtrees, discarding, mayhem,
Nearest to up: gaited, kayaks, highlander, play, rusher, aired, dominator, emeryville,
Nearest to which: bsa, sunos, noyce, alternatively, awakening, nanotubes, regard, large,
Nearest to applications: unix, upto, install, developers, tools, subjective, arabidopsis, shortcut,
Nearest to professional: popularity, played, blanco, unruly, lefebvre, downsized, xd, pushkin,
Nearest to older: householder, males, family, tupi, husayn, age, classicism, coslet,
Nearest to shown: kilns, macrolides, brouwer, chatham, conditioner, blockers, simulant, bred,
Nearest to rise: flower, thinly, hilbert, trader, myocardial, larsson, distancing, diplomatically,
Nearest to derived: nonverbal, judeo, linear, gmc, tribes, calw, shades, reside,
Nearest to cost: upgrades, lack, nix, thurgood, conceptualize, bicycles, broadband, lotus,
Nearest to existence: microseconds, emptied, adjectives, quantifiers, auras, meagre, bureaucrats, logico,
Epoch 3/10 Iteration: 11100 Avg. Training loss: 4.1660 0.3079 sec/batch
Epoch 3/10 Iteration: 11200 Avg. Training loss: 4.2116 0.3660 sec/batch
Epoch 3/10 Iteration: 11300 Avg. Training loss: 4.1844 0.3306 sec/batch
Epoch 3/10 Iteration: 11400 Avg. Training loss: 4.1447 0.3237 sec/batch
Epoch 3/10 Iteration: 11500 Avg. Training loss: 4.1642 0.2933 sec/batch
Epoch 3/10 Iteration: 11600 Avg. Training loss: 4.2102 0.2959 sec/batch
Epoch 3/10 Iteration: 11700 Avg. Training loss: 4.2017 0.3088 sec/batch
Epoch 3/10 Iteration: 11800 Avg. Training loss: 4.1536 0.2949 sec/batch
Epoch 3/10 Iteration: 11900 Avg. Training loss: 4.1411 0.3017 sec/batch
Epoch 3/10 Iteration: 12000 Avg. Training loss: 4.1602 0.3028 sec/batch
Nearest to not: deutschlands, asked, them, zanjeer, maggots, leprosy, or, guilty,
Nearest to people: shooter, singapore, established, tupac, algal, less, cowl, rapp,
Nearest to between: cyrillic, measured, modicum, earn, boudin, dwindle, subtitles, from,
Nearest to new: stepmania, pappas, joysticks, proscriptions, ricks, elgar, illuminate, qc,
Nearest to many: krew, shareholder, sub, generalised, afghana, notable, seniors, broadway,
Nearest to the: is, in, goalie, consistent, subtrees, levy, a, reactants,
Nearest to up: kayaks, gaited, highlander, angry, dominator, anubis, rusher, emeryville,
Nearest to which: bsa, sunos, noyce, alternatively, regard, awakening, fourteen, hestia,
Nearest to applications: upto, unix, install, developers, tools, subjective, embedded, arabidopsis,
Nearest to professional: popularity, played, xd, downsized, trophies, unruly, mentor, directions,
Nearest to older: householder, males, family, tupi, age, husayn, families, females,
Nearest to shown: kilns, macrolides, brouwer, simulant, conditioner, ninhursag, blockers, materiality,
Nearest to rise: flower, hilbert, distancing, thinly, larsson, trader, myocardial, facilitating,
Nearest to derived: judeo, nonverbal, tribes, linear, gmc, calw, reside, sanskrit,
Nearest to cost: upgrades, lack, thurgood, nix, cass, conceptualize, lotus, earthen,
Nearest to existence: microseconds, quantifiers, adjectives, emptied, view, meagre, auras, bureaucrats,
Epoch 3/10 Iteration: 12100 Avg. Training loss: 4.2036 0.3107 sec/batch
Epoch 3/10 Iteration: 12200 Avg. Training loss: 4.1705 0.3055 sec/batch
Epoch 3/10 Iteration: 12300 Avg. Training loss: 4.1698 0.3038 sec/batch
Epoch 3/10 Iteration: 12400 Avg. Training loss: 4.1781 0.2918 sec/batch
Epoch 3/10 Iteration: 12500 Avg. Training loss: 4.1415 0.2798 sec/batch
Epoch 3/10 Iteration: 12600 Avg. Training loss: 4.1512 0.3236 sec/batch
Epoch 3/10 Iteration: 12700 Avg. Training loss: 4.1589 0.6297 sec/batch
Epoch 3/10 Iteration: 12800 Avg. Training loss: 4.1200 0.5955 sec/batch
Epoch 3/10 Iteration: 12900 Avg. Training loss: 4.1795 0.4541 sec/batch
Epoch 3/10 Iteration: 13000 Avg. Training loss: 4.2217 0.4879 sec/batch
Nearest to not: deutschlands, asked, guilty, that, want, or, maggots, resisted,
Nearest to people: shooter, rapp, singapore, established, tupac, cowl, algal, tanzania,
Nearest to between: cyrillic, modicum, measured, earn, shimon, negus, boudin, christianized,
Nearest to new: proscriptions, elgar, pappas, nafta, peregrine, stepmania, ricks, early,
Nearest to many: afghana, sub, krew, generalised, notable, shareholder, seniors, seekers,
Nearest to the: in, is, coronations, goalie, a, protectorate, levy, wayland,
Nearest to up: kayaks, anubis, gaited, unbaptized, domineering, highlander, angry, stationed,
Nearest to which: bsa, regard, sunos, noyce, dealings, awakening, parsis, counties,
Nearest to applications: unix, upto, install, developers, tools, desktop, itanium, cryptology,
Nearest to professional: played, pushkin, unruly, blanco, xd, downsized, popularity, trophies,
Nearest to older: householder, males, family, tupi, age, females, husayn, families,
Nearest to shown: kilns, macrolides, brouwer, ninhursag, undecidability, conditioner, chatham, deallocation,
Nearest to rise: flower, hilbert, distancing, larsson, diplomatically, trader, nigsberg, thinly,
Nearest to derived: judeo, nonverbal, tribes, sanskrit, reside, slavonic, calw, gmc,
Nearest to cost: upgrades, lack, nix, thurgood, cass, nhra, guerillas, conceptualize,
Nearest to existence: microseconds, quantifiers, view, adjectives, emptied, auras, meagre, bureaucrats,
Epoch 3/10 Iteration: 13100 Avg. Training loss: 4.2418 0.4912 sec/batch
Epoch 3/10 Iteration: 13200 Avg. Training loss: 4.1435 0.4746 sec/batch
Epoch 3/10 Iteration: 13300 Avg. Training loss: 4.1271 0.5476 sec/batch
Epoch 3/10 Iteration: 13400 Avg. Training loss: 4.1498 0.3148 sec/batch
Epoch 3/10 Iteration: 13500 Avg. Training loss: 4.0596 0.4680 sec/batch
Epoch 3/10 Iteration: 13600 Avg. Training loss: 4.1336 0.5572 sec/batch
Epoch 3/10 Iteration: 13700 Avg. Training loss: 4.1604 0.6424 sec/batch
Epoch 3/10 Iteration: 13800 Avg. Training loss: 4.1415 0.6259 sec/batch
Epoch 4/10 Iteration: 13900 Avg. Training loss: 4.1738 0.1199 sec/batch
Epoch 4/10 Iteration: 14000 Avg. Training loss: 4.0890 0.5968 sec/batch
Nearest to not: deutschlands, asked, zanjeer, that, guilty, want, ddd, maggots,
Nearest to people: shooter, singapore, less, algal, established, tupac, tanzania, cowl,
Nearest to between: cyrillic, measured, boudin, earn, modicum, dwindle, its, shimon,
Nearest to new: elgar, ricks, pappas, proscriptions, joysticks, nafta, stepmania, prevailed,
Nearest to many: sub, generalised, afghana, krew, notable, seniors, broadway, shareholder,
Nearest to the: in, is, a, goalie, panels, consistent, each, tempest,
Nearest to up: kayaks, tolerate, gaited, highlander, hostname, emeryville, anubis, rusher,
Nearest to which: bsa, sunos, noyce, nanotubes, alternatively, regard, dealings, awakening,
Nearest to applications: upto, unix, install, tools, developers, ended, cryptology, itanium,
Nearest to professional: popularity, played, pushkin, xd, unruly, trophies, directions, sucre,
Nearest to older: householder, males, tupi, family, age, females, classicism, coslet,
Nearest to shown: macrolides, kilns, brouwer, ninhursag, chatham, blockers, undecidability, transfection,
Nearest to rise: flower, hilbert, distancing, nigsberg, diplomatically, klaip, thinly, larsson,
Nearest to derived: judeo, nonverbal, tribes, reside, sanskrit, calw, slavonic, leet,
Nearest to cost: upgrades, lack, nix, thurgood, will, cass, immediate, improving,
Nearest to existence: microseconds, view, quantifiers, auras, emptied, matthew, meagre, adjectives,
Epoch 4/10 Iteration: 14100 Avg. Training loss: 4.1017 0.5195 sec/batch
Epoch 4/10 Iteration: 14200 Avg. Training loss: 4.0997 0.5917 sec/batch
Epoch 4/10 Iteration: 14300 Avg. Training loss: 4.1100 0.5835 sec/batch
Epoch 4/10 Iteration: 14400 Avg. Training loss: 4.0396 0.6098 sec/batch
Epoch 4/10 Iteration: 14500 Avg. Training loss: 4.0842 0.5590 sec/batch
Epoch 4/10 Iteration: 14600 Avg. Training loss: 4.0380 0.5028 sec/batch
Epoch 4/10 Iteration: 14700 Avg. Training loss: 4.1021 0.5200 sec/batch
Epoch 4/10 Iteration: 14800 Avg. Training loss: 4.1101 0.4995 sec/batch
Epoch 4/10 Iteration: 14900 Avg. Training loss: 4.1228 0.5029 sec/batch
Epoch 4/10 Iteration: 15000 Avg. Training loss: 4.0098 0.5116 sec/batch
Nearest to not: deutschlands, asked, want, skold, ddd, yours, that, zanjeer,
Nearest to people: shooter, singapore, less, prabhupada, algal, tanzania, whites, rapp,
Nearest to between: cyrillic, measured, earn, boudin, dwindle, modicum, shimon, annex,
Nearest to new: elgar, proscriptions, ricks, stepmania, pappas, joysticks, prevailed, early,
Nearest to many: sub, generalised, notable, afghana, shareholder, have, krew, criticisms,
Nearest to the: in, is, a, goalie, each, as, gopala, panels,
Nearest to up: kayaks, gaited, rusher, dominator, highlander, stationed, angry, aired,
Nearest to which: bsa, alternatively, sunos, demilitarization, noyce, nanotubes, large, dealings,
Nearest to applications: upto, unix, install, tools, developers, cryptology, ended, yudhoyono,
Nearest to professional: popularity, played, xd, trophies, unruly, pushkin, directions, rivaling,
Nearest to older: householder, males, family, tupi, age, females, families, husayn,
Nearest to shown: kilns, ninhursag, macrolides, brouwer, chatham, undecidability, materiality, transfection,
Nearest to rise: flower, hilbert, thinly, nigsberg, diplomatically, klaip, distancing, trader,
Nearest to derived: judeo, nonverbal, tribes, reside, leet, shades, gmc, sanskrit,
Nearest to cost: upgrades, lack, thurgood, immediate, nix, will, guerillas, conceptualize,
Nearest to existence: microseconds, quantifiers, view, emptied, auras, adjectives, matthew, sedgewick,
Epoch 4/10 Iteration: 15100 Avg. Training loss: 4.0346 0.5161 sec/batch
Epoch 4/10 Iteration: 15200 Avg. Training loss: 4.0272 0.5271 sec/batch
Epoch 4/10 Iteration: 15300 Avg. Training loss: 4.0399 0.5020 sec/batch
Epoch 4/10 Iteration: 15400 Avg. Training loss: 4.0645 0.4996 sec/batch
Epoch 4/10 Iteration: 15500 Avg. Training loss: 4.1016 0.5080 sec/batch
Epoch 4/10 Iteration: 15600 Avg. Training loss: 4.0655 0.5027 sec/batch
Epoch 4/10 Iteration: 15700 Avg. Training loss: 4.0750 0.5169 sec/batch
Epoch 4/10 Iteration: 15800 Avg. Training loss: 4.1207 0.4800 sec/batch
Epoch 4/10 Iteration: 15900 Avg. Training loss: 4.0485 0.5020 sec/batch
Epoch 4/10 Iteration: 16000 Avg. Training loss: 4.0397 0.5064 sec/batch
Nearest to not: asked, deutschlands, that, want, or, them, furthermore, leprosy,
Nearest to people: shooter, singapore, algal, tupac, superclass, rapp, less, cowl,
Nearest to between: measured, cyrillic, earn, modicum, boudin, leftrightarrow, dwindle, shimon,
Nearest to new: elgar, ricks, pappas, proscriptions, joysticks, stepmania, victoria, jaguars,
Nearest to many: sub, generalised, krew, afghana, notable, have, shareholder, criticisms,
Nearest to the: is, in, a, each, consistent, reactants, as, gopala,
Nearest to up: kayaks, gaited, anubis, hostname, dominator, tolerate, toughened, sissy,
Nearest to which: nanotubes, sunos, bsa, alternatively, phosphorylated, noyce, regard, awakening,
Nearest to applications: upto, unix, install, tools, ended, cryptology, developers, internationalization,
Nearest to professional: popularity, xd, played, trophies, academies, pushkin, rivaling, unruly,
Nearest to older: householder, males, family, tupi, age, females, families, households,
Nearest to shown: ninhursag, kilns, brouwer, transfection, materiality, macrolides, blockers, simulant,
Nearest to rise: hilbert, flower, distancing, myocardial, larsson, nigsberg, thinly, cools,
Nearest to derived: judeo, nonverbal, reside, leet, tribes, shades, linear, sanskrit,
Nearest to cost: upgrades, lack, thurgood, nix, immediate, improving, cass, guerillas,
Nearest to existence: microseconds, view, quantifiers, postulate, adjectives, emptied, auras, sedgewick,
Epoch 4/10 Iteration: 16100 Avg. Training loss: 4.0574 0.5098 sec/batch
Epoch 4/10 Iteration: 16200 Avg. Training loss: 4.0772 0.5038 sec/batch
Epoch 4/10 Iteration: 16300 Avg. Training loss: 4.0855 0.5008 sec/batch
Epoch 4/10 Iteration: 16400 Avg. Training loss: 4.0590 0.4989 sec/batch
Epoch 4/10 Iteration: 16500 Avg. Training loss: 4.0818 0.5180 sec/batch
Epoch 4/10 Iteration: 16600 Avg. Training loss: 4.0568 0.5664 sec/batch
Epoch 4/10 Iteration: 16700 Avg. Training loss: 4.0783 0.5141 sec/batch
Epoch 4/10 Iteration: 16800 Avg. Training loss: 4.0548 0.5062 sec/batch
Epoch 4/10 Iteration: 16900 Avg. Training loss: 4.0929 0.5079 sec/batch
Epoch 4/10 Iteration: 17000 Avg. Training loss: 4.0564 0.5207 sec/batch
Nearest to not: asked, deutschlands, furthermore, that, zanjeer, ddd, skold, want,
Nearest to people: shooter, singapore, algal, is, superclass, prabhupada, cowl, tupac,
Nearest to between: cyrillic, measured, modicum, earn, shimon, boudin, its, christianized,
Nearest to new: elgar, proscriptions, ricks, revolution, prevailed, pappas, early, mjf,
Nearest to many: sub, generalised, have, krew, notable, afghana, seekers, derry,
Nearest to the: in, is, a, each, as, suebi, of, coronations,
Nearest to up: tolerate, kayaks, anubis, hostname, busing, gaited, landfill, disk,
Nearest to which: bsa, parsis, awakening, pagemaker, dealings, electoral, nanotubes, equitable,
Nearest to applications: upto, unix, install, tools, ended, developers, cryptology, yudhoyono,
Nearest to professional: xd, popularity, played, pushkin, unruly, trophies, mentor, rivaling,
Nearest to older: householder, males, family, tupi, age, families, females, households,
Nearest to shown: ninhursag, brouwer, kilns, transfection, macrolides, motif, chatham, materiality,
Nearest to rise: hilbert, flower, distancing, larsson, nigsberg, diplomatically, myocardial, thinly,
Nearest to derived: judeo, tribes, nonverbal, reside, shades, leet, sanskrit, opposition,
Nearest to cost: upgrades, lack, cass, thurgood, nix, guerillas, improving, immediate,
Nearest to existence: microseconds, view, quantifiers, postulate, emptied, adjectives, auras, matthew,
Epoch 4/10 Iteration: 17100 Avg. Training loss: 4.0624 0.5304 sec/batch
Epoch 4/10 Iteration: 17200 Avg. Training loss: 4.0238 0.5063 sec/batch
Epoch 4/10 Iteration: 17300 Avg. Training loss: 4.0323 0.5064 sec/batch
Epoch 4/10 Iteration: 17400 Avg. Training loss: 4.0574 0.5254 sec/batch
Epoch 4/10 Iteration: 17500 Avg. Training loss: 4.0638 0.5137 sec/batch
Epoch 4/10 Iteration: 17600 Avg. Training loss: 4.1167 0.5045 sec/batch
Epoch 4/10 Iteration: 17700 Avg. Training loss: 4.1445 0.5146 sec/batch
Epoch 4/10 Iteration: 17800 Avg. Training loss: 4.0526 0.5075 sec/batch
Epoch 4/10 Iteration: 17900 Avg. Training loss: 4.0451 0.5074 sec/batch
Epoch 4/10 Iteration: 18000 Avg. Training loss: 4.0692 0.5232 sec/batch
Nearest to not: asked, deutschlands, that, furthermore, want, then, or, zanjeer,
Nearest to people: shooter, singapore, rapp, cowl, algal, tupac, tanzania, fula,
Nearest to between: earn, cyrillic, measured, modicum, disiyyah, negus, boudin, varangians,
Nearest to new: elgar, revolution, proscriptions, ricks, peregrine, early, pappas, prevailed,
Nearest to many: sub, generalised, have, seekers, notable, criticisms, afghana, derry,
Nearest to the: in, is, a, each, as, of, levy, this,
Nearest to up: tolerate, hostname, anubis, gaited, emeryville, hospice, dominator, disk,
Nearest to which: bsa, electoral, nanotubes, parsis, pagemaker, handy, regard, failed,
Nearest to applications: upto, unix, install, developers, tools, cryptology, ended, yudhoyono,
Nearest to professional: xd, pushkin, played, unruly, popularity, gein, designating, sports,
Nearest to older: householder, males, family, age, females, families, tupi, households,
Nearest to shown: ninhursag, kilns, a, brouwer, macrolides, transfection, deallocation, quadrants,
Nearest to rise: hilbert, flower, distancing, nigsberg, klaip, diplomatically, larsson, myocardial,
Nearest to derived: judeo, reside, tribes, leet, nonverbal, shades, sanskrit, calw,
Nearest to cost: upgrades, lack, nix, improving, benefit, thurgood, cass, immediate,
Nearest to existence: microseconds, view, quantifiers, sedgewick, logico, postulate, auras, longifolia,
Epoch 4/10 Iteration: 18100 Avg. Training loss: 3.9707 0.5147 sec/batch
Epoch 4/10 Iteration: 18200 Avg. Training loss: 4.0237 0.5091 sec/batch
Epoch 4/10 Iteration: 18300 Avg. Training loss: 4.0340 0.5062 sec/batch
Epoch 4/10 Iteration: 18400 Avg. Training loss: 4.0338 0.5453 sec/batch
Epoch 4/10 Iteration: 18500 Avg. Training loss: 4.0835 0.5188 sec/batch
Epoch 5/10 Iteration: 18600 Avg. Training loss: 4.0359 0.5180 sec/batch
Epoch 5/10 Iteration: 18700 Avg. Training loss: 4.0032 0.5798 sec/batch
Epoch 5/10 Iteration: 18800 Avg. Training loss: 3.9834 0.5408 sec/batch
Epoch 5/10 Iteration: 18900 Avg. Training loss: 4.0274 0.6400 sec/batch
Epoch 5/10 Iteration: 19000 Avg. Training loss: 3.9854 0.6031 sec/batch
Nearest to not: asked, deutschlands, that, zanjeer, furthermore, then, want, circumstantial,
Nearest to people: shooter, singapore, cowl, algal, tanzania, superclass, fula, whites,
Nearest to between: earn, measured, cyrillic, modicum, dwindle, sporadically, disiyyah, shimon,
Nearest to new: revolution, ricks, proscriptions, elgar, early, prevailed, pappas, york,
Nearest to many: sub, generalised, notable, afghana, have, seekers, culture, areas,
Nearest to the: in, a, is, each, of, suebi, as, widest,
Nearest to up: tolerate, hostname, have, toughened, anubis, disk, busing, dominator,
Nearest to which: nanotubes, pagemaker, bsa, handy, phosphorylated, alternatively, carrier, be,
Nearest to applications: upto, unix, install, developers, ended, cryptology, tools, desktop,
Nearest to professional: played, pushkin, sports, xd, gein, academies, popularity, unruly,
Nearest to older: householder, males, family, tupi, age, families, females, households,
Nearest to shown: ninhursag, macrolides, kilns, chatham, brouwer, transfection, quadrants, deallocation,
Nearest to rise: hilbert, flower, klaip, nigsberg, larsson, distancing, diplomatically, thinly,
Nearest to derived: judeo, reside, nonverbal, tribes, leet, cloned, shades, sanskrit,
Nearest to cost: upgrades, lack, immediate, profitable, improving, nix, thurgood, benefit,
Nearest to existence: view, quantifiers, microseconds, postulate, auras, adjectives, logico, emptied,
Epoch 5/10 Iteration: 19100 Avg. Training loss: 3.9929 0.5683 sec/batch
Epoch 5/10 Iteration: 19200 Avg. Training loss: 3.9378 0.4135 sec/batch
Epoch 5/10 Iteration: 19300 Avg. Training loss: 3.9860 0.3370 sec/batch
Epoch 5/10 Iteration: 19400 Avg. Training loss: 4.0043 0.3330 sec/batch
Epoch 5/10 Iteration: 19500 Avg. Training loss: 4.0077 0.3087 sec/batch
Epoch 5/10 Iteration: 19600 Avg. Training loss: 4.0350 0.3039 sec/batch
Epoch 5/10 Iteration: 19700 Avg. Training loss: 3.9139 0.3204 sec/batch
Epoch 5/10 Iteration: 19800 Avg. Training loss: 3.9766 0.3224 sec/batch
Epoch 5/10 Iteration: 19900 Avg. Training loss: 3.9336 0.3244 sec/batch
Epoch 5/10 Iteration: 20000 Avg. Training loss: 3.9566 0.3184 sec/batch
Nearest to not: asked, deutschlands, then, that, want, furthermore, yours, it,
Nearest to people: shooter, singapore, is, superclass, fula, algal, whites, native,
Nearest to between: earn, measured, modicum, cyrillic, leftrightarrow, vetted, occured, sporadically,
Nearest to new: proscriptions, revolution, york, elgar, ricks, prevailed, early, pappas,
Nearest to many: sub, generalised, culture, afghana, notable, have, in, seekers,
Nearest to the: in, is, a, each, as, of, and, sixth,
Nearest to up: busing, tolerate, sissy, gaited, anubis, hospice, hostname, toughened,
Nearest to which: nanotubes, phosphorylated, bsa, pagemaker, be, alternatively, large, electoral,
Nearest to applications: upto, unix, install, ended, cryptology, developers, tools, internationalization,
Nearest to professional: pushkin, played, xd, trophies, sports, popularity, gein, academies,
Nearest to older: householder, males, family, age, females, families, tupi, households,
Nearest to shown: ninhursag, macrolides, kilns, transfection, chatham, brouwer, blockers, deallocation,
Nearest to rise: hilbert, flower, klaip, larsson, thinly, diplomatically, nigsberg, underemployed,
Nearest to derived: judeo, reside, nonverbal, tribes, leet, cloned, sauces, opposition,
Nearest to cost: upgrades, lack, thurgood, guerillas, profitable, immediate, improving, benefit,
Nearest to existence: view, microseconds, quantifiers, auras, emptied, sedgewick, postulate, adjectives,
Epoch 5/10 Iteration: 20100 Avg. Training loss: 4.0053 0.3398 sec/batch
Epoch 5/10 Iteration: 20200 Avg. Training loss: 3.9992 0.3191 sec/batch
Epoch 5/10 Iteration: 20300 Avg. Training loss: 3.9586 0.3195 sec/batch
Epoch 5/10 Iteration: 20400 Avg. Training loss: 4.0063 0.3283 sec/batch
Epoch 5/10 Iteration: 20500 Avg. Training loss: 4.0515 0.3030 sec/batch
Epoch 5/10 Iteration: 20600 Avg. Training loss: 3.9526 0.3022 sec/batch
Epoch 5/10 Iteration: 20700 Avg. Training loss: 3.9850 0.3258 sec/batch
Epoch 5/10 Iteration: 20800 Avg. Training loss: 3.9956 0.3051 sec/batch
Epoch 5/10 Iteration: 20900 Avg. Training loss: 3.9825 0.3057 sec/batch
Epoch 5/10 Iteration: 21000 Avg. Training loss: 3.9986 0.3023 sec/batch
Nearest to not: asked, that, then, deutschlands, want, it, if, zanjeer,
Nearest to people: shooter, singapore, superclass, algal, whites, is, fula, less,
Nearest to between: earn, measured, cyrillic, modicum, usc, halfbakery, sporadically, boudin,
Nearest to new: york, called, elgar, ricks, revolution, proscriptions, pappas, springing,
Nearest to many: sub, generalised, have, derry, broadway, afghana, culture, tours,
Nearest to the: in, is, a, of, each, as, and, third,
Nearest to up: tolerate, anubis, busing, hostname, sissy, gaited, disk, have,
Nearest to which: pagemaker, electoral, nanotubes, phosphorylated, handy, be, alternatively, pneumococcus,
Nearest to applications: upto, unix, install, ended, cryptology, developers, internationalization, desktop,
Nearest to professional: xd, pushkin, gein, sports, popularity, played, academies, designating,
Nearest to older: householder, males, family, age, families, females, households, median,
Nearest to shown: ninhursag, a, macrolides, kilns, brouwer, transfection, blockers, simulant,
Nearest to rise: hilbert, larsson, flower, nigsberg, klaip, diplomatically, distancing, censorship,
Nearest to derived: judeo, reside, leet, nonverbal, opposition, sanskrit, cloned, linear,
Nearest to cost: upgrades, lack, profitable, thurgood, cass, guerillas, nix, improving,
Nearest to existence: view, quantifiers, microseconds, postulate, sedgewick, transcendent, emptied, adjectives,
Epoch 5/10 Iteration: 21100 Avg. Training loss: 3.9864 0.3076 sec/batch
Epoch 5/10 Iteration: 21200 Avg. Training loss: 3.9841 0.3018 sec/batch
Epoch 5/10 Iteration: 21300 Avg. Training loss: 3.9695 0.3068 sec/batch
Epoch 5/10 Iteration: 21400 Avg. Training loss: 3.9841 0.3166 sec/batch
Epoch 5/10 Iteration: 21500 Avg. Training loss: 4.0080 0.3040 sec/batch
Epoch 5/10 Iteration: 21600 Avg. Training loss: 4.0079 0.3129 sec/batch
Epoch 5/10 Iteration: 21700 Avg. Training loss: 4.0074 0.3111 sec/batch
Epoch 5/10 Iteration: 21800 Avg. Training loss: 3.9808 0.3423 sec/batch
Epoch 5/10 Iteration: 21900 Avg. Training loss: 3.9756 0.3835 sec/batch
Epoch 5/10 Iteration: 22000 Avg. Training loss: 4.0443 0.3172 sec/batch
Nearest to not: asked, that, ddd, furthermore, deutschlands, then, zanjeer, want,
Nearest to people: shooter, singapore, whites, fula, superclass, millions, algal, is,
Nearest to between: cyrillic, negus, modicum, earn, christianized, relations, from, measured,
Nearest to new: revolution, york, proscriptions, prevailed, ricks, early, elgar, called,
Nearest to many: sub, generalised, seekers, afghana, have, derry, tours, notable,
Nearest to the: in, is, of, a, and, as, s, from,
Nearest to up: tolerate, busing, anubis, hostname, have, disk, gaited, hospice,
Nearest to which: pagemaker, bsa, nanotubes, electoral, failed, parsis, phosphorylated, handy,
Nearest to applications: upto, unix, install, ended, developers, yudhoyono, cryptology, internationalization,
Nearest to professional: pushkin, xd, gein, sports, preventive, popularity, designating, ifex,
Nearest to older: householder, males, family, age, families, females, households, median,
Nearest to shown: ninhursag, a, transfection, kilns, blockers, macrolides, brouwer, deallocation,
Nearest to rise: hilbert, larsson, diplomatically, klaip, flower, distancing, nigsberg, savonarola,
Nearest to derived: judeo, reside, leet, sanskrit, cloned, tribes, opposition, subordinate,
Nearest to cost: lack, upgrades, cass, thurgood, profitable, serrated, guerillas, benefit,
Nearest to existence: view, quantifiers, microseconds, sedgewick, postulate, adjectives, emptied, transcendent,
Epoch 5/10 Iteration: 22100 Avg. Training loss: 3.9708 0.3167 sec/batch
Epoch 5/10 Iteration: 22200 Avg. Training loss: 4.1019 0.3124 sec/batch
Epoch 5/10 Iteration: 22300 Avg. Training loss: 4.0690 0.3127 sec/batch
Epoch 5/10 Iteration: 22400 Avg. Training loss: 4.0310 0.2982 sec/batch
Epoch 5/10 Iteration: 22500 Avg. Training loss: 3.9488 0.3288 sec/batch
Epoch 5/10 Iteration: 22600 Avg. Training loss: 3.9340 0.2986 sec/batch
Epoch 5/10 Iteration: 22700 Avg. Training loss: 3.9846 0.2955 sec/batch
Epoch 5/10 Iteration: 22800 Avg. Training loss: 3.9132 0.2980 sec/batch
Epoch 5/10 Iteration: 22900 Avg. Training loss: 3.9740 0.3057 sec/batch
Epoch 5/10 Iteration: 23000 Avg. Training loss: 3.9776 0.2955 sec/batch
Nearest to not: asked, that, ddd, then, skold, ebay, want, furthermore,
Nearest to people: shooter, singapore, millions, alumni, whites, fula, superclass, rapp,
Nearest to between: earn, cyrillic, measured, modicum, halfbakery, relations, christianized, trilled,
Nearest to new: york, revolution, called, proscriptions, early, elgar, ricks, joysticks,
Nearest to many: sub, generalised, have, afghana, notable, seekers, more, criticisms,
Nearest to the: in, is, a, of, and, as, s, on,
Nearest to up: tolerate, have, anubis, gaited, landfill, busing, hostname, sissy,
Nearest to which: pagemaker, nanotubes, bsa, electoral, phosphorylated, alternatively, pneumococcus, handy,
Nearest to applications: upto, unix, install, developers, ended, yudhoyono, desktop, cryptology,
Nearest to professional: pushkin, gein, popularity, xd, sports, paganini, skill, academies,
Nearest to older: householder, family, males, age, families, females, tupi, households,
Nearest to shown: a, ninhursag, macrolides, brouwer, blockers, quadrants, transfection, ene,
Nearest to rise: hilbert, diplomatically, larsson, klaip, flower, distancing, underemployed, nigsberg,
Nearest to derived: leet, judeo, reside, opposition, cloned, subordinate, sanskrit, linear,
Nearest to cost: upgrades, lack, profitable, benefit, improving, cass, thurgood, serrated,
Nearest to existence: view, microseconds, quantifiers, sedgewick, postulate, emptied, adjectives, auras,
Epoch 5/10 Iteration: 23100 Avg. Training loss: 3.9978 0.2981 sec/batch
Epoch 6/10 Iteration: 23200 Avg. Training loss: 3.9937 0.1953 sec/batch
Epoch 6/10 Iteration: 23300 Avg. Training loss: 3.9334 0.3015 sec/batch
Epoch 6/10 Iteration: 23400 Avg. Training loss: 3.9389 0.3027 sec/batch
Epoch 6/10 Iteration: 23500 Avg. Training loss: 3.9513 0.3042 sec/batch
Epoch 6/10 Iteration: 23600 Avg. Training loss: 3.9076 0.2996 sec/batch
Epoch 6/10 Iteration: 23700 Avg. Training loss: 3.9375 0.3100 sec/batch
Epoch 6/10 Iteration: 23800 Avg. Training loss: 3.8927 0.2820 sec/batch
Epoch 6/10 Iteration: 23900 Avg. Training loss: 3.9286 0.2855 sec/batch
Epoch 6/10 Iteration: 24000 Avg. Training loss: 3.9711 0.3048 sec/batch
Nearest to not: asked, then, that, skold, zanjeer, want, ddd, deutschlands,
Nearest to people: shooter, whites, singapore, fula, superclass, alumni, millions, essayists,
Nearest to between: earn, cyrillic, modicum, negus, measured, christianized, vetted, trilled,
Nearest to new: york, revolution, proscriptions, called, early, elgar, ricks, joysticks,
Nearest to many: sub, generalised, seekers, in, reasons, derry, have, notable,
Nearest to the: in, is, a, of, and, as, sixth, from,
Nearest to up: tolerate, have, sissy, materialize, busing, hospice, unleash, pocket,
Nearest to which: pagemaker, nanotubes, phosphorylated, the, bsa, pneumococcus, large, to,
Nearest to applications: upto, unix, install, developers, ended, desktop, yudhoyono, alamos,
Nearest to professional: sports, pushkin, popularity, gein, academies, played, xd, trophies,
Nearest to older: householder, family, males, age, families, females, tupi, households,
Nearest to shown: ninhursag, macrolides, brouwer, a, transfection, blockers, kilns, deallocation,
Nearest to rise: hilbert, klaip, larsson, diplomatically, flower, nigsberg, underemployed, distancing,
Nearest to derived: judeo, leet, reside, cloned, sanskrit, opposition, sensei, tribes,
Nearest to cost: upgrades, profitable, cass, thurgood, serrated, lack, benefit, guerillas,
Nearest to existence: microseconds, view, quantifiers, sedgewick, emptied, postulate, adjectives, bureaucrats,
Epoch 6/10 Iteration: 24100 Avg. Training loss: 3.9210 0.3094 sec/batch
Epoch 6/10 Iteration: 24200 Avg. Training loss: 4.0004 0.3183 sec/batch
Epoch 6/10 Iteration: 24300 Avg. Training loss: 3.8403 0.2975 sec/batch
Epoch 6/10 Iteration: 24400 Avg. Training loss: 3.9250 0.2977 sec/batch
Epoch 6/10 Iteration: 24500 Avg. Training loss: 3.9079 0.2963 sec/batch
Epoch 6/10 Iteration: 24600 Avg. Training loss: 3.8880 0.2952 sec/batch
Epoch 6/10 Iteration: 24700 Avg. Training loss: 3.9696 0.2941 sec/batch
Epoch 6/10 Iteration: 24800 Avg. Training loss: 3.9853 0.2951 sec/batch
Epoch 6/10 Iteration: 24900 Avg. Training loss: 3.9331 0.2949 sec/batch
Epoch 6/10 Iteration: 25000 Avg. Training loss: 3.9106 0.3041 sec/batch
Nearest to not: asked, that, then, furthermore, want, to, skold, ddd,
Nearest to people: shooter, alumni, singapore, whites, superclass, fula, millions, is,
Nearest to between: earn, measured, halfbakery, sporadically, cyrillic, usc, relations, trilled,
Nearest to new: york, revolution, early, called, springing, ricks, proscriptions, pappas,
Nearest to many: sub, generalised, have, derry, reasons, in, ordinals, more,
Nearest to the: in, is, a, as, of, and, s, from,
Nearest to up: tolerate, have, quantification, busing, hospice, pocket, landfill, pancreatitis,
Nearest to which: nanotubes, pagemaker, the, be, phosphorylated, pneumococcus, to, in,
Nearest to applications: upto, unix, install, ended, developers, desktop, alamos, cryptology,
Nearest to professional: sports, pushkin, popularity, academies, xd, competitions, ifex, skill,
Nearest to older: householder, males, family, age, families, females, median, households,
Nearest to shown: a, ninhursag, macrolides, been, kilns, blockers, transfection, quadrants,
Nearest to rise: hilbert, flower, larsson, distancing, nigsberg, heeled, diplomatically, klaip,
Nearest to derived: leet, judeo, cloned, reside, opposition, sanskrit, hebert, sensei,
Nearest to cost: upgrades, profitable, thurgood, cass, benefit, serrated, appropriately, lack,
Nearest to existence: view, microseconds, sedgewick, quantifiers, emptied, postulate, adjectives, jahwist,
Epoch 6/10 Iteration: 25100 Avg. Training loss: 3.9879 0.3007 sec/batch
Epoch 6/10 Iteration: 25200 Avg. Training loss: 3.9233 0.2985 sec/batch
Epoch 6/10 Iteration: 25300 Avg. Training loss: 3.9090 0.2834 sec/batch
Epoch 6/10 Iteration: 25400 Avg. Training loss: 3.9466 0.2891 sec/batch
Epoch 6/10 Iteration: 25500 Avg. Training loss: 3.9156 0.3012 sec/batch
Epoch 6/10 Iteration: 25600 Avg. Training loss: 3.9295 0.2957 sec/batch
Epoch 6/10 Iteration: 25700 Avg. Training loss: 3.9649 0.3253 sec/batch
Epoch 6/10 Iteration: 25800 Avg. Training loss: 3.9481 0.3893 sec/batch
Epoch 6/10 Iteration: 25900 Avg. Training loss: 3.9425 0.4332 sec/batch
Epoch 6/10 Iteration: 26000 Avg. Training loss: 3.9418 0.4480 sec/batch
Nearest to not: asked, that, then, ddd, to, furthermore, skold, ebay,
Nearest to people: shooter, alumni, singapore, essayists, whites, gdr, superclass, pooling,
Nearest to between: earn, halfbakery, relations, silvia, vetted, measured, christianized, negus,
Nearest to new: york, revolution, early, springing, elgar, called, proscriptions, prevailed,
Nearest to many: sub, derry, have, more, generalised, seekers, sinti, reasons,
Nearest to the: in, is, of, a, as, and, this, from,
Nearest to up: tolerate, busing, landfill, hospice, have, quantification, materialize, around,
Nearest to which: pagemaker, the, in, phosphorylated, electoral, nanotubes, be, a,
Nearest to applications: upto, unix, ended, developers, install, alamos, desktop, yudhoyono,
Nearest to professional: sports, pushkin, skill, popularity, competitions, xd, academies, ifex,
Nearest to older: householder, males, family, age, families, females, median, tupi,
Nearest to shown: a, ninhursag, macrolides, been, collaborators, transfection, blockers, kilns,
Nearest to rise: hilbert, larsson, distancing, nigsberg, flower, diplomatically, censorship, klaip,
Nearest to derived: opposition, judeo, leet, reside, cloned, hebert, subordinate, sanskrit,
Nearest to cost: thurgood, profitable, upgrades, benefit, cass, lack, guerillas, appropriately,
Nearest to existence: microseconds, quantifiers, sedgewick, view, postulate, adjectives, emptied, jahwist,
Epoch 6/10 Iteration: 26100 Avg. Training loss: 3.9775 0.4517 sec/batch
Epoch 6/10 Iteration: 26200 Avg. Training loss: 3.9356 0.3697 sec/batch
Epoch 6/10 Iteration: 26300 Avg. Training loss: 3.9720 0.3356 sec/batch
Epoch 6/10 Iteration: 26400 Avg. Training loss: 3.9151 0.3546 sec/batch
Epoch 6/10 Iteration: 26500 Avg. Training loss: 3.9130 0.3220 sec/batch
Epoch 6/10 Iteration: 26600 Avg. Training loss: 3.9528 0.3236 sec/batch
Epoch 6/10 Iteration: 26700 Avg. Training loss: 3.9181 0.3196 sec/batch
Epoch 6/10 Iteration: 26800 Avg. Training loss: 4.0232 0.3312 sec/batch
Epoch 6/10 Iteration: 26900 Avg. Training loss: 4.0103 0.3209 sec/batch
Epoch 6/10 Iteration: 27000 Avg. Training loss: 4.0187 0.3196 sec/batch
Nearest to not: that, asked, skold, furthermore, it, then, ddd, deutschlands,
Nearest to people: alumni, shooter, essayists, millions, politicians, singapore, this, pooling,
Nearest to between: negus, relations, halfbakery, cyrillic, boudin, vetted, earn, trilled,
Nearest to new: york, revolution, early, called, springing, elgar, proscriptions, routledge,
Nearest to many: sub, have, more, seekers, adel, sinti, these, derry,
Nearest to the: in, is, as, and, of, a, s, this,
Nearest to up: tolerate, hospice, busing, materialize, anubis, landfill, quantification, cakes,
Nearest to which: pagemaker, nanotubes, phosphorylated, in, electoral, counties, the, to,
Nearest to applications: unix, upto, install, developers, desktop, application, internationalization, yudhoyono,
Nearest to professional: pushkin, gein, skill, sports, xd, popularity, paganini, ifex,
Nearest to older: householder, males, family, age, females, families, median, tupi,
Nearest to shown: a, ninhursag, macrolides, collaborators, kilns, transfection, ene, been,
Nearest to rise: hilbert, larsson, flower, diplomatically, distancing, nigsberg, klaip, savonarola,
Nearest to derived: judeo, cloned, leet, reside, opposition, sanskrit, subordinate, hebert,
Nearest to cost: cass, thurgood, profitable, upgrades, benefit, lack, serrated, guerillas,
Nearest to existence: sedgewick, quantifiers, microseconds, view, postulate, adjectives, jahwist, dementia,
Epoch 6/10 Iteration: 27100 Avg. Training loss: 3.9148 0.3272 sec/batch
Epoch 6/10 Iteration: 27200 Avg. Training loss: 3.8722 0.3102 sec/batch
Epoch 6/10 Iteration: 27300 Avg. Training loss: 3.9364 0.3175 sec/batch
Epoch 6/10 Iteration: 27400 Avg. Training loss: 3.8400 0.3055 sec/batch
Epoch 6/10 Iteration: 27500 Avg. Training loss: 3.9594 0.3007 sec/batch
Epoch 6/10 Iteration: 27600 Avg. Training loss: 3.9507 0.3005 sec/batch
Epoch 6/10 Iteration: 27700 Avg. Training loss: 3.9299 0.3011 sec/batch
Epoch 7/10 Iteration: 27800 Avg. Training loss: 3.9884 0.1170 sec/batch
Epoch 7/10 Iteration: 27900 Avg. Training loss: 3.9269 0.3101 sec/batch
Epoch 7/10 Iteration: 28000 Avg. Training loss: 3.9387 0.3155 sec/batch
Nearest to not: asked, that, zanjeer, furthermore, want, then, personal, deutschlands,
Nearest to people: alumni, millions, shooter, politicians, essayists, gdr, fula, pooling,
Nearest to between: halfbakery, vetted, negus, cyrillic, relations, trilled, sporadically, shimon,
Nearest to new: york, revolution, early, called, elgar, proscriptions, springing, restore,
Nearest to many: sub, more, have, generalised, reasons, adel, these, influential,
Nearest to the: in, is, of, a, and, as, from, this,
Nearest to up: tolerate, hospice, around, unleash, busing, materialize, have, cakes,
Nearest to which: nanotubes, the, pagemaker, in, phosphorylated, its, to, electoral,
Nearest to applications: upto, unix, developers, ended, install, desktop, yudhoyono, cryptology,
Nearest to professional: pushkin, skill, sports, gein, popularity, ifex, academies, xd,
Nearest to older: householder, family, males, age, females, families, tupi, median,
Nearest to shown: a, blockers, macrolides, transfection, ninhursag, quadrants, ene, been,
Nearest to rise: hilbert, larsson, flower, distancing, klaip, diplomatically, nigsberg, censorship,
Nearest to derived: judeo, leet, cloned, reside, opposition, sanskrit, hebrew, numbering,
Nearest to cost: thurgood, cass, profitable, upgrades, benefit, lack, appropriately, serrated,
Nearest to existence: sedgewick, quantifiers, view, microseconds, postulate, adjectives, jahwist, race,
Epoch 7/10 Iteration: 28100 Avg. Training loss: 3.9063 0.3214 sec/batch
Epoch 7/10 Iteration: 28200 Avg. Training loss: 3.9311 0.3069 sec/batch
Epoch 7/10 Iteration: 28300 Avg. Training loss: 3.8942 0.2962 sec/batch
Epoch 7/10 Iteration: 28400 Avg. Training loss: 3.9103 0.3097 sec/batch
Epoch 7/10 Iteration: 28500 Avg. Training loss: 3.8531 0.3028 sec/batch
Epoch 7/10 Iteration: 28600 Avg. Training loss: 3.8983 0.3149 sec/batch
Epoch 7/10 Iteration: 28700 Avg. Training loss: 3.9152 0.3006 sec/batch
Epoch 7/10 Iteration: 28800 Avg. Training loss: 3.9399 0.3048 sec/batch
Epoch 7/10 Iteration: 28900 Avg. Training loss: 3.8253 0.3034 sec/batch
Epoch 7/10 Iteration: 29000 Avg. Training loss: 3.8598 0.3170 sec/batch
Nearest to not: asked, that, zanjeer, yours, then, to, but, ddd,
Nearest to people: millions, alumni, shooter, politicians, fula, pooling, singapore, superclass,
Nearest to between: earn, vetted, negus, relations, halfbakery, both, near, cyrillic,
Nearest to new: york, called, revolution, springing, early, proscriptions, restore, prevailed,
Nearest to many: sub, have, more, generalised, in, notable, reasons, these,
Nearest to the: in, is, as, of, a, and, from, to,
Nearest to up: tolerate, around, have, quantification, materialize, hospice, landfill, sissy,
Nearest to which: the, in, to, its, phosphorylated, being, pagemaker, electoral,
Nearest to applications: upto, unix, developers, ended, install, cryptology, alamos, gogol,
Nearest to professional: pushkin, sports, skill, ifex, popularity, academies, competitions, preventive,
Nearest to older: householder, males, age, family, females, families, median, tupi,
Nearest to shown: a, been, macrolides, transfection, blockers, ninhursag, collaborators, quadrants,
Nearest to rise: hilbert, flower, larsson, distancing, snapper, diplomatically, heeled, underemployed,
Nearest to derived: judeo, leet, cloned, opposition, sanskrit, reside, hebrew, hebert,
Nearest to cost: thurgood, cass, upgrades, benefit, lack, appropriately, profitable, assr,
Nearest to existence: quantifiers, sedgewick, microseconds, view, inert, postulate, adjectives, race,
Epoch 7/10 Iteration: 29100 Avg. Training loss: 3.9101 0.3059 sec/batch
Epoch 7/10 Iteration: 29200 Avg. Training loss: 3.8247 0.2930 sec/batch
Epoch 7/10 Iteration: 29300 Avg. Training loss: 3.9164 0.3156 sec/batch
Epoch 7/10 Iteration: 29400 Avg. Training loss: 3.9088 0.3223 sec/batch
Epoch 7/10 Iteration: 29500 Avg. Training loss: 3.8922 0.3139 sec/batch
Epoch 7/10 Iteration: 29600 Avg. Training loss: 3.8912 0.2887 sec/batch
Epoch 7/10 Iteration: 29700 Avg. Training loss: 3.9826 0.2984 sec/batch
Epoch 7/10 Iteration: 29800 Avg. Training loss: 3.9000 0.3029 sec/batch
Epoch 7/10 Iteration: 29900 Avg. Training loss: 3.8888 0.3015 sec/batch
Epoch 7/10 Iteration: 30000 Avg. Training loss: 3.8976 0.3050 sec/batch
Nearest to not: that, asked, but, to, furthermore, it, if, then,
Nearest to people: alumni, millions, shooter, politicians, pooling, superclass, autodidacts, fula,
Nearest to between: vetted, earn, halfbakery, relations, interval, usc, negus, both,
Nearest to new: york, called, revolution, springing, early, elgar, prentice, proscriptions,
Nearest to many: sub, more, have, generalised, reasons, as, ordinals, notable,
Nearest to the: in, is, of, a, as, and, that, from,
Nearest to up: tolerate, have, around, quantification, unleash, materialize, hospice, landfill,
Nearest to which: the, to, phosphorylated, in, pagemaker, its, nanotubes, distinguishing,
Nearest to applications: upto, unix, ended, cryptology, install, gogol, developers, application,
Nearest to professional: pushkin, sports, academies, ifex, skill, gein, competitions, preventive,
Nearest to older: householder, age, family, males, females, families, median, household,
Nearest to shown: a, been, macrolides, blockers, transfection, ninhursag, collaborators, tarr,
Nearest to rise: hilbert, flower, larsson, distancing, ectopic, expression, nigsberg, heeled,
Nearest to derived: cloned, judeo, leet, opposition, hebrew, reside, linear, sanskrit,
Nearest to cost: thurgood, benefit, cass, upgrades, lack, appropriately, profitable, costs,
Nearest to existence: quantifiers, microseconds, view, sedgewick, inert, postulate, adjectives, dementia,
Epoch 7/10 Iteration: 30100 Avg. Training loss: 3.9159 0.3069 sec/batch
Epoch 7/10 Iteration: 30200 Avg. Training loss: 3.9288 0.3129 sec/batch
Epoch 7/10 Iteration: 30300 Avg. Training loss: 3.8691 0.3053 sec/batch
Epoch 7/10 Iteration: 30400 Avg. Training loss: 3.8968 0.3142 sec/batch
Epoch 7/10 Iteration: 30500 Avg. Training loss: 3.9104 0.3062 sec/batch
Epoch 7/10 Iteration: 30600 Avg. Training loss: 3.9059 0.3014 sec/batch
Epoch 7/10 Iteration: 30700 Avg. Training loss: 3.9151 0.3024 sec/batch
Epoch 7/10 Iteration: 30800 Avg. Training loss: 3.9062 0.3039 sec/batch
Epoch 7/10 Iteration: 30900 Avg. Training loss: 3.9217 0.3004 sec/batch
Epoch 7/10 Iteration: 31000 Avg. Training loss: 3.8683 0.3009 sec/batch
Nearest to not: that, asked, furthermore, but, ddd, to, it, zanjeer,
Nearest to people: alumni, millions, essayists, politicians, shooter, autodidacts, is, lovers,
Nearest to between: negus, halfbakery, relations, vetted, both, percival, earn, silvia,
Nearest to new: york, revolution, called, springing, early, proscriptions, elgar, restore,
Nearest to many: sub, have, more, these, as, generalised, were, in,
Nearest to the: in, of, and, is, as, a, from, by,
Nearest to up: tolerate, around, quantification, hospice, landfill, have, busing, materialize,
Nearest to which: in, the, to, its, phosphorylated, a, pagemaker, principal,
Nearest to applications: unix, upto, ended, install, developers, cryptology, desktop, gogol,
Nearest to professional: sports, pushkin, skill, ifex, academies, gein, competitions, preventive,
Nearest to older: householder, age, family, females, males, families, median, tupi,
Nearest to shown: a, been, collaborators, ninhursag, macrolides, tarr, blockers, transfection,
Nearest to rise: hilbert, larsson, flower, distancing, nigsberg, diplomatically, expression, snapper,
Nearest to derived: judeo, hebrew, cloned, opposition, leet, sanskrit, hebert, reside,
Nearest to cost: thurgood, cass, lack, benefit, profitable, upgrades, costs, assr,
Nearest to existence: microseconds, sedgewick, view, postulate, inert, quantifiers, adjectives, heavenly,
Epoch 7/10 Iteration: 31100 Avg. Training loss: 3.9079 0.3051 sec/batch
Epoch 7/10 Iteration: 31200 Avg. Training loss: 3.9070 0.2986 sec/batch
Epoch 7/10 Iteration: 31300 Avg. Training loss: 3.8939 0.3039 sec/batch
Epoch 7/10 Iteration: 31400 Avg. Training loss: 3.9711 0.3063 sec/batch
Epoch 7/10 Iteration: 31500 Avg. Training loss: 3.9996 0.3100 sec/batch
Epoch 7/10 Iteration: 31600 Avg. Training loss: 3.9707 0.3118 sec/batch
Epoch 7/10 Iteration: 31700 Avg. Training loss: 3.9255 0.3064 sec/batch
Epoch 7/10 Iteration: 31800 Avg. Training loss: 3.8523 0.3148 sec/batch
Epoch 7/10 Iteration: 31900 Avg. Training loss: 3.8815 0.3087 sec/batch
Epoch 7/10 Iteration: 32000 Avg. Training loss: 3.7929 0.3306 sec/batch
Nearest to not: that, asked, but, furthermore, to, ddd, it, zanjeer,
Nearest to people: alumni, politicians, millions, autodidacts, essayists, births, lovers, leu,
Nearest to between: cyrillic, negus, relations, halfbakery, both, percival, macedonian, vetted,
Nearest to new: york, revolution, called, early, transducer, springing, elgar, unhelpful,
Nearest to many: more, sub, have, as, these, such, in, were,
Nearest to the: in, of, and, is, a, as, from, its,
Nearest to up: tolerate, have, around, materialize, landfill, quantification, hospice, busing,
Nearest to which: in, its, the, to, a, of, under, from,
Nearest to applications: upto, unix, install, cryptology, developers, application, ended, alamos,
Nearest to professional: pushkin, skill, gein, sports, ifex, paganini, glamorous, xd,
Nearest to older: householder, age, family, females, males, families, household, median,
Nearest to shown: a, been, collaborators, macrolides, blockers, tarr, ninhursag, ene,
Nearest to rise: hilbert, flower, larsson, distancing, diplomatically, expression, nigsberg, ecological,
Nearest to derived: judeo, leet, cloned, hebrew, opposition, sanskrit, numbering, reside,
Nearest to cost: thurgood, cass, benefit, lack, costs, profitable, upgrades, assr,
Nearest to existence: microseconds, inert, sedgewick, postulate, adjectives, view, quantifiers, dementia,
Epoch 7/10 Iteration: 32100 Avg. Training loss: 3.9124 0.3106 sec/batch
Epoch 7/10 Iteration: 32200 Avg. Training loss: 3.9346 0.3368 sec/batch
Epoch 7/10 Iteration: 32300 Avg. Training loss: 3.8915 0.3423 sec/batch
Epoch 8/10 Iteration: 32400 Avg. Training loss: 3.9234 0.0339 sec/batch
Epoch 8/10 Iteration: 32500 Avg. Training loss: 3.8999 0.3155 sec/batch
Epoch 8/10 Iteration: 32600 Avg. Training loss: 3.8885 0.3306 sec/batch
Epoch 8/10 Iteration: 32700 Avg. Training loss: 3.8757 0.3203 sec/batch
Epoch 8/10 Iteration: 32800 Avg. Training loss: 3.8981 0.3025 sec/batch
Epoch 8/10 Iteration: 32900 Avg. Training loss: 3.8322 0.3282 sec/batch
Epoch 8/10 Iteration: 33000 Avg. Training loss: 3.8513 0.3249 sec/batch
Nearest to not: asked, that, to, but, furthermore, zanjeer, personal, then,
Nearest to people: alumni, millions, autodidacts, politicians, essayists, lovers, martian, shooter,
Nearest to between: negus, vetted, shimon, halfbakery, cyrillic, both, silvia, percival,
Nearest to new: york, early, called, revolution, springing, proscriptions, unhelpful, elgar,
Nearest to many: have, sub, more, these, reasons, in, as, such,
Nearest to the: in, of, a, is, and, as, from, to,
Nearest to up: have, tolerate, quantification, around, materialize, landfill, unleash, hammadi,
Nearest to which: the, in, to, its, was, a, from, of,
Nearest to applications: unix, upto, install, developers, cryptology, application, desktop, ended,
Nearest to professional: skill, pushkin, sports, ifex, gein, academies, competitions, ifbb,
Nearest to older: householder, age, family, females, families, males, household, median,
Nearest to shown: been, a, collaborators, macrolides, tarr, ninhursag, blockers, transfection,
Nearest to rise: flower, hilbert, larsson, distancing, diplomatically, expression, nigsberg, snapper,
Nearest to derived: judeo, cloned, leet, hebrew, opposition, sanskrit, reside, numbering,
Nearest to cost: thurgood, profitable, cass, upgrades, lack, benefit, assr, costs,
Nearest to existence: view, microseconds, inert, sedgewick, adjectives, race, postulate, quantifiers,
Epoch 8/10 Iteration: 33100 Avg. Training loss: 3.8498 0.3139 sec/batch
Epoch 8/10 Iteration: 33200 Avg. Training loss: 3.8831 0.3071 sec/batch
Epoch 8/10 Iteration: 33300 Avg. Training loss: 3.9123 0.3097 sec/batch
Epoch 8/10 Iteration: 33400 Avg. Training loss: 3.8818 0.3304 sec/batch
Epoch 8/10 Iteration: 33500 Avg. Training loss: 3.8392 0.3234 sec/batch
Epoch 8/10 Iteration: 33600 Avg. Training loss: 3.8270 0.3240 sec/batch
Epoch 8/10 Iteration: 33700 Avg. Training loss: 3.8235 0.3102 sec/batch
Epoch 8/10 Iteration: 33800 Avg. Training loss: 3.8135 0.3064 sec/batch
Epoch 8/10 Iteration: 33900 Avg. Training loss: 3.8598 0.3489 sec/batch
Epoch 8/10 Iteration: 34000 Avg. Training loss: 3.8874 0.3284 sec/batch
Nearest to not: that, asked, to, but, then, for, it, is,
Nearest to people: alumni, millions, politicians, is, autodidacts, lovers, essayists, pooling,
Nearest to between: both, halfbakery, interval, relations, cyrillic, macedonian, trilled, vetted,
Nearest to new: york, early, called, springing, revolution, elgar, proscriptions, appalachian,
Nearest to many: sub, have, as, such, in, these, more, also,
Nearest to the: in, is, a, and, as, of, to, from,
Nearest to up: have, tolerate, quantification, around, materialize, busing, hospice, cakes,
Nearest to which: the, in, to, its, a, from, of, an,
Nearest to applications: unix, upto, install, developers, cryptology, application, ended, internationalization,
Nearest to professional: skill, sports, pushkin, competitions, academies, ifex, training, gein,
Nearest to older: householder, age, family, families, females, males, household, median,
Nearest to shown: a, been, collaborators, blockers, macrolides, tarr, transfection, mst,
Nearest to rise: flower, hilbert, larsson, distancing, diplomatically, expression, nigsberg, ectopic,
Nearest to derived: cloned, judeo, leet, opposition, hebrew, disassembly, reside, numbering,
Nearest to cost: thurgood, lack, cass, benefit, profitable, costs, upgrades, assr,
Nearest to existence: view, inert, sedgewick, microseconds, postulate, race, dementia, adjectives,
Epoch 8/10 Iteration: 34100 Avg. Training loss: 3.8696 0.3381 sec/batch
Epoch 8/10 Iteration: 34200 Avg. Training loss: 3.8580 0.3268 sec/batch
Epoch 8/10 Iteration: 34300 Avg. Training loss: 3.9337 0.3263 sec/batch
Epoch 8/10 Iteration: 34400 Avg. Training loss: 3.8908 0.3339 sec/batch
Epoch 8/10 Iteration: 34500 Avg. Training loss: 3.8380 0.3282 sec/batch
Epoch 8/10 Iteration: 34600 Avg. Training loss: 3.8881 0.3182 sec/batch
Epoch 8/10 Iteration: 34700 Avg. Training loss: 3.8980 0.3285 sec/batch
Epoch 8/10 Iteration: 34800 Avg. Training loss: 3.8667 0.3152 sec/batch
Epoch 8/10 Iteration: 34900 Avg. Training loss: 3.8782 0.3152 sec/batch
Epoch 8/10 Iteration: 35000 Avg. Training loss: 3.9248 0.2910 sec/batch
Nearest to not: that, asked, to, but, then, is, furthermore, for,
Nearest to people: alumni, politicians, millions, essayists, autodidacts, lovers, native, songwriters,
Nearest to between: both, halfbakery, relations, vetted, percival, earn, predominantly, silvia,
Nearest to new: york, called, springing, revolution, early, elgar, unhelpful, prentice,
Nearest to many: have, sub, these, more, as, such, frequently, also,
Nearest to the: in, is, a, of, as, and, which, this,
Nearest to up: tolerate, have, around, quantification, materialize, busing, sissy, hospice,
Nearest to which: the, in, to, its, a, of, from, is,
Nearest to applications: upto, unix, cryptology, developers, install, application, ended, shugart,
Nearest to professional: skill, pushkin, gein, sports, paganini, glamorous, xd, gees,
Nearest to older: householder, age, females, family, families, males, median, household,
Nearest to shown: a, been, collaborators, mst, ninhursag, transfection, tarr, appeared,
Nearest to rise: flower, hilbert, larsson, nigsberg, distancing, diplomatically, expression, ectopic,
Nearest to derived: judeo, cloned, opposition, leet, hebrew, finn, meaning, numbering,
Nearest to cost: thurgood, lack, cass, profitable, benefit, assr, costs, upgrades,
Nearest to existence: inert, microseconds, race, view, postulate, dementia, sedgewick, adjectives,
Epoch 8/10 Iteration: 35100 Avg. Training loss: 3.8912 0.3024 sec/batch
Epoch 8/10 Iteration: 35200 Avg. Training loss: 3.8666 0.2994 sec/batch
Epoch 8/10 Iteration: 35300 Avg. Training loss: 3.8864 0.2875 sec/batch
Epoch 8/10 Iteration: 35400 Avg. Training loss: 3.9053 0.3014 sec/batch
Epoch 8/10 Iteration: 35500 Avg. Training loss: 3.8784 0.2964 sec/batch
Epoch 8/10 Iteration: 35600 Avg. Training loss: 3.8868 0.3035 sec/batch
Epoch 8/10 Iteration: 35700 Avg. Training loss: 3.8532 0.3089 sec/batch
Epoch 8/10 Iteration: 35800 Avg. Training loss: 3.8633 0.2898 sec/batch
Epoch 8/10 Iteration: 35900 Avg. Training loss: 3.9278 0.3018 sec/batch
Epoch 8/10 Iteration: 36000 Avg. Training loss: 3.8389 0.2824 sec/batch
Nearest to not: that, asked, but, furthermore, for, to, it, then,
Nearest to people: alumni, millions, essayists, autodidacts, lovers, politicians, births, songwriters,
Nearest to between: both, negus, relations, silvia, halfbakery, vetted, predominantly, cyrillic,
Nearest to new: york, revolution, early, springing, called, first, in, elgar,
Nearest to many: these, sub, have, as, more, frequently, also, both,
Nearest to the: in, of, as, is, and, a, on, s,
Nearest to up: tolerate, materialize, quantification, around, landfill, have, hammadi, busing,
Nearest to which: the, in, its, to, a, principal, was, annihilation,
Nearest to applications: unix, upto, developers, cryptology, application, alamos, install, ended,
Nearest to professional: pushkin, skill, gein, sports, paganini, xd, gees, glamorous,
Nearest to older: householder, age, females, family, males, families, household, median,
Nearest to shown: a, been, collaborators, ninhursag, quadrants, tarr, transfection, ene,
Nearest to rise: hilbert, flower, larsson, distancing, diplomatically, nigsberg, savonarola, expression,
Nearest to derived: cloned, judeo, hebrew, leet, opposition, sanskrit, meaning, ionia,
Nearest to cost: thurgood, lack, cass, benefit, costs, profitable, assr, upgrades,
Nearest to existence: inert, sedgewick, microseconds, dementia, postulate, race, view, adjectives,
Epoch 8/10 Iteration: 36100 Avg. Training loss: 3.9835 0.2973 sec/batch
Epoch 8/10 Iteration: 36200 Avg. Training loss: 3.9687 0.2859 sec/batch
Epoch 8/10 Iteration: 36300 Avg. Training loss: 3.8989 0.2966 sec/batch
Epoch 8/10 Iteration: 36400 Avg. Training loss: 3.8172 0.3021 sec/batch
Epoch 8/10 Iteration: 36500 Avg. Training loss: 3.8683 0.2938 sec/batch
Epoch 8/10 Iteration: 36600 Avg. Training loss: 3.8115 0.3038 sec/batch
Epoch 8/10 Iteration: 36700 Avg. Training loss: 3.8596 0.2986 sec/batch
Epoch 8/10 Iteration: 36800 Avg. Training loss: 3.8762 0.2978 sec/batch
Epoch 8/10 Iteration: 36900 Avg. Training loss: 3.8752 0.2949 sec/batch
Epoch 8/10 Iteration: 37000 Avg. Training loss: 3.8815 0.2999 sec/batch
Nearest to not: that, asked, but, to, then, for, furthermore, zanjeer,
Nearest to people: alumni, millions, essayists, autodidacts, lovers, politicians, songwriters, births,
Nearest to between: both, silvia, relations, interval, halfbakery, cyrillic, predominantly, trilled,
Nearest to new: york, early, revolution, called, springing, classic, it, elgar,
Nearest to many: have, these, sub, as, such, more, also, frequently,
Nearest to the: in, of, a, is, and, as, on, s,
Nearest to up: tolerate, have, materialize, quantification, around, landfill, hospice, cakes,
Nearest to which: the, in, its, to, a, of, from, principal,
Nearest to applications: upto, unix, developers, application, install, ended, cryptology, internationalization,
Nearest to professional: pushkin, skill, gein, training, ifex, glamorous, gees, paganini,
Nearest to older: householder, family, age, females, males, household, families, income,
Nearest to shown: a, collaborators, been, transfection, tarr, blockers, quadrants, mst,
Nearest to rise: hilbert, flower, larsson, diplomatically, distancing, expression, nigsberg, snapper,
Nearest to derived: cloned, leet, judeo, hebrew, meaning, opposition, numbering, sanskrit,
Nearest to cost: lack, thurgood, cass, costs, profitable, benefit, assr, upgrades,
Nearest to existence: sedgewick, race, dementia, inert, postulate, microseconds, adjectives, ptah,
Epoch 9/10 Iteration: 37100 Avg. Training loss: 3.8673 0.2586 sec/batch
Epoch 9/10 Iteration: 37200 Avg. Training loss: 3.8587 0.3223 sec/batch
Epoch 9/10 Iteration: 37300 Avg. Training loss: 3.8508 0.3013 sec/batch
Epoch 9/10 Iteration: 37400 Avg. Training loss: 3.8837 0.3018 sec/batch
Epoch 9/10 Iteration: 37500 Avg. Training loss: 3.8193 0.3019 sec/batch
Epoch 9/10 Iteration: 37600 Avg. Training loss: 3.8269 0.2980 sec/batch
Epoch 9/10 Iteration: 37700 Avg. Training loss: 3.7904 0.3031 sec/batch
Epoch 9/10 Iteration: 37800 Avg. Training loss: 3.8738 0.3259 sec/batch
Epoch 9/10 Iteration: 37900 Avg. Training loss: 3.8545 0.3195 sec/batch
Epoch 9/10 Iteration: 38000 Avg. Training loss: 3.8763 0.3137 sec/batch
Nearest to not: that, asked, to, but, for, then, it, zanjeer,
Nearest to people: alumni, millions, lovers, autodidacts, politicians, essayists, songwriters, births,
Nearest to between: both, silvia, interval, vetted, kla, halfbakery, predominantly, macedonian,
Nearest to new: york, early, called, era, revolution, springing, first, it,
Nearest to many: have, sub, these, frequently, such, also, as, more,
Nearest to the: in, of, a, is, and, which, as, to,
Nearest to up: materialize, cakes, have, quantification, tolerate, sissy, landfill, hospice,
Nearest to which: the, in, to, its, of, a, from, principal,
Nearest to applications: upto, unix, developers, application, ended, cryptology, install, internationalization,
Nearest to professional: pushkin, skill, sports, training, competitions, ifex, gees, gein,
Nearest to older: householder, age, family, females, household, families, males, income,
Nearest to shown: been, collaborators, a, tarr, appeared, quadrants, mst, transfection,
Nearest to rise: hilbert, flower, larsson, diplomatically, nigsberg, expression, snapper, distancing,
Nearest to derived: cloned, judeo, leet, hebrew, meaning, opposition, reside, numbering,
Nearest to cost: thurgood, cass, lack, profitable, assr, costs, benefit, extolling,
Nearest to existence: race, sedgewick, inert, dementia, postulate, microseconds, view, jahwist,
Epoch 9/10 Iteration: 38100 Avg. Training loss: 3.8594 0.3099 sec/batch
Epoch 9/10 Iteration: 38200 Avg. Training loss: 3.7701 0.2968 sec/batch
Epoch 9/10 Iteration: 38300 Avg. Training loss: 3.8541 0.2988 sec/batch
Epoch 9/10 Iteration: 38400 Avg. Training loss: 3.8130 0.3068 sec/batch
Epoch 9/10 Iteration: 38500 Avg. Training loss: 3.7980 0.2970 sec/batch
Epoch 9/10 Iteration: 38600 Avg. Training loss: 3.8655 0.3037 sec/batch
Epoch 9/10 Iteration: 38700 Avg. Training loss: 3.8815 0.3168 sec/batch
Epoch 9/10 Iteration: 38800 Avg. Training loss: 3.8252 0.3061 sec/batch
Epoch 9/10 Iteration: 38900 Avg. Training loss: 3.8820 0.3110 sec/batch
Epoch 9/10 Iteration: 39000 Avg. Training loss: 3.8892 0.2871 sec/batch
Nearest to not: that, asked, then, but, to, for, yours, it,
Nearest to people: alumni, millions, essayists, politicians, lovers, autodidacts, songwriters, pooling,
Nearest to between: both, silvia, relations, interval, vetted, predominantly, halfbakery, usc,
Nearest to new: york, called, early, springing, isoleucine, routledge, prentice, revolution,
Nearest to many: sub, as, have, also, such, these, more, in,
Nearest to the: in, of, is, a, and, as, which, to,
Nearest to up: materialize, sissy, tolerate, have, quantification, cakes, hospice, busing,
Nearest to which: the, to, in, a, its, of, an, from,
Nearest to applications: upto, unix, alamos, ended, internationalization, cryptology, application, developers,
Nearest to professional: training, pushkin, skill, ifex, sports, academies, gees, competitions,
Nearest to older: householder, age, females, family, household, males, families, income,
Nearest to shown: been, a, collaborators, appeared, has, tarr, mst, transfection,
Nearest to rise: hilbert, flower, larsson, expression, distancing, nigsberg, diplomatically, ectopic,
Nearest to derived: cloned, judeo, leet, opposition, gpl, meaning, hebrew, noftsker,
Nearest to cost: thurgood, lack, cass, costs, profitable, benefit, assr, appropriately,
Nearest to existence: race, inert, sedgewick, dementia, view, postulate, ptah, adjectives,
Epoch 9/10 Iteration: 39100 Avg. Training loss: 3.8238 0.2884 sec/batch
Epoch 9/10 Iteration: 39200 Avg. Training loss: 3.8602 0.2947 sec/batch
Epoch 9/10 Iteration: 39300 Avg. Training loss: 3.8370 0.3781 sec/batch
Epoch 9/10 Iteration: 39400 Avg. Training loss: 3.8642 0.3534 sec/batch
Epoch 9/10 Iteration: 39500 Avg. Training loss: 3.8291 0.3499 sec/batch
Epoch 9/10 Iteration: 39600 Avg. Training loss: 3.8748 0.3337 sec/batch
Epoch 9/10 Iteration: 39700 Avg. Training loss: 3.8776 0.2701 sec/batch
Epoch 9/10 Iteration: 39800 Avg. Training loss: 3.8357 12.3689 sec/batch
Epoch 9/10 Iteration: 39900 Avg. Training loss: 3.8488 0.3267 sec/batch

Restore the trained network if you need to:


In [ ]:
with train_graph.as_default():
    saver = tf.train.Saver()

with tf.Session(graph=train_graph) as sess:
    saver.restore(sess, tf.train.latest_checkpoint('checkpoints'))
    embed_mat = sess.run(embedding)

Visualizing the word vectors

Below we'll use T-SNE to visualize how our high-dimensional word vectors cluster together. T-SNE is used to project these vectors into two dimensions while preserving local stucture. Check out this post from Christopher Olah to learn more about T-SNE and other ways to visualize high-dimensional data.


In [ ]:
%matplotlib inline
%config InlineBackend.figure_format = 'retina'

import matplotlib.pyplot as plt
from sklearn.manifold import TSNE

In [ ]:
viz_words = 500
tsne = TSNE()
embed_tsne = tsne.fit_transform(embed_mat[:viz_words, :])

In [ ]:
fig, ax = plt.subplots(figsize=(14, 14))
for idx in range(viz_words):
    plt.scatter(*embed_tsne[idx, :], color='steelblue')
    plt.annotate(int_to_vocab[idx], (embed_tsne[idx, 0], embed_tsne[idx, 1]), alpha=0.7)