This notebook contains code to train a LSTM RNN on the corpus of all Steam game store long descriptions and generate new descriptions.

Based on this blog post: http://machinelearningmastery.com/text-generation-lstm-recurrent-neural-networks-python-keras/ and this example: https://github.com/fchollet/keras/blob/master/examples/lstm_text_generation.py

Things tried:

  • 100 char window, 1 char stride (bad results with big/small models) (all spaces or "toes")
  • Cleaned out all but alphanumeric chars and some punctuation
  • 40 char window, 3 char stride (much better results with small model) ("and the coanen")
  • 60 char window, 3 char stride (better results but a big jump in error around 10th epoch) ("and_seeens", "to")
  • don't take out lowercase, use adadelta optimizer, don't skip chars, big seq length, 3 LSTM layers w/ Dropout 0.3 ("to the start")
  • convert multiple spaces to one, drop long/short descriptions, limit to metacritic_review is not null (hopefully finding more homogeneous descriptions)

In [1]:
%matplotlib inline
import dataset
import keras
import numpy as np
import pandas as pd
import os
import re
from tqdm import tqdm
from pathlib import Path


Using Theano backend.
/home/jason/.pyenv/versions/miniconda3-latest/envs/steam-store-analysis/lib/python3.6/site-packages/theano/gpuarray/dnn.py:135: UserWarning: Your cuDNN version is more recent than Theano. If you encounter problems, try updating Theano or downgrading cuDNN to version 5.1.
  warnings.warn("Your cuDNN version is more recent than "
Using cuDNN version 6021 on context None
Mapped name None to device cuda: GeForce GTX 970 (0000:01:00.0)

In [2]:
db = dataset.connect(os.environ['POSTGRES_URI'])

Pull texts from our database. Limit the number of descriptions pulled to prevent us from running out of memory (and order randomly so we get a random sample). Keep only descriptions with metacritic scores to hopefully cut out a lot of the really tiny indie games with broken English in their descriptions. This biases us toward AAA games, but I think that's fine for the purpose of generating stereotypical game descriptions, and there are still plenty to choose from.


In [3]:
description_query = '''
WITH filtered_games AS (
  SELECT *
  FROM game_crawl
  WHERE is_dlc = FALSE
    AND game_name IS NOT NULL
    AND metacritic_score IS NOT NULL
),
lower_length_limit AS (
  SELECT percentile_cont(0.01) WITHIN GROUP (ORDER BY length(long_description)) AS lower_limit
    FROM filtered_games
),
upper_length_limit AS (
  SELECT percentile_cont(0.99) WITHIN GROUP (ORDER BY length(long_description)) AS upper_limit
    FROM filtered_games
)
SELECT * 
FROM filtered_games
WHERE length(long_description)
  BETWEEN (SELECT lower_limit FROM lower_length_limit)
    AND (SELECT upper_limit FROM upper_length_limit)
ORDER BY random()
LIMIT 1000
'''

corpus = [r['long_description'] for r in db.query(description_query)]
print(len(corpus))
print(corpus[:1])


1000
['ABOUT THIS GAME\nCounter-Strike: Global Offensive (CS: GO) will expand upon the team-based action gameplay that it pioneered when it was launched 14 years ago.\n\nCS: GO features new maps, characters, and weapons and delivers updated versions of the classic CS content (de_dust, etc.). In addition, CS: GO will introduce new gameplay modes, matchmaking, leader boards, and more.\n\n"Counter-Strike took the gaming industry by surprise when the unlikely MOD became the most played online PC action game in the world almost immediately after its release in August 1999," said Doug Lombardi at Valve. "For the past 12 years, it has continued to be one of the most-played games in the world, headline competitive gaming tournaments and selling over 25 million units worldwide across the franchise. CS: GO promises to expand on CS\' award-winning gameplay and deliver it to gamers on the PC as well as the next gen consoles and the Mac."']

Check the distribution of lengths on the descriptions to make sure we didn't get any crazy outliers.


In [4]:
pd.Series(corpus).apply(len).plot(kind='hist')


Out[4]:
<matplotlib.axes._subplots.AxesSubplot at 0x7fb511b0edd8>

Apply cleaning to help the model out.


In [5]:
bad_char_re = re.compile(r'[^-a-zA-Z0-9 !.,?\n:()]')
multi_spaces_re = re.compile(r'(\s){2,}')

def clean_description(description):
    filtered_description = bad_char_re.sub('', description)
    # Replace two or more spaces with one space
    filtered_description = multi_spaces_re.sub(r'\1\1', filtered_description)
    return filtered_description

cleaned_corpus = [clean_description(d) for d in corpus]
del corpus

Create a mapping of unique chars to integers


In [6]:
joined_corpus = '\n'.join(cleaned_corpus)
del cleaned_corpus
chars = sorted(list(set(joined_corpus)))
print(chars)
char_to_int = dict((c, i) for i, c in enumerate(chars))


['\n', ' ', '!', '(', ')', ',', '-', '.', '0', '1', '2', '3', '4', '5', '6', '7', '8', '9', ':', '?', 'A', 'B', 'C', 'D', 'E', 'F', 'G', 'H', 'I', 'J', 'K', 'L', 'M', 'N', 'O', 'P', 'Q', 'R', 'S', 'T', 'U', 'V', 'W', 'X', 'Y', 'Z', 'a', 'b', 'c', 'd', 'e', 'f', 'g', 'h', 'i', 'j', 'k', 'l', 'm', 'n', 'o', 'p', 'q', 'r', 's', 't', 'u', 'v', 'w', 'x', 'y', 'z']

Total number of characters in the corpus.


In [7]:
n_chars = len(joined_corpus)
print(n_chars)


1461861

Total number of characters in the vocab


In [8]:
n_vocab = len(chars)
print(n_vocab)


72

Prepare the dataset of input to output pairs encoded as integers


In [9]:
seq_length = 140
step = 1
data_x = []
data_y = []
for i in tqdm(range(0, n_chars - seq_length, step)):
    start = i
    end = i + seq_length
    seq_in = joined_corpus[start:end]
    seq_out = joined_corpus[end]
    data_x.append([char_to_int[char] for char in seq_in])
    data_y.append(char_to_int[seq_out])
n_patterns = len(data_x)
print(n_patterns)
del joined_corpus


100%|██████████| 1461721/1461721 [00:18<00:00, 80401.33it/s] 
1461721

Reshape the X array to be [samples, time steps, features], normalize, and one-hot encode the output


In [10]:
def transform_text_samples(text_samples, n_patterns, seq_length):
    return np.reshape(text_samples, (n_patterns, seq_length, 1)) / float(n_vocab)

X = transform_text_samples(data_x, n_patterns, seq_length)
y = keras.utils.np_utils.to_categorical(data_y)

Define the model


In [11]:
model = keras.models.Sequential()
model.add(keras.layers.LSTM(128, input_shape=(X.shape[1], X.shape[2]), return_sequences=True, implementation=2))
model.add(keras.layers.Dropout(0.3))
model.add(keras.layers.LSTM(128, return_sequences=True, implementation=2))
model.add(keras.layers.LSTM(128, implementation=2))
model.add(keras.layers.Dense(y.shape[1], activation='softmax'))

# optimizer = keras.optimizers.RMSprop(lr=0.01)
model.compile(loss='categorical_crossentropy', optimizer='adadelta')

checkpoint_path = Path('models', 'weights-improvement-{epoch:02d}-{loss:.4f}.hdf5')
checkpoint = keras.callbacks.ModelCheckpoint(str(checkpoint_path), monitor='loss',
                                             verbose=1, save_best_only=True, mode='min')
callbacks_list = [checkpoint]

In [12]:
model.fit(X, y, epochs=60, batch_size=128, callbacks=callbacks_list)


Epoch 1/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 2.9168Epoch 00000: loss improved from inf to 2.91683, saving model to models/weights-improvement-00-2.9168.hdf5
1461721/1461721 [==============================] - 1926s - loss: 2.9168  
Epoch 2/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 2.6051Epoch 00001: loss improved from 2.91683 to 2.60510, saving model to models/weights-improvement-01-2.6051.hdf5
1461721/1461721 [==============================] - 1910s - loss: 2.6051  
Epoch 3/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 2.4457Epoch 00002: loss improved from 2.60510 to 2.44575, saving model to models/weights-improvement-02-2.4457.hdf5
1461721/1461721 [==============================] - 1910s - loss: 2.4457  
Epoch 4/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 2.3315Epoch 00003: loss improved from 2.44575 to 2.33146, saving model to models/weights-improvement-03-2.3315.hdf5
1461721/1461721 [==============================] - 1916s - loss: 2.3315  
Epoch 5/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 2.2422Epoch 00004: loss improved from 2.33146 to 2.24218, saving model to models/weights-improvement-04-2.2422.hdf5
1461721/1461721 [==============================] - 1916s - loss: 2.2422  
Epoch 6/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 2.1742Epoch 00005: loss improved from 2.24218 to 2.17423, saving model to models/weights-improvement-05-2.1742.hdf5
1461721/1461721 [==============================] - 1916s - loss: 2.1742  
Epoch 7/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 2.1187Epoch 00006: loss improved from 2.17423 to 2.11872, saving model to models/weights-improvement-06-2.1187.hdf5
1461721/1461721 [==============================] - 1914s - loss: 2.1187  
Epoch 8/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 2.0730Epoch 00007: loss improved from 2.11872 to 2.07297, saving model to models/weights-improvement-07-2.0730.hdf5
1461721/1461721 [==============================] - 1914s - loss: 2.0730  
Epoch 9/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 2.0327Epoch 00008: loss improved from 2.07297 to 2.03267, saving model to models/weights-improvement-08-2.0327.hdf5
1461721/1461721 [==============================] - 1912s - loss: 2.0327  
Epoch 10/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 1.9938Epoch 00009: loss improved from 2.03267 to 1.99376, saving model to models/weights-improvement-09-1.9938.hdf5
1461721/1461721 [==============================] - 1912s - loss: 1.9938  
Epoch 11/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 1.9572Epoch 00010: loss improved from 1.99376 to 1.95715, saving model to models/weights-improvement-10-1.9572.hdf5
1461721/1461721 [==============================] - 1911s - loss: 1.9572  
Epoch 12/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 1.9249Epoch 00011: loss improved from 1.95715 to 1.92494, saving model to models/weights-improvement-11-1.9249.hdf5
1461721/1461721 [==============================] - 1912s - loss: 1.9249  
Epoch 13/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 1.8997Epoch 00012: loss improved from 1.92494 to 1.89965, saving model to models/weights-improvement-12-1.8996.hdf5
1461721/1461721 [==============================] - 1912s - loss: 1.8996  
Epoch 14/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 1.8758Epoch 00013: loss improved from 1.89965 to 1.87575, saving model to models/weights-improvement-13-1.8757.hdf5
1461721/1461721 [==============================] - 1913s - loss: 1.8757  
Epoch 15/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 1.8547Epoch 00014: loss improved from 1.87575 to 1.85470, saving model to models/weights-improvement-14-1.8547.hdf5
1461721/1461721 [==============================] - 1912s - loss: 1.8547  
Epoch 16/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 1.8349Epoch 00015: loss improved from 1.85470 to 1.83491, saving model to models/weights-improvement-15-1.8349.hdf5
1461721/1461721 [==============================] - 1912s - loss: 1.8349  
Epoch 17/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 1.8166Epoch 00016: loss improved from 1.83491 to 1.81667, saving model to models/weights-improvement-16-1.8167.hdf5
1461721/1461721 [==============================] - 1907s - loss: 1.8167  
Epoch 18/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 1.8001Epoch 00017: loss improved from 1.81667 to 1.80007, saving model to models/weights-improvement-17-1.8001.hdf5
1461721/1461721 [==============================] - 1907s - loss: 1.8001  
Epoch 19/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 1.7851Epoch 00018: loss improved from 1.80007 to 1.78512, saving model to models/weights-improvement-18-1.7851.hdf5
1461721/1461721 [==============================] - 1908s - loss: 1.7851  
Epoch 20/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 1.7708Epoch 00019: loss improved from 1.78512 to 1.77079, saving model to models/weights-improvement-19-1.7708.hdf5
1461721/1461721 [==============================] - 1907s - loss: 1.7708  
Epoch 21/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 1.7577Epoch 00020: loss improved from 1.77079 to 1.75771, saving model to models/weights-improvement-20-1.7577.hdf5
1461721/1461721 [==============================] - 1908s - loss: 1.7577  
Epoch 22/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 1.7449Epoch 00021: loss improved from 1.75771 to 1.74485, saving model to models/weights-improvement-21-1.7449.hdf5
1461721/1461721 [==============================] - 1908s - loss: 1.7449  
Epoch 23/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 1.7338Epoch 00022: loss improved from 1.74485 to 1.73378, saving model to models/weights-improvement-22-1.7338.hdf5
1461721/1461721 [==============================] - 1908s - loss: 1.7338  
Epoch 24/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 1.7237Epoch 00023: loss improved from 1.73378 to 1.72365, saving model to models/weights-improvement-23-1.7237.hdf5
1461721/1461721 [==============================] - 1908s - loss: 1.7237  
Epoch 25/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 1.7130Epoch 00024: loss improved from 1.72365 to 1.71302, saving model to models/weights-improvement-24-1.7130.hdf5
1461721/1461721 [==============================] - 1908s - loss: 1.7130  
Epoch 26/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 1.7038Epoch 00025: loss improved from 1.71302 to 1.70379, saving model to models/weights-improvement-25-1.7038.hdf5
1461721/1461721 [==============================] - 1908s - loss: 1.7038  
Epoch 27/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 1.6948Epoch 00026: loss improved from 1.70379 to 1.69481, saving model to models/weights-improvement-26-1.6948.hdf5
1461721/1461721 [==============================] - 1909s - loss: 1.6948  
Epoch 28/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 1.6866Epoch 00027: loss improved from 1.69481 to 1.68662, saving model to models/weights-improvement-27-1.6866.hdf5
1461721/1461721 [==============================] - 1909s - loss: 1.6866  
Epoch 29/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 1.6788Epoch 00028: loss improved from 1.68662 to 1.67878, saving model to models/weights-improvement-28-1.6788.hdf5
1461721/1461721 [==============================] - 1910s - loss: 1.6788  
Epoch 30/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 1.6707Epoch 00029: loss improved from 1.67878 to 1.67074, saving model to models/weights-improvement-29-1.6707.hdf5
1461721/1461721 [==============================] - 1910s - loss: 1.6707  
Epoch 31/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 1.6639Epoch 00030: loss improved from 1.67074 to 1.66385, saving model to models/weights-improvement-30-1.6639.hdf5
1461721/1461721 [==============================] - 1910s - loss: 1.6639  
Epoch 32/60
1461632/1461721 [============================>.] - ETA: 0s - loss: 1.6566Epoch 00031: loss improved from 1.66385 to 1.65665, saving model to models/weights-improvement-31-1.6567.hdf5
1461721/1461721 [==============================] - 1903s - loss: 1.6567  
Epoch 33/60
 227584/1461721 [===>..........................] - ETA: 1608s - loss: 1.6466
---------------------------------------------------------------------------
KeyboardInterrupt                         Traceback (most recent call last)
<ipython-input-12-8ed647879db6> in <module>()
----> 1 model.fit(X, y, epochs=60, batch_size=128, callbacks=callbacks_list)

/home/jason/.pyenv/versions/miniconda3-latest/envs/steam-store-analysis/lib/python3.6/site-packages/keras/models.py in fit(self, x, y, batch_size, epochs, verbose, callbacks, validation_split, validation_data, shuffle, class_weight, sample_weight, initial_epoch, **kwargs)
    854                               class_weight=class_weight,
    855                               sample_weight=sample_weight,
--> 856                               initial_epoch=initial_epoch)
    857 
    858     def evaluate(self, x, y, batch_size=32, verbose=1,

/home/jason/.pyenv/versions/miniconda3-latest/envs/steam-store-analysis/lib/python3.6/site-packages/keras/engine/training.py in fit(self, x, y, batch_size, epochs, verbose, callbacks, validation_split, validation_data, shuffle, class_weight, sample_weight, initial_epoch, **kwargs)
   1496                               val_f=val_f, val_ins=val_ins, shuffle=shuffle,
   1497                               callback_metrics=callback_metrics,
-> 1498                               initial_epoch=initial_epoch)
   1499 
   1500     def evaluate(self, x, y, batch_size=32, verbose=1, sample_weight=None):

/home/jason/.pyenv/versions/miniconda3-latest/envs/steam-store-analysis/lib/python3.6/site-packages/keras/engine/training.py in _fit_loop(self, f, ins, out_labels, batch_size, epochs, verbose, callbacks, val_f, val_ins, shuffle, callback_metrics, initial_epoch)
   1150                 batch_logs['size'] = len(batch_ids)
   1151                 callbacks.on_batch_begin(batch_index, batch_logs)
-> 1152                 outs = f(ins_batch)
   1153                 if not isinstance(outs, list):
   1154                     outs = [outs]

/home/jason/.pyenv/versions/miniconda3-latest/envs/steam-store-analysis/lib/python3.6/site-packages/keras/backend/theano_backend.py in __call__(self, inputs)
   1156     def __call__(self, inputs):
   1157         assert isinstance(inputs, (list, tuple))
-> 1158         return self.function(*inputs)
   1159 
   1160 

/home/jason/.pyenv/versions/miniconda3-latest/envs/steam-store-analysis/lib/python3.6/site-packages/theano/compile/function_module.py in __call__(self, *args, **kwargs)
    882         try:
    883             outputs =\
--> 884                 self.fn() if output_subset is None else\
    885                 self.fn(output_subset=output_subset)
    886         except Exception:

/home/jason/.pyenv/versions/miniconda3-latest/envs/steam-store-analysis/lib/python3.6/site-packages/theano/scan_module/scan_op.py in rval(p, i, o, n, allow_gc)
    987         def rval(p=p, i=node_input_storage, o=node_output_storage, n=node,
    988                  allow_gc=allow_gc):
--> 989             r = p(n, [x[0] for x in i], o)
    990             for o in node.outputs:
    991                 compute_map[o][0] = True

/home/jason/.pyenv/versions/miniconda3-latest/envs/steam-store-analysis/lib/python3.6/site-packages/theano/scan_module/scan_op.py in p(node, args, outs)
    976                                                 args,
    977                                                 outs,
--> 978                                                 self, node)
    979         except (ImportError, theano.gof.cmodule.MissingGXX):
    980             p = self.execute

KeyboardInterrupt: 

In [13]:
filename = Path('models', 'weights-improvement-31-1.6567.hdf5')
model.load_weights(str(filename))
model.compile(loss='categorical_crossentropy', optimizer='adadelta')

Generate a reverse mapping for ints to chars


In [14]:
int_to_char = dict((i, c) for i, c in enumerate(chars))

Generate predictions from a seed sequence


In [15]:
start = np.random.randint(0, len(data_x)-1)
pattern = data_x[start]
print("Seed:\n{}".format(''.join([int_to_char[value] for value in pattern])))
num_generated_chars = 1000

def sample(preds, temperature=1.0):
    # sample an index from a probability array
    preds = np.asarray(preds).astype('float64')
    preds = np.log(preds) / temperature
    exp_preds = np.exp(preds)
    preds = exp_preds / np.sum(exp_preds)
    probas = np.random.multinomial(1, preds)
    return np.argmax(probas)

for diversity in (0.05, 0.1, 0.2, 0.5, 0.75, 1.0, 1.2):
    generated_str = ''

    for i in range(num_generated_chars):
        x = transform_text_samples(pattern, 1, len(pattern))
        prediction = model.predict(x, verbose=0)
        index = sample(prediction[0], temperature=diversity)
        result = int_to_char[index]
        seq_in = [int_to_char[value] for value in pattern]
        generated_str += result
        pattern.append(index)
        pattern = pattern[1:]

    print("\n\nResult (diversity {}):\n{}".format(diversity, generated_str))


Seed:
Michael Thorton will carry consequences for his future and the fate of the world.
ABOUT THIS GAME
The critically-acclaimed and award-winning


Result (diversity 0.05):
 and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and ceauh and ceauh in the complete and ceauh in the complete and the cester of the game in the complete and cester cetter cetter cetter complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and complete and comp


Result (diversity 0.1):
lete and ceauh and control of the many of the game in the complete and cester cetter cetter cetter cetter cetter ceautiful and complete and complete and ceauhs and complete the world of the game in the complete and the competitive sole of the most and complete and ceauh and cester and ceauh and the world of the world of the game to the complete and strategy game with the castle to the story of the competitive mode. 
The game is a strategy game that is a strategy game is a strategy game in the fame of the game is a strategy game with a series of an interaction of the many of the many of the game in the complete and additional combat system that is the complete and the competitive story of the competitive story of the most adventure to the world of the most experience.

Aombat and the game is a strategy game that is the fame of the game in the complete and the many of the complete and the combat system that is the strategy game with a seal- and the most and cester and cester and complete


Result (diversity 0.2):
 the game in the cester can be all the best start and experience the streets of the most powerful story in the campaign in a strategy game with a first person shooter that will experience the story of the game and the combat system that in the world of the game world.

Features:
The game where you can be a strategy game that has been into the cattle of the game in the complete and ceauhs and competitive story and ceauh in the combat system.

An additional combat tystem and competitive experience and ceauh in the content of the world of the most powerful combat to the competitive mew complete and the combat struival in the competitive story of the game to be a strategy game to ce the strategy game mode with a complete and complete story.

Aotnte your friends and interactive and strategy and the street that will be a tnique ways of the fame in the cestroy and more than ever with a complete and cester complete and complete and ceauhs and cester and ceauh in the most powerful story of the 


Result (diversity 0.5):
Sastical and exterience. Aace on the hnseroation of Mensann the world of uhe same of ship is from the stars with the strategy and the crings of the most adventure that like a syo genre and one of the three first person strpyiss of case and counter the cays of as an experience of the cattle. 
Be hs a gortily of the combat that can be a mew way of the planet with the dangerous game is more than ever ho the fame of eifht platformer campaign, And a three stune through the cam stand of an increase and play, destroyed and an ancient fight for the ancient realisy and the story in the famaxy, as commectinn of the World of an exen domprl. The game world of the combat sypes of a more for features the geroe and the land realmy how to the the story of the eark and power of the beautiful shoulators to explore the game aeventure to the modern ceauh. ABOUT THIS GAME
Io the country of a challenging and competitive and uay to the experience experience with a complexe story in a sinulation of the game. 


Result (diversity 0.75):

Bdad Sesron of a first-person shoulations of thrans set on a masser of puest to breate a ceautiful wide-ounring and more your purchasable the cootdnt gor your surpor beaome characters and combat and means adds from eefp from the cousse of Gorcer sells. . Ao in she cartiving adventure through the riilt iu is ce mocie provides on the bouh. 
Uhe friends actoss the story is an inttruted from recelyion, leaders, and celor,each of interactive, rame of all that becore your own offers aboul lose moder! as uhe course terration of the Bett Hrosr in the soldier before recoudr the powerful combat. The fame maner in the colprser down in the beautiful opw a strenle cnmfc and explar on the Aead
with enemies uitually domland the hods. 
CQST SEATSES 
 Dancy prove and clocked with faigllar, if soecial style and unique characters grow anorser the world hame is time cefind your dhpice to inporative mew gour modations,
Fvnt objects and she Crcatmvaathe Bater hs a rtrategy comtrol of his target through the


Result (diversity 1.0):
 rocaming minutes of hunabioe of puestions. Co elianced accesbtie gomd, hs pessesanteatiog eectorl HIC tragn, ouel battleshass amd rtahe of the Eark inrelse,eissie realist, 
Tie kine odoee, you  Saklted to see( Uhe olll,scwe pn his iss binaclic piwer commedtic dungeons and beaut or a nap, napu powerous tnanniine battles weapons, docu woull dhscover the girst welliigsies
and all puzzle player is. Whyhtu and this is stre bomtsoe how their pwn wariedts, Lrs a profressenu 40 tole ie exen, Nor ie is surnor three ase pani. Cfattefcsh combat with share wish mou into Mhe,Iames, b mew reason, Mey Features
Hxperrive Meeective:
Hn eifhts and bulfc on the wears alone- raz as diacr is as the most adventure while fiatou moder. but become uhe domcations and play as is words
or into the giost.
Pream thrilling all famtast inttodutes your seal tp, and rrouice. Av you in on ce sells and soace players. Whuh bm EIoge-ma the gring experience uhe feat shrough the chpem and main and save the stbrt hunce and b


Result (diversity 1.2):
egore siared. Shis senv newhls you in your freeme intow thooe,.she teries
of their ower 60 -Vie rag:  Oow Cegtions, timln ceatuieign, gelrsary dtimiie anopseru amd afainst qesstne,gasdtny combat and gighless catnefers
 Fimpedd Iohv. each with a dream dome bevigo team inteliers outhoys famietic eruinge lodsme fuents  Fmmostanibges is a thrle tracks agtt) and dme-digdlony reourations Cvpacirabke Wnrayhikase Bhius.
Eestacte nulgen rauage game in ouerlors gvtur in Hirwan slowchn thder. Heaturess amc Sizehnlkeitinn off, lefe uhe Kimc Slay as ited- fatty ouber rangemized wecponly colpsser ciawict: much uobielep inhlrtaoied by Sidtx, particlea her acout awnising lersesicle in she Oans-and Vuory: The course and challenging bailityicsliny druieicl rtecition and itnv iumYnit, Snlock in daley, encumnuary ploster reou prsategy -uhagots twrhs and exel dirtence 20 !vith new rowarion mearay twopor,- Tp sinnle is clearsmy tsorrU
your holds including incoeas - The lovn astuuni 2jdrmiehing fame gas she