Face Generation

In this project, you'll use generative adversarial networks to generate new images of faces.

Get the Data

You'll be using two datasets in this project:

  • MNIST
  • CelebA

Since the celebA dataset is complex and you're doing GANs in a project for the first time, we want you to test your neural network on MNIST before CelebA. Running the GANs on MNIST will allow you to see how well your model trains sooner.

If you're using FloydHub, set data_dir to "/input" and use the FloydHub data ID "R5KrjnANiKVhLWAkpXhNBe".


In [9]:
data_dir = './data'

# FloydHub - Use with data ID "R5KrjnANiKVhLWAkpXhNBe"
#data_dir = '/input'


"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import helper

helper.download_extract('mnist', data_dir)
helper.download_extract('celeba', data_dir)


Found mnist Data
Found celeba Data

Explore the Data

MNIST

As you're aware, the MNIST dataset contains images of handwritten digits. You can view the first number of examples by changing show_n_images.


In [10]:
show_n_images = 25

"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
%matplotlib inline
import os
from glob import glob
from matplotlib import pyplot

mnist_images = helper.get_batch(glob(os.path.join(data_dir, 'mnist/*.jpg'))[:show_n_images], 28, 28, 'L')
pyplot.imshow(helper.images_square_grid(mnist_images, 'L'), cmap='gray')


Out[10]:
<matplotlib.image.AxesImage at 0x7f87b576bc50>

CelebA

The CelebFaces Attributes Dataset (CelebA) dataset contains over 200,000 celebrity images with annotations. Since you're going to be generating faces, you won't need the annotations. You can view the first number of examples by changing show_n_images.


In [11]:
show_n_images = 25

"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
mnist_images = helper.get_batch(glob(os.path.join(data_dir, 'img_align_celeba/*.jpg'))[:show_n_images], 28, 28, 'RGB')
pyplot.imshow(helper.images_square_grid(mnist_images, 'RGB'))


Out[11]:
<matplotlib.image.AxesImage at 0x7f8810418518>

Preprocess the Data

Since the project's main focus is on building the GANs, we'll preprocess the data for you. The values of the MNIST and CelebA dataset will be in the range of -0.5 to 0.5 of 28x28 dimensional images. The CelebA images will be cropped to remove parts of the image that don't include a face, then resized down to 28x28.

The MNIST images are black and white images with a single color channel while the CelebA images have 3 color channels (RGB color channel).

Build the Neural Network

You'll build the components necessary to build a GANs by implementing the following functions below:

  • model_inputs
  • discriminator
  • generator
  • model_loss
  • model_opt
  • train

Check the Version of TensorFlow and Access to GPU

This will check to make sure you have the correct version of TensorFlow and access to a GPU


In [12]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
from distutils.version import LooseVersion
import warnings
import tensorflow as tf

# Check TensorFlow Version
assert LooseVersion(tf.__version__) >= LooseVersion('1.0'), 'Please use TensorFlow version 1.0 or newer.  You are using {}'.format(tf.__version__)
print('TensorFlow Version: {}'.format(tf.__version__))

# Check for a GPU
if not tf.test.gpu_device_name():
    warnings.warn('No GPU found. Please use a GPU to train your neural network.')
else:
    print('Default GPU Device: {}'.format(tf.test.gpu_device_name()))


TensorFlow Version: 1.1.0
Default GPU Device: /gpu:0

Input

Implement the model_inputs function to create TF Placeholders for the Neural Network. It should create the following placeholders:

  • Real input images placeholder with rank 4 using image_width, image_height, and image_channels.
  • Z input placeholder with rank 2 using z_dim.
  • Learning rate placeholder with rank 0.

Return the placeholders in the following the tuple (tensor of real input images, tensor of z data)


In [13]:
import problem_unittests as tests

def model_inputs(image_width, image_height, image_channels, z_dim):
    """
    Create the model inputs
    :param image_width: The input image width
    :param image_height: The input image height
    :param image_channels: The number of image channels
    :param z_dim: The dimension of Z
    :return: Tuple of (tensor of real input images, tensor of z data, learning rate)
    """
    # real input images placeholder
    inputs_real = tf.placeholder(tf.float32, (None, image_width, image_height, image_channels), name='input_real')
    # noise placeholder
    inputs_z = tf.placeholder(tf.float32, (None, z_dim), name='input_z')
    learning_rate = tf.placeholder(tf.float32, None, name='learn_rate')

    return (inputs_real, inputs_z, learning_rate)

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_model_inputs(model_inputs)


Tests Passed

Discriminator

Implement discriminator to create a discriminator neural network that discriminates on images. This function should be able to reuse the variables in the neural network. Use tf.variable_scope with a scope name of "discriminator" to allow the variables to be reused. The function should return a tuple of (tensor output of the discriminator, tensor logits of the discriminator).


In [14]:
def discriminator(images, reuse=False, alpha=0.2):
    """
    Create the discriminator network
    :param images: Tensor of input image(s)
    :param reuse: Boolean if the weights should be reused
    :return: Tuple of (tensor output of the discriminator, tensor logits of the discriminator)
    """
    # TODO: Implement Function
    with tf.variable_scope('discriminator', reuse=reuse):
        # Input layer is 32x32x3 (maybe?)
        #print('disc.images.shape:', images.shape)
        image_layer1 = tf.layers.conv2d(images, 64, 5, strides=2, padding='same')
        relu1 = tf.maximum(alpha * image_layer1, image_layer1)
        #print('disc.image_layer1.shape:', relu1.shape)
        # 16x16x32

        image_layer2 = tf.layers.conv2d(relu1, 128, 5, strides=2, padding='same')
        bn2 = tf.layers.batch_normalization(image_layer2, training=True)
        relu2 = tf.maximum(alpha * bn2, bn2)
        #print('disc.image_layer2.shape:', relu2.shape)
        # 8x8x128

        image_layer3 = tf.layers.conv2d(relu2, 256, 5, strides=2, padding='same')
        bn3 = tf.layers.batch_normalization(image_layer3, training=True)
        relu3 = tf.maximum(alpha * bn3, bn3)
        #print('disc.image_layer3.shape:', relu3.shape)
        # 4x4x256

        # Flatten it
        flat = tf.reshape(relu3, (-1, 4*4*256))
        logits = tf.layers.dense(flat, 1)
        out = tf.sigmoid(logits)
        #print('disc.logits.shape:', logits.shape)
        #print('disc.out.shape:', out.shape)

    return out, logits


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_discriminator(discriminator, tf)


Tests Passed

Generator

Implement generator to generate an image using z. This function should be able to reuse the variables in the neural network. Use tf.variable_scope with a scope name of "generator" to allow the variables to be reused. The function should return the generated 28 x 28 x out_channel_dim images.


In [15]:
def generator(z, out_channel_dim, is_train=True, alpha=0.2):
    """
    Create the generator network
    :param z: Input z
    :param out_channel_dim: The number of channels in the output image
    :param is_train: Boolean if generator is being used for training
    :return: The tensor output of the generator
    """
    reuse = not is_train
    with tf.variable_scope('generator', reuse=reuse):
        #print('generator.z.shape:', z.shape)

        # First fully connected layer
        x1 = tf.layers.dense(z, 7*7*512)
        # Reshape it to start the convolutional stack
        x1 = tf.reshape(x1, (-1, 7, 7, 512))
        x1 = tf.layers.batch_normalization(x1, training=is_train)
        x1 = tf.maximum(alpha * x1, x1)
        #print('gen.x1.shape:', x1.shape)

        x2 = tf.layers.conv2d_transpose(x1, 256, 5, strides=2, padding='same')
        x2 = tf.layers.batch_normalization(x2, training=is_train)
        x2 = tf.maximum(alpha * x2, x2)
        #print('gen.x2.shape:', x2.shape)

        x3 = tf.layers.conv2d_transpose(x2, 128, 5, strides=1, padding='same')
        x3 = tf.layers.batch_normalization(x3, training=is_train)
        x3 = tf.maximum(alpha * x3, x3)
        #print('gen.x3.shape:', x3.shape)

        # Output layer
        logits = tf.layers.conv2d_transpose(x3, out_channel_dim, 5, strides=2, padding='same')
        #logits = tf.image.resize_images(logits, (28, 28))
        #print('gen.logits.shape:', logits.shape)
        # 28x28xout_channel_dim

        out = tf.tanh(logits)

        return out

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_generator(generator, tf)


Tests Passed

Loss

Implement model_loss to build the GANs for training and calculate the loss. The function should return a tuple of (discriminator loss, generator loss). Use the following functions you implemented:

  • discriminator(images, reuse=False)
  • generator(z, out_channel_dim, is_train=True)

In [16]:
def model_loss(input_real, input_z, out_channel_dim):
    """
    Get the loss for the discriminator and generator
    :param input_real: Images from the real dataset
    :param input_z: Z input (i.e., noise)
    :param out_channel_dim: The number of channels in the output image
    :return: A tuple of (discriminator loss, generator loss)
    """
    g_model = generator(z=input_z, out_channel_dim=out_channel_dim)
    d_model_real, d_logits_real = discriminator(images=input_real, reuse=False)
    d_model_fake, d_logits_fake = discriminator(images=g_model, reuse=True)

    d_loss_real = tf.reduce_mean(
        tf.nn.sigmoid_cross_entropy_with_logits(logits=d_logits_real, labels=tf.ones_like(d_model_real)))
    d_loss_fake = tf.reduce_mean(
        tf.nn.sigmoid_cross_entropy_with_logits(logits=d_logits_fake, labels=tf.zeros_like(d_model_fake)))
    g_loss = tf.reduce_mean(
        tf.nn.sigmoid_cross_entropy_with_logits(logits=d_logits_fake, labels=tf.ones_like(d_model_fake)))

    d_loss = d_loss_real + d_loss_fake

    return d_loss, g_loss    


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_model_loss(model_loss)


Tests Passed

Optimization

Implement model_opt to create the optimization operations for the GANs. Use tf.trainable_variables to get all the trainable variables. Filter the variables with names that are in the discriminator and generator scope names. The function should return a tuple of (discriminator training operation, generator training operation).


In [17]:
def model_opt(d_loss, g_loss, learning_rate, beta1):
    """
    Get optimization operations
    :param d_loss: Discriminator loss Tensor
    :param g_loss: Generator loss Tensor
    :param learning_rate: Learning Rate Placeholder
    :param beta1: The exponential decay rate for the 1st moment in the optimizer
    :return: A tuple of (discriminator training optimizer, generator training optimizer)
    """
    # Get weights and bias to update
    t_vars = tf.trainable_variables()
    d_vars = [var for var in t_vars if var.name.startswith('discriminator')]
    g_vars = [var for var in t_vars if var.name.startswith('generator')]

    # Optimize
    with tf.control_dependencies(tf.get_collection(tf.GraphKeys.UPDATE_OPS)):  
        d_train_opt = tf.train.AdamOptimizer(learning_rate, beta1=beta1).minimize(d_loss, var_list=d_vars)
        g_train_opt = tf.train.AdamOptimizer(learning_rate, beta1=beta1).minimize(g_loss, var_list=g_vars)

    return d_train_opt, g_train_opt

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_model_opt(model_opt, tf)


Tests Passed

Neural Network Training

Show Output

Use this function to show the current output of the generator during training. It will help you determine how well the GANs is training.


In [18]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import numpy as np

def show_generator_output(sess, n_images, input_z, out_channel_dim, image_mode):
    """
    Show example output for the generator
    :param sess: TensorFlow session
    :param n_images: Number of Images to display
    :param input_z: Input Z Tensor
    :param out_channel_dim: The number of channels in the output image
    :param image_mode: The mode to use for images ("RGB" or "L")
    """
    cmap = None if image_mode == 'RGB' else 'gray'
    z_dim = input_z.get_shape().as_list()[-1]
    example_z = np.random.uniform(-1, 1, size=[n_images, z_dim])

    samples = sess.run(
        generator(input_z, out_channel_dim, False),
        feed_dict={input_z: example_z})

    images_grid = helper.images_square_grid(samples, image_mode)
    pyplot.imshow(images_grid, cmap=cmap)
    pyplot.show()

Train

Implement train to build and train the GANs. Use the following functions you implemented:

  • model_inputs(image_width, image_height, image_channels, z_dim)
  • model_loss(input_real, input_z, out_channel_dim)
  • model_opt(d_loss, g_loss, learning_rate, beta1)

Use the show_generator_output to show generator output while you train. Running show_generator_output for every batch will drastically increase training time and increase the size of the notebook. It's recommended to print the generator output every 100 batches.


In [19]:
class GAN:
    def __init__(self, data_shape, z_dim, learning_rate, beta1=0.5):

        height = data_shape[1]
        width = data_shape[2]
        channels = data_shape[3]
        print('height: {}, width: {}, channels: {}, z_dim: {}'.format(height, width, channels, z_dim))
        self.input_real, self.input_z, self.learning_rate = \
            model_inputs(width, height, channels, z_dim)

        self.d_loss, self.g_loss = model_loss(self.input_real, self.input_z, channels)

        self.d_opt, self.g_opt = model_opt(self.d_loss, self.g_loss, self.learning_rate, beta1)


def train(epoch_count, batch_size, z_dim, learning_rate, beta1, get_batches, data_shape, data_image_mode):
    """
    Train the GAN
    :param epoch_count: Number of epochs
    :param batch_size: Batch Size
    :param z_dim: Z dimension
    :param learning_rate: Learning Rate
    :param beta1: The exponential decay rate for the 1st moment in the optimizer
    :param get_batches: Function to get batches
    :param data_shape: Shape of the data
    :param data_image_mode: The image mode to use for images ("RGB" or "L")
    """
    print('data shape:', data_shape)
    net = GAN(data_shape, z_dim, learning_rate)
    
    saver = tf.train.Saver()

    samples, losses = [], []
    steps = 0
    print_every = 25
    show_every = 100

    with tf.Session() as sess:
        sess.run(tf.global_variables_initializer())
        for epoch_i in range(epoch_count):
            for batch_images in get_batches(batch_size):
                batch_images = batch_images * 2

                steps += 1

                # Sample random noise for G
                batch_z = np.random.uniform(-1, 1, size=(batch_size, z_dim))

                # Run optimizers
                _ = sess.run(net.d_opt, feed_dict={
                        net.input_real: batch_images, net.input_z: batch_z, net.learning_rate: learning_rate})
                _ = sess.run(net.g_opt, feed_dict={
                        net.input_real: batch_images, net.input_z: batch_z, net.learning_rate: learning_rate})
                _ = sess.run(net.g_opt, feed_dict={
                        net.input_real: batch_images, net.input_z: batch_z, net.learning_rate: learning_rate})

                if steps % print_every == 0:
                    # At the end of each epoch, get the losses and print them out
                    train_loss_d = net.d_loss.eval({
                            net.input_z: batch_z, net.input_real: batch_images, net.learning_rate: learning_rate})
                    train_loss_g = net.g_loss.eval({
                            net.input_z: batch_z, net.input_real: batch_images, net.learning_rate: learning_rate})

                    print("Epoch {}/{}...".format(epoch_i+1, epochs),
                          "Discriminator Loss: {:.4f}...".format(train_loss_d),
                          "Generator Loss: {:.4f}".format(train_loss_g))
                    # Save losses to view after training
                    losses.append((train_loss_d, train_loss_g))

                if steps % show_every == 0:
                    show_generator_output(sess, 10, net.input_z, data_shape[3], data_image_mode)

        saver.save(sess, './checkpoints/generator.ckpt')

    #with open('samples.pkl', 'wb') as f:
    #    pkl.dump(samples, f)
    
    return losses, samples

MNIST

Test your GANs architecture on MNIST. After 2 epochs, the GANs should be able to generate images that look like handwritten digits. Make sure the loss of the generator is lower than the loss of the discriminator or close to 0.


In [15]:
!mkdir checkpoints

batch_size = 128
z_dim = 100
learning_rate = 0.001
beta1 = 0.5


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
epochs = 2

mnist_dataset = helper.Dataset('mnist', glob(os.path.join(data_dir, 'mnist/*.jpg')))
tf.reset_default_graph()

with tf.Graph().as_default():
    train(epochs, batch_size, z_dim, learning_rate, beta1, mnist_dataset.get_batches,
          mnist_dataset.shape, mnist_dataset.image_mode)


mkdir: cannot create directory ‘checkpoints’: File exists
data shape: (60000, 28, 28, 1)
height: 28, width: 28, channels: 1, z_dim: 100
generator.z.shape: (?, 100)
gen.x1.shape: (?, 4, 4, 512)
gen.x2.shape: (?, 8, 8, 256)
gen.x3.shape: (?, 16, 16, 128)
gen.logits.shape: (?, 28, 28, 1)
disc.images.shape: (?, 28, 28, 1)
disc.image_layer1.shape: (?, 14, 14, 64)
disc.image_layer2.shape: (?, 7, 7, 128)
disc.image_layer3.shape: (?, 4, 4, 256)
disc.logits.shape: (?, 1)
disc.images.shape: (?, 28, 28, 1)
disc.image_layer1.shape: (?, 14, 14, 64)
disc.image_layer2.shape: (?, 7, 7, 128)
disc.image_layer3.shape: (?, 4, 4, 256)
disc.logits.shape: (?, 1)
Epoch 1/2... Discriminator Loss: 0.3953... Generator Loss: 2.9809
Epoch 1/2... Discriminator Loss: 2.3139... Generator Loss: 0.9728
Epoch 1/2... Discriminator Loss: 0.6372... Generator Loss: 1.5321
Epoch 1/2... Discriminator Loss: 0.7290... Generator Loss: 1.0158
generator.z.shape: (?, 100)
gen.x1.shape: (?, 4, 4, 512)
gen.x2.shape: (?, 8, 8, 256)
gen.x3.shape: (?, 16, 16, 128)
gen.logits.shape: (?, 28, 28, 1)
Epoch 1/2... Discriminator Loss: 1.0462... Generator Loss: 4.3096
Epoch 1/2... Discriminator Loss: 0.5284... Generator Loss: 2.6896
Epoch 1/2... Discriminator Loss: 0.5713... Generator Loss: 2.4373
Epoch 1/2... Discriminator Loss: 0.3898... Generator Loss: 1.8314
generator.z.shape: (?, 100)
gen.x1.shape: (?, 4, 4, 512)
gen.x2.shape: (?, 8, 8, 256)
gen.x3.shape: (?, 16, 16, 128)
gen.logits.shape: (?, 28, 28, 1)
Epoch 1/2... Discriminator Loss: 0.9913... Generator Loss: 1.4820
Epoch 1/2... Discriminator Loss: 0.5365... Generator Loss: 1.4952
Epoch 1/2... Discriminator Loss: 1.0379... Generator Loss: 0.8425
Epoch 1/2... Discriminator Loss: 1.0831... Generator Loss: 1.0821
generator.z.shape: (?, 100)
gen.x1.shape: (?, 4, 4, 512)
gen.x2.shape: (?, 8, 8, 256)
gen.x3.shape: (?, 16, 16, 128)
gen.logits.shape: (?, 28, 28, 1)
Epoch 1/2... Discriminator Loss: 1.4507... Generator Loss: 0.4491
Epoch 1/2... Discriminator Loss: 1.7278... Generator Loss: 0.3824
Epoch 1/2... Discriminator Loss: 1.6767... Generator Loss: 0.3356
Epoch 1/2... Discriminator Loss: 1.0764... Generator Loss: 0.6955
generator.z.shape: (?, 100)
gen.x1.shape: (?, 4, 4, 512)
gen.x2.shape: (?, 8, 8, 256)
gen.x3.shape: (?, 16, 16, 128)
gen.logits.shape: (?, 28, 28, 1)
Epoch 1/2... Discriminator Loss: 1.8303... Generator Loss: 0.3051
Epoch 1/2... Discriminator Loss: 0.8113... Generator Loss: 1.0284
Epoch 2/2... Discriminator Loss: 0.9039... Generator Loss: 0.9488
Epoch 2/2... Discriminator Loss: 1.2970... Generator Loss: 0.8633
generator.z.shape: (?, 100)
gen.x1.shape: (?, 4, 4, 512)
gen.x2.shape: (?, 8, 8, 256)
gen.x3.shape: (?, 16, 16, 128)
gen.logits.shape: (?, 28, 28, 1)
Epoch 2/2... Discriminator Loss: 0.7627... Generator Loss: 1.9209
Epoch 2/2... Discriminator Loss: 1.2090... Generator Loss: 0.5573
Epoch 2/2... Discriminator Loss: 1.6164... Generator Loss: 0.3094
Epoch 2/2... Discriminator Loss: 1.6698... Generator Loss: 0.3295
generator.z.shape: (?, 100)
gen.x1.shape: (?, 4, 4, 512)
gen.x2.shape: (?, 8, 8, 256)
gen.x3.shape: (?, 16, 16, 128)
gen.logits.shape: (?, 28, 28, 1)
Epoch 2/2... Discriminator Loss: 0.9085... Generator Loss: 1.3172
Epoch 2/2... Discriminator Loss: 0.8944... Generator Loss: 0.8483
Epoch 2/2... Discriminator Loss: 0.9245... Generator Loss: 1.1797
Epoch 2/2... Discriminator Loss: 0.9288... Generator Loss: 1.1515
generator.z.shape: (?, 100)
gen.x1.shape: (?, 4, 4, 512)
gen.x2.shape: (?, 8, 8, 256)
gen.x3.shape: (?, 16, 16, 128)
gen.logits.shape: (?, 28, 28, 1)
Epoch 2/2... Discriminator Loss: 0.8987... Generator Loss: 0.9910
Epoch 2/2... Discriminator Loss: 1.2408... Generator Loss: 0.4805
Epoch 2/2... Discriminator Loss: 1.0696... Generator Loss: 1.1688
Epoch 2/2... Discriminator Loss: 0.8408... Generator Loss: 1.5538
generator.z.shape: (?, 100)
gen.x1.shape: (?, 4, 4, 512)
gen.x2.shape: (?, 8, 8, 256)
gen.x3.shape: (?, 16, 16, 128)
gen.logits.shape: (?, 28, 28, 1)
Epoch 2/2... Discriminator Loss: 0.9605... Generator Loss: 0.7891
Epoch 2/2... Discriminator Loss: 1.0552... Generator Loss: 1.1932
Epoch 2/2... Discriminator Loss: 0.9252... Generator Loss: 1.0271
Epoch 2/2... Discriminator Loss: 0.8682... Generator Loss: 0.8822
generator.z.shape: (?, 100)
gen.x1.shape: (?, 4, 4, 512)
gen.x2.shape: (?, 8, 8, 256)
gen.x3.shape: (?, 16, 16, 128)
gen.logits.shape: (?, 28, 28, 1)
Epoch 2/2... Discriminator Loss: 1.2410... Generator Loss: 0.5514

CelebA

Run your GANs on CelebA. It will take around 20 minutes on the average GPU to run one epoch. You can run the whole epoch or stop when it starts to generate realistic faces.


In [22]:
batch_size = 64
z_dim = 200
learning_rate = 0.0001
beta1 = 0.1


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
epochs = 5

celeba_dataset = helper.Dataset('celeba', glob(os.path.join(data_dir, 'img_align_celeba/*.jpg')))
tf.reset_default_graph()

with tf.Graph().as_default():
    train(epochs, batch_size, z_dim, learning_rate, beta1, celeba_dataset.get_batches,
          celeba_dataset.shape, celeba_dataset.image_mode)


data shape: (202599, 28, 28, 3)
height: 28, width: 28, channels: 3, z_dim: 200
Epoch 1/5... Discriminator Loss: 3.4456... Generator Loss: 0.0742
Epoch 1/5... Discriminator Loss: 3.2763... Generator Loss: 0.2381
Epoch 1/5... Discriminator Loss: 2.0080... Generator Loss: 0.3828
Epoch 1/5... Discriminator Loss: 2.6455... Generator Loss: 0.2707
Epoch 1/5... Discriminator Loss: 1.8225... Generator Loss: 0.4578
Epoch 1/5... Discriminator Loss: 1.8423... Generator Loss: 0.4079
Epoch 1/5... Discriminator Loss: 1.5701... Generator Loss: 0.5357
Epoch 1/5... Discriminator Loss: 1.5974... Generator Loss: 0.5731
Epoch 1/5... Discriminator Loss: 1.6452... Generator Loss: 0.4845
Epoch 1/5... Discriminator Loss: 1.6508... Generator Loss: 0.5099
Epoch 1/5... Discriminator Loss: 1.6599... Generator Loss: 0.5156
Epoch 1/5... Discriminator Loss: 1.6364... Generator Loss: 0.5242
Epoch 1/5... Discriminator Loss: 1.5252... Generator Loss: 0.5887
Epoch 1/5... Discriminator Loss: 1.6377... Generator Loss: 0.4842
Epoch 1/5... Discriminator Loss: 1.4318... Generator Loss: 0.5991
Epoch 1/5... Discriminator Loss: 1.5129... Generator Loss: 0.5922
Epoch 1/5... Discriminator Loss: 1.6807... Generator Loss: 0.5577
Epoch 1/5... Discriminator Loss: 1.5543... Generator Loss: 0.5376
Epoch 1/5... Discriminator Loss: 1.4963... Generator Loss: 0.5811
Epoch 1/5... Discriminator Loss: 1.5391... Generator Loss: 0.5830
Epoch 1/5... Discriminator Loss: 1.6633... Generator Loss: 0.5796
Epoch 1/5... Discriminator Loss: 1.5104... Generator Loss: 0.6418
Epoch 1/5... Discriminator Loss: 1.5725... Generator Loss: 0.5741
Epoch 1/5... Discriminator Loss: 1.4882... Generator Loss: 0.6147
Epoch 1/5... Discriminator Loss: 1.5578... Generator Loss: 0.5682
Epoch 1/5... Discriminator Loss: 1.4738... Generator Loss: 0.6494
Epoch 1/5... Discriminator Loss: 1.5218... Generator Loss: 0.6085
Epoch 1/5... Discriminator Loss: 1.5078... Generator Loss: 0.6327
Epoch 1/5... Discriminator Loss: 1.4178... Generator Loss: 0.6623
Epoch 1/5... Discriminator Loss: 1.5336... Generator Loss: 0.5828
Epoch 1/5... Discriminator Loss: 1.4801... Generator Loss: 0.6635
Epoch 1/5... Discriminator Loss: 1.5070... Generator Loss: 0.5796
Epoch 1/5... Discriminator Loss: 1.4937... Generator Loss: 0.6504
Epoch 1/5... Discriminator Loss: 1.5250... Generator Loss: 0.5471
Epoch 1/5... Discriminator Loss: 1.4853... Generator Loss: 0.6032
Epoch 1/5... Discriminator Loss: 1.4868... Generator Loss: 0.6371
Epoch 1/5... Discriminator Loss: 1.4204... Generator Loss: 0.6577
Epoch 1/5... Discriminator Loss: 1.4973... Generator Loss: 0.6338
Epoch 1/5... Discriminator Loss: 1.4916... Generator Loss: 0.6988
Epoch 1/5... Discriminator Loss: 1.4691... Generator Loss: 0.6450
Epoch 1/5... Discriminator Loss: 1.4777... Generator Loss: 0.6369
Epoch 1/5... Discriminator Loss: 1.4199... Generator Loss: 0.6598
Epoch 1/5... Discriminator Loss: 1.4924... Generator Loss: 0.6537
Epoch 1/5... Discriminator Loss: 1.4775... Generator Loss: 0.6502
Epoch 1/5... Discriminator Loss: 1.4944... Generator Loss: 0.6465
Epoch 1/5... Discriminator Loss: 1.5116... Generator Loss: 0.6058
Epoch 1/5... Discriminator Loss: 1.4859... Generator Loss: 0.6178
Epoch 1/5... Discriminator Loss: 1.5074... Generator Loss: 0.5888
Epoch 1/5... Discriminator Loss: 1.4025... Generator Loss: 0.6169
Epoch 1/5... Discriminator Loss: 1.4678... Generator Loss: 0.6634
Epoch 1/5... Discriminator Loss: 1.4262... Generator Loss: 0.6537
Epoch 1/5... Discriminator Loss: 1.4868... Generator Loss: 0.6401
Epoch 1/5... Discriminator Loss: 1.6380... Generator Loss: 0.5395
Epoch 1/5... Discriminator Loss: 1.5318... Generator Loss: 0.6012
Epoch 1/5... Discriminator Loss: 1.3883... Generator Loss: 0.6596
Epoch 1/5... Discriminator Loss: 1.4776... Generator Loss: 0.6032
Epoch 1/5... Discriminator Loss: 1.4928... Generator Loss: 0.6808
Epoch 1/5... Discriminator Loss: 1.4395... Generator Loss: 0.7124
Epoch 1/5... Discriminator Loss: 1.4981... Generator Loss: 0.6039
Epoch 1/5... Discriminator Loss: 1.4703... Generator Loss: 0.6421
Epoch 1/5... Discriminator Loss: 1.4920... Generator Loss: 0.6318
Epoch 1/5... Discriminator Loss: 1.4784... Generator Loss: 0.6301
Epoch 1/5... Discriminator Loss: 1.4240... Generator Loss: 0.6569
Epoch 1/5... Discriminator Loss: 1.4764... Generator Loss: 0.6595
Epoch 1/5... Discriminator Loss: 1.4538... Generator Loss: 0.6144
Epoch 1/5... Discriminator Loss: 1.4036... Generator Loss: 0.6427
Epoch 1/5... Discriminator Loss: 1.4927... Generator Loss: 0.5934
Epoch 1/5... Discriminator Loss: 1.4007... Generator Loss: 0.6705
Epoch 1/5... Discriminator Loss: 1.4572... Generator Loss: 0.6294
Epoch 1/5... Discriminator Loss: 1.4691... Generator Loss: 0.6037
Epoch 1/5... Discriminator Loss: 1.4668... Generator Loss: 0.5928
Epoch 1/5... Discriminator Loss: 1.4498... Generator Loss: 0.6190
Epoch 1/5... Discriminator Loss: 1.4740... Generator Loss: 0.6522
Epoch 1/5... Discriminator Loss: 1.5020... Generator Loss: 0.6301
Epoch 1/5... Discriminator Loss: 1.5189... Generator Loss: 0.5795
Epoch 1/5... Discriminator Loss: 1.4440... Generator Loss: 0.6397
Epoch 1/5... Discriminator Loss: 1.4820... Generator Loss: 0.6096
Epoch 1/5... Discriminator Loss: 1.4250... Generator Loss: 0.6590
Epoch 1/5... Discriminator Loss: 1.4422... Generator Loss: 0.6448
Epoch 1/5... Discriminator Loss: 1.4007... Generator Loss: 0.6915
Epoch 1/5... Discriminator Loss: 1.4543... Generator Loss: 0.6433
Epoch 1/5... Discriminator Loss: 1.4959... Generator Loss: 0.5938
Epoch 1/5... Discriminator Loss: 1.5064... Generator Loss: 0.6159
Epoch 1/5... Discriminator Loss: 1.5115... Generator Loss: 0.6853
Epoch 1/5... Discriminator Loss: 1.5544... Generator Loss: 0.5778
Epoch 1/5... Discriminator Loss: 1.4672... Generator Loss: 0.5967
Epoch 1/5... Discriminator Loss: 1.4734... Generator Loss: 0.6402
Epoch 1/5... Discriminator Loss: 1.4730... Generator Loss: 0.6366
Epoch 1/5... Discriminator Loss: 1.4338... Generator Loss: 0.6739
Epoch 1/5... Discriminator Loss: 1.4568... Generator Loss: 0.6098
Epoch 1/5... Discriminator Loss: 1.4783... Generator Loss: 0.6262
Epoch 1/5... Discriminator Loss: 1.4459... Generator Loss: 0.6790
Epoch 1/5... Discriminator Loss: 1.4517... Generator Loss: 0.6693
Epoch 1/5... Discriminator Loss: 1.4443... Generator Loss: 0.6472
Epoch 1/5... Discriminator Loss: 1.4217... Generator Loss: 0.7053
Epoch 1/5... Discriminator Loss: 1.4526... Generator Loss: 0.6650
Epoch 1/5... Discriminator Loss: 1.4648... Generator Loss: 0.6035
Epoch 1/5... Discriminator Loss: 1.5069... Generator Loss: 0.6155
Epoch 1/5... Discriminator Loss: 1.4274... Generator Loss: 0.6332
Epoch 1/5... Discriminator Loss: 1.4190... Generator Loss: 0.6283
Epoch 1/5... Discriminator Loss: 1.4616... Generator Loss: 0.6520
Epoch 1/5... Discriminator Loss: 1.4408... Generator Loss: 0.6165
Epoch 1/5... Discriminator Loss: 1.4343... Generator Loss: 0.6734
Epoch 1/5... Discriminator Loss: 1.4706... Generator Loss: 0.6589
Epoch 1/5... Discriminator Loss: 1.4267... Generator Loss: 0.6597
Epoch 1/5... Discriminator Loss: 1.4494... Generator Loss: 0.6601
Epoch 1/5... Discriminator Loss: 1.4930... Generator Loss: 0.6279
Epoch 1/5... Discriminator Loss: 1.4984... Generator Loss: 0.5819
Epoch 1/5... Discriminator Loss: 1.4857... Generator Loss: 0.6473
Epoch 1/5... Discriminator Loss: 1.4279... Generator Loss: 0.6416
Epoch 1/5... Discriminator Loss: 1.4369... Generator Loss: 0.6403
Epoch 1/5... Discriminator Loss: 1.4550... Generator Loss: 0.6318
Epoch 1/5... Discriminator Loss: 1.4536... Generator Loss: 0.6470
Epoch 1/5... Discriminator Loss: 1.4268... Generator Loss: 0.6342
Epoch 1/5... Discriminator Loss: 1.5182... Generator Loss: 0.6035
Epoch 1/5... Discriminator Loss: 1.4308... Generator Loss: 0.6520
Epoch 1/5... Discriminator Loss: 1.4340... Generator Loss: 0.6798
Epoch 1/5... Discriminator Loss: 1.4576... Generator Loss: 0.6043
Epoch 1/5... Discriminator Loss: 1.4041... Generator Loss: 0.7042
Epoch 1/5... Discriminator Loss: 1.4555... Generator Loss: 0.6553
Epoch 1/5... Discriminator Loss: 1.4252... Generator Loss: 0.6420
Epoch 1/5... Discriminator Loss: 1.4521... Generator Loss: 0.6569
Epoch 1/5... Discriminator Loss: 1.4412... Generator Loss: 0.6612
Epoch 1/5... Discriminator Loss: 1.4374... Generator Loss: 0.6205
Epoch 1/5... Discriminator Loss: 1.4383... Generator Loss: 0.6427
Epoch 1/5... Discriminator Loss: 1.4547... Generator Loss: 0.6257
Epoch 2/5... Discriminator Loss: 1.4648... Generator Loss: 0.6530
Epoch 2/5... Discriminator Loss: 1.4509... Generator Loss: 0.6396
Epoch 2/5... Discriminator Loss: 1.4396... Generator Loss: 0.6583
Epoch 2/5... Discriminator Loss: 1.4433... Generator Loss: 0.6580
Epoch 2/5... Discriminator Loss: 1.4597... Generator Loss: 0.6777
Epoch 2/5... Discriminator Loss: 1.4117... Generator Loss: 0.6654
Epoch 2/5... Discriminator Loss: 1.4964... Generator Loss: 0.6418
Epoch 2/5... Discriminator Loss: 1.4272... Generator Loss: 0.6602
Epoch 2/5... Discriminator Loss: 1.4315... Generator Loss: 0.6714
Epoch 2/5... Discriminator Loss: 1.3977... Generator Loss: 0.6881
Epoch 2/5... Discriminator Loss: 1.4434... Generator Loss: 0.6648
Epoch 2/5... Discriminator Loss: 1.4483... Generator Loss: 0.6586
Epoch 2/5... Discriminator Loss: 1.4593... Generator Loss: 0.6377
Epoch 2/5... Discriminator Loss: 1.4348... Generator Loss: 0.6549
Epoch 2/5... Discriminator Loss: 1.4829... Generator Loss: 0.6320
Epoch 2/5... Discriminator Loss: 1.4734... Generator Loss: 0.6387
Epoch 2/5... Discriminator Loss: 1.4908... Generator Loss: 0.6210
Epoch 2/5... Discriminator Loss: 1.4688... Generator Loss: 0.6470
Epoch 2/5... Discriminator Loss: 1.4558... Generator Loss: 0.6433
Epoch 2/5... Discriminator Loss: 1.4464... Generator Loss: 0.6508
Epoch 2/5... Discriminator Loss: 1.4057... Generator Loss: 0.6569
Epoch 2/5... Discriminator Loss: 1.4438... Generator Loss: 0.6267
Epoch 2/5... Discriminator Loss: 1.4157... Generator Loss: 0.6730
Epoch 2/5... Discriminator Loss: 1.4133... Generator Loss: 0.6744
Epoch 2/5... Discriminator Loss: 1.4601... Generator Loss: 0.6327
Epoch 2/5... Discriminator Loss: 1.4308... Generator Loss: 0.6782
Epoch 2/5... Discriminator Loss: 1.4348... Generator Loss: 0.6591
Epoch 2/5... Discriminator Loss: 1.4325... Generator Loss: 0.6486
Epoch 2/5... Discriminator Loss: 1.4618... Generator Loss: 0.6379
Epoch 2/5... Discriminator Loss: 1.4256... Generator Loss: 0.6964
Epoch 2/5... Discriminator Loss: 1.4261... Generator Loss: 0.6349
Epoch 2/5... Discriminator Loss: 1.4329... Generator Loss: 0.6602
Epoch 2/5... Discriminator Loss: 1.4104... Generator Loss: 0.6613
Epoch 2/5... Discriminator Loss: 1.4633... Generator Loss: 0.6332
Epoch 2/5... Discriminator Loss: 1.4590... Generator Loss: 0.5934
Epoch 2/5... Discriminator Loss: 1.4059... Generator Loss: 0.6693
Epoch 2/5... Discriminator Loss: 1.4433... Generator Loss: 0.6500
Epoch 2/5... Discriminator Loss: 1.3965... Generator Loss: 0.6925
Epoch 2/5... Discriminator Loss: 1.4178... Generator Loss: 0.6497
Epoch 2/5... Discriminator Loss: 1.4203... Generator Loss: 0.6787
Epoch 2/5... Discriminator Loss: 1.4104... Generator Loss: 0.6545
Epoch 2/5... Discriminator Loss: 1.4096... Generator Loss: 0.6777
Epoch 2/5... Discriminator Loss: 1.4070... Generator Loss: 0.6460
Epoch 2/5... Discriminator Loss: 1.4659... Generator Loss: 0.6267
Epoch 2/5... Discriminator Loss: 1.4466... Generator Loss: 0.6643
Epoch 2/5... Discriminator Loss: 1.4065... Generator Loss: 0.6681
Epoch 2/5... Discriminator Loss: 1.4488... Generator Loss: 0.6724
Epoch 2/5... Discriminator Loss: 1.4261... Generator Loss: 0.6326
Epoch 2/5... Discriminator Loss: 1.4515... Generator Loss: 0.6178
Epoch 2/5... Discriminator Loss: 1.4258... Generator Loss: 0.6705
Epoch 2/5... Discriminator Loss: 1.4067... Generator Loss: 0.6454
Epoch 2/5... Discriminator Loss: 1.4372... Generator Loss: 0.6817
Epoch 2/5... Discriminator Loss: 1.4595... Generator Loss: 0.6668
Epoch 2/5... Discriminator Loss: 1.4447... Generator Loss: 0.6473
Epoch 2/5... Discriminator Loss: 1.4227... Generator Loss: 0.6469
Epoch 2/5... Discriminator Loss: 1.4419... Generator Loss: 0.6455
Epoch 2/5... Discriminator Loss: 1.4599... Generator Loss: 0.6481
Epoch 2/5... Discriminator Loss: 1.3883... Generator Loss: 0.6746
Epoch 2/5... Discriminator Loss: 1.4152... Generator Loss: 0.6894
Epoch 2/5... Discriminator Loss: 1.4032... Generator Loss: 0.6434
Epoch 2/5... Discriminator Loss: 1.3832... Generator Loss: 0.6889
Epoch 2/5... Discriminator Loss: 1.4174... Generator Loss: 0.6180
Epoch 2/5... Discriminator Loss: 1.4217... Generator Loss: 0.6559
Epoch 2/5... Discriminator Loss: 1.4307... Generator Loss: 0.6898
Epoch 2/5... Discriminator Loss: 1.4330... Generator Loss: 0.6660
Epoch 2/5... Discriminator Loss: 1.4096... Generator Loss: 0.6777
Epoch 2/5... Discriminator Loss: 1.4299... Generator Loss: 0.6715
Epoch 2/5... Discriminator Loss: 1.4114... Generator Loss: 0.6673
Epoch 2/5... Discriminator Loss: 1.4383... Generator Loss: 0.6845
Epoch 2/5... Discriminator Loss: 1.4366... Generator Loss: 0.6721
Epoch 2/5... Discriminator Loss: 1.3953... Generator Loss: 0.6741
Epoch 2/5... Discriminator Loss: 1.4331... Generator Loss: 0.6843
Epoch 2/5... Discriminator Loss: 1.4169... Generator Loss: 0.6777
Epoch 2/5... Discriminator Loss: 1.4533... Generator Loss: 0.6748
Epoch 2/5... Discriminator Loss: 1.4276... Generator Loss: 0.6422
Epoch 2/5... Discriminator Loss: 1.4020... Generator Loss: 0.6551
Epoch 2/5... Discriminator Loss: 1.4183... Generator Loss: 0.6939
Epoch 2/5... Discriminator Loss: 1.3988... Generator Loss: 0.6988
Epoch 2/5... Discriminator Loss: 1.4261... Generator Loss: 0.6637
Epoch 2/5... Discriminator Loss: 1.4088... Generator Loss: 0.6558
Epoch 2/5... Discriminator Loss: 1.4366... Generator Loss: 0.6366
Epoch 2/5... Discriminator Loss: 1.4448... Generator Loss: 0.6576
Epoch 2/5... Discriminator Loss: 1.4000... Generator Loss: 0.6998
Epoch 2/5... Discriminator Loss: 1.4248... Generator Loss: 0.6295
Epoch 2/5... Discriminator Loss: 1.5012... Generator Loss: 0.5988
Epoch 2/5... Discriminator Loss: 1.4191... Generator Loss: 0.6766
Epoch 2/5... Discriminator Loss: 1.4608... Generator Loss: 0.6493
Epoch 2/5... Discriminator Loss: 1.3884... Generator Loss: 0.6862
Epoch 2/5... Discriminator Loss: 1.4789... Generator Loss: 0.6323
Epoch 2/5... Discriminator Loss: 1.4394... Generator Loss: 0.6655
Epoch 2/5... Discriminator Loss: 1.3951... Generator Loss: 0.6615
Epoch 2/5... Discriminator Loss: 1.4032... Generator Loss: 0.7007
Epoch 2/5... Discriminator Loss: 1.4335... Generator Loss: 0.6648
Epoch 2/5... Discriminator Loss: 1.4191... Generator Loss: 0.7062
Epoch 2/5... Discriminator Loss: 1.4322... Generator Loss: 0.6278
Epoch 2/5... Discriminator Loss: 1.4086... Generator Loss: 0.6806
Epoch 2/5... Discriminator Loss: 1.4234... Generator Loss: 0.7022
Epoch 2/5... Discriminator Loss: 1.4251... Generator Loss: 0.6378
Epoch 2/5... Discriminator Loss: 1.4102... Generator Loss: 0.6762
Epoch 2/5... Discriminator Loss: 1.4152... Generator Loss: 0.6444
Epoch 2/5... Discriminator Loss: 1.3918... Generator Loss: 0.6842
Epoch 2/5... Discriminator Loss: 1.4564... Generator Loss: 0.6480
Epoch 2/5... Discriminator Loss: 1.4236... Generator Loss: 0.6482
Epoch 2/5... Discriminator Loss: 1.4438... Generator Loss: 0.6550
Epoch 2/5... Discriminator Loss: 1.4349... Generator Loss: 0.6333
Epoch 2/5... Discriminator Loss: 1.4126... Generator Loss: 0.6601
Epoch 2/5... Discriminator Loss: 1.4428... Generator Loss: 0.6488
Epoch 2/5... Discriminator Loss: 1.4200... Generator Loss: 0.6385
Epoch 2/5... Discriminator Loss: 1.4340... Generator Loss: 0.6622
Epoch 2/5... Discriminator Loss: 1.4241... Generator Loss: 0.6583
Epoch 2/5... Discriminator Loss: 1.4435... Generator Loss: 0.6535
Epoch 2/5... Discriminator Loss: 1.4242... Generator Loss: 0.6792
Epoch 2/5... Discriminator Loss: 1.4408... Generator Loss: 0.6668
Epoch 2/5... Discriminator Loss: 1.4440... Generator Loss: 0.6440
Epoch 2/5... Discriminator Loss: 1.4091... Generator Loss: 0.6665
Epoch 2/5... Discriminator Loss: 1.4099... Generator Loss: 0.7007
Epoch 2/5... Discriminator Loss: 1.4141... Generator Loss: 0.6693
Epoch 2/5... Discriminator Loss: 1.4117... Generator Loss: 0.6674
Epoch 2/5... Discriminator Loss: 1.3967... Generator Loss: 0.6395
Epoch 2/5... Discriminator Loss: 1.4217... Generator Loss: 0.6642
Epoch 2/5... Discriminator Loss: 1.4243... Generator Loss: 0.6662
Epoch 2/5... Discriminator Loss: 1.4030... Generator Loss: 0.6918
Epoch 2/5... Discriminator Loss: 1.4073... Generator Loss: 0.6595
Epoch 2/5... Discriminator Loss: 1.3966... Generator Loss: 0.6933
Epoch 2/5... Discriminator Loss: 1.4239... Generator Loss: 0.6868
Epoch 2/5... Discriminator Loss: 1.4225... Generator Loss: 0.6782
Epoch 2/5... Discriminator Loss: 1.3993... Generator Loss: 0.6902
Epoch 3/5... Discriminator Loss: 1.4521... Generator Loss: 0.6320
Epoch 3/5... Discriminator Loss: 1.4006... Generator Loss: 0.6917
Epoch 3/5... Discriminator Loss: 1.4193... Generator Loss: 0.6589
Epoch 3/5... Discriminator Loss: 1.4212... Generator Loss: 0.6926
Epoch 3/5... Discriminator Loss: 1.4171... Generator Loss: 0.6597
Epoch 3/5... Discriminator Loss: 1.4501... Generator Loss: 0.6460
Epoch 3/5... Discriminator Loss: 1.4592... Generator Loss: 0.6517
Epoch 3/5... Discriminator Loss: 1.4041... Generator Loss: 0.6790
Epoch 3/5... Discriminator Loss: 1.4017... Generator Loss: 0.6606
Epoch 3/5... Discriminator Loss: 1.4285... Generator Loss: 0.6516
Epoch 3/5... Discriminator Loss: 1.4277... Generator Loss: 0.6586
Epoch 3/5... Discriminator Loss: 1.4362... Generator Loss: 0.6352
Epoch 3/5... Discriminator Loss: 1.4144... Generator Loss: 0.6739
Epoch 3/5... Discriminator Loss: 1.4159... Generator Loss: 0.6948
Epoch 3/5... Discriminator Loss: 1.4382... Generator Loss: 0.6461
Epoch 3/5... Discriminator Loss: 1.4305... Generator Loss: 0.6530
Epoch 3/5... Discriminator Loss: 1.4254... Generator Loss: 0.6512
Epoch 3/5... Discriminator Loss: 1.4083... Generator Loss: 0.6792
Epoch 3/5... Discriminator Loss: 1.4095... Generator Loss: 0.6697
Epoch 3/5... Discriminator Loss: 1.4358... Generator Loss: 0.6355
Epoch 3/5... Discriminator Loss: 1.4008... Generator Loss: 0.6673
Epoch 3/5... Discriminator Loss: 1.4113... Generator Loss: 0.6618
Epoch 3/5... Discriminator Loss: 1.3871... Generator Loss: 0.6715
Epoch 3/5... Discriminator Loss: 1.4457... Generator Loss: 0.6503
Epoch 3/5... Discriminator Loss: 1.4452... Generator Loss: 0.6686
Epoch 3/5... Discriminator Loss: 1.4323... Generator Loss: 0.6546
Epoch 3/5... Discriminator Loss: 1.3865... Generator Loss: 0.6785
Epoch 3/5... Discriminator Loss: 1.4170... Generator Loss: 0.6862
Epoch 3/5... Discriminator Loss: 1.4303... Generator Loss: 0.6838
Epoch 3/5... Discriminator Loss: 1.4085... Generator Loss: 0.6813
Epoch 3/5... Discriminator Loss: 1.4517... Generator Loss: 0.6439
Epoch 3/5... Discriminator Loss: 1.4151... Generator Loss: 0.6640
Epoch 3/5... Discriminator Loss: 1.4104... Generator Loss: 0.6553
Epoch 3/5... Discriminator Loss: 1.4053... Generator Loss: 0.6482
Epoch 3/5... Discriminator Loss: 1.3843... Generator Loss: 0.6772
Epoch 3/5... Discriminator Loss: 1.3842... Generator Loss: 0.7107
Epoch 3/5... Discriminator Loss: 1.3948... Generator Loss: 0.6595
Epoch 3/5... Discriminator Loss: 1.4273... Generator Loss: 0.6750
Epoch 3/5... Discriminator Loss: 1.4125... Generator Loss: 0.6820
Epoch 3/5... Discriminator Loss: 1.4003... Generator Loss: 0.6555
Epoch 3/5... Discriminator Loss: 1.4276... Generator Loss: 0.6566
Epoch 3/5... Discriminator Loss: 1.4614... Generator Loss: 0.6455
Epoch 3/5... Discriminator Loss: 1.4233... Generator Loss: 0.6462
Epoch 3/5... Discriminator Loss: 1.4053... Generator Loss: 0.6420
Epoch 3/5... Discriminator Loss: 1.3962... Generator Loss: 0.6641
Epoch 3/5... Discriminator Loss: 1.4295... Generator Loss: 0.6895
Epoch 3/5... Discriminator Loss: 1.3936... Generator Loss: 0.6863
Epoch 3/5... Discriminator Loss: 1.4178... Generator Loss: 0.7106
Epoch 3/5... Discriminator Loss: 1.4504... Generator Loss: 0.6555
Epoch 3/5... Discriminator Loss: 1.4333... Generator Loss: 0.6770
Epoch 3/5... Discriminator Loss: 1.3984... Generator Loss: 0.6540
Epoch 3/5... Discriminator Loss: 1.4068... Generator Loss: 0.6772
Epoch 3/5... Discriminator Loss: 1.4185... Generator Loss: 0.6575
Epoch 3/5... Discriminator Loss: 1.4234... Generator Loss: 0.6521
Epoch 3/5... Discriminator Loss: 1.4148... Generator Loss: 0.6232
Epoch 3/5... Discriminator Loss: 1.4033... Generator Loss: 0.6684
Epoch 3/5... Discriminator Loss: 1.4135... Generator Loss: 0.6600
Epoch 3/5... Discriminator Loss: 1.4001... Generator Loss: 0.6964
Epoch 3/5... Discriminator Loss: 1.3912... Generator Loss: 0.6689
Epoch 3/5... Discriminator Loss: 1.4227... Generator Loss: 0.6652
Epoch 3/5... Discriminator Loss: 1.4228... Generator Loss: 0.6511
Epoch 3/5... Discriminator Loss: 1.4221... Generator Loss: 0.6601
Epoch 3/5... Discriminator Loss: 1.4007... Generator Loss: 0.6680
Epoch 3/5... Discriminator Loss: 1.4032... Generator Loss: 0.6663
Epoch 3/5... Discriminator Loss: 1.3972... Generator Loss: 0.6904
Epoch 3/5... Discriminator Loss: 1.4046... Generator Loss: 0.6983
Epoch 3/5... Discriminator Loss: 1.4113... Generator Loss: 0.6439
Epoch 3/5... Discriminator Loss: 1.4324... Generator Loss: 0.6372
Epoch 3/5... Discriminator Loss: 1.4185... Generator Loss: 0.6703
Epoch 3/5... Discriminator Loss: 1.4020... Generator Loss: 0.7015
Epoch 3/5... Discriminator Loss: 1.4150... Generator Loss: 0.6541
Epoch 3/5... Discriminator Loss: 1.3923... Generator Loss: 0.6981
Epoch 3/5... Discriminator Loss: 1.3958... Generator Loss: 0.6906
Epoch 3/5... Discriminator Loss: 1.4193... Generator Loss: 0.6667
Epoch 3/5... Discriminator Loss: 1.3973... Generator Loss: 0.6717
Epoch 3/5... Discriminator Loss: 1.4088... Generator Loss: 0.6653
Epoch 3/5... Discriminator Loss: 1.4194... Generator Loss: 0.6754
Epoch 3/5... Discriminator Loss: 1.4010... Generator Loss: 0.6666
Epoch 3/5... Discriminator Loss: 1.4068... Generator Loss: 0.6935
Epoch 3/5... Discriminator Loss: 1.4002... Generator Loss: 0.6651
Epoch 3/5... Discriminator Loss: 1.4629... Generator Loss: 0.6401
Epoch 3/5... Discriminator Loss: 1.3987... Generator Loss: 0.6860
Epoch 3/5... Discriminator Loss: 1.3962... Generator Loss: 0.6625
Epoch 3/5... Discriminator Loss: 1.4105... Generator Loss: 0.6756
Epoch 3/5... Discriminator Loss: 1.4287... Generator Loss: 0.6460
Epoch 3/5... Discriminator Loss: 1.4259... Generator Loss: 0.6662
Epoch 3/5... Discriminator Loss: 1.4196... Generator Loss: 0.6854
Epoch 3/5... Discriminator Loss: 1.3995... Generator Loss: 0.6763
Epoch 3/5... Discriminator Loss: 1.4018... Generator Loss: 0.6939
Epoch 3/5... Discriminator Loss: 1.4261... Generator Loss: 0.6795
Epoch 3/5... Discriminator Loss: 1.3902... Generator Loss: 0.7251
Epoch 3/5... Discriminator Loss: 1.4114... Generator Loss: 0.6586
Epoch 3/5... Discriminator Loss: 1.4284... Generator Loss: 0.6787
Epoch 3/5... Discriminator Loss: 1.4035... Generator Loss: 0.6892
Epoch 3/5... Discriminator Loss: 1.4117... Generator Loss: 0.6909
Epoch 3/5... Discriminator Loss: 1.4180... Generator Loss: 0.6637
Epoch 3/5... Discriminator Loss: 1.4007... Generator Loss: 0.6387
Epoch 3/5... Discriminator Loss: 1.4393... Generator Loss: 0.6465
Epoch 3/5... Discriminator Loss: 1.4194... Generator Loss: 0.6373
Epoch 3/5... Discriminator Loss: 1.4290... Generator Loss: 0.6601
Epoch 3/5... Discriminator Loss: 1.4028... Generator Loss: 0.6650
Epoch 3/5... Discriminator Loss: 1.4331... Generator Loss: 0.6627
Epoch 3/5... Discriminator Loss: 1.4432... Generator Loss: 0.6389
Epoch 3/5... Discriminator Loss: 1.4164... Generator Loss: 0.6608
Epoch 3/5... Discriminator Loss: 1.4128... Generator Loss: 0.6837
Epoch 3/5... Discriminator Loss: 1.4162... Generator Loss: 0.6703
Epoch 3/5... Discriminator Loss: 1.3950... Generator Loss: 0.7059
Epoch 3/5... Discriminator Loss: 1.4033... Generator Loss: 0.6649
Epoch 3/5... Discriminator Loss: 1.4358... Generator Loss: 0.6565
Epoch 3/5... Discriminator Loss: 1.4136... Generator Loss: 0.6756
Epoch 3/5... Discriminator Loss: 1.4147... Generator Loss: 0.6797
Epoch 3/5... Discriminator Loss: 1.3978... Generator Loss: 0.7122
Epoch 3/5... Discriminator Loss: 1.3972... Generator Loss: 0.6826
Epoch 3/5... Discriminator Loss: 1.4139... Generator Loss: 0.6604
Epoch 3/5... Discriminator Loss: 1.4016... Generator Loss: 0.6708
Epoch 3/5... Discriminator Loss: 1.3869... Generator Loss: 0.7042
Epoch 3/5... Discriminator Loss: 1.4054... Generator Loss: 0.6597
Epoch 3/5... Discriminator Loss: 1.4208... Generator Loss: 0.6584
Epoch 3/5... Discriminator Loss: 1.4063... Generator Loss: 0.6575
Epoch 3/5... Discriminator Loss: 1.4358... Generator Loss: 0.6499
Epoch 3/5... Discriminator Loss: 1.3974... Generator Loss: 0.6468
Epoch 3/5... Discriminator Loss: 1.3926... Generator Loss: 0.6976
Epoch 3/5... Discriminator Loss: 1.3962... Generator Loss: 0.6973
Epoch 3/5... Discriminator Loss: 1.4129... Generator Loss: 0.6820
Epoch 3/5... Discriminator Loss: 1.4188... Generator Loss: 0.6626
Epoch 3/5... Discriminator Loss: 1.4263... Generator Loss: 0.6812
Epoch 4/5... Discriminator Loss: 1.4170... Generator Loss: 0.6350
Epoch 4/5... Discriminator Loss: 1.4503... Generator Loss: 0.6692
Epoch 4/5... Discriminator Loss: 1.4174... Generator Loss: 0.6631
Epoch 4/5... Discriminator Loss: 1.3953... Generator Loss: 0.6828
Epoch 4/5... Discriminator Loss: 1.4114... Generator Loss: 0.6497
Epoch 4/5... Discriminator Loss: 1.4221... Generator Loss: 0.6824
Epoch 4/5... Discriminator Loss: 1.4311... Generator Loss: 0.6460
Epoch 4/5... Discriminator Loss: 1.4189... Generator Loss: 0.6769
Epoch 4/5... Discriminator Loss: 1.4360... Generator Loss: 0.6336
Epoch 4/5... Discriminator Loss: 1.4216... Generator Loss: 0.6719
Epoch 4/5... Discriminator Loss: 1.4305... Generator Loss: 0.6556
Epoch 4/5... Discriminator Loss: 1.4049... Generator Loss: 0.6438
Epoch 4/5... Discriminator Loss: 1.3872... Generator Loss: 0.6950
Epoch 4/5... Discriminator Loss: 1.3999... Generator Loss: 0.6767
Epoch 4/5... Discriminator Loss: 1.3964... Generator Loss: 0.6525
Epoch 4/5... Discriminator Loss: 1.4181... Generator Loss: 0.6682
Epoch 4/5... Discriminator Loss: 1.4184... Generator Loss: 0.6890
Epoch 4/5... Discriminator Loss: 1.4041... Generator Loss: 0.6582
Epoch 4/5... Discriminator Loss: 1.4247... Generator Loss: 0.6795
Epoch 4/5... Discriminator Loss: 1.3975... Generator Loss: 0.6958
Epoch 4/5... Discriminator Loss: 1.4405... Generator Loss: 0.6844
Epoch 4/5... Discriminator Loss: 1.3992... Generator Loss: 0.6658
Epoch 4/5... Discriminator Loss: 1.4093... Generator Loss: 0.6650
Epoch 4/5... Discriminator Loss: 1.4079... Generator Loss: 0.6929
Epoch 4/5... Discriminator Loss: 1.3940... Generator Loss: 0.6780
Epoch 4/5... Discriminator Loss: 1.4052... Generator Loss: 0.6857
Epoch 4/5... Discriminator Loss: 1.4011... Generator Loss: 0.6755
Epoch 4/5... Discriminator Loss: 1.4241... Generator Loss: 0.6528
Epoch 4/5... Discriminator Loss: 1.4289... Generator Loss: 0.6460
Epoch 4/5... Discriminator Loss: 1.4179... Generator Loss: 0.6645
Epoch 4/5... Discriminator Loss: 1.4164... Generator Loss: 0.6814
Epoch 4/5... Discriminator Loss: 1.3983... Generator Loss: 0.6922
Epoch 4/5... Discriminator Loss: 1.4093... Generator Loss: 0.6691
Epoch 4/5... Discriminator Loss: 1.4084... Generator Loss: 0.6552
Epoch 4/5... Discriminator Loss: 1.4221... Generator Loss: 0.6592
Epoch 4/5... Discriminator Loss: 1.4017... Generator Loss: 0.6957
Epoch 4/5... Discriminator Loss: 1.4178... Generator Loss: 0.6580
Epoch 4/5... Discriminator Loss: 1.3962... Generator Loss: 0.6625
Epoch 4/5... Discriminator Loss: 1.4141... Generator Loss: 0.6897
Epoch 4/5... Discriminator Loss: 1.4195... Generator Loss: 0.6583
Epoch 4/5... Discriminator Loss: 1.4054... Generator Loss: 0.6941
Epoch 4/5... Discriminator Loss: 1.4024... Generator Loss: 0.6808
Epoch 4/5... Discriminator Loss: 1.4185... Generator Loss: 0.6856
Epoch 4/5... Discriminator Loss: 1.4004... Generator Loss: 0.7198
Epoch 4/5... Discriminator Loss: 1.4181... Generator Loss: 0.6363
Epoch 4/5... Discriminator Loss: 1.4089... Generator Loss: 0.6979
Epoch 4/5... Discriminator Loss: 1.4008... Generator Loss: 0.6810
Epoch 4/5... Discriminator Loss: 1.4123... Generator Loss: 0.6416
Epoch 4/5... Discriminator Loss: 1.4302... Generator Loss: 0.6385
Epoch 4/5... Discriminator Loss: 1.3996... Generator Loss: 0.6632
Epoch 4/5... Discriminator Loss: 1.4163... Generator Loss: 0.6626
Epoch 4/5... Discriminator Loss: 1.3954... Generator Loss: 0.6663
Epoch 4/5... Discriminator Loss: 1.4163... Generator Loss: 0.6431
Epoch 4/5... Discriminator Loss: 1.3914... Generator Loss: 0.7067
Epoch 4/5... Discriminator Loss: 1.4241... Generator Loss: 0.6576
Epoch 4/5... Discriminator Loss: 1.3880... Generator Loss: 0.6774
Epoch 4/5... Discriminator Loss: 1.3893... Generator Loss: 0.6848
Epoch 4/5... Discriminator Loss: 1.4073... Generator Loss: 0.6554
Epoch 4/5... Discriminator Loss: 1.4071... Generator Loss: 0.6948
Epoch 4/5... Discriminator Loss: 1.4344... Generator Loss: 0.6411
Epoch 4/5... Discriminator Loss: 1.3992... Generator Loss: 0.7087
Epoch 4/5... Discriminator Loss: 1.4095... Generator Loss: 0.6734
Epoch 4/5... Discriminator Loss: 1.4140... Generator Loss: 0.6654
Epoch 4/5... Discriminator Loss: 1.3969... Generator Loss: 0.7013
Epoch 4/5... Discriminator Loss: 1.4123... Generator Loss: 0.6922
Epoch 4/5... Discriminator Loss: 1.4220... Generator Loss: 0.6637
Epoch 4/5... Discriminator Loss: 1.4032... Generator Loss: 0.7229
Epoch 4/5... Discriminator Loss: 1.4260... Generator Loss: 0.6684
Epoch 4/5... Discriminator Loss: 1.4297... Generator Loss: 0.6585
Epoch 4/5... Discriminator Loss: 1.4041... Generator Loss: 0.6561
Epoch 4/5... Discriminator Loss: 1.4260... Generator Loss: 0.6636
Epoch 4/5... Discriminator Loss: 1.4102... Generator Loss: 0.7111
Epoch 4/5... Discriminator Loss: 1.4018... Generator Loss: 0.6455
Epoch 4/5... Discriminator Loss: 1.4071... Generator Loss: 0.6989
Epoch 4/5... Discriminator Loss: 1.4154... Generator Loss: 0.6475
Epoch 4/5... Discriminator Loss: 1.3939... Generator Loss: 0.7006
Epoch 4/5... Discriminator Loss: 1.4113... Generator Loss: 0.6721
Epoch 4/5... Discriminator Loss: 1.3963... Generator Loss: 0.6915
Epoch 4/5... Discriminator Loss: 1.4047... Generator Loss: 0.7029
Epoch 4/5... Discriminator Loss: 1.4098... Generator Loss: 0.6505
Epoch 4/5... Discriminator Loss: 1.3922... Generator Loss: 0.6826
Epoch 4/5... Discriminator Loss: 1.4148... Generator Loss: 0.6477
Epoch 4/5... Discriminator Loss: 1.3812... Generator Loss: 0.6863
Epoch 4/5... Discriminator Loss: 1.4081... Generator Loss: 0.6622
Epoch 4/5... Discriminator Loss: 1.4003... Generator Loss: 0.6818
Epoch 4/5... Discriminator Loss: 1.4237... Generator Loss: 0.6708
Epoch 4/5... Discriminator Loss: 1.3939... Generator Loss: 0.6652
Epoch 4/5... Discriminator Loss: 1.4234... Generator Loss: 0.6703
Epoch 4/5... Discriminator Loss: 1.4297... Generator Loss: 0.6394
Epoch 4/5... Discriminator Loss: 1.4087... Generator Loss: 0.6755
Epoch 4/5... Discriminator Loss: 1.4015... Generator Loss: 0.6515
Epoch 4/5... Discriminator Loss: 1.4219... Generator Loss: 0.6917
Epoch 4/5... Discriminator Loss: 1.4266... Generator Loss: 0.6553
Epoch 4/5... Discriminator Loss: 1.4347... Generator Loss: 0.6580
Epoch 4/5... Discriminator Loss: 1.4081... Generator Loss: 0.7035
Epoch 4/5... Discriminator Loss: 1.4170... Generator Loss: 0.6682
Epoch 4/5... Discriminator Loss: 1.4033... Generator Loss: 0.6637
Epoch 4/5... Discriminator Loss: 1.4138... Generator Loss: 0.6847
Epoch 4/5... Discriminator Loss: 1.4072... Generator Loss: 0.6259
Epoch 4/5... Discriminator Loss: 1.4116... Generator Loss: 0.6584
Epoch 4/5... Discriminator Loss: 1.3876... Generator Loss: 0.6563
Epoch 4/5... Discriminator Loss: 1.4091... Generator Loss: 0.6824
Epoch 4/5... Discriminator Loss: 1.3960... Generator Loss: 0.7184
Epoch 4/5... Discriminator Loss: 1.4157... Generator Loss: 0.6569
Epoch 4/5... Discriminator Loss: 1.4021... Generator Loss: 0.6692
Epoch 4/5... Discriminator Loss: 1.4022... Generator Loss: 0.7045
Epoch 4/5... Discriminator Loss: 1.4083... Generator Loss: 0.6505
Epoch 4/5... Discriminator Loss: 1.4051... Generator Loss: 0.6936
Epoch 4/5... Discriminator Loss: 1.3925... Generator Loss: 0.6578
Epoch 4/5... Discriminator Loss: 1.4192... Generator Loss: 0.6653
Epoch 4/5... Discriminator Loss: 1.4176... Generator Loss: 0.6687
Epoch 4/5... Discriminator Loss: 1.4061... Generator Loss: 0.7056
Epoch 4/5... Discriminator Loss: 1.4262... Generator Loss: 0.6587
Epoch 4/5... Discriminator Loss: 1.4240... Generator Loss: 0.6770
Epoch 4/5... Discriminator Loss: 1.4006... Generator Loss: 0.6572
Epoch 4/5... Discriminator Loss: 1.4228... Generator Loss: 0.6782
Epoch 4/5... Discriminator Loss: 1.3887... Generator Loss: 0.6755
Epoch 4/5... Discriminator Loss: 1.4019... Generator Loss: 0.6755
Epoch 4/5... Discriminator Loss: 1.4268... Generator Loss: 0.6644
Epoch 4/5... Discriminator Loss: 1.3944... Generator Loss: 0.6722
Epoch 4/5... Discriminator Loss: 1.4330... Generator Loss: 0.6828
Epoch 4/5... Discriminator Loss: 1.3980... Generator Loss: 0.6554
Epoch 4/5... Discriminator Loss: 1.3966... Generator Loss: 0.6952
Epoch 4/5... Discriminator Loss: 1.4033... Generator Loss: 0.6795
Epoch 4/5... Discriminator Loss: 1.4150... Generator Loss: 0.6550
Epoch 4/5... Discriminator Loss: 1.3956... Generator Loss: 0.6552
Epoch 4/5... Discriminator Loss: 1.4331... Generator Loss: 0.6451
Epoch 5/5... Discriminator Loss: 1.3988... Generator Loss: 0.6767
Epoch 5/5... Discriminator Loss: 1.4060... Generator Loss: 0.6648
Epoch 5/5... Discriminator Loss: 1.4098... Generator Loss: 0.6616
Epoch 5/5... Discriminator Loss: 1.4120... Generator Loss: 0.6886
Epoch 5/5... Discriminator Loss: 1.3871... Generator Loss: 0.6760
Epoch 5/5... Discriminator Loss: 1.3966... Generator Loss: 0.6573
Epoch 5/5... Discriminator Loss: 1.4224... Generator Loss: 0.6822
Epoch 5/5... Discriminator Loss: 1.3944... Generator Loss: 0.6840
Epoch 5/5... Discriminator Loss: 1.3772... Generator Loss: 0.6697
Epoch 5/5... Discriminator Loss: 1.4064... Generator Loss: 0.6665
Epoch 5/5... Discriminator Loss: 1.4124... Generator Loss: 0.6695
Epoch 5/5... Discriminator Loss: 1.4072... Generator Loss: 0.6828
Epoch 5/5... Discriminator Loss: 1.3963... Generator Loss: 0.6853
Epoch 5/5... Discriminator Loss: 1.4052... Generator Loss: 0.6645
Epoch 5/5... Discriminator Loss: 1.4087... Generator Loss: 0.6905
Epoch 5/5... Discriminator Loss: 1.4005... Generator Loss: 0.6851
Epoch 5/5... Discriminator Loss: 1.3931... Generator Loss: 0.6918
Epoch 5/5... Discriminator Loss: 1.4163... Generator Loss: 0.6432
Epoch 5/5... Discriminator Loss: 1.4052... Generator Loss: 0.6460
Epoch 5/5... Discriminator Loss: 1.4056... Generator Loss: 0.6940
Epoch 5/5... Discriminator Loss: 1.4138... Generator Loss: 0.6798
Epoch 5/5... Discriminator Loss: 1.4134... Generator Loss: 0.6889
Epoch 5/5... Discriminator Loss: 1.4148... Generator Loss: 0.6749
Epoch 5/5... Discriminator Loss: 1.3949... Generator Loss: 0.6860
Epoch 5/5... Discriminator Loss: 1.4080... Generator Loss: 0.6888
Epoch 5/5... Discriminator Loss: 1.4153... Generator Loss: 0.6707
Epoch 5/5... Discriminator Loss: 1.3988... Generator Loss: 0.6690
Epoch 5/5... Discriminator Loss: 1.3808... Generator Loss: 0.6831
Epoch 5/5... Discriminator Loss: 1.4110... Generator Loss: 0.6803
Epoch 5/5... Discriminator Loss: 1.4252... Generator Loss: 0.6575
Epoch 5/5... Discriminator Loss: 1.4027... Generator Loss: 0.6531
Epoch 5/5... Discriminator Loss: 1.4016... Generator Loss: 0.6875
Epoch 5/5... Discriminator Loss: 1.4159... Generator Loss: 0.6812
Epoch 5/5... Discriminator Loss: 1.4117... Generator Loss: 0.6666
Epoch 5/5... Discriminator Loss: 1.4035... Generator Loss: 0.6605
Epoch 5/5... Discriminator Loss: 1.3912... Generator Loss: 0.6763
Epoch 5/5... Discriminator Loss: 1.3997... Generator Loss: 0.6770
Epoch 5/5... Discriminator Loss: 1.4017... Generator Loss: 0.7072
Epoch 5/5... Discriminator Loss: 1.3989... Generator Loss: 0.6689
Epoch 5/5... Discriminator Loss: 1.3953... Generator Loss: 0.6965
Epoch 5/5... Discriminator Loss: 1.4189... Generator Loss: 0.6708
Epoch 5/5... Discriminator Loss: 1.4130... Generator Loss: 0.6469
Epoch 5/5... Discriminator Loss: 1.3926... Generator Loss: 0.6893
Epoch 5/5... Discriminator Loss: 1.4057... Generator Loss: 0.6583
Epoch 5/5... Discriminator Loss: 1.4193... Generator Loss: 0.6609
Epoch 5/5... Discriminator Loss: 1.4051... Generator Loss: 0.6801
Epoch 5/5... Discriminator Loss: 1.4010... Generator Loss: 0.6604
Epoch 5/5... Discriminator Loss: 1.3994... Generator Loss: 0.6823
Epoch 5/5... Discriminator Loss: 1.3993... Generator Loss: 0.6652
Epoch 5/5... Discriminator Loss: 1.3847... Generator Loss: 0.7171
Epoch 5/5... Discriminator Loss: 1.3993... Generator Loss: 0.6922
Epoch 5/5... Discriminator Loss: 1.4107... Generator Loss: 0.6840
Epoch 5/5... Discriminator Loss: 1.4228... Generator Loss: 0.6474
Epoch 5/5... Discriminator Loss: 1.4233... Generator Loss: 0.6732
Epoch 5/5... Discriminator Loss: 1.4303... Generator Loss: 0.6299
Epoch 5/5... Discriminator Loss: 1.4093... Generator Loss: 0.6524
Epoch 5/5... Discriminator Loss: 1.4186... Generator Loss: 0.6384
Epoch 5/5... Discriminator Loss: 1.4165... Generator Loss: 0.6485
Epoch 5/5... Discriminator Loss: 1.4019... Generator Loss: 0.6852
Epoch 5/5... Discriminator Loss: 1.3973... Generator Loss: 0.6847
Epoch 5/5... Discriminator Loss: 1.3663... Generator Loss: 0.7189
Epoch 5/5... Discriminator Loss: 1.3942... Generator Loss: 0.6725
Epoch 5/5... Discriminator Loss: 1.4057... Generator Loss: 0.6892
Epoch 5/5... Discriminator Loss: 1.4239... Generator Loss: 0.6802
Epoch 5/5... Discriminator Loss: 1.4077... Generator Loss: 0.6723
Epoch 5/5... Discriminator Loss: 1.4088... Generator Loss: 0.6571
Epoch 5/5... Discriminator Loss: 1.3970... Generator Loss: 0.6869
Epoch 5/5... Discriminator Loss: 1.4033... Generator Loss: 0.6976
Epoch 5/5... Discriminator Loss: 1.4275... Generator Loss: 0.6639
Epoch 5/5... Discriminator Loss: 1.3886... Generator Loss: 0.6563
Epoch 5/5... Discriminator Loss: 1.4041... Generator Loss: 0.6544
Epoch 5/5... Discriminator Loss: 1.4000... Generator Loss: 0.6739
Epoch 5/5... Discriminator Loss: 1.3893... Generator Loss: 0.6556
Epoch 5/5... Discriminator Loss: 1.4081... Generator Loss: 0.6677
Epoch 5/5... Discriminator Loss: 1.3954... Generator Loss: 0.6583
Epoch 5/5... Discriminator Loss: 1.3930... Generator Loss: 0.6878
Epoch 5/5... Discriminator Loss: 1.4005... Generator Loss: 0.6977
Epoch 5/5... Discriminator Loss: 1.3906... Generator Loss: 0.6541
Epoch 5/5... Discriminator Loss: 1.4079... Generator Loss: 0.6513
Epoch 5/5... Discriminator Loss: 1.4200... Generator Loss: 0.6630
Epoch 5/5... Discriminator Loss: 1.3975... Generator Loss: 0.6716
Epoch 5/5... Discriminator Loss: 1.3993... Generator Loss: 0.6565
Epoch 5/5... Discriminator Loss: 1.4131... Generator Loss: 0.7015
Epoch 5/5... Discriminator Loss: 1.4253... Generator Loss: 0.6613
Epoch 5/5... Discriminator Loss: 1.4099... Generator Loss: 0.6710
Epoch 5/5... Discriminator Loss: 1.4241... Generator Loss: 0.6518
Epoch 5/5... Discriminator Loss: 1.4121... Generator Loss: 0.6888
Epoch 5/5... Discriminator Loss: 1.4093... Generator Loss: 0.6426
Epoch 5/5... Discriminator Loss: 1.3906... Generator Loss: 0.6888
Epoch 5/5... Discriminator Loss: 1.4071... Generator Loss: 0.6669
Epoch 5/5... Discriminator Loss: 1.4122... Generator Loss: 0.6702
Epoch 5/5... Discriminator Loss: 1.4272... Generator Loss: 0.6382
Epoch 5/5... Discriminator Loss: 1.3913... Generator Loss: 0.7119
Epoch 5/5... Discriminator Loss: 1.4121... Generator Loss: 0.6734
Epoch 5/5... Discriminator Loss: 1.4011... Generator Loss: 0.6782
Epoch 5/5... Discriminator Loss: 1.4133... Generator Loss: 0.6576
Epoch 5/5... Discriminator Loss: 1.4248... Generator Loss: 0.6353
Epoch 5/5... Discriminator Loss: 1.4004... Generator Loss: 0.6513
Epoch 5/5... Discriminator Loss: 1.4166... Generator Loss: 0.6660
Epoch 5/5... Discriminator Loss: 1.4212... Generator Loss: 0.6874
Epoch 5/5... Discriminator Loss: 1.3982... Generator Loss: 0.6503
Epoch 5/5... Discriminator Loss: 1.3912... Generator Loss: 0.6633
Epoch 5/5... Discriminator Loss: 1.3851... Generator Loss: 0.6687
Epoch 5/5... Discriminator Loss: 1.4113... Generator Loss: 0.6706
Epoch 5/5... Discriminator Loss: 1.4032... Generator Loss: 0.6872
Epoch 5/5... Discriminator Loss: 1.4020... Generator Loss: 0.6753
Epoch 5/5... Discriminator Loss: 1.4098... Generator Loss: 0.6641
Epoch 5/5... Discriminator Loss: 1.4063... Generator Loss: 0.6531
Epoch 5/5... Discriminator Loss: 1.3841... Generator Loss: 0.6683
Epoch 5/5... Discriminator Loss: 1.3960... Generator Loss: 0.6760
Epoch 5/5... Discriminator Loss: 1.4208... Generator Loss: 0.6503
Epoch 5/5... Discriminator Loss: 1.3923... Generator Loss: 0.6818
Epoch 5/5... Discriminator Loss: 1.4049... Generator Loss: 0.6871
Epoch 5/5... Discriminator Loss: 1.3935... Generator Loss: 0.6903
Epoch 5/5... Discriminator Loss: 1.4053... Generator Loss: 0.6518
Epoch 5/5... Discriminator Loss: 1.3971... Generator Loss: 0.6943
Epoch 5/5... Discriminator Loss: 1.4284... Generator Loss: 0.6849
Epoch 5/5... Discriminator Loss: 1.3992... Generator Loss: 0.6889
Epoch 5/5... Discriminator Loss: 1.4166... Generator Loss: 0.6353
Epoch 5/5... Discriminator Loss: 1.4114... Generator Loss: 0.6539
Epoch 5/5... Discriminator Loss: 1.4016... Generator Loss: 0.6475
Epoch 5/5... Discriminator Loss: 1.3998... Generator Loss: 0.6655
Epoch 5/5... Discriminator Loss: 1.4031... Generator Loss: 0.6672
Epoch 5/5... Discriminator Loss: 1.4193... Generator Loss: 0.6635
Epoch 5/5... Discriminator Loss: 1.4044... Generator Loss: 0.6547
Epoch 5/5... Discriminator Loss: 1.4084... Generator Loss: 0.6854
Epoch 5/5... Discriminator Loss: 1.3864... Generator Loss: 0.6954

Submitting This Project

When submitting this project, make sure to run all the cells before saving the notebook. Save the notebook file as "dlnd_face_generation.ipynb" and save it as a HTML file under "File" -> "Download as". Include the "helper.py" and "problem_unittests.py" files in your submission.