Deep Convolutional GANs

In this notebook, you'll build a GAN using convolutional layers in the generator and discriminator. This is called a Deep Convolutional GAN, or DCGAN for short. The DCGAN architecture was first explored last year and has seen impressive results in generating new images, you can read the original paper here.

You'll be training DCGAN on the Street View House Numbers (SVHN) dataset. These are color images of house numbers collected from Google street view. SVHN images are in color and much more variable than MNIST.

So, we'll need a deeper and more powerful network. This is accomplished through using convolutional layers in the discriminator and generator. It's also necessary to use batch normalization to get the convolutional networks to train. The only real changes compared to what you saw previously are in the generator and discriminator, otherwise the rest of the implementation is the same.


In [1]:
%matplotlib inline

import pickle as pkl

import matplotlib.pyplot as plt
import numpy as np
from scipy.io import loadmat
import tensorflow as tf

In [2]:
!mkdir data


mkdir: cannot create directory ‘data’: File exists

Getting the data

Here you can download the SVHN dataset. Run the cell above and it'll download to your machine.


In [3]:
from urllib.request import urlretrieve
from os.path import isfile, isdir
from tqdm import tqdm

data_dir = 'data/'

if not isdir(data_dir):
    raise Exception("Data directory doesn't exist!")

class DLProgress(tqdm):
    last_block = 0

    def hook(self, block_num=1, block_size=1, total_size=None):
        self.total = total_size
        self.update((block_num - self.last_block) * block_size)
        self.last_block = block_num

if not isfile(data_dir + "train_32x32.mat"):
    with DLProgress(unit='B', unit_scale=True, miniters=1, desc='SVHN Training Set') as pbar:
        urlretrieve(
            'http://ufldl.stanford.edu/housenumbers/train_32x32.mat',
            data_dir + 'train_32x32.mat',
            pbar.hook)

if not isfile(data_dir + "test_32x32.mat"):
    with DLProgress(unit='B', unit_scale=True, miniters=1, desc='SVHN Testing Set') as pbar:
        urlretrieve(
            'http://ufldl.stanford.edu/housenumbers/test_32x32.mat',
            data_dir + 'test_32x32.mat',
            pbar.hook)

These SVHN files are .mat files typically used with Matlab. However, we can load them in with scipy.io.loadmat which we imported above.


In [4]:
trainset = loadmat(data_dir + 'train_32x32.mat')
testset = loadmat(data_dir + 'test_32x32.mat')

Here I'm showing a small sample of the images. Each of these is 32x32 with 3 color channels (RGB). These are the real images we'll pass to the discriminator and what the generator will eventually fake.


In [5]:
idx = np.random.randint(0, trainset['X'].shape[3], size=36)
fig, axes = plt.subplots(6, 6, sharex=True, sharey=True, figsize=(5,5),)
for ii, ax in zip(idx, axes.flatten()):
    ax.imshow(trainset['X'][:,:,:,ii], aspect='equal')
    ax.xaxis.set_visible(False)
    ax.yaxis.set_visible(False)
plt.subplots_adjust(wspace=0, hspace=0)


Here we need to do a bit of preprocessing and getting the images into a form where we can pass batches to the network. First off, we need to rescale the images to a range of -1 to 1, since the output of our generator is also in that range. We also have a set of test and validation images which could be used if we're trying to identify the numbers in the images.


In [6]:
def scale(x, feature_range=(-1, 1)):
    # scale to (0, 1)
    x = ((x - x.min())/(255 - x.min()))
    
    # scale to feature_range
    min, max = feature_range
    x = x * (max - min) + min
    return x

In [7]:
class Dataset:
    def __init__(self, train, test, val_frac=0.5, shuffle=False, scale_func=None):
        split_idx = int(len(test['y'])*(1 - val_frac))
        self.test_x, self.valid_x = test['X'][:,:,:,:split_idx], test['X'][:,:,:,split_idx:]
        self.test_y, self.valid_y = test['y'][:split_idx], test['y'][split_idx:]
        self.train_x, self.train_y = train['X'], train['y']
        
        self.train_x = np.rollaxis(self.train_x, 3)
        self.valid_x = np.rollaxis(self.valid_x, 3)
        self.test_x = np.rollaxis(self.test_x, 3)
        
        if scale_func is None:
            self.scaler = scale
        else:
            self.scaler = scale_func
        self.shuffle = shuffle
        
    def batches(self, batch_size):
        if self.shuffle:
            idx = np.arange(len(dataset.train_x))
            np.random.shuffle(idx)
            self.train_x = self.train_x[idx]
            self.train_y = self.train_y[idx]
        
        n_batches = len(self.train_y)//batch_size
        for ii in range(0, len(self.train_y), batch_size):
            x = self.train_x[ii:ii+batch_size]
            y = self.train_y[ii:ii+batch_size]
            
            yield self.scaler(x), y

Network Inputs

Here, just creating some placeholders like normal.


In [8]:
def model_inputs(real_dim, z_dim):
    inputs_real = tf.placeholder(tf.float32, (None, *real_dim), name='input_real')
    inputs_z = tf.placeholder(tf.float32, (None, z_dim), name='input_z')
    
    return inputs_real, inputs_z

Generator

Here you'll build the generator network. The input will be our noise vector z as before. Also as before, the output will be a $tanh$ output, but this time with size 32x32 which is the size of our SVHN images.

What's new here is we'll use convolutional layers to create our new images. The first layer is a fully connected layer which is reshaped into a deep and narrow layer, something like 4x4x1024 as in the original DCGAN paper. Then we use batch normalization and a leaky ReLU activation. Next is a transposed convolution where typically you'd halve the depth and double the width and height of the previous layer. Again, we use batch normalization and leaky ReLU. For each of these layers, the general scheme is convolution > batch norm > leaky ReLU.

You keep stacking layers up like this until you get the final transposed convolution layer with shape 32x32x3. Below is the archicture used in the original DCGAN paper:

Note that the final layer here is 64x64x3, while for our SVHN dataset, we only want it to be 32x32x3.

Exercise: Build the transposed convolutional network for the generator in the function below. Be sure to use leaky ReLUs on all the layers except for the last tanh layer, as well as batch normalization on all the transposed convolutional layers except the last one.


In [20]:
# def generator(z, output_dim, reuse=False, alpha=0.2, training=True):
#     with tf.variable_scope('generator', reuse=reuse):
#         # First fully connected layer
#         x = tf.layers.dense(z, 4*4*512, activation=None)
#         x = tf.reshape(x, (-1, 4, 4, 512))
#         x = tf.layers.batch_normalization(x, training=training)
#         x = tf.maximum(alpha * x, x)
        
#         conv1 = tf.layers.conv2d_transpose(inputs=x, filters=256, kernel_size=5, strides=2, padding="same")
#         conv1 = tf.layers.batch_normalization(conv1, training=training)
#         conv1 = tf.maximum(alpha * conv1, conv1)
        
#         conv2 = tf.layers.conv2d_transpose(inputs=conv1, filters=128, kernel_size=5, strides=2, padding="same")
#         conv2 = tf.layers.batch_normalization(conv2, training=training)
#         conv2 = tf.maximum(alpha * conv2, conv2)
        
#         # Output layer, 32x32x3
#         logits = tf.layers.conv2d_transpose(inputs=conv2, filters=output_dim, kernel_size=5, strides=2, activation=tf.nn.tanh)
        
#         out = tf.tanh(logits)
        
#         return out
def generator(z, output_dim, reuse=False, alpha=0.2, training=True):
    with tf.variable_scope('generator', reuse=reuse):
        # First fully connected layer
        x1 = tf.layers.dense(z, 4*4*512)
        # Reshape it to start the convolutional stack
        x1 = tf.reshape(x1, (-1, 4, 4, 512))
        x1 = tf.layers.batch_normalization(x1, training=training)
        x1 = tf.maximum(alpha * x1, x1)
        # 4x4x512 now
        
        x2 = tf.layers.conv2d_transpose(x1, 256, 5, strides=2, padding='same')
        x2 = tf.layers.batch_normalization(x2, training=training)
        x2 = tf.maximum(alpha * x2, x2)
        # 8x8x256 now
        
        x3 = tf.layers.conv2d_transpose(x2, 128, 5, strides=2, padding='same')
        x3 = tf.layers.batch_normalization(x3, training=training)
        x3 = tf.maximum(alpha * x3, x3)
        # 16x16x128 now
        
        # Output layer
        logits = tf.layers.conv2d_transpose(x3, output_dim, 5, strides=2, padding='same')
        # 32x32x3 now
        
        out = tf.tanh(logits)
        
        return out

Discriminator

Here you'll build the discriminator. This is basically just a convolutional classifier like you've built before. The input to the discriminator are 32x32x3 tensors/images. You'll want a few convolutional layers, then a fully connected layer for the output. As before, we want a sigmoid output, and you'll need to return the logits as well. For the depths of the convolutional layers I suggest starting with 16, 32, 64 filters in the first layer, then double the depth as you add layers. Note that in the DCGAN paper, they did all the downsampling using only strided convolutional layers with no maxpool layers.

You'll also want to use batch normalization with tf.layers.batch_normalization on each layer except the first convolutional and output layers. Again, each layer should look something like convolution > batch norm > leaky ReLU.

Note: in this project, your batch normalization layers will always use batch statistics. (That is, always set training to True.) That's because we are only interested in using the discriminator to help train the generator. However, if you wanted to use the discriminator for inference later, then you would need to set the training parameter appropriately.

Exercise: Build the convolutional network for the discriminator. The input is a 32x32x3 images, the output is a sigmoid plus the logits. Again, use Leaky ReLU activations and batch normalization on all the layers except the first.


In [28]:
def discriminator(x, reuse=False, alpha=0.2):
    with tf.variable_scope('discriminator', reuse=reuse):
        # Input layer is 32x32x3
        x1 = tf.layers.conv2d(inputs=x, filters=64, kernel_size=5, strides=2, padding="same")
        x1 = tf.maximum(alpha * x1, x1)
        
        x2 = tf.layers.conv2d(inputs=x1, filters=128, kernel_size=5, strides=2, padding="same")
        x2 = tf.layers.batch_normalization(x2, training=True)
        x2 = tf.maximum(alpha * x2, x2)
        
        x3 = tf.layers.conv2d(inputs=x2, filters=256, kernel_size=5, strides=2, padding="same")
        x3 = tf.layers.batch_normalization(x3, training=True)
        x3 = tf.maximum(alpha * x3, x3)
        
        flat = tf.reshape(x3, (-1, 4*4*256))
        logits = tf.layers.dense(flat, 1)
        out = tf.nn.sigmoid(logits)
        
        return out, logits

Model Loss

Calculating the loss like before, nothing new here.


In [29]:
def model_loss(input_real, input_z, output_dim, alpha=0.2):
    """
    Get the loss for the discriminator and generator
    :param input_real: Images from the real dataset
    :param input_z: Z input
    :param out_channel_dim: The number of channels in the output image
    :return: A tuple of (discriminator loss, generator loss)
    """
    g_model = generator(input_z, output_dim, alpha=alpha)
    d_model_real, d_logits_real = discriminator(input_real, alpha=alpha)
    d_model_fake, d_logits_fake = discriminator(g_model, reuse=True, alpha=alpha)

    d_loss_real = tf.reduce_mean(
        tf.nn.sigmoid_cross_entropy_with_logits(logits=d_logits_real, labels=tf.ones_like(d_model_real)))
    d_loss_fake = tf.reduce_mean(
        tf.nn.sigmoid_cross_entropy_with_logits(logits=d_logits_fake, labels=tf.zeros_like(d_model_fake)))
    g_loss = tf.reduce_mean(
        tf.nn.sigmoid_cross_entropy_with_logits(logits=d_logits_fake, labels=tf.ones_like(d_model_fake)))

    d_loss = d_loss_real + d_loss_fake

    return d_loss, g_loss

Optimizers

Not much new here, but notice how the train operations are wrapped in a with tf.control_dependencies block so the batch normalization layers can update their population statistics.


In [30]:
def model_opt(d_loss, g_loss, learning_rate, beta1):
    """
    Get optimization operations
    :param d_loss: Discriminator loss Tensor
    :param g_loss: Generator loss Tensor
    :param learning_rate: Learning Rate Placeholder
    :param beta1: The exponential decay rate for the 1st moment in the optimizer
    :return: A tuple of (discriminator training operation, generator training operation)
    """
    # Get weights and bias to update
    t_vars = tf.trainable_variables()
    d_vars = [var for var in t_vars if var.name.startswith('discriminator')]
    g_vars = [var for var in t_vars if var.name.startswith('generator')]

    # Optimize
    with tf.control_dependencies(tf.get_collection(tf.GraphKeys.UPDATE_OPS)):
        d_train_opt = tf.train.AdamOptimizer(learning_rate, beta1=beta1).minimize(d_loss, var_list=d_vars)
        g_train_opt = tf.train.AdamOptimizer(learning_rate, beta1=beta1).minimize(g_loss, var_list=g_vars)

    return d_train_opt, g_train_opt

Building the model

Here we can use the functions we defined about to build the model as a class. This will make it easier to move the network around in our code since the nodes and operations in the graph are packaged in one object.


In [31]:
class GAN:
    def __init__(self, real_size, z_size, learning_rate, alpha=0.2, beta1=0.5):
        tf.reset_default_graph()
        
        self.input_real, self.input_z = model_inputs(real_size, z_size)
        
        self.d_loss, self.g_loss = model_loss(self.input_real, self.input_z,
                                              real_size[2], alpha=0.2)
        
        self.d_opt, self.g_opt = model_opt(self.d_loss, self.g_loss, learning_rate, beta1)

Here is a function for displaying generated images.


In [32]:
def view_samples(epoch, samples, nrows, ncols, figsize=(5,5)):
    fig, axes = plt.subplots(figsize=figsize, nrows=nrows, ncols=ncols, 
                             sharey=True, sharex=True)
    for ax, img in zip(axes.flatten(), samples[epoch]):
        ax.axis('off')
        img = ((img - img.min())*255 / (img.max() - img.min())).astype(np.uint8)
        ax.set_adjustable('box-forced')
        im = ax.imshow(img, aspect='equal')
   
    plt.subplots_adjust(wspace=0, hspace=0)
    return fig, axes

And another function we can use to train our network. Notice when we call generator to create the samples to display, we set training to False. That's so the batch normalization layers will use the population statistics rather than the batch statistics. Also notice that we set the net.input_real placeholder when we run the generator's optimizer. The generator doesn't actually use it, but we'd get an error without it because of the tf.control_dependencies block we created in model_opt.


In [33]:
def train(net, dataset, epochs, batch_size, print_every=20, show_every=400, figsize=(5,5)):
    saver = tf.train.Saver()
    sample_z = np.random.uniform(-1, 1, size=(72, z_size))

    samples, losses = [], []
    steps = 0

    with tf.Session() as sess:
        sess.run(tf.global_variables_initializer())
        for e in range(epochs):
            for x, y in dataset.batches(batch_size):
                steps += 1

                # Sample random noise for G
                batch_z = np.random.uniform(-1, 1, size=(batch_size, z_size))

                # Run optimizers
                _ = sess.run(net.d_opt, feed_dict={net.input_real: x, net.input_z: batch_z})
                _ = sess.run(net.g_opt, feed_dict={net.input_z: batch_z, net.input_real: x})

                if steps % print_every == 0:
                    # At the end of each epoch, get the losses and print them out
                    train_loss_d = net.d_loss.eval({net.input_z: batch_z, net.input_real: x})
                    train_loss_g = net.g_loss.eval({net.input_z: batch_z})

                    print("Epoch {}/{}...".format(e+1, epochs),
                          "Discriminator Loss: {:.4f}...".format(train_loss_d),
                          "Generator Loss: {:.4f}".format(train_loss_g))
                    # Save losses to view after training
                    losses.append((train_loss_d, train_loss_g))

                if steps % show_every == 0:
                    with tf.device("/gpu:1"):
                        gen_samples = sess.run(
                                       generator(net.input_z, 3, reuse=True, training=False),
                                       feed_dict={net.input_z: sample_z})
                        samples.append(gen_samples)
                        _ = view_samples(-1, samples, 6, 12, figsize=figsize)
                        plt.show()

        saver.save(sess, './checkpoints/generator.ckpt')

    with open('samples.pkl', 'wb') as f:
        pkl.dump(samples, f)
    
    return losses, samples

Hyperparameters

GANs are very sensitive to hyperparameters. A lot of experimentation goes into finding the best hyperparameters such that the generator and discriminator don't overpower each other. Try out your own hyperparameters or read the DCGAN paper to see what worked for them.

Exercise: Find hyperparameters to train this GAN. The values found in the DCGAN paper work well, or you can experiment on your own. In general, you want the discriminator loss to be around 0.3, this means it is correctly classifying images as fake or real about 50% of the time.


In [36]:
real_size = (32,32,3)
z_size = 100
learning_rate = 0.0002
batch_size = 128
epochs = 25
alpha = 0.2
beta1 = 0.5

# Create the network
net = GAN(real_size, z_size, learning_rate, alpha=alpha, beta1=beta1)

In [37]:
# Load the data and train the network here
dataset = Dataset(trainset, testset)
losses, samples = train(net, dataset, epochs, batch_size, figsize=(10,5))


Epoch 1/25... Discriminator Loss: 0.3763... Generator Loss: 2.1946
Epoch 1/25... Discriminator Loss: 0.1493... Generator Loss: 2.5731
Epoch 1/25... Discriminator Loss: 0.0703... Generator Loss: 3.4994
Epoch 1/25... Discriminator Loss: 0.0941... Generator Loss: 3.4324
Epoch 1/25... Discriminator Loss: 0.2097... Generator Loss: 2.0871
Epoch 1/25... Discriminator Loss: 0.1855... Generator Loss: 2.4278
Epoch 1/25... Discriminator Loss: 0.5734... Generator Loss: 2.0745
Epoch 1/25... Discriminator Loss: 0.4183... Generator Loss: 1.9456
Epoch 1/25... Discriminator Loss: 0.2267... Generator Loss: 2.8273
Epoch 1/25... Discriminator Loss: 0.1906... Generator Loss: 2.4483
Epoch 1/25... Discriminator Loss: 0.2874... Generator Loss: 4.2017
Epoch 1/25... Discriminator Loss: 0.6901... Generator Loss: 1.0990
Epoch 1/25... Discriminator Loss: 0.4127... Generator Loss: 2.5948
Epoch 1/25... Discriminator Loss: 0.9052... Generator Loss: 3.2087
Epoch 1/25... Discriminator Loss: 1.1984... Generator Loss: 1.8652
Epoch 1/25... Discriminator Loss: 0.6553... Generator Loss: 1.2667
Epoch 1/25... Discriminator Loss: 0.8665... Generator Loss: 1.3686
Epoch 1/25... Discriminator Loss: 0.5753... Generator Loss: 1.6822
Epoch 1/25... Discriminator Loss: 0.4748... Generator Loss: 1.8928
Epoch 1/25... Discriminator Loss: 0.4017... Generator Loss: 1.7120
Epoch 1/25... Discriminator Loss: 1.0253... Generator Loss: 0.6868
Epoch 1/25... Discriminator Loss: 0.6111... Generator Loss: 1.9373
Epoch 1/25... Discriminator Loss: 1.0346... Generator Loss: 0.5879
Epoch 1/25... Discriminator Loss: 0.9990... Generator Loss: 1.0957
Epoch 1/25... Discriminator Loss: 0.8632... Generator Loss: 1.5626
Epoch 1/25... Discriminator Loss: 0.8453... Generator Loss: 1.2344
Epoch 1/25... Discriminator Loss: 1.3468... Generator Loss: 0.4878
Epoch 1/25... Discriminator Loss: 0.7187... Generator Loss: 1.3893
Epoch 2/25... Discriminator Loss: 0.6915... Generator Loss: 1.5955
Epoch 2/25... Discriminator Loss: 1.1388... Generator Loss: 1.1971
Epoch 2/25... Discriminator Loss: 0.8360... Generator Loss: 1.1958
Epoch 2/25... Discriminator Loss: 0.7549... Generator Loss: 1.3521
Epoch 2/25... Discriminator Loss: 1.3504... Generator Loss: 0.9805
Epoch 2/25... Discriminator Loss: 1.0844... Generator Loss: 1.2674
Epoch 2/25... Discriminator Loss: 0.8386... Generator Loss: 1.3043
Epoch 2/25... Discriminator Loss: 0.7878... Generator Loss: 1.2044
Epoch 2/25... Discriminator Loss: 0.7611... Generator Loss: 1.1316
Epoch 2/25... Discriminator Loss: 1.1325... Generator Loss: 0.9366
Epoch 2/25... Discriminator Loss: 1.1143... Generator Loss: 0.9220
Epoch 2/25... Discriminator Loss: 1.3520... Generator Loss: 0.7191
Epoch 2/25... Discriminator Loss: 1.3431... Generator Loss: 1.0047
Epoch 2/25... Discriminator Loss: 1.3971... Generator Loss: 0.7349
Epoch 2/25... Discriminator Loss: 1.2230... Generator Loss: 1.0269
Epoch 2/25... Discriminator Loss: 1.1939... Generator Loss: 0.7871
Epoch 2/25... Discriminator Loss: 1.2151... Generator Loss: 0.9270
Epoch 2/25... Discriminator Loss: 1.0543... Generator Loss: 1.2431
Epoch 2/25... Discriminator Loss: 1.0395... Generator Loss: 1.2405
Epoch 2/25... Discriminator Loss: 1.3537... Generator Loss: 0.6601
Epoch 2/25... Discriminator Loss: 1.1295... Generator Loss: 1.1513
Epoch 2/25... Discriminator Loss: 1.2408... Generator Loss: 0.9605
Epoch 2/25... Discriminator Loss: 1.2062... Generator Loss: 1.4125
Epoch 2/25... Discriminator Loss: 1.0998... Generator Loss: 0.8977
Epoch 2/25... Discriminator Loss: 1.3617... Generator Loss: 0.6124
Epoch 2/25... Discriminator Loss: 0.9855... Generator Loss: 1.2857
Epoch 2/25... Discriminator Loss: 1.2520... Generator Loss: 0.9669
Epoch 2/25... Discriminator Loss: 1.1918... Generator Loss: 0.9032
Epoch 2/25... Discriminator Loss: 1.1415... Generator Loss: 0.9382
Epoch 3/25... Discriminator Loss: 1.0416... Generator Loss: 0.8800
Epoch 3/25... Discriminator Loss: 0.8978... Generator Loss: 0.9424
Epoch 3/25... Discriminator Loss: 0.7848... Generator Loss: 1.3207
Epoch 3/25... Discriminator Loss: 1.0792... Generator Loss: 0.7130
Epoch 3/25... Discriminator Loss: 0.8855... Generator Loss: 1.1155
Epoch 3/25... Discriminator Loss: 0.6971... Generator Loss: 1.1584
Epoch 3/25... Discriminator Loss: 0.7565... Generator Loss: 1.4211
Epoch 3/25... Discriminator Loss: 0.8081... Generator Loss: 1.3896
Epoch 3/25... Discriminator Loss: 0.5764... Generator Loss: 1.8060
Epoch 3/25... Discriminator Loss: 0.8732... Generator Loss: 1.5823
Epoch 3/25... Discriminator Loss: 0.3234... Generator Loss: 2.5127
Epoch 3/25... Discriminator Loss: 0.8600... Generator Loss: 0.8580
Epoch 3/25... Discriminator Loss: 0.7186... Generator Loss: 1.0450
Epoch 3/25... Discriminator Loss: 0.7744... Generator Loss: 0.8696
Epoch 3/25... Discriminator Loss: 0.7918... Generator Loss: 0.8998
Epoch 3/25... Discriminator Loss: 0.6272... Generator Loss: 1.2519
Epoch 3/25... Discriminator Loss: 0.6162... Generator Loss: 1.2619
Epoch 3/25... Discriminator Loss: 0.7115... Generator Loss: 1.8091
Epoch 3/25... Discriminator Loss: 0.5512... Generator Loss: 1.5475
Epoch 3/25... Discriminator Loss: 0.9550... Generator Loss: 0.6410
Epoch 3/25... Discriminator Loss: 0.7029... Generator Loss: 1.0913
Epoch 3/25... Discriminator Loss: 0.4497... Generator Loss: 2.1033
Epoch 3/25... Discriminator Loss: 0.6946... Generator Loss: 1.7284
Epoch 3/25... Discriminator Loss: 0.5613... Generator Loss: 1.6113
Epoch 3/25... Discriminator Loss: 0.5386... Generator Loss: 1.7025
Epoch 3/25... Discriminator Loss: 0.4643... Generator Loss: 1.9775
Epoch 3/25... Discriminator Loss: 0.6651... Generator Loss: 1.2941
Epoch 3/25... Discriminator Loss: 0.5141... Generator Loss: 1.7972
Epoch 4/25... Discriminator Loss: 0.8769... Generator Loss: 0.9786
Epoch 4/25... Discriminator Loss: 0.5409... Generator Loss: 1.5060
Epoch 4/25... Discriminator Loss: 0.4333... Generator Loss: 2.1853
Epoch 4/25... Discriminator Loss: 0.6244... Generator Loss: 1.5448
Epoch 4/25... Discriminator Loss: 0.6517... Generator Loss: 1.2361
Epoch 4/25... Discriminator Loss: 3.1654... Generator Loss: 4.9037
Epoch 4/25... Discriminator Loss: 0.7562... Generator Loss: 1.0370
Epoch 4/25... Discriminator Loss: 0.8647... Generator Loss: 0.7743
Epoch 4/25... Discriminator Loss: 0.7920... Generator Loss: 0.9340
Epoch 4/25... Discriminator Loss: 0.6116... Generator Loss: 1.8041
Epoch 4/25... Discriminator Loss: 0.5010... Generator Loss: 2.4769
Epoch 4/25... Discriminator Loss: 0.5443... Generator Loss: 1.2909
Epoch 4/25... Discriminator Loss: 0.5613... Generator Loss: 1.8125
Epoch 4/25... Discriminator Loss: 0.5816... Generator Loss: 1.4420
Epoch 4/25... Discriminator Loss: 0.5820... Generator Loss: 1.1774
Epoch 4/25... Discriminator Loss: 0.6550... Generator Loss: 1.8007
Epoch 4/25... Discriminator Loss: 0.6581... Generator Loss: 1.1810
Epoch 4/25... Discriminator Loss: 0.5919... Generator Loss: 1.1974
Epoch 4/25... Discriminator Loss: 0.7895... Generator Loss: 1.0313
Epoch 4/25... Discriminator Loss: 0.5067... Generator Loss: 1.3897
Epoch 4/25... Discriminator Loss: 0.5763... Generator Loss: 1.4764
Epoch 4/25... Discriminator Loss: 0.3649... Generator Loss: 1.8123
Epoch 4/25... Discriminator Loss: 0.7722... Generator Loss: 1.6712
Epoch 4/25... Discriminator Loss: 0.7885... Generator Loss: 1.2393
Epoch 4/25... Discriminator Loss: 0.7140... Generator Loss: 1.2874
Epoch 4/25... Discriminator Loss: 0.5384... Generator Loss: 1.6251
Epoch 4/25... Discriminator Loss: 0.6059... Generator Loss: 1.6147
Epoch 4/25... Discriminator Loss: 0.7389... Generator Loss: 0.9116
Epoch 4/25... Discriminator Loss: 0.8916... Generator Loss: 0.7233
Epoch 5/25... Discriminator Loss: 0.4169... Generator Loss: 1.6586
Epoch 5/25... Discriminator Loss: 1.3136... Generator Loss: 2.8866
Epoch 5/25... Discriminator Loss: 0.5447... Generator Loss: 1.2017
Epoch 5/25... Discriminator Loss: 1.0614... Generator Loss: 0.5851
Epoch 5/25... Discriminator Loss: 0.6373... Generator Loss: 1.0940
Epoch 5/25... Discriminator Loss: 0.8699... Generator Loss: 0.7953
Epoch 5/25... Discriminator Loss: 0.3600... Generator Loss: 2.0787
Epoch 5/25... Discriminator Loss: 1.1986... Generator Loss: 0.5301
Epoch 5/25... Discriminator Loss: 0.7024... Generator Loss: 1.0281
Epoch 5/25... Discriminator Loss: 0.6293... Generator Loss: 1.2785
Epoch 5/25... Discriminator Loss: 0.4848... Generator Loss: 1.4909
Epoch 5/25... Discriminator Loss: 0.7903... Generator Loss: 1.0359
Epoch 5/25... Discriminator Loss: 0.6414... Generator Loss: 1.2455
Epoch 5/25... Discriminator Loss: 0.8331... Generator Loss: 0.8229
Epoch 5/25... Discriminator Loss: 0.5263... Generator Loss: 1.8684
Epoch 5/25... Discriminator Loss: 0.4544... Generator Loss: 1.4874
Epoch 5/25... Discriminator Loss: 0.8301... Generator Loss: 2.5896
Epoch 5/25... Discriminator Loss: 0.4583... Generator Loss: 1.8497
Epoch 5/25... Discriminator Loss: 0.5579... Generator Loss: 1.3185
Epoch 5/25... Discriminator Loss: 0.8301... Generator Loss: 1.0814
Epoch 5/25... Discriminator Loss: 0.6533... Generator Loss: 1.1027
Epoch 5/25... Discriminator Loss: 0.6285... Generator Loss: 1.0265
Epoch 5/25... Discriminator Loss: 0.6266... Generator Loss: 1.2254
Epoch 5/25... Discriminator Loss: 0.7607... Generator Loss: 1.6768
Epoch 5/25... Discriminator Loss: 0.7564... Generator Loss: 2.1685
Epoch 5/25... Discriminator Loss: 0.6708... Generator Loss: 1.4163
Epoch 5/25... Discriminator Loss: 0.5591... Generator Loss: 1.5070
Epoch 5/25... Discriminator Loss: 0.6011... Generator Loss: 1.7665
Epoch 5/25... Discriminator Loss: 0.7447... Generator Loss: 0.9114
Epoch 6/25... Discriminator Loss: 0.6274... Generator Loss: 1.1631
Epoch 6/25... Discriminator Loss: 0.9937... Generator Loss: 0.7415
Epoch 6/25... Discriminator Loss: 0.5981... Generator Loss: 1.8021
Epoch 6/25... Discriminator Loss: 0.8334... Generator Loss: 0.8382
Epoch 6/25... Discriminator Loss: 0.9607... Generator Loss: 0.6777
Epoch 6/25... Discriminator Loss: 0.7676... Generator Loss: 0.9210
Epoch 6/25... Discriminator Loss: 0.5349... Generator Loss: 1.4683
Epoch 6/25... Discriminator Loss: 1.5912... Generator Loss: 0.8660
Epoch 6/25... Discriminator Loss: 0.4104... Generator Loss: 1.5049
Epoch 6/25... Discriminator Loss: 0.6511... Generator Loss: 1.1267
Epoch 6/25... Discriminator Loss: 0.7038... Generator Loss: 1.4327
Epoch 6/25... Discriminator Loss: 0.6471... Generator Loss: 1.1139
Epoch 6/25... Discriminator Loss: 0.8031... Generator Loss: 0.8686
Epoch 6/25... Discriminator Loss: 0.7465... Generator Loss: 0.9143
Epoch 6/25... Discriminator Loss: 1.7165... Generator Loss: 0.2897
Epoch 6/25... Discriminator Loss: 0.4268... Generator Loss: 1.5346
Epoch 6/25... Discriminator Loss: 0.5826... Generator Loss: 1.7936
Epoch 6/25... Discriminator Loss: 0.6196... Generator Loss: 1.5895
Epoch 6/25... Discriminator Loss: 0.7711... Generator Loss: 0.9135
Epoch 6/25... Discriminator Loss: 1.1265... Generator Loss: 0.5746
Epoch 6/25... Discriminator Loss: 0.6297... Generator Loss: 1.1233
Epoch 6/25... Discriminator Loss: 0.4892... Generator Loss: 1.3249
Epoch 6/25... Discriminator Loss: 0.9466... Generator Loss: 0.7000
Epoch 6/25... Discriminator Loss: 0.4945... Generator Loss: 1.6744
Epoch 6/25... Discriminator Loss: 0.4396... Generator Loss: 1.5239
Epoch 6/25... Discriminator Loss: 2.0018... Generator Loss: 0.2053
Epoch 6/25... Discriminator Loss: 1.8157... Generator Loss: 0.2443
Epoch 6/25... Discriminator Loss: 0.5534... Generator Loss: 1.5607
Epoch 7/25... Discriminator Loss: 0.6537... Generator Loss: 1.4399
Epoch 7/25... Discriminator Loss: 0.8067... Generator Loss: 0.9986
Epoch 7/25... Discriminator Loss: 0.4892... Generator Loss: 1.6633
Epoch 7/25... Discriminator Loss: 1.1448... Generator Loss: 0.5434
Epoch 7/25... Discriminator Loss: 0.8669... Generator Loss: 0.8176
Epoch 7/25... Discriminator Loss: 0.8089... Generator Loss: 1.6584
Epoch 7/25... Discriminator Loss: 0.4787... Generator Loss: 1.4161
Epoch 7/25... Discriminator Loss: 1.5434... Generator Loss: 3.1058
Epoch 7/25... Discriminator Loss: 0.7255... Generator Loss: 0.9906
Epoch 7/25... Discriminator Loss: 0.6351... Generator Loss: 1.2095
Epoch 7/25... Discriminator Loss: 0.6762... Generator Loss: 1.2125
Epoch 7/25... Discriminator Loss: 0.6002... Generator Loss: 1.3004
Epoch 7/25... Discriminator Loss: 0.9457... Generator Loss: 0.9309
Epoch 7/25... Discriminator Loss: 0.4473... Generator Loss: 1.6302
Epoch 7/25... Discriminator Loss: 1.2302... Generator Loss: 0.5610
Epoch 7/25... Discriminator Loss: 0.5638... Generator Loss: 1.5397
Epoch 7/25... Discriminator Loss: 0.7611... Generator Loss: 0.8840
Epoch 7/25... Discriminator Loss: 0.7754... Generator Loss: 0.8650
Epoch 7/25... Discriminator Loss: 0.4427... Generator Loss: 1.8027
Epoch 7/25... Discriminator Loss: 0.6610... Generator Loss: 1.1807
Epoch 7/25... Discriminator Loss: 0.8488... Generator Loss: 1.1459
Epoch 7/25... Discriminator Loss: 0.4980... Generator Loss: 1.4152
Epoch 7/25... Discriminator Loss: 0.8848... Generator Loss: 0.7679
Epoch 7/25... Discriminator Loss: 0.8590... Generator Loss: 0.8342
Epoch 7/25... Discriminator Loss: 0.6539... Generator Loss: 1.1110
Epoch 7/25... Discriminator Loss: 0.6844... Generator Loss: 1.0110
Epoch 7/25... Discriminator Loss: 0.5846... Generator Loss: 1.5616
Epoch 7/25... Discriminator Loss: 0.8300... Generator Loss: 1.4049
Epoch 7/25... Discriminator Loss: 0.8490... Generator Loss: 0.7127
Epoch 8/25... Discriminator Loss: 0.6042... Generator Loss: 1.1840
Epoch 8/25... Discriminator Loss: 0.7043... Generator Loss: 1.0833
Epoch 8/25... Discriminator Loss: 0.9095... Generator Loss: 0.7975
Epoch 8/25... Discriminator Loss: 0.5111... Generator Loss: 1.4181
Epoch 8/25... Discriminator Loss: 0.7810... Generator Loss: 0.8210
Epoch 8/25... Discriminator Loss: 0.6073... Generator Loss: 1.0689
Epoch 8/25... Discriminator Loss: 0.5882... Generator Loss: 2.1473
Epoch 8/25... Discriminator Loss: 0.4052... Generator Loss: 2.0405
Epoch 8/25... Discriminator Loss: 0.5809... Generator Loss: 1.2733
Epoch 8/25... Discriminator Loss: 1.1371... Generator Loss: 0.5203
Epoch 8/25... Discriminator Loss: 0.6141... Generator Loss: 1.1701
Epoch 8/25... Discriminator Loss: 0.6560... Generator Loss: 1.5314
Epoch 8/25... Discriminator Loss: 0.6818... Generator Loss: 1.3052
Epoch 8/25... Discriminator Loss: 0.4941... Generator Loss: 1.3459
Epoch 8/25... Discriminator Loss: 0.6562... Generator Loss: 0.9961
Epoch 8/25... Discriminator Loss: 0.9714... Generator Loss: 0.6918
Epoch 8/25... Discriminator Loss: 0.5988... Generator Loss: 1.1092
Epoch 8/25... Discriminator Loss: 0.7992... Generator Loss: 1.0105
Epoch 8/25... Discriminator Loss: 0.4894... Generator Loss: 1.5042
Epoch 8/25... Discriminator Loss: 0.5936... Generator Loss: 1.2442
Epoch 8/25... Discriminator Loss: 0.8223... Generator Loss: 0.9784
Epoch 8/25... Discriminator Loss: 0.5011... Generator Loss: 1.4955
Epoch 8/25... Discriminator Loss: 0.9520... Generator Loss: 0.6997
Epoch 8/25... Discriminator Loss: 0.4418... Generator Loss: 2.1632
Epoch 8/25... Discriminator Loss: 0.8642... Generator Loss: 2.6402
Epoch 8/25... Discriminator Loss: 0.7703... Generator Loss: 0.8722
Epoch 8/25... Discriminator Loss: 0.5962... Generator Loss: 1.1636
Epoch 8/25... Discriminator Loss: 0.8002... Generator Loss: 1.0623
Epoch 8/25... Discriminator Loss: 0.9145... Generator Loss: 3.4130
Epoch 9/25... Discriminator Loss: 0.7205... Generator Loss: 0.9371
Epoch 9/25... Discriminator Loss: 0.7144... Generator Loss: 1.0387
Epoch 9/25... Discriminator Loss: 0.5375... Generator Loss: 1.7096
Epoch 9/25... Discriminator Loss: 0.4861... Generator Loss: 1.9928
Epoch 9/25... Discriminator Loss: 0.6679... Generator Loss: 0.9882
Epoch 9/25... Discriminator Loss: 0.6544... Generator Loss: 1.0749
Epoch 9/25... Discriminator Loss: 1.0676... Generator Loss: 0.5903
Epoch 9/25... Discriminator Loss: 0.6190... Generator Loss: 1.2288
Epoch 9/25... Discriminator Loss: 0.6280... Generator Loss: 1.2956
Epoch 9/25... Discriminator Loss: 0.4988... Generator Loss: 1.3548
Epoch 9/25... Discriminator Loss: 1.1148... Generator Loss: 0.5758
Epoch 9/25... Discriminator Loss: 0.5760... Generator Loss: 1.3354
Epoch 9/25... Discriminator Loss: 0.7098... Generator Loss: 1.7985
Epoch 9/25... Discriminator Loss: 0.9567... Generator Loss: 2.3271
Epoch 9/25... Discriminator Loss: 0.5218... Generator Loss: 1.8360
Epoch 9/25... Discriminator Loss: 0.5757... Generator Loss: 1.2494
Epoch 9/25... Discriminator Loss: 0.5497... Generator Loss: 1.2677
Epoch 9/25... Discriminator Loss: 0.6084... Generator Loss: 1.2853
Epoch 9/25... Discriminator Loss: 0.6505... Generator Loss: 1.1468
Epoch 9/25... Discriminator Loss: 0.3649... Generator Loss: 1.6016
Epoch 9/25... Discriminator Loss: 0.5980... Generator Loss: 1.1501
Epoch 9/25... Discriminator Loss: 0.6610... Generator Loss: 1.0000
Epoch 9/25... Discriminator Loss: 0.6257... Generator Loss: 1.2836
Epoch 9/25... Discriminator Loss: 0.7991... Generator Loss: 0.9034
Epoch 9/25... Discriminator Loss: 1.8351... Generator Loss: 0.2561
Epoch 9/25... Discriminator Loss: 0.7141... Generator Loss: 1.9286
Epoch 9/25... Discriminator Loss: 0.7995... Generator Loss: 1.6825
Epoch 9/25... Discriminator Loss: 1.4765... Generator Loss: 0.3339
Epoch 10/25... Discriminator Loss: 0.6529... Generator Loss: 1.2124
Epoch 10/25... Discriminator Loss: 0.7199... Generator Loss: 0.9631
Epoch 10/25... Discriminator Loss: 1.3805... Generator Loss: 0.3992
Epoch 10/25... Discriminator Loss: 0.4700... Generator Loss: 1.5116
Epoch 10/25... Discriminator Loss: 0.6948... Generator Loss: 1.0455
Epoch 10/25... Discriminator Loss: 1.0343... Generator Loss: 0.6696
Epoch 10/25... Discriminator Loss: 0.9808... Generator Loss: 0.6322
Epoch 10/25... Discriminator Loss: 0.5547... Generator Loss: 1.5723
Epoch 10/25... Discriminator Loss: 0.7333... Generator Loss: 0.9818
Epoch 10/25... Discriminator Loss: 1.2172... Generator Loss: 0.4930
Epoch 10/25... Discriminator Loss: 0.5195... Generator Loss: 1.3608
Epoch 10/25... Discriminator Loss: 0.5610... Generator Loss: 1.7319
Epoch 10/25... Discriminator Loss: 0.5761... Generator Loss: 2.5001
Epoch 10/25... Discriminator Loss: 0.8090... Generator Loss: 0.9166
Epoch 10/25... Discriminator Loss: 0.4565... Generator Loss: 1.6716
Epoch 10/25... Discriminator Loss: 0.6966... Generator Loss: 0.9531
Epoch 10/25... Discriminator Loss: 0.6279... Generator Loss: 1.2260
Epoch 10/25... Discriminator Loss: 0.7516... Generator Loss: 0.8947
Epoch 10/25... Discriminator Loss: 0.6652... Generator Loss: 0.9525
Epoch 10/25... Discriminator Loss: 0.5606... Generator Loss: 1.7157
Epoch 10/25... Discriminator Loss: 0.7421... Generator Loss: 0.9452
Epoch 10/25... Discriminator Loss: 0.8286... Generator Loss: 0.8216
Epoch 10/25... Discriminator Loss: 3.3553... Generator Loss: 0.0922
Epoch 10/25... Discriminator Loss: 0.6863... Generator Loss: 1.0958
Epoch 10/25... Discriminator Loss: 0.8795... Generator Loss: 0.7183
Epoch 10/25... Discriminator Loss: 0.6603... Generator Loss: 0.9965
Epoch 10/25... Discriminator Loss: 0.9471... Generator Loss: 4.4551
Epoch 10/25... Discriminator Loss: 0.5445... Generator Loss: 1.6074
Epoch 10/25... Discriminator Loss: 0.7738... Generator Loss: 0.8367
Epoch 11/25... Discriminator Loss: 0.7810... Generator Loss: 0.9070
Epoch 11/25... Discriminator Loss: 1.1176... Generator Loss: 0.6349
Epoch 11/25... Discriminator Loss: 0.6129... Generator Loss: 1.3102
Epoch 11/25... Discriminator Loss: 0.5345... Generator Loss: 1.5728
Epoch 11/25... Discriminator Loss: 0.5485... Generator Loss: 1.4005
Epoch 11/25... Discriminator Loss: 0.7215... Generator Loss: 0.9313
Epoch 11/25... Discriminator Loss: 0.3299... Generator Loss: 2.2832
Epoch 11/25... Discriminator Loss: 0.3617... Generator Loss: 1.9089
Epoch 11/25... Discriminator Loss: 1.6057... Generator Loss: 0.3163
Epoch 11/25... Discriminator Loss: 0.5002... Generator Loss: 1.7050
Epoch 11/25... Discriminator Loss: 0.5922... Generator Loss: 1.2010
Epoch 11/25... Discriminator Loss: 0.3519... Generator Loss: 2.2306
Epoch 11/25... Discriminator Loss: 1.1460... Generator Loss: 0.5672
Epoch 11/25... Discriminator Loss: 0.9835... Generator Loss: 0.6462
Epoch 11/25... Discriminator Loss: 0.6134... Generator Loss: 1.2678
Epoch 11/25... Discriminator Loss: 0.5108... Generator Loss: 2.2082
Epoch 11/25... Discriminator Loss: 0.8694... Generator Loss: 0.7946
Epoch 11/25... Discriminator Loss: 1.0842... Generator Loss: 0.6214
Epoch 11/25... Discriminator Loss: 0.9722... Generator Loss: 0.6177
Epoch 11/25... Discriminator Loss: 0.5295... Generator Loss: 1.2719
Epoch 11/25... Discriminator Loss: 1.1834... Generator Loss: 0.5171
Epoch 11/25... Discriminator Loss: 0.8652... Generator Loss: 0.8403
Epoch 11/25... Discriminator Loss: 1.0159... Generator Loss: 0.6851
Epoch 11/25... Discriminator Loss: 0.4764... Generator Loss: 1.3272
Epoch 11/25... Discriminator Loss: 0.4711... Generator Loss: 1.6353
Epoch 11/25... Discriminator Loss: 0.6837... Generator Loss: 0.9462
Epoch 11/25... Discriminator Loss: 0.7467... Generator Loss: 0.8849
Epoch 11/25... Discriminator Loss: 0.4659... Generator Loss: 1.9995
Epoch 11/25... Discriminator Loss: 0.7917... Generator Loss: 0.8731
Epoch 12/25... Discriminator Loss: 0.7437... Generator Loss: 1.0748
Epoch 12/25... Discriminator Loss: 0.4420... Generator Loss: 1.6013
Epoch 12/25... Discriminator Loss: 1.0542... Generator Loss: 0.5597
Epoch 12/25... Discriminator Loss: 0.4994... Generator Loss: 1.6735
Epoch 12/25... Discriminator Loss: 0.9668... Generator Loss: 0.6602
Epoch 12/25... Discriminator Loss: 0.7801... Generator Loss: 1.1639
Epoch 12/25... Discriminator Loss: 0.8853... Generator Loss: 0.8011
Epoch 12/25... Discriminator Loss: 2.9956... Generator Loss: 0.1315
Epoch 12/25... Discriminator Loss: 0.5375... Generator Loss: 1.7190
Epoch 12/25... Discriminator Loss: 0.9768... Generator Loss: 0.6940
Epoch 12/25... Discriminator Loss: 0.4977... Generator Loss: 1.3391
Epoch 12/25... Discriminator Loss: 0.4784... Generator Loss: 1.7882
Epoch 12/25... Discriminator Loss: 0.4596... Generator Loss: 1.5473
Epoch 12/25... Discriminator Loss: 0.4788... Generator Loss: 1.3649
Epoch 12/25... Discriminator Loss: 0.5526... Generator Loss: 1.1824
Epoch 12/25... Discriminator Loss: 0.7221... Generator Loss: 1.4543
Epoch 12/25... Discriminator Loss: 1.1990... Generator Loss: 0.5186
Epoch 12/25... Discriminator Loss: 1.0401... Generator Loss: 0.6442
Epoch 12/25... Discriminator Loss: 0.5677... Generator Loss: 1.7453
Epoch 12/25... Discriminator Loss: 0.4336... Generator Loss: 1.4896
Epoch 12/25... Discriminator Loss: 0.4083... Generator Loss: 1.4838
Epoch 12/25... Discriminator Loss: 1.1108... Generator Loss: 0.5434
Epoch 12/25... Discriminator Loss: 1.0108... Generator Loss: 0.5963
Epoch 12/25... Discriminator Loss: 0.8025... Generator Loss: 0.9046
Epoch 12/25... Discriminator Loss: 0.6500... Generator Loss: 1.2251
Epoch 12/25... Discriminator Loss: 0.4833... Generator Loss: 1.6231
Epoch 12/25... Discriminator Loss: 0.9572... Generator Loss: 0.8654
Epoch 12/25... Discriminator Loss: 0.8272... Generator Loss: 2.2820
Epoch 13/25... Discriminator Loss: 0.4466... Generator Loss: 1.5445
Epoch 13/25... Discriminator Loss: 0.7295... Generator Loss: 1.2771
Epoch 13/25... Discriminator Loss: 0.5908... Generator Loss: 2.1914
Epoch 13/25... Discriminator Loss: 1.8671... Generator Loss: 0.2500
Epoch 13/25... Discriminator Loss: 0.4804... Generator Loss: 1.5565
Epoch 13/25... Discriminator Loss: 0.5414... Generator Loss: 1.2986
Epoch 13/25... Discriminator Loss: 0.5070... Generator Loss: 1.7817
Epoch 13/25... Discriminator Loss: 0.5696... Generator Loss: 1.2386
Epoch 13/25... Discriminator Loss: 0.6625... Generator Loss: 1.0833
Epoch 13/25... Discriminator Loss: 1.2079... Generator Loss: 2.1285
Epoch 13/25... Discriminator Loss: 1.0092... Generator Loss: 0.6857
Epoch 13/25... Discriminator Loss: 0.9745... Generator Loss: 0.6337
Epoch 13/25... Discriminator Loss: 0.5245... Generator Loss: 1.6807
Epoch 13/25... Discriminator Loss: 0.9084... Generator Loss: 0.6763
Epoch 13/25... Discriminator Loss: 0.7783... Generator Loss: 0.8475
Epoch 13/25... Discriminator Loss: 0.5405... Generator Loss: 1.2737
Epoch 13/25... Discriminator Loss: 0.6379... Generator Loss: 1.1504
Epoch 13/25... Discriminator Loss: 0.7001... Generator Loss: 1.3523
Epoch 13/25... Discriminator Loss: 1.2540... Generator Loss: 0.4929
Epoch 13/25... Discriminator Loss: 0.4361... Generator Loss: 1.5306
Epoch 13/25... Discriminator Loss: 0.7248... Generator Loss: 1.0985
Epoch 13/25... Discriminator Loss: 0.8548... Generator Loss: 0.8256
Epoch 13/25... Discriminator Loss: 1.0434... Generator Loss: 0.7271
Epoch 13/25... Discriminator Loss: 0.7943... Generator Loss: 0.8191
Epoch 13/25... Discriminator Loss: 0.8150... Generator Loss: 0.8273
Epoch 13/25... Discriminator Loss: 0.6332... Generator Loss: 1.1246
Epoch 13/25... Discriminator Loss: 0.4399... Generator Loss: 1.5223
Epoch 13/25... Discriminator Loss: 0.9649... Generator Loss: 0.6801
Epoch 13/25... Discriminator Loss: 1.1509... Generator Loss: 0.5186
Epoch 14/25... Discriminator Loss: 0.7058... Generator Loss: 1.0045
Epoch 14/25... Discriminator Loss: 0.5519... Generator Loss: 1.2886
Epoch 14/25... Discriminator Loss: 0.5769... Generator Loss: 1.2917
Epoch 14/25... Discriminator Loss: 1.2344... Generator Loss: 0.4980
Epoch 14/25... Discriminator Loss: 0.6704... Generator Loss: 1.0978
Epoch 14/25... Discriminator Loss: 0.5172... Generator Loss: 1.1016
Epoch 14/25... Discriminator Loss: 0.7519... Generator Loss: 1.0734
Epoch 14/25... Discriminator Loss: 0.9192... Generator Loss: 0.7149
Epoch 14/25... Discriminator Loss: 0.8405... Generator Loss: 2.4530
Epoch 14/25... Discriminator Loss: 0.9192... Generator Loss: 1.2673
Epoch 14/25... Discriminator Loss: 0.6121... Generator Loss: 1.5839
Epoch 14/25... Discriminator Loss: 0.5605... Generator Loss: 1.1796
Epoch 14/25... Discriminator Loss: 0.3741... Generator Loss: 1.8675
Epoch 14/25... Discriminator Loss: 0.4941... Generator Loss: 1.4164
Epoch 14/25... Discriminator Loss: 0.5460... Generator Loss: 1.2620
Epoch 14/25... Discriminator Loss: 0.8560... Generator Loss: 0.7495
Epoch 14/25... Discriminator Loss: 0.4914... Generator Loss: 1.3714
Epoch 14/25... Discriminator Loss: 0.7199... Generator Loss: 1.0948
Epoch 14/25... Discriminator Loss: 0.8451... Generator Loss: 0.8094
Epoch 14/25... Discriminator Loss: 0.8956... Generator Loss: 0.7184
Epoch 14/25... Discriminator Loss: 0.7811... Generator Loss: 0.9414
Epoch 14/25... Discriminator Loss: 0.5948... Generator Loss: 1.1664
Epoch 14/25... Discriminator Loss: 1.3992... Generator Loss: 0.4524
Epoch 14/25... Discriminator Loss: 1.3779... Generator Loss: 0.4358
Epoch 14/25... Discriminator Loss: 1.0833... Generator Loss: 0.5920
Epoch 14/25... Discriminator Loss: 0.6232... Generator Loss: 1.1015
Epoch 14/25... Discriminator Loss: 0.6783... Generator Loss: 0.9830
Epoch 14/25... Discriminator Loss: 1.5081... Generator Loss: 0.3471
Epoch 14/25... Discriminator Loss: 1.3427... Generator Loss: 0.4442
Epoch 15/25... Discriminator Loss: 0.5176... Generator Loss: 1.6113
Epoch 15/25... Discriminator Loss: 0.6213... Generator Loss: 1.1551
Epoch 15/25... Discriminator Loss: 0.7268... Generator Loss: 2.2146
Epoch 15/25... Discriminator Loss: 0.8443... Generator Loss: 0.8120
Epoch 15/25... Discriminator Loss: 0.8367... Generator Loss: 0.8855
Epoch 15/25... Discriminator Loss: 0.4829... Generator Loss: 1.7754
Epoch 15/25... Discriminator Loss: 0.4853... Generator Loss: 1.5914
Epoch 15/25... Discriminator Loss: 0.8056... Generator Loss: 1.4896
Epoch 15/25... Discriminator Loss: 1.0036... Generator Loss: 0.7122
Epoch 15/25... Discriminator Loss: 0.5079... Generator Loss: 1.3763
Epoch 15/25... Discriminator Loss: 0.7290... Generator Loss: 0.9172
Epoch 15/25... Discriminator Loss: 0.6823... Generator Loss: 1.0149
Epoch 15/25... Discriminator Loss: 0.7762... Generator Loss: 0.8688
Epoch 15/25... Discriminator Loss: 1.2568... Generator Loss: 0.5004
Epoch 15/25... Discriminator Loss: 0.7836... Generator Loss: 0.9739
Epoch 15/25... Discriminator Loss: 0.5646... Generator Loss: 1.4335
Epoch 15/25... Discriminator Loss: 0.9199... Generator Loss: 1.9072
Epoch 15/25... Discriminator Loss: 0.7462... Generator Loss: 0.9048
Epoch 15/25... Discriminator Loss: 0.5936... Generator Loss: 1.3687
Epoch 15/25... Discriminator Loss: 1.1725... Generator Loss: 0.5333
Epoch 15/25... Discriminator Loss: 0.4519... Generator Loss: 1.4578
Epoch 15/25... Discriminator Loss: 1.7190... Generator Loss: 0.3012
Epoch 15/25... Discriminator Loss: 0.5952... Generator Loss: 1.5834
Epoch 15/25... Discriminator Loss: 0.9647... Generator Loss: 0.7573
Epoch 15/25... Discriminator Loss: 1.7354... Generator Loss: 0.2827
Epoch 15/25... Discriminator Loss: 0.9432... Generator Loss: 0.6713
Epoch 15/25... Discriminator Loss: 0.6598... Generator Loss: 1.0139
Epoch 15/25... Discriminator Loss: 0.4842... Generator Loss: 1.4967
Epoch 16/25... Discriminator Loss: 1.2363... Generator Loss: 0.4751
Epoch 16/25... Discriminator Loss: 0.7245... Generator Loss: 1.0399
Epoch 16/25... Discriminator Loss: 0.7684... Generator Loss: 0.8864
Epoch 16/25... Discriminator Loss: 2.4137... Generator Loss: 4.9496
Epoch 16/25... Discriminator Loss: 0.9187... Generator Loss: 0.7578
Epoch 16/25... Discriminator Loss: 0.6387... Generator Loss: 1.0291
Epoch 16/25... Discriminator Loss: 0.5874... Generator Loss: 1.3208
Epoch 16/25... Discriminator Loss: 1.4402... Generator Loss: 0.3824
Epoch 16/25... Discriminator Loss: 0.7523... Generator Loss: 1.0198
Epoch 16/25... Discriminator Loss: 0.4883... Generator Loss: 1.3488
Epoch 16/25... Discriminator Loss: 0.8758... Generator Loss: 0.7813
Epoch 16/25... Discriminator Loss: 0.7523... Generator Loss: 0.9631
Epoch 16/25... Discriminator Loss: 1.1640... Generator Loss: 0.4932
Epoch 16/25... Discriminator Loss: 0.8204... Generator Loss: 0.8961
Epoch 16/25... Discriminator Loss: 2.4648... Generator Loss: 5.3066
Epoch 16/25... Discriminator Loss: 0.6407... Generator Loss: 1.4019
Epoch 16/25... Discriminator Loss: 0.7362... Generator Loss: 0.9639
Epoch 16/25... Discriminator Loss: 0.6231... Generator Loss: 1.3401
Epoch 16/25... Discriminator Loss: 0.6924... Generator Loss: 1.0162
Epoch 16/25... Discriminator Loss: 0.8488... Generator Loss: 0.7746
Epoch 16/25... Discriminator Loss: 0.5312... Generator Loss: 1.4957
Epoch 16/25... Discriminator Loss: 0.7650... Generator Loss: 0.9209
Epoch 16/25... Discriminator Loss: 1.2622... Generator Loss: 0.4722
Epoch 16/25... Discriminator Loss: 1.2454... Generator Loss: 0.4924
Epoch 16/25... Discriminator Loss: 0.8944... Generator Loss: 0.7691
Epoch 16/25... Discriminator Loss: 1.0650... Generator Loss: 0.6477
Epoch 16/25... Discriminator Loss: 0.8442... Generator Loss: 3.1099
Epoch 16/25... Discriminator Loss: 0.5490... Generator Loss: 1.3523
Epoch 16/25... Discriminator Loss: 0.6895... Generator Loss: 1.5546
Epoch 17/25... Discriminator Loss: 0.9372... Generator Loss: 0.7389
Epoch 17/25... Discriminator Loss: 0.5958... Generator Loss: 1.2052
Epoch 17/25... Discriminator Loss: 0.5398... Generator Loss: 1.2759
Epoch 17/25... Discriminator Loss: 0.6583... Generator Loss: 1.3921
Epoch 17/25... Discriminator Loss: 1.2220... Generator Loss: 0.5009
Epoch 17/25... Discriminator Loss: 0.6470... Generator Loss: 1.0579
Epoch 17/25... Discriminator Loss: 0.6383... Generator Loss: 1.6996
Epoch 17/25... Discriminator Loss: 0.5491... Generator Loss: 1.2719
Epoch 17/25... Discriminator Loss: 0.6492... Generator Loss: 1.4223
Epoch 17/25... Discriminator Loss: 1.0287... Generator Loss: 0.6299
Epoch 17/25... Discriminator Loss: 0.4691... Generator Loss: 1.3665
Epoch 17/25... Discriminator Loss: 0.6079... Generator Loss: 1.0888
Epoch 17/25... Discriminator Loss: 0.9277... Generator Loss: 0.7696
Epoch 17/25... Discriminator Loss: 1.5717... Generator Loss: 0.3363
Epoch 17/25... Discriminator Loss: 0.3858... Generator Loss: 1.7450
Epoch 17/25... Discriminator Loss: 0.9648... Generator Loss: 0.6855
Epoch 17/25... Discriminator Loss: 0.4583... Generator Loss: 1.6299
Epoch 17/25... Discriminator Loss: 1.1393... Generator Loss: 0.5825
Epoch 17/25... Discriminator Loss: 0.8073... Generator Loss: 0.8591
Epoch 17/25... Discriminator Loss: 0.6452... Generator Loss: 1.2233
Epoch 17/25... Discriminator Loss: 0.5449... Generator Loss: 1.6786
Epoch 17/25... Discriminator Loss: 1.0358... Generator Loss: 0.6030
Epoch 17/25... Discriminator Loss: 0.3615... Generator Loss: 2.1490
Epoch 17/25... Discriminator Loss: 0.4632... Generator Loss: 1.4839
Epoch 17/25... Discriminator Loss: 0.7246... Generator Loss: 0.9724
Epoch 17/25... Discriminator Loss: 0.9037... Generator Loss: 0.7224
Epoch 17/25... Discriminator Loss: 1.3715... Generator Loss: 0.5710
Epoch 17/25... Discriminator Loss: 1.3226... Generator Loss: 0.4608
Epoch 17/25... Discriminator Loss: 0.9055... Generator Loss: 0.7816
Epoch 18/25... Discriminator Loss: 0.7861... Generator Loss: 0.8933
Epoch 18/25... Discriminator Loss: 0.7340... Generator Loss: 0.9485
Epoch 18/25... Discriminator Loss: 0.6818... Generator Loss: 0.9940
Epoch 18/25... Discriminator Loss: 0.9697... Generator Loss: 0.7168
Epoch 18/25... Discriminator Loss: 0.3771... Generator Loss: 1.9554
Epoch 18/25... Discriminator Loss: 0.9217... Generator Loss: 0.7382
Epoch 18/25... Discriminator Loss: 0.6532... Generator Loss: 1.0484
Epoch 18/25... Discriminator Loss: 0.4014... Generator Loss: 1.8960
Epoch 18/25... Discriminator Loss: 0.6428... Generator Loss: 1.1798
Epoch 18/25... Discriminator Loss: 0.5919... Generator Loss: 1.3418
Epoch 18/25... Discriminator Loss: 0.8760... Generator Loss: 0.7499
Epoch 18/25... Discriminator Loss: 0.9104... Generator Loss: 0.7944
Epoch 18/25... Discriminator Loss: 0.5994... Generator Loss: 1.1074
Epoch 18/25... Discriminator Loss: 1.0536... Generator Loss: 0.6322
Epoch 18/25... Discriminator Loss: 0.6711... Generator Loss: 1.1999
Epoch 18/25... Discriminator Loss: 0.8501... Generator Loss: 0.8671
Epoch 18/25... Discriminator Loss: 0.5505... Generator Loss: 1.4130
Epoch 18/25... Discriminator Loss: 0.7316... Generator Loss: 1.2871
Epoch 18/25... Discriminator Loss: 0.7054... Generator Loss: 0.9705
Epoch 18/25... Discriminator Loss: 0.5836... Generator Loss: 1.2902
Epoch 18/25... Discriminator Loss: 0.5343... Generator Loss: 1.1919
Epoch 18/25... Discriminator Loss: 1.7999... Generator Loss: 3.5937
Epoch 18/25... Discriminator Loss: 0.5573... Generator Loss: 1.2722
Epoch 18/25... Discriminator Loss: 1.2836... Generator Loss: 0.4665
Epoch 18/25... Discriminator Loss: 0.7561... Generator Loss: 1.0980
Epoch 18/25... Discriminator Loss: 1.0439... Generator Loss: 0.6889
Epoch 18/25... Discriminator Loss: 0.5543... Generator Loss: 1.1965
Epoch 18/25... Discriminator Loss: 1.0847... Generator Loss: 0.6090
Epoch 19/25... Discriminator Loss: 1.0461... Generator Loss: 0.6666
Epoch 19/25... Discriminator Loss: 0.9079... Generator Loss: 0.7488
Epoch 19/25... Discriminator Loss: 0.8800... Generator Loss: 0.7957
Epoch 19/25... Discriminator Loss: 0.9976... Generator Loss: 0.7206
Epoch 19/25... Discriminator Loss: 1.3328... Generator Loss: 0.4099
Epoch 19/25... Discriminator Loss: 0.5977... Generator Loss: 1.1846
Epoch 19/25... Discriminator Loss: 1.2950... Generator Loss: 0.4719
Epoch 19/25... Discriminator Loss: 0.7053... Generator Loss: 1.0550
Epoch 19/25... Discriminator Loss: 0.5971... Generator Loss: 1.1513
Epoch 19/25... Discriminator Loss: 1.1291... Generator Loss: 0.7935
Epoch 19/25... Discriminator Loss: 1.0220... Generator Loss: 0.6696
Epoch 19/25... Discriminator Loss: 1.0720... Generator Loss: 0.6545
Epoch 19/25... Discriminator Loss: 0.6593... Generator Loss: 1.0999
Epoch 19/25... Discriminator Loss: 0.8403... Generator Loss: 0.7576
Epoch 19/25... Discriminator Loss: 3.1732... Generator Loss: 0.1317
Epoch 19/25... Discriminator Loss: 1.0766... Generator Loss: 0.6784
Epoch 19/25... Discriminator Loss: 0.8699... Generator Loss: 0.8145
Epoch 19/25... Discriminator Loss: 1.0094... Generator Loss: 0.6654
Epoch 19/25... Discriminator Loss: 0.7059... Generator Loss: 1.0941
Epoch 19/25... Discriminator Loss: 0.7204... Generator Loss: 0.9857
Epoch 19/25... Discriminator Loss: 0.6759... Generator Loss: 1.1241
Epoch 19/25... Discriminator Loss: 0.8290... Generator Loss: 0.8242
Epoch 19/25... Discriminator Loss: 0.9772... Generator Loss: 0.7776
Epoch 19/25... Discriminator Loss: 0.3223... Generator Loss: 2.1903
Epoch 19/25... Discriminator Loss: 1.2967... Generator Loss: 0.4892
Epoch 19/25... Discriminator Loss: 0.6584... Generator Loss: 1.1149
Epoch 19/25... Discriminator Loss: 1.1353... Generator Loss: 0.6254
Epoch 19/25... Discriminator Loss: 0.8045... Generator Loss: 0.8672
Epoch 19/25... Discriminator Loss: 0.6215... Generator Loss: 1.0630
Epoch 20/25... Discriminator Loss: 0.6199... Generator Loss: 1.4807
Epoch 20/25... Discriminator Loss: 0.9683... Generator Loss: 0.6664
Epoch 20/25... Discriminator Loss: 1.2914... Generator Loss: 0.4375
Epoch 20/25... Discriminator Loss: 0.5295... Generator Loss: 1.5849
Epoch 20/25... Discriminator Loss: 0.7390... Generator Loss: 0.9673
Epoch 20/25... Discriminator Loss: 0.9196... Generator Loss: 0.7311
Epoch 20/25... Discriminator Loss: 1.0030... Generator Loss: 0.6463
Epoch 20/25... Discriminator Loss: 0.7869... Generator Loss: 0.9596
Epoch 20/25... Discriminator Loss: 0.9999... Generator Loss: 2.9024
Epoch 20/25... Discriminator Loss: 0.7165... Generator Loss: 1.0333
Epoch 20/25... Discriminator Loss: 1.2154... Generator Loss: 0.5093
Epoch 20/25... Discriminator Loss: 0.7831... Generator Loss: 0.9043
Epoch 20/25... Discriminator Loss: 1.5708... Generator Loss: 0.3447
Epoch 20/25... Discriminator Loss: 0.6138... Generator Loss: 1.1355
Epoch 20/25... Discriminator Loss: 2.0138... Generator Loss: 0.2292
Epoch 20/25... Discriminator Loss: 0.5542... Generator Loss: 1.2609
Epoch 20/25... Discriminator Loss: 0.7850... Generator Loss: 0.9067
Epoch 20/25... Discriminator Loss: 1.1212... Generator Loss: 2.3025
Epoch 20/25... Discriminator Loss: 0.7869... Generator Loss: 0.8972
Epoch 20/25... Discriminator Loss: 1.0025... Generator Loss: 0.7018
Epoch 20/25... Discriminator Loss: 0.8696... Generator Loss: 0.8250
Epoch 20/25... Discriminator Loss: 1.3586... Generator Loss: 0.4667
Epoch 20/25... Discriminator Loss: 0.7824... Generator Loss: 0.8889
Epoch 20/25... Discriminator Loss: 1.0365... Generator Loss: 0.6395
Epoch 20/25... Discriminator Loss: 0.5064... Generator Loss: 1.9561
Epoch 20/25... Discriminator Loss: 0.7918... Generator Loss: 0.9079
Epoch 20/25... Discriminator Loss: 0.6737... Generator Loss: 1.6241
Epoch 20/25... Discriminator Loss: 1.9074... Generator Loss: 0.2644
Epoch 20/25... Discriminator Loss: 0.2379... Generator Loss: 2.5258
Epoch 21/25... Discriminator Loss: 1.0465... Generator Loss: 0.6451
Epoch 21/25... Discriminator Loss: 1.1553... Generator Loss: 0.5785
Epoch 21/25... Discriminator Loss: 1.2873... Generator Loss: 0.5179
Epoch 21/25... Discriminator Loss: 0.6614... Generator Loss: 0.9075
Epoch 21/25... Discriminator Loss: 0.4734... Generator Loss: 1.4098
Epoch 21/25... Discriminator Loss: 0.9840... Generator Loss: 0.7233
Epoch 21/25... Discriminator Loss: 0.9496... Generator Loss: 0.7101
Epoch 21/25... Discriminator Loss: 1.0864... Generator Loss: 0.6254
Epoch 21/25... Discriminator Loss: 2.6015... Generator Loss: 0.2957
Epoch 21/25... Discriminator Loss: 2.2028... Generator Loss: 0.1963
Epoch 21/25... Discriminator Loss: 0.6188... Generator Loss: 1.9035
Epoch 21/25... Discriminator Loss: 0.8430... Generator Loss: 0.8392
Epoch 21/25... Discriminator Loss: 0.5242... Generator Loss: 1.4207
Epoch 21/25... Discriminator Loss: 0.6424... Generator Loss: 1.1073
Epoch 21/25... Discriminator Loss: 0.9758... Generator Loss: 0.6964
Epoch 21/25... Discriminator Loss: 1.2313... Generator Loss: 0.5623
Epoch 21/25... Discriminator Loss: 1.0427... Generator Loss: 0.6924
Epoch 21/25... Discriminator Loss: 0.7888... Generator Loss: 0.8608
Epoch 21/25... Discriminator Loss: 0.5042... Generator Loss: 1.3225
Epoch 21/25... Discriminator Loss: 0.8715... Generator Loss: 0.8240
Epoch 21/25... Discriminator Loss: 1.4235... Generator Loss: 0.4177
Epoch 21/25... Discriminator Loss: 0.5999... Generator Loss: 1.4921
Epoch 21/25... Discriminator Loss: 1.7035... Generator Loss: 0.3365
Epoch 21/25... Discriminator Loss: 0.7445... Generator Loss: 1.0475
Epoch 21/25... Discriminator Loss: 0.6706... Generator Loss: 1.0310
Epoch 21/25... Discriminator Loss: 0.9793... Generator Loss: 0.8163
Epoch 21/25... Discriminator Loss: 0.8453... Generator Loss: 0.8115
Epoch 21/25... Discriminator Loss: 1.6070... Generator Loss: 0.3134
Epoch 22/25... Discriminator Loss: 0.9023... Generator Loss: 1.8097
Epoch 22/25... Discriminator Loss: 1.0422... Generator Loss: 0.6634
Epoch 22/25... Discriminator Loss: 0.5623... Generator Loss: 1.1943
Epoch 22/25... Discriminator Loss: 0.9178... Generator Loss: 0.7196
Epoch 22/25... Discriminator Loss: 0.7783... Generator Loss: 2.0713
Epoch 22/25... Discriminator Loss: 0.6382... Generator Loss: 1.0417
Epoch 22/25... Discriminator Loss: 0.7070... Generator Loss: 1.1220
Epoch 22/25... Discriminator Loss: 0.4377... Generator Loss: 2.0347
Epoch 22/25... Discriminator Loss: 1.6599... Generator Loss: 0.2918
Epoch 22/25... Discriminator Loss: 0.8155... Generator Loss: 1.6375
Epoch 22/25... Discriminator Loss: 0.5931... Generator Loss: 1.2604
Epoch 22/25... Discriminator Loss: 0.9243... Generator Loss: 0.7726
Epoch 22/25... Discriminator Loss: 0.5528... Generator Loss: 1.2987
Epoch 22/25... Discriminator Loss: 1.5167... Generator Loss: 0.3745
Epoch 22/25... Discriminator Loss: 0.5292... Generator Loss: 1.7210
Epoch 22/25... Discriminator Loss: 0.8413... Generator Loss: 0.7543
Epoch 22/25... Discriminator Loss: 0.9818... Generator Loss: 0.7065
Epoch 22/25... Discriminator Loss: 0.6633... Generator Loss: 1.3945
Epoch 22/25... Discriminator Loss: 1.1573... Generator Loss: 0.5774
Epoch 22/25... Discriminator Loss: 0.5431... Generator Loss: 1.4445
Epoch 22/25... Discriminator Loss: 0.7633... Generator Loss: 0.8911
Epoch 22/25... Discriminator Loss: 0.7730... Generator Loss: 0.8623
Epoch 22/25... Discriminator Loss: 0.6066... Generator Loss: 1.3726
Epoch 22/25... Discriminator Loss: 0.9201... Generator Loss: 0.8475
Epoch 22/25... Discriminator Loss: 0.5488... Generator Loss: 1.5972
Epoch 22/25... Discriminator Loss: 1.0892... Generator Loss: 0.6352
Epoch 22/25... Discriminator Loss: 0.6677... Generator Loss: 1.9283
Epoch 22/25... Discriminator Loss: 0.5221... Generator Loss: 1.3763
Epoch 22/25... Discriminator Loss: 1.2759... Generator Loss: 0.5005
Epoch 23/25... Discriminator Loss: 0.4236... Generator Loss: 1.7390
Epoch 23/25... Discriminator Loss: 0.5432... Generator Loss: 1.2915
Epoch 23/25... Discriminator Loss: 1.0943... Generator Loss: 0.7032
Epoch 23/25... Discriminator Loss: 0.6654... Generator Loss: 1.0695
Epoch 23/25... Discriminator Loss: 0.4184... Generator Loss: 1.8670
Epoch 23/25... Discriminator Loss: 0.8996... Generator Loss: 0.7249
Epoch 23/25... Discriminator Loss: 0.5238... Generator Loss: 2.3122
Epoch 23/25... Discriminator Loss: 0.9717... Generator Loss: 0.7285
Epoch 23/25... Discriminator Loss: 0.5294... Generator Loss: 1.4446
Epoch 23/25... Discriminator Loss: 0.5286... Generator Loss: 1.4667
Epoch 23/25... Discriminator Loss: 0.5205... Generator Loss: 1.4228
Epoch 23/25... Discriminator Loss: 0.5637... Generator Loss: 1.6809
Epoch 23/25... Discriminator Loss: 0.6989... Generator Loss: 0.9277
Epoch 23/25... Discriminator Loss: 0.6951... Generator Loss: 1.0805
Epoch 23/25... Discriminator Loss: 0.9225... Generator Loss: 1.9149
Epoch 23/25... Discriminator Loss: 0.6702... Generator Loss: 1.0190
Epoch 23/25... Discriminator Loss: 1.3030... Generator Loss: 0.5368
Epoch 23/25... Discriminator Loss: 1.1009... Generator Loss: 0.6243
Epoch 23/25... Discriminator Loss: 0.8549... Generator Loss: 0.7747
Epoch 23/25... Discriminator Loss: 0.4645... Generator Loss: 1.4630
Epoch 23/25... Discriminator Loss: 0.5503... Generator Loss: 1.8790
Epoch 23/25... Discriminator Loss: 0.7514... Generator Loss: 1.2209
Epoch 23/25... Discriminator Loss: 1.0574... Generator Loss: 0.8127
Epoch 23/25... Discriminator Loss: 1.1994... Generator Loss: 0.6057
Epoch 23/25... Discriminator Loss: 1.0076... Generator Loss: 0.7529
Epoch 23/25... Discriminator Loss: 1.0566... Generator Loss: 0.6186
Epoch 23/25... Discriminator Loss: 0.3695... Generator Loss: 1.7551
Epoch 23/25... Discriminator Loss: 0.5529... Generator Loss: 1.6654
Epoch 24/25... Discriminator Loss: 0.5765... Generator Loss: 1.1762
Epoch 24/25... Discriminator Loss: 0.7358... Generator Loss: 1.4623
Epoch 24/25... Discriminator Loss: 0.8797... Generator Loss: 0.8096
Epoch 24/25... Discriminator Loss: 0.6959... Generator Loss: 1.1071
Epoch 24/25... Discriminator Loss: 0.5265... Generator Loss: 1.3320
Epoch 24/25... Discriminator Loss: 1.1862... Generator Loss: 0.5537
Epoch 24/25... Discriminator Loss: 0.4036... Generator Loss: 1.6963
Epoch 24/25... Discriminator Loss: 0.9655... Generator Loss: 0.6734
Epoch 24/25... Discriminator Loss: 1.5107... Generator Loss: 0.4297
Epoch 24/25... Discriminator Loss: 1.3819... Generator Loss: 0.4863
Epoch 24/25... Discriminator Loss: 0.9259... Generator Loss: 0.8652
Epoch 24/25... Discriminator Loss: 0.6736... Generator Loss: 1.1326
Epoch 24/25... Discriminator Loss: 0.7372... Generator Loss: 0.9510
Epoch 24/25... Discriminator Loss: 1.0377... Generator Loss: 0.7756
Epoch 24/25... Discriminator Loss: 0.8388... Generator Loss: 0.8843
Epoch 24/25... Discriminator Loss: 0.5949... Generator Loss: 1.2707
Epoch 24/25... Discriminator Loss: 0.9413... Generator Loss: 0.7409
Epoch 24/25... Discriminator Loss: 0.9094... Generator Loss: 0.7733
Epoch 24/25... Discriminator Loss: 1.4203... Generator Loss: 0.4626
Epoch 24/25... Discriminator Loss: 0.6661... Generator Loss: 1.1323
Epoch 24/25... Discriminator Loss: 0.7543... Generator Loss: 1.0616
Epoch 24/25... Discriminator Loss: 0.7003... Generator Loss: 1.3825
Epoch 24/25... Discriminator Loss: 1.9684... Generator Loss: 0.2531
Epoch 24/25... Discriminator Loss: 0.9774... Generator Loss: 0.7928
Epoch 24/25... Discriminator Loss: 0.6322... Generator Loss: 1.0893
Epoch 24/25... Discriminator Loss: 0.7398... Generator Loss: 1.0352
Epoch 24/25... Discriminator Loss: 1.2008... Generator Loss: 0.5538
Epoch 24/25... Discriminator Loss: 1.0590... Generator Loss: 0.6005
Epoch 24/25... Discriminator Loss: 0.9896... Generator Loss: 0.8738
Epoch 25/25... Discriminator Loss: 0.6079... Generator Loss: 1.3686
Epoch 25/25... Discriminator Loss: 0.7620... Generator Loss: 1.0210
Epoch 25/25... Discriminator Loss: 1.2616... Generator Loss: 0.4664
Epoch 25/25... Discriminator Loss: 1.4822... Generator Loss: 0.3628
Epoch 25/25... Discriminator Loss: 0.5827... Generator Loss: 2.0070
Epoch 25/25... Discriminator Loss: 1.0625... Generator Loss: 0.6309
Epoch 25/25... Discriminator Loss: 0.9144... Generator Loss: 0.7911
Epoch 25/25... Discriminator Loss: 0.6906... Generator Loss: 1.0270
Epoch 25/25... Discriminator Loss: 0.5668... Generator Loss: 2.5652
Epoch 25/25... Discriminator Loss: 0.9758... Generator Loss: 0.8044
Epoch 25/25... Discriminator Loss: 0.6513... Generator Loss: 1.1191
Epoch 25/25... Discriminator Loss: 1.0661... Generator Loss: 0.7836
Epoch 25/25... Discriminator Loss: 1.2286... Generator Loss: 0.5615
Epoch 25/25... Discriminator Loss: 0.9525... Generator Loss: 0.6396
Epoch 25/25... Discriminator Loss: 0.8394... Generator Loss: 0.8966
Epoch 25/25... Discriminator Loss: 0.5027... Generator Loss: 1.8718
Epoch 25/25... Discriminator Loss: 1.0048... Generator Loss: 0.7645
Epoch 25/25... Discriminator Loss: 0.4722... Generator Loss: 1.4806
Epoch 25/25... Discriminator Loss: 0.4704... Generator Loss: 2.8551
Epoch 25/25... Discriminator Loss: 0.7332... Generator Loss: 0.9834
Epoch 25/25... Discriminator Loss: 1.1404... Generator Loss: 0.5568
Epoch 25/25... Discriminator Loss: 0.9719... Generator Loss: 0.7012
Epoch 25/25... Discriminator Loss: 0.6020... Generator Loss: 1.2101
Epoch 25/25... Discriminator Loss: 0.9124... Generator Loss: 0.8253
Epoch 25/25... Discriminator Loss: 0.8784... Generator Loss: 0.9245
Epoch 25/25... Discriminator Loss: 0.5314... Generator Loss: 1.3813
Epoch 25/25... Discriminator Loss: 0.6301... Generator Loss: 1.1036
Epoch 25/25... Discriminator Loss: 1.0706... Generator Loss: 0.7497
Epoch 25/25... Discriminator Loss: 1.6185... Generator Loss: 0.3618

In [38]:
fig, ax = plt.subplots()
losses = np.array(losses)
plt.plot(losses.T[0], label='Discriminator', alpha=0.5)
plt.plot(losses.T[1], label='Generator', alpha=0.5)
plt.title("Training Losses")
plt.legend()


Out[38]:
<matplotlib.legend.Legend at 0x7f5c1dc60f98>

In [39]:
_ = view_samples(-1, samples, 6, 12, figsize=(10,5))



In [ ]: