Ch 02: Concept 08

Using TensorBoard

TensorBoard is a great way to visualize what's happening behind the code.

In this example, we'll loop through some numbers to improve our guess of the average value. Then we can visualize the results on TensorBoard.

Let's just set ourselves up with some data to work with:


In [1]:
import tensorflow as tf
import numpy as np

raw_data = np.random.normal(10, 1, 100)

The moving average is defined as follows:


In [2]:
alpha = tf.constant(0.05)
curr_value = tf.placeholder(tf.float32)
prev_avg = tf.Variable(0.)

update_avg = alpha * curr_value + (1 - alpha) * prev_avg

Here's what we care to visualize:


In [3]:
avg_hist = tf.summary.scalar("running_average", update_avg)
value_hist = tf.summary.scalar("incoming_values", curr_value)

merged = tf.summary.merge_all()
writer = tf.summary.FileWriter("./logs")

Time to compute the moving averages. We'll also run the merged op to track how the values change:


In [4]:
init = tf.global_variables_initializer()

with tf.Session() as sess:
    sess.run(init)
    for i in range(len(raw_data)):
        summary_str, curr_avg = sess.run([merged, update_avg], feed_dict={curr_value: raw_data[i]})
        sess.run(tf.assign(prev_avg, curr_avg))
        print(raw_data[i], curr_avg)
        writer.add_summary(summary_str, i)


9.247841477069203 0.4623921
10.019298730125382 0.9402374
11.971773672793464 1.4918143
10.702923359431118 1.9523697
11.667057068606786 2.4381042
9.143228690197773 2.7733603
9.457709656523708 3.1075776
12.33999608545561 3.5691986
9.543410229631846 3.8679092
9.251442209932934 4.137086
8.942198790212387 4.3773413
11.019946553148321 4.709471
11.430198193578404 5.0455074
8.6213954795195 5.224302
10.822108995258686 5.504192
10.58310002901428 5.7581377
10.20420365104725 5.9804406
10.312154931419304 6.1970263
10.545111153579882 6.4144306
8.797765458370709 6.5335975
8.56686695526782 6.6352606
12.570525410195215 6.9320235
11.543815331679966 7.162613
10.320920832332627 7.320528
10.423914230722215 7.4756975
10.619258439210187 7.6328754
9.101109809288653 7.7062874
9.841278298991933 7.813037
9.099955845561944 7.877383
9.41973125623955 7.9545
11.082836040691273 8.110917
10.116690980009775 8.2112055
9.402594289154155 8.270775
10.925993106488145 8.403536
10.243254438024696 8.495522
9.477769687949733 8.544634
9.351362392482848 8.58497
9.242191906408548 8.617831
12.123667719477677 8.793122
10.076009517273803 8.857266
9.74900667301667 8.901854
10.830363231386094 8.998279
8.861116004341559 8.991421
10.007389057190906 9.042218
10.769369554012615 9.128575
12.971561516039255 9.3207245
9.875042913748056 9.34844
9.64616462712992 9.363326
9.76634851758219 9.383477
9.326634526001623 9.380634
8.492294014699189 9.336217
10.006073094467316 9.369709
9.442892778881891 9.373368
9.56787198816676 9.383093
9.961494974707488 9.412013
9.572285501643822 9.420026
11.851354361154291 9.541592
10.833573476171445 9.606191
11.836376240592454 9.7177
11.047626672901409 9.784197
10.913818292308468 9.840677
10.60857743486623 9.879072
9.883074005285522 9.879272
8.227633816367192 9.79669
9.788906167639809 9.796301
9.001469197788671 9.756559
8.918205933440774 9.714642
9.885320274459133 9.723175
10.77521268535355 9.775776
9.68349427673202 9.771162
10.113753965038361 9.788292
9.6597232190883 9.781863
9.323572053812015 9.758949
9.618532841629188 9.751928
9.011944462852757 9.714929
8.323719148832197 9.645369
9.442883485401897 9.635244
10.430287903497137 9.674997
10.838671174170663 9.733181
9.346134056876938 9.713829
10.234079103904495 9.739841
9.692786236742311 9.737489
8.675916172925552 9.68441
9.7006074487691 9.68522
10.064675943184373 9.704192
9.4021612098359 9.689091
11.124410899430886 9.760857
10.034575898474612 9.774543
9.793431430576485 9.775487
10.889420930759462 9.831183
9.253518007206916 9.8023
11.114827470151916 9.867927
9.378996323459113 9.843479
9.864306640072803 9.844521
11.803316169037448 9.94246
10.103049011008196 9.95049
8.723187258083906 9.889125
8.985505621881307 9.843944
10.690261212066178 9.88626
8.426969249944442 9.813295

Check out the visualization by running TensorBoard from the terminal:

$ tensorboard --logdir=path/to/logs

In [5]:
#made the logs be written successfully
writer.close()

In [ ]: