In [0]:
#@title Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
|
|
TensorBoard can be used directly within notebook experiences such as and Jupyter. This can be helpful for sharing results, integrating TensorBoard into existing workflows, and using TensorBoard without installing anything locally.
Start by installing TF 2.0 and loading the TensorBoard notebook extension:
For Jupyter users: If you’ve installed Jupyter and TensorBoard into
the same virtualenv, then you should be good to go. If you’re using a
more complicated setup, like a global Jupyter installation and kernels
for different Conda/virtualenv environments, then you must ensure that
the tensorboard
binary is on your PATH
inside the Jupyter notebook
context. One way to do this is to modify the kernel_spec
to prepend
the environment’s bin
directory to PATH
, as described here.
In case you are running a Docker image of Jupyter Notebook server using TensorFlow's nightly, it is necessary to expose not only the notebook's port, but the TensorBoard's port.
Thus, run the container with the following command:
docker run -it -p 8888:8888 -p 6006:6006 \
tensorflow/tensorflow:nightly-py3-jupyter
where the -p 6006
is the default port of TensorBoard. This will allocate a port for you to run one TensorBoard instance. To have concurrent instances, it is necessary to allocate more ports.
In [1]:
# Load the TensorBoard notebook extension
%load_ext tensorboard
Import TensorFlow, datetime, and os:
In [0]:
import tensorflow as tf
import datetime, os
Download the FashionMNIST dataset and scale it:
In [4]:
fashion_mnist = tf.keras.datasets.fashion_mnist
(x_train, y_train),(x_test, y_test) = fashion_mnist.load_data()
x_train, x_test = x_train / 255.0, x_test / 255.0
Create a very simple model:
In [0]:
def create_model():
return tf.keras.models.Sequential([
tf.keras.layers.Flatten(input_shape=(28, 28)),
tf.keras.layers.Dense(512, activation='relu'),
tf.keras.layers.Dropout(0.2),
tf.keras.layers.Dense(10, activation='softmax')
])
Train the model using Keras and the TensorBoard callback:
In [6]:
def train_model():
model = create_model()
model.compile(optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
logdir = os.path.join("logs", datetime.datetime.now().strftime("%Y%m%d-%H%M%S"))
tensorboard_callback = tf.keras.callbacks.TensorBoard(logdir, histogram_freq=1)
model.fit(x=x_train,
y=y_train,
epochs=5,
validation_data=(x_test, y_test),
callbacks=[tensorboard_callback])
train_model()
Start TensorBoard within the notebook using magics:
In [0]:
%tensorboard --logdir logs
You can now view dashboards such as scalars, graphs, histograms, and others. Some dashboards are not available yet in Colab (such as the profile plugin).
The %tensorboard
magic has exactly the same format as the TensorBoard command line invocation, but with a %
-sign in front of it.
You can also start TensorBoard before training to monitor it in progress:
In [0]:
%tensorboard --logdir logs
The same TensorBoard backend is reused by issuing the same command. If a different logs directory was chosen, a new instance of TensorBoard would be opened. Ports are managed automatically.
Start training a new model and watch TensorBoard update automatically every 30 seconds or refresh it with the button on the top right:
In [9]:
train_model()
You can use the tensorboard.notebook
APIs for a bit more control:
In [10]:
from tensorboard import notebook
notebook.list() # View open TensorBoard instances
In [0]:
# Control TensorBoard display. If no port is provided,
# the most recently launched TensorBoard is used
notebook.display(port=6006, height=1000)