Visualisation with TensorBoard

Recommended reading:

In this lesson we will look at how to create and visualise a graph using TensorBoard. We lightly went over TensorBoard in our 1st lesson on variables

So what is TensorBoard and why would we want to use it?

TensorBoard is a suite of web applications for inspecting and understanding your TensorFlow runs and graphs. TensorBoard currently supports five visualizations: scalars, images, audio, histograms, and graphs. The computations you will use in TensorFlow for things such as training a massive deep neural network, can be fairly complex and confusing, TensorBoard will make this a lot easier to understand, debug, and optimize your TensorFlow programs.

To see a TensorBoard in action, click here.

This is what a TensorBoard graph looks like:



The basic script

Below we have the basic script for building a TensorBoard graph. Right now, all this will return if you run it in a python interpreter, is 63.

import tensorflow as tf

a = tf.add(1, 2,)
b = tf.multiply(a, 3)
c = tf.add(4, 5,)
d = tf.multiply(c, 6,)
e = tf.multiply(4, 5,)
f = tf.div(c, 6,)
g = tf.add(b, d)
h = tf.multiply(g, f)

with tf.Session() as sess:
	print(sess.run(h))
	

Now we add a SummaryWriter to the end of our code, this will create a folder in your given directory, Which will contain the information for TensorBoard to build the graph.

with tf.Session() as sess:
	writer = tf.summary.FileWriter("output", sess.graph)
	print(sess.run(h))
	writer.close()

If you were to run the TensorBoard now, with tensorboard --logdir=path/to/logs/directory, you would see that in your given directory you get a folder named ‘output’. If you navigate to the ip address in your terminal, It will take you to your TensorBoard, Then if you click graphs you will see your graph.

At this point the graph is kind of all over the place and is fairly hard to read. So lets name some of the parts to make it more readable.

Adding names

In the code below we have only added one parameter a few times. name=[something]. This parameter will take the selected area and give it a name on the graph.

a = tf.add(1, 2, name="Add_these_numbers")
b = tf.multiply(a, 3)
c = tf.add(4, 5, name="And_These_ones")
d = tf.multiply(c, 6, name="Multiply_these_numbers")
e = tf.multiply(4, 5, name="B_add")
f = tf.div(c, 6, name="B_mul")
g = tf.add(b, d)
h = tf.multiply(g, f)

Now if you re-run your python file and then run tensorboard --logdir=path/to/logs/directory again, you will now see that your graph has some names on the specific parts you named. However it is still very messy and if this was a huge neural network it would be next to impossible to read.

Creating scopes

If we give the graph a name by typing with tf.name_scope("MyOperationGroup"): and give the graph a scope like this with tf.name_scope("Scope_A"):. when you re-run your TensorBoard you will see something very different. The graph is now much more easier to read, and you can see that it all comes under the graph header, In this case that is MyOperationGroup, and then you have your scopes A and B, Which have there operations within them.

#Here we are defining the name of the graph, scopes A, B and C.
with tf.name_scope("MyOperationGroup"):
    with tf.name_scope("Scope_A"):
        a = tf.add(1, 2, name="Add_these_numbers")
        b = tf.multiply(a, 3)
    with tf.name_scope("Scope_B"):
        c = tf.add(4, 5, name="And_These_ones")
        d = tf.multiply(c, 6, name="Multiply_these_numbers")

with tf.name_scope("Scope_C"):
    e = tf.multiply(4, 5, name="B_add")
    f = tf.div(c, 6, name="B_mul")
g = tf.add(b, d)
h = tf.multiply(g, f)

    

As you can see, the graph is now a lot easier to read.



TensorBoard has a wide range of features, some of which we will cover in future lessons. If you want to dive deeper, start by watching this video from the 2017 TensorFlow Developers Conference.



In this lesson we looked at:

  1. The basic layout for a TensorBoard graph
  2. Adding the Summary writer to build a TensorBoard
  3. Adding names to the TensorBoard graph
  4. Adding a name and scopes to the TensorBoard

Exercises

There’s a great 3rd party tool called TensorDebugger (TDB), TBD is as it says a debugger. But unlike the standard debuggers that are built into the TensorBoard, TBD interfaces directly with the execution of a TensorFlow graph, and allows for stepping through execution one node at a time. Where as the standard TensorBoard debugger cannot be used concurrently with running a TensorFlow graph and log files must be written first.

  1. Install TBD from here and read the material (try the demo!).
  2. Use TBD with this gradient descent code, Plot a graph showing the debugger working through the results and print the predicted model. ( Note: this is 2.7 compatible only )
import tensorflow as tf
import numpy as np

# x and y are placeholders for our training data
x = tf.placeholder("float")
y = tf.placeholder("float")
# w is the variable storing our values. It is initialised with starting "guesses"
# w[0] is the "a" in our equation, w[1] is the "b"
w = tf.Variable([1.0, 2.0], name="w")
# Our model of y = a*x + b
y_model = tf.multiply(x, w[0]) + w[1]

# Our error is defined as the square of the differences
error = tf.square(y - y_model)
# The Gradient Descent Optimizer does the heavy lifting
train_op = tf.train.GradientDescentOptimizer(0.01).minimize(error)

# Normal TensorFlow - initialize values, create a session and run the model
model = tf.global_variables_initializer()

with tf.Session() as session:
    session.run(model)
    for i in range(1000):
        x_value = np.random.rand()
        y_value = x_value * 2 + 6
        session.run(train_op, feed_dict={x: x_value, y: y_value})

    w_value = session.run(w)
    print("Predicted model: {a:.3f}x + {b:.3f}".format(a=w_value[0], b=w_value[1]))



These special icons are used for constants and summary nodes.

special icons

Keep going!

We have an increasing set of lessons that we hope guides you through learning this powerful library. Follow these links to keep going to our next lesson.

You can also use the nav menu at the top of the page to go directly to a specific lesson.

Coming soon (although not written by us):

Get updates

Sign up here to receive infrequent emails from us about updates to the site and when new lessons are released.



* indicates required

If you have any feedback, please see our page here. If you spot any errors with our lessons, please direct them to our Github page with the name of the lesson in which the error resides, so that we can resolve them and close them off there.

If you have larger questions that may involve consultancy, please contact us here