Interactive Sessions

Recommended reading:

Now that we have a few examples under our belt, let us take a look at what is happening a bit more closely.

As we have identified earlier, TensorFlow allows us to create a graph of operations and variables. These variables are called Tensors, and represent data, whether that is a single number, a string, a matrix, or something else. Tensors are combined through operations, and this whole process is modelled in a graph.

First, make sure you have your tensorenv virtual environment activated, Once it is activated type in conda install jupyter to install jupter books.

Then, run jupyter notebook to launch a browser session of the Jupyter Notebook (previously called the IPython Notebook). (If your browser doesn’t open, open it and type localhost:8888 into the browser’s address bar.)

Click “New” and then “Python 3” under “Notebooks”. This will launch a new browser tab. Give the notebook a name by clicking “Untitled” at the top and give it a name (I used “Interactive TensorFlow”).

If you have never used a Jupyter Notebook (or IPython Notebook) before, take a look at this site for a brief introduction.

Next, as before, let’s create a basic TensorFlow program. One major change is the use of an InteractiveSession, which allows us to run variables without needing to constantly refer to the session object (less typing!). Code blocks below are broken into different cells. If you see a break in the code, you will need to run the previous cell first. Also, if you aren’t otherwise confident, ensure all of the code in a given block is type into a cell before you run it.

import tensorflow as tf

session = tf.InteractiveSession()

x = tf.constant(list(range(10)))

In this section of code, we create an InteractiveSession, and then define a constant value, which is like a placeholder, but with a set value (that doesn’t change). In the next cell, we can evaluate this constant and print the result.

print(x.eval())

Next, we close the open session.

session.close()

Closing sessions is quite important, and can be easy to forget. For that reason, we were using the with keyword in earlier tutorials to handle this. When the with block is finished executing, the session will be closed (this also happens if an error happens - the session is still closed).

Now lets take a look at a larger example. In this example, we will take a very large matrix and compute on it, keeping track of when memory is used. First, let’s find out how much memory our Python session is currently using:

import resource
print("{} Kb".format(resource.getrusage(resource.RUSAGE_SELF).ru_maxrss))

On my system, this is using 78496 kilobytes, after running the above code as well. Now, create a new session, and define two matrices:

import numpy as np
session = tf.InteractiveSession()

X = tf.constant(np.eye(10000))
Y = tf.constant(np.random.randn(10000, 300))

Let’s take a look at our memory usage again:

print("{} Kb".format(resource.getrusage(resource.RUSAGE_SELF).ru_maxrss))

On my system, the memory usage jumped to 885,220 Kb - those matrices are large!

Now, let’s multiply those matrices together using matmul:

Z = tf.matmul(X, Y)

If we check our memory usage now, we find that no more memory has been used – no actual computation of Z has taken place. It is only when we evaluate the operation do we actually computer this. For an interactive session, you can just use Z.eval(), rather than run session.run(Z). Note that you can’t always rely on .eval(), as this is a shortcut that uses the “default” session, not necessarily the one you want to use.

If your computer is lower end (for this example, less than 3Gb of ram) then don't run this code - just trust me!

Z.eval()

Your computer will think for quite a while, because only now is it actually performing the action of multiplying those matrices. Checking the memory usage afterwards reveals that this computation has happened, as it now uses nearly 3Gb!

print("{} Kb".format(resource.getrusage(resource.RUSAGE_SELF).ru_maxrss))

Don’t forget to close your session!

session.close()

Exercises

Note: I recommend using a new Jupyter Notebook, as the above example code may accidentally be executed again, possibly causing a computer crash!

1) Create a large matrix (at least 10,000,000) of integer values (for example, use NumPy’s randint function). Check the memory usage after the matrix is created. Then, convert the matrix to float values using TensorFlow’s to_float function. Check the memory usage again to see an increase in memory usage of more than double. The “doubling” is caused by a copy of the matrix being created, but what is the cause of the “extra increase”? After performing this experiment, you can use this code to display the image.

 from PIL import Image
from io import BytesIO

# read data from string
im = Image.open(BytesIO(result))
im
**Hint:** Ensure that you carefully measure memory usage after each step, as just importing TensorFlow will use quite a bit of memory itself.

2) Use TensorFlow’s image functions to convert the image from the previous tutorial (or another image) to JPEG with different functions and record the memory usage.

Grow your business with data analytics

Looking to improve your business through data analytics? Are you interested in implementing data mining, automation or artificial intelligence?

This book is the ultimate guide to getting started with using data in your business, with a non-technical view and focusing on achieving good outcomes. We don't get bogged down by technical detail or complex algorithms.

For additional offers, including a premium package, see this page.

Get updates

Sign up here to receive infrequent emails from us about updates to the site and when new lessons are released.



* indicates required

You can also support LearningTensorFlow.com by becoming a patron at Patreon. If we have saved you trawling through heavy documentation, or given you a pointer on where to go next, help us to create new lessons and keep the site running.

You'll also get access to extra content and updates not available on LearningTensorFlow.com!

Coming soon!

Note: despite the title, this book has no relationship to LearningTensorFlow.com

Learn More!