Variables

  • Variables must be initialized by running an init Op after having launched the graph. - We first have to add the init Op to the graph.

In [2]:
import tensorflow as tf

In [3]:
# creating a variable : Note we gave an initialization value
state = tf.Variable(0,name = "counter")
one = tf.constant(1)
incr = tf.add(state,one)

# state = state + one produces error

update = tf.assign(state,incr)

# the initialization operation
init_op = tf.initialize_all_variables()

with tf.Session() as sess:
    # print(sess.run(state)) ERROR
    sess.run(init_op)   # initialized my variables
    print(sess.run(state))
    sess.run(incr)
    print(incr)
    sess.run(update)
    print(update)
    print(sess.run(state))


0
Tensor("Add:0", shape=(), dtype=int32)
Tensor("Assign:0", shape=(), dtype=int32_ref)
1

Custom Initialization

The convenience function tf.initialize_all_variables() adds an op to initialize all variables in the model. You can also pass it an explicit list of variables to initialize. See the Variables Documentation for more options, including checking if variables are initialized.

Initialization from another Variable

You sometimes need to initialize a variable from the initial value of another variable. As the op added by tf.initialize_all_variables() initializes all variables in parallel you have to be careful when this is needed.

To initialize a new variable from the value of another variable use the other variable's initialized_value() property. You can use the initialized value directly as the initial value for the new variable, or you can use it as any other tensor to compute a value for the new variable.


In [4]:
weights = tf.Variable(tf.random_normal(shape = (3,3),mean = 0,stddev = 1.0),name = "weights")
biases = tf.Variable(tf.random_uniform(shape = (3,1),minval = -1,maxval = 1),name = "biases")

w2 = tf.Variable(weights.initialized_value(),name = "w2")
b2 = tf.Variable(biases.initialized_value()*2,name = "b2")

# init_op1 = tf.initialize_all_variables([weights])
# init_op2 = tf.initialize_all_variables([biases])  DIDN'T WORK
# init_op3 = tf.initialize_all_variables([w2,b2])

init = tf.initialize_all_variables()

with tf.Session() as sess:
    # sess.run(init_op1)
    # sess.run(inti_op2)
    # sess.run(init_op3)
    sess.run(init)
    
    print(sess.run(weights))
    print(sess.run(biases))
    print(sess.run(w2))
    print(sess.run(b2))


[[ 0.38019025  0.66199875 -1.05253553]
 [-1.04309237 -0.29391313  0.01105313]
 [-1.87257981  0.22100802  0.06594836]]
[[-0.00422788]
 [-0.49569464]
 [ 0.12291121]]
[[ 0.38019025  0.66199875 -1.05253553]
 [-1.04309237 -0.29391313  0.01105313]
 [-1.87257981  0.22100802  0.06594836]]
[[-0.00845575]
 [-0.99138927]
 [ 0.24582243]]

In [8]:
x = tf.constant(35,name = 'x')
y = tf.Variable(x+5,name = 'y')

with tf.Session() as sess:
    # sess.run(x)           # NO NEED TO INITIALIZE OR RUN A CONSTANT
    sess.run(y.initializer)
    print(sess.run(y))


40

In [10]:
x = tf.constant([1,2,3])
y = tf.Variable(x+5)

with tf.Session() as sess:
    sess.run(y.initializer)
    print(sess.run(y))


[6 7 8]

In [12]:
with tf.Session() as sess:
    print(x.eval())


[1 2 3]

Placeholders

  • don't use eval()
  • data need to be fed to them
  • used for taking input and output, ie don't change during the course of learning

Feeding

  • TensorFlow's feed mechanism lets you inject data into any Tensor in a computation graph. A python computation can thus feed data directly into the graph.

  • Supply feed data through the feed_dict argument to a run() or eval() call that initiates computation.

with tf.Session(): input = tf.placeholder(tf.float32) classifier = ... print(classifier.eval(feed_dict={input: my_python_preprocessing_fn()}))

  • While you can replace any Tensor with feed data, including variables and constants, the best practice is to use a placeholder op node. A placeholder exists solely to serve as the target of feeds. It is not initialized and contains no data. A placeholder generates an error if it is executed without a feed, so you won't forget to feed it.

In [14]:
import numpy as np

x = tf.placeholder(tf.float32,shape = (3,3),name = "x")
y = tf.matmul(x,x)

with tf.Session() as sess:
    rnd = np.random.rand(3,3)
    result = sess.run(y,feed_dict = {x:rnd})
    print(result)


[[ 1.36074042  0.25870848  0.74567944]
 [ 2.01679397  0.41987088  1.10313272]
 [ 1.24884903  0.21098296  0.87478375]]

In [ ]:
# giving partial shapes

x = tf.placeholder("float",[None,3])   # while the num_rows can be any number, num_cols = 3