In TensorFlow, every line of code that you write has to go through a computational graph. You’ll find that when you’re working with TensorFlow, constants, variables, and placeholders come handy to define the input data, class labels, weights, and biases.
Constants
In case of Constant, it takes no input, you use them to store constant values. They produce a constant output that it stores.
Code with Tensorflow Version 1.x
%tensorflow_version 1.x
import tensorflow as tf
print(tf.__version__)
a = tf.constant(2.0)
b = tf.constant(3.0)
c = a * b
In case of Tensorflow 1.x, for executing every line of code that you write, it has to go through a computational graph. In older version of Tensorflow 1.x you have to execute same as Session.
sess = tf.Session()
sess.run(c)
Output - 1.15.2
6.0
Code with Tensorflow Version 2.x
%tensorflow_version 2.x
import tensorflow as tf
print(tf.__version__)
a = tf.constant(2.0)
b = tf.constant(3.0)
c = a * b
tf.print(c)
Output - 2.4.1
6
However, in newer version starting Tensorflow 2.0 you can directly print it.
Placeholders
The Placeholders which are values that are unassigned and that will be initialized by the session when you run it. Like the name already given away, it’s just a placeholder for a tensor that will always be fed when the session is run.
Code with Tensorflow Version 1.x
%tensorflow_version 1.x
import tensorflow as tf
print(tf.__version__)
a = tf.placeholder(tf.float32)
b = tf.placeholder(tf.float32)
add = a + b
sess = tf.Session()
# Executing add by passing the values [1, 3] [2, 4] for a and b respectively
output = sess.run(add, {a: [1,3], b: [2, 4]})
print('Adding a and b:', output)
Output - 1.15.2
Adding a and b: [3. 7.]
Code with Tensorflow Version 2.x
%tensorflow_version 2.x
import tensorflow as tf
print(tf.__version__)
a = tf.Variable(tf.ones(shape=(2,2)), name="a")
b = tf.Variable(tf.zeros(shape=(2)), name="b")
print (a + b)
Output - tf.Tensor( [[1. 1.] [1. 1.]], shape=(2, 2), dtype=float32)
Variables
The variables allow you to modify the graph such that it can produce new outputs with respect to the same inputs, the value can be modified throughout time. To initialize all the variables in TensorFlow, you need to explicitly call the global variable initializer global_variables_initializer(), which initializes all the existing variables in your TensorFlow code. Variables survive across multiple executions of a graph, unlike normal tensors that are only instantiated when a graph is run and are immediately deleted afterward.
We have seen that placeholders are used for holding the input data and class labels, whereas variables are used for weights and biases.
Code with Tensorflow Version 1.x
%tensorflow_version 1.x
import tensorflow as tf
print(tf.__version__)
#Variables are defined by providing their initial value and type
variable = tf.Variable([0.9,0.7], dtype = tf.float32)
print(variable)
#variable must be initialized before a graph is used for the first time.
init = tf.global_variables_initializer()
sess = tf.Session()
sess.run(init)
print(sess.run(variable))
Output - 1.15.2 <tf.Variable 'Variable_6:0' shape=(2,) dtype=float32_ref>
[0.9 0.7]
Code with Tensorflow Version 2.x
%tensorflow_version 2.x
import tensorflow as tf
print(tf.__version__)
#Variables are defined by providing their initial value and type
variable = tf.Variable([0.9,0.7], dtype = tf.float32)
print(variable)
Output - 2.4.1
<tf.Variable 'Variable:0' shape=(2,) dtype=float32, numpy=array([0.9, 0.7], dtype=float32)>