I read this tutorial. There were the confusing terms they are tf.placeholder
and tf.Variable
. So I checked the difference point.
tf.placeholder
is called when a session runs a calculation.
If once you set a value with tf.placeholder
, it can not change an own variable.
tf.Variable
can change an own variable by assign
method. So tf.Variable
is literally "variable."
Finally, I found this question was beneficial for me.
The top rated answer said:
You use tf.Variable
for trainable variables such as weights and biases for your model.
weights = tf.Variable(
tf.truncated_normal([IMAGE_PIXELS, hidden1_units],
stddev=1.0 / math.sqrt(float(IMAGE_PIXELS))), name='weights')
biases = tf.Variable(tf.zeros([hidden1_units]), name='biases')
tf.placeholder
is used to feed actual training examples.
images_placeholder = tf.placeholder(tf.float32, shape=(batch_size, IMAGE_PIXELS))
labels_placeholder = tf.placeholder(tf.int32, shape=(batch_size))
for step in xrange(FLAGS.max_steps):
feed_dict = {
images_placeholder: images_feed,
labels_placeholder: labels_feed,
}
_, loss_value = sess.run([train_op, loss], feed_dict=feed_dict)
And the another answer said:
The more important difference is their role within TensorFlow. Variables are trained over time, placeholders are are input data that doesn't change as your model trains (like input images, and class labels for those images).
I appreciate roughly my question thanks to the StackOverflow's answer.