tf.variable_scope let variables, which from tf.get_variable and tf.Variable. share the same name. note that: when the flag reuse is true, the variable can share the same value.
tf.name_scope let variables, which only from tf.Variable (OP) share the same name.
further more, tf.name_scope prefixes op_name, and tf.variable_scope prefixes the name of the variable created by tf.Variable as well as tf.get_variable().
tf.variable_scope
def conv_relu(input, kernel_shape, bias_shape):
# Create variable named "weights".
weights = tf.get_variable("weights", kernel_shape,
initializer=tf.random_normal_initializer())
# Create variable named "biases".
biases = tf.get_variable("biases", bias_shape,
initializer=tf.constant_intializer(0.0))
conv = tf.nn.conv2d(input, weights,
strides=[1, 1, 1, 1], padding='SAME')
return tf.nn.relu(conv + biases)
def 2_conv2d_network(input_images):
with tf.variable_scope("conv1"):
# Variables created here will be named "conv1/weights", "conv1/biases".
relu1 = conv_relu(input_images, [5, 5, 32, 32], [32])
with tf.variable_scope("conv2"):
# Variables created here will be named "conv2/weights", "conv2/biases".
return conv_relu(relu1, [5, 5, 32, 32], [32])
so that we can reuse the weights and biases with different values, and don't need to feed a new name
when we want parameter to share the same value:
def 2_conv2d_network(input_images):
with tf.variable_scope("image_filters") as scope:
image_first_step = my_image_filter(image)
scope.reuse_variables()
image_second_step = my_image_filter(image_first_step)
notice: each variable in variable_scope will inherit the above reuse value, that means, as long as the first layer of the reuse is enabled, then the following is also enabled!can not manually be changed. when you don't want to share the same value, you should quit the variable_scope!
网友评论