基于TF1
Tensorflow的几种基本数据类型:
tf.constant(value, dtype=None, shape=None, name='Const', verify_shape=False)
tf.Variable(initializer, name)
tf.placeholder(dtype, shape=None, name=None)
我们来看下,当使用以上数据类型时,图中节点创建情况
constant常量
import tensorflow as tf
#打印图中节点信息
def dump_graph(g, filename):
print(filename)
print(g.as_graph_def())
#获取默认图
g = tf.get_default_graph()
cons = tf.constant([1, 2, 3, 4, 5, 6, 7],name="const_array")#定义一个长常量
dump_graph(g, 'after_cons_creation.graph')
init = tf.global_variables_initializer()#变量初始化
dump_graph(g, 'after_initializer_creation.graph')
with tf.Session() as sess:
sess.run(init)
dump_graph(g, 'after_initializer_run.graph')
#几率图信息到tensorfboard中
file_write = tf.summary.FileWriter('/home/jiadongfeng/tensorflow/board/', graph=sess.graph)
-
after_cons_creation.graph
在执行完var = tf.constant([1, 2, 3, 4, 5, 6, 7])后,图中生成了以下结点:Const:用来保存cons常量;常量在会话中是不需要进行所谓初始化的。
node {
name: "const_array"
op: "Const"
attr {
key: "dtype"
value {
type: DT_INT32
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_INT32
tensor_shape {
dim {
size: 7
}
}
tensor_content: "\001\000\000\000\002\000\000\000\003\000\000\000\004\000\000\000\005\000\000\000\006\000\000\000\007\000\000\000"
}
}
}
}
versions {
producer: 38
}
-
after_initializer_creation.graph
在执行完tf.global_variables_initializer()后,图中结点为:- Const
- init
after_initializer_creation.graph
node {
name: "const_array"
op: "Const"
...
}
node {
name: "init"
op: "NoOp"
}
versions {
producer: 38
}
- after_initializer_run.graph
由打印的信息可知,虽然函数global_variables_initializer()的执行在图中添加了一个init的结点,但是没有任何操作。
同时,我们可以看到关于常量的类型,形状、具体的值都已经在一个node中包含了
after_initializer_run.graph
node {
name: "const_array"
op: "Const"
attr {
key: "dtype"
value {
type: DT_INT32
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_INT32
tensor_shape {
dim {
size: 7
}
}
tensor_content: "\001\000\000\000\002\000\000\000\003\000\000\000\004\000\000\000\005\000\000\000\006\000\000\000\007\000\000\000"
}
}
}
}
node {
name: "init"
op: "NoOp"
}
versions {
producer: 38
}
tensorboard图:
constant图
Variables 变量
tf.Variable(initializer, name)
- initializer 初始化的只
- name 变量名称
例子:
import tensorflow as tf
def dump_graph(g, filename):
print(filename)
print(g.as_graph_def())
g = tf.get_default_graph()
var = tf.Variable(3)
dump_graph(g, 'after_var_creation.graph')
init = tf.global_variables_initializer()
dump_graph(g, 'after_initializer_creation.graph')
with tf.Session() as sess:
sess.run(init)
dump_graph(g, 'after_initializer_run.graph')
file_write = tf.summary.FileWriter('/home/jiadongfeng/tensorflow/board/', graph=sess.graph)
- after_var_creation.graph
after_var_creation.graph
node {
name: "Variable/initial_value"
op: "Const"
...
attr {
key: "value"
value {
tensor {
dtype: DT_INT32
tensor_shape {
}
int_val: 3
}
}
}
}
node {
name: "Variable"
op: "VariableV2"
...
}
node {
name: "Variable/Assign"
op: "Assign"
input: "Variable"
input: "Variable/initial_value"
...
}
node {
name: "Variable/read"
op: "Identity"
input: "Variable"
...
}
versions {
producer: 38
}
在执行完tf.Variable(3)以后,图中生成了以下几个结点:
- Variable/initial_value
- Variable
- Variable/Assign
- Variable/read
变量创建后的tensorboard图:
var_create.png- after_initializer_creation.graph
...
node {
name: "init"
op: "NoOp"
input: "^Variable/Assign"
}
执行完tf.global_variables_initializer()后,图中结点变为:
Variable/initial_value
Variable
Variable/Assign
Variable/read
init : 图中变量初始化的作用
tensorboard图:
调用初始化后,创建了init节点,虚线表示没有执行任何操作
- after_initializer_run.graph
node {
name: "Variable/initial_value"
op: "Const"
attr {
key: "dtype"
value {
type: DT_INT32
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_INT32
tensor_shape {
}
int_val: 3
}
}
}
}
node {
name: "Variable"
op: "VariableV2"
attr {
key: "container"
value {
s: ""
}
}
attr {
key: "dtype"
value {
type: DT_INT32
}
}
attr {
key: "shape"
value {
shape {
}
}
}
attr {
key: "shared_name"
value {
s: ""
}
}
}
node {
name: "Variable/Assign"
op: "Assign"
input: "Variable"
input: "Variable/initial_value"
attr {
key: "T"
value {
type: DT_INT32
}
}
attr {
key: "_class"
value {
list {
s: "loc:@Variable"
}
}
}
attr {
key: "use_locking"
value {
b: true
}
}
attr {
key: "validate_shape"
value {
b: true
}
}
}
node {
name: "Variable/read"
op: "Identity"
input: "Variable"
attr {
key: "T"
value {
type: DT_INT32
}
}
attr {
key: "_class"
value {
list {
s: "loc:@Variable"
}
}
}
}
node {
name: "init"
op: "NoOp"
input: "^Variable/Assign"
}
versions {
producer: 38
}
执行完初始化后的tensorbord图:
变量图placeholder 占位符
tf.placeholder(dtype, shape=None, name=None)
dtype:数据类型。常用的是tf.float32, tf.float64等数值类型
shape:数据形状。默认是None,就是一维值,也可以是多维(比如[2,3]表示2行3列数据, [None, 3] 表示数据的列是3,行不定)
name:名称,可以理解为变量的名字(自变量)
为什么要使用tf.placeholder:
因为每一个tensor值在graph上都是一个op,placeholder被使用后,下次赋值后可以继续使用,且只会产生一个节点,极大了节省了开销。
例子:
import tensorflow as tf
def dump_graph(g, filename):
print(filename)
print(g.as_graph_def())
g = tf.get_default_graph()
input1 = tf.placeholder(tf.float32, None)
input2 = tf.placeholder(tf.float32, None)
dump_graph(g, 'after_var_creation.graph')
output = tf.multiply(input1, input2)
with tf.Session() as sess:
print sess.run(output, feed_dict = {input1:[3.], input2: [4.]})
print sess.run(output, feed_dict = {input1:[5.], input2: [6.]})
file_write = tf.summary.FileWriter('/home/jiadongfeng/tensorflow/board/', graph=sess.graph)
在执行完placeholder(tf.float32, None)后,图中生成了一个结点:
Placeholder
输出结果为:
after_var_creation.graph
node {
name: "Placeholder"
op: "Placeholder"
attr {
key: "dtype"
value {
type: DT_FLOAT
}
}
attr {
key: "shape"
value {
shape {
unknown_rank: true
}
}
}
}
versions {
producer: 38
}
[12.]
[30.]
tensorboard 图
placeholder节点图如图所示,以上操作,每个placeholder只会产生一个节点,无论复用多少次
网友评论