美文网首页
Tensorboard进行可视化 2

Tensorboard进行可视化 2

作者: 马淑 | 来源:发表于2018-07-15 17:36 被阅读19次

如果要可视化训练过程,需要《Tensorboard进行可视化 1》文的基础上进一步做以下改变:

  1. 在 add_layer() 方法中添加一个参数 n_layer,用来标识层数, 并且用变量 layer_name 代表其每层的名称。

  2. 因为add_layer函数增加了n_layer参数,所以增加隐藏层的时候,明确神经层。

# add hidden layer
l1 = add_layer(xs, 1, 10, n_layer=1, activation_function=tf.nn.relu) # The first hidden layer
# add output layer
prediction = add_layer(l1, 10, 1, n_layer=2, activation_function = None) # The second hidden layer
  1. 使用tf.histogram_summary()方法,用来绘制图, 第一个参数是图表的名称:layer_name+'/参数名', 第二个参数是图表要记录的变量: Weights、biases、outputs。
def add_layer(inputs, in_size, out_size, n_layer,activation_function=None):
        layer_name = 'layer%s' % n_layer
        with tf.name_scope(layer_name):
                with tf.name_scope('weights'):
                        Weights = tf.Variable(tf.random_normal([in_size,out_size]), name='W') # Weight matrix
                        tf.summary.histogram(layer_name+'/weights',Weights)
                with tf.name_scope('biases'):
                        biases = tf.Variable(tf.zeros([1, out_size])+0.1, name='b') #Biases is not suggested to be zero, so set +0.1 here
                        tf.summary.histogram(layer_name+'/biases',biases)
                with tf.name_scope('Wx_plus_b'):
                        Wx_plus_b = tf.add(tf.matmul(inputs,Weights),biases)
                if activation_function is None:
                        outputs = Wx_plus_b
                else:
                        outputs = activation_function(Wx_plus_b,)
                tf.summary.histogram(layer_name+'/outputs',outputs)
                return outputs
  1. loss是在scalar里面显示的,修改代码为:
# define loss function,
with tf.name_scope('loss'):
        loss = tf.reduce_mean(tf.square(ys-prediction))
        tf.summary.scalar('loss',loss)
  1. 使用tf.summary.merge_all(),将所有summary合并到一起,并且merged 也是需要run 才能发挥作用的,因此放入sess.run,并且每50步记录一个summary:
sess = tf.Session()
merged = tf.summary.merge_all() # pack summary

writer = tf.summary.FileWriter('logs/',sess.graph)

# initiation
init = tf.global_variables_initializer()
sess.run(init)

for i in range(1000):
        sess.run(train_step,feed_dict={xs:x_data, ys:y_data})  # learn 1000 times
        if i%50==0:
                result = sess.run(merged, feed_dict={xs:x_data, ys:y_data})
                writer.add_summary(result,i)
  1. 打开Command,定位到logs文件所在的目录,输入:tensorboard -logdir logs , 并将出现的链接输入到浏览器(用Chrome),即可查看效果。
command scalars-loss graph Distributions-layer1 Distributions-layer2 Histograms

全部代码:

import tensorflow as tf
import numpy as np

def add_layer(inputs, in_size, out_size, n_layer,activation_function=None):
        layer_name = 'layer%s' % n_layer
        with tf.name_scope(layer_name):
                with tf.name_scope('weights'):
                        Weights = tf.Variable(tf.random_normal([in_size,out_size]), name='W') # Weight matrix
                        tf.summary.histogram(layer_name+'/weights',Weights)
                with tf.name_scope('biases'):
                        biases = tf.Variable(tf.zeros([1, out_size])+0.1, name='b') #Biases is not suggested to be zero, so set +0.1 here
                        tf.summary.histogram(layer_name+'/biases',biases)
                with tf.name_scope('Wx_plus_b'):
                        Wx_plus_b = tf.add(tf.matmul(inputs,Weights),biases)
                if activation_function is None:
                        outputs = Wx_plus_b
                else:
                        outputs = activation_function(Wx_plus_b,)
                tf.summary.histogram(layer_name+'/outputs',outputs)
                return outputs

# Input Observed Data     
x_data = np.linspace(-1,1,300)[:,np.newaxis]
noise = np.random.normal(0,0.05,x_data.shape)
y_data = np.square(x_data)-0.5+noise

# Store Observed Data  with placeholder
with tf.name_scope('inputs'):
        xs = tf.placeholder(tf.float32, [None,1],name='x_input')
        ys = tf.placeholder(tf.float32, [None,1],name='y_input')

# add hidden layer
l1 = add_layer(xs, 1, 10, n_layer=1, activation_function=tf.nn.relu) # The first hidden layer
# add output layer
prediction = add_layer(l1, 10, 1, n_layer=2, activation_function = None) # The second hidden layer

# define loss function,
with tf.name_scope('loss'):
        loss = tf.reduce_mean(tf.square(ys-prediction))
        tf.summary.scalar('loss',loss)

# use Gradient Descent Optimizer to minimize loss
with tf.name_scope('train'):
        train_step = tf.train.GradientDescentOptimizer(0.1).minimize(loss)

sess = tf.Session()
merged = tf.summary.merge_all() # pack summary

writer = tf.summary.FileWriter('logs/',sess.graph)

# initiation
init = tf.global_variables_initializer()
sess.run(init)

for i in range(1000):
        sess.run(train_step,feed_dict={xs:x_data, ys:y_data})  # learn 1000 times
        if i%50==0:
                result = sess.run(merged, feed_dict={xs:x_data, ys:y_data})
                writer.add_summary(result,i)

相关文章

网友评论

      本文标题:Tensorboard进行可视化 2

      本文链接:https://www.haomeiwen.com/subject/biippftx.html