layer模块是TensorFlow中的一个High Level的模块,提供方便快捷的API,可以很方便的使用layer模块构建Dense 层,卷积层,激活层,dropout回归等,这篇文章使用Layer模块快速的构建一个深度卷积神经网络,基于MNIST数据分类,修改使用Lenet深度卷积神经网络。
主程序框架
先写一个程序框架,然后往这个框架中填代码:
# 引入库
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
# 引入TensorFlow库和numpy库
import tensorflow as tf
import numpy as np
# 设置日志的冗余程度,这里设置日志冗余最低,就是把INFO级别及更高级别的日志都输出
tf.logging.set_verbosity(tf.logging.INFO)
# 下面是添加工程代码
if __name__ == "__main__":
tf.app.run()
卷积神经网络
一个典型的CNN网络,包含一堆卷积模块,这些卷积模块用来提取特征,每一个卷积模块包含一个卷积层,后面跟一个池化层,最后一个卷积模块通常一个或者几个Dense层或者叫做全卷积层。
在最简单的MNist分类任务中,最后一层是单一结点,这个节点用softmax激活函数激活,输出一个0到1之间的小数,这个数字表示当前图像与分类类别之间的相似程度,数字最大的对应当前图像对应的类别。
经典的MNIST分类的网络架构是:
第一层:卷积层,有32个卷积核,每个核的大小是55.激活函数是ReLU
第二层:池化层,22的卷积核,步长是2(池化层没有overlap)
第三层:卷积层,有64个卷积核,每个核的大小是55.
第四层:池化层,22的卷积核,步长是2
第五层:全卷积层,也称作是Dense层,1024个神经元,dropout正则化系数是04,在训练过程中每个单元有0.4的概率不激活
第六层:全卷基层,Dense层,10个神经元,每一个对应一个数字的类别。
TensorFlow中的Layer模块基本函数
在TensorFlow中的layer模块,有下面的几个函数分别创建上面提高的网络层:
conv2d:创建一个二维的卷积层,参数包含滤波器的数量,滤波器的核大小,padding,激活函数
max_pooling2d:创建一个二维的max pooling 层,使用max pooling 算法,输入参数包含卷积核的大小和步长大小
dense:创建一个Dense层,输入参数包含神经元个数和激活函数。
添加函数
在上面的程序框架中添加一个函数,用来创建网络模型
def cnn_model_fn(features, labels, mode):
"""创建CNN模型"""
# 输入层
input_layer = tf.reshape(features['x'], [-1, 28, 28, 1])
# 第一卷积层
conv1 = tf.layers.conv2d(
inputs=input_layer,
filters=32,
kernel_size=[5,5],
padding="same",
activation=tf.nn.relu)
# 池化层
pool1= tf.layers.max_pooling2d(inputs=conv1,pool_size=[2,2], strides=2)
# 第二卷积层
conv2 = tf.layers.conv2d(
inputs=pool1,
filters=64,
kernel_size=[5,5],
padding="same",
activation=tf.nn.relu)
# 池化层
pool2 = tf.layers.max_pooling2d(inputs=conv2, pool_size=[2,2], strides=2)
# 全卷积层
pool2_flat = tf.reshape(pool2, [-1, 7 * 7 * 64])
dense = tf.layers.dense(inputs=pool2_flat,units=1024,activation=tf.nn.relu)
dropout = tf.layers.dropout(inputs=dense,rate=0.4,training=mode == tf.estimator.ModeKeys.TRAIN)
# 最后一层(全卷积)
logits = tf.layers.dense(inputs=dropout,units=10)
predictions = {
# generate predictions(for predict and evalue mode)
"classes": tf.argmax(input=logits,axis=1),
# Add `softmax_tensor` to the graph. It is used for PREDICT and by the
# `logging_hook`.
"probabilities": tf.nn.softmax(logits, name="softmax_tensor")
}
if mode == tf.estimator.ModeKeys.PREDICT:
return tf.estimator.EstimatorSpec(mode=mode, predictions=predictions)
# Calculate Loss (for both TRAIN and EVAL modes)
loss = tf.losses.sparse_softmax_cross_entropy(labels=labels, logits=logits)
# Configure the Training Op (for TRAIN mode)
if mode == tf.estimator.ModeKeys.TRAIN:
optimizer = tf.train.GradientDescentOptimizer(learning_rate=0.001)
train_op = optimizer.minimize(
loss=loss,
global_step=tf.train.get_global_step())
return tf.estimator.EstimatorSpec(mode=mode, loss=loss, train_op=train_op)
# Add evaluation metrics (for EVAL mode)
eval_metric_ops = {
"accuracy": tf.metrics.accuracy(
labels=labels, predictions=predictions["classes"])}
return tf.estimator.EstimatorSpec(
mode=mode, loss=loss, eval_metric_ops=eval_metric_ops)
解释上面的代码
-
输入层:
对输入的数据进行reshape,调整把输入数据调整好一个tensor,每个tensor有下面的形状:
[batch_szie, image_width, image_height, channels]
在这里,batch_size的大小是-1,表示这个值是在训练过程是动态变化的。允许我们把batch_size作为一个超参数进行调整。 -
卷积层:
包含有这几个参数,inputs必须与输入层一样的tensor形状。
kernel的大小是两个数字确定的,但是确定kernel的长和宽的大小是一样的,tensor的大小可以用一个数字确定。
padding有两个枚举类型的值,一个是same,表示经过卷积层后,输出大小与输入大小一样,在TensorFlow中在边缘添加0进行padding,valid是默认值,没有padding -
激活层
激活层就使用relu进行激活 -
池化层:
只有两个参数,卷积核大小和步长 -
Dense层:
通常会在卷积层后面添加一个或者多个Dense层,卷积层是二维的,全卷积层是一维的,所以需要在连接全卷积层之前,进行平坦化。
然后添加到Dense层中作为输入。 -
dropout
为了改善模型的结果,采用dropout正则化项,这也算是一层。
后面的training=mode== tf.estimator.ModeKeys.TRAIN, 表示只有在训练的时候才使用dropout正则化。
预测
最后一层的名字是logit,但是网络输出的是一个N*10的一个张量,其中N表示一个Batch的大小,但是我们期望的是一个类别和对应的概率,接下来就是构建这样一个字典:
predictions = {
# generate predictions(for predict and evalue mode)
"classes": tf.argmax(input=logits,axis=1),
# Add `softmax_tensor` to the graph. It is used for PREDICT and by the
# `logging_hook`.
"probabilities": tf.nn.softmax(logits, name="softmax_tensor")
}
这是一个Python字典,第一个元素的key是“classes”,对应的值是每一行中的最大值(最大值的下标表示对应的类别);第二个元素是“Probabilities”,其值是对应的每一行数据的softmax软回归的值。这里显式的给这个软回归起名字叫做“softmax_tensor”在后面方面打印日志使用。
这个字典是在预测的时候使用的,在预测的时候,需要把这个字典当做参数,然后使用estimator空间返回。
if mode == tf.estimator.ModeKeys.PREDICT:
return tf.estimator.EstimatorSpec(mode=mode, predictions=predictions)
训练
定义网络的训练操作,首先要定义的是损失函数,这里使用的是交叉熵的损失函数。
loss = tf.losses.sparse_softmax_cross_entropy(labels=labels, logits=logits)
上面的操作是直接使用的labels,但是在官网的教程中,使用的是onehot_labels,是将原始的label(只有一个数字进行了填零扩充)
onehot_labels = tf.one_hot(indices=tf.cast(labels, tf.int32), depth=10)
loss = tf.losses.softmax_cross_entropy(
onehot_labels=onehot_labels, logits=logits)
训练操作是:
if mode == tf.estimator.ModeKeys.TRAIN:
optimizer = tf.train.GradientDescentOptimizer(learning_rate=0.001)
train_op = optimizer.minimize(
loss=loss,
global_step=tf.train.get_global_step())
return tf.estimator.EstimatorSpec(mode=mode, loss=loss, train_op=train_op)
测试
测试阶段,关心的是计算的类别与输出的类别是否一致
eval_metric_ops = {
"accuracy": tf.metrics.accuracy(
labels=labels, predictions=predictions["classes"])}
return tf.estimator.EstimatorSpec(
mode=mode, loss=loss, eval_metric_ops=eval_metric_ops)
在训练、测试、预测阶段,总是将模式,损失函数或者字典传递给创建estimator空间的函数,然后返回一个estimator空间。
训练网络
首先定义main函数,然后加载数据:
def main(unused_argv):
# Load training and eval data
mnist = tf.contrib.learn.datasets.load_dataset("mnist")
train_data = mnist.train.images # Returns np.array
train_labels = np.asarray(mnist.train.labels, dtype=np.int32)
eval_data = mnist.test.images # Returns np.array
eval_labels = np.asarray(mnist.test.labels, dtype=np.int32)
这里是利用了历程中自带的脚本自动下载MNIST数据库,病转换成了numpy中的np.array格式存储。其中55000张图像是用来训练,10000张图像用来测试。
创建一个estimator
mnist_classifier = tf.estimator.Estimator(model_fn=cnn_model_fn, model_dir="/tmp/mnist_convnet_model")
模型的函数cnn_model_fn中包含了训练,测试,预测三个返回对象
设置日志Hook
tensors_to_log = {"probabilities": "softmax_tensor"}
logging_hook = tf.train.LoggingTensorHook(
tensors=tensors_to_log, every_n_iter=50)
日志的形式是一个字典,key是probabilities,是用来输出的,其对应的值是名字为softmax_tensor的一个张量,这个tensor是我们在cnn_model_fn中显示定义的。
训练模型
设置输入方法,并调用estimator的train函数进行训练。
train_input_fn = tf.estimator.inputs.numpy_input_fn(
x={"x": train_data},
y=train_labels,
batch_size=100,
num_epochs=None,
shuffle=True)
mnist_classifier.train(
input_fn=train_input_fn,
steps=20000,
hooks=[logging_hook])
一个Batch是100,即一个step使用100个样本
测试模型
设置测试输入,然后使用estimator的evaluate函数进行评估
# Evaluate the model and print results
eval_input_fn = tf.estimator.inputs.numpy_input_fn(
x={"x": eval_data},
y=eval_labels,
num_epochs=1,
shuffle=False)
eval_results = mnist_classifier.evaluate(input_fn=eval_input_fn)
print(eval_results)
跑一下
输出结果如下:
INFO:tensorflow:step = 19901, loss = 0.07808095 (14.584 sec)
INFO:tensorflow:probabilities = [[0.00653758 0.00027977 0.02949601 0.00125711 0.3366562 0.00445841
0.14701328 0.00127716 0.21882716 0.25419727]
[0.00430843 0.00016417 0.00113066 0.02829063 0.00024055 0.93776697
0.00143744 0.00001802 0.02562505 0.00101799]
[0.9999541 0.00000003 0.00003625 0.00000008 0. 0.00000439
0.00000004 0.00000358 0.00000051 0.00000109]
[0.00008427 0.00000212 0.00008484 0.00005185 0.925472 0.00016064
0.00003379 0.00013239 0.01440229 0.05957583]
[0.99740785 0.0000001 0.00172199 0.00000233 0.0000049 0.00000163
0.00027313 0.00000015 0.00050908 0.00007891]
[0.00029092 0.00001455 0.0005991 0.00000168 0.00001352 0.07832033
0.8784256 0.00000001 0.04233294 0.00000141]
[0.00000061 0.00000002 0.00000446 0.9998882 0. 0.0000235
0. 0.00000956 0.00000513 0.00006869]
[0.0000003 0.99950767 0.00001671 0.00000373 0.00001526 0.00000009
0.00000912 0.00018723 0.00025866 0.00000117]
[0.00000063 0.00000266 0.00005948 0.00047997 0.00000392 0.00013671
0.00001396 0.00001721 0.9990503 0.00023515]
[0.0000724 0.99552804 0.00201107 0.00055174 0.00013566 0.00020998
0.00053835 0.00028406 0.00043207 0.00023667]
[0.00141204 0.05343223 0.00628457 0.02424757 0.00002866 0.00039093
0.00001509 0.00093949 0.911214 0.00203547]
[0.00000168 0.0000008 0.00016274 0.00021079 0.00000015 0.00000158
0. 0.9991935 0.00001052 0.00041815]
[0.00002727 0.00627399 0.00002645 0.00746263 0.03002508 0.00042525
0.00000047 0.03689083 0.00298324 0.9158847 ]
[0.20883888 0.00000094 0.78590924 0.0000095 0.00080836 0.00000171
0.00158269 0.00002897 0.00122197 0.00159774]
[0.00000309 0.00000142 0.00065031 0.000364 0.00028749 0.00016724
0.00001817 0.00000014 0.99835086 0.00015735]
[0.00024412 0.01437429 0.97447646 0.00627168 0.00004161 0.00006755
0.00013385 0.00118695 0.00314311 0.00006047]
[0.00000118 0.0000001 0.00002905 0.0000519 0.00000023 0.00000202
0. 0.99958724 0.00000062 0.00032768]
[0.00001657 0.00000003 0.00002998 0.0000301 0.00019094 0.00002377
0.00000058 0.00450326 0.00005672 0.995148 ]
[0.00001416 0.9985292 0.00013869 0.00023481 0.00005727 0.00001574
0.00002333 0.00033733 0.00049305 0.00015641]
[0.00000115 0.00044185 0.00051543 0.00000114 0.00000975 0.00008357
0.99894387 0. 0.00000323 0. ]
[0.00000022 0.00000022 0.00066469 0.99725014 0.00000083 0.00002809
0.00000001 0.00000016 0.00205229 0.00000325]
[0.00000116 0.0000012 0.00002754 0.9997433 0.00000011 0.00009156
0.00000005 0.00000006 0.00013239 0.00000267]
[0.00000151 0.99815434 0.00019713 0.00012246 0.00033924 0.00000488
0.00002861 0.00073196 0.00027945 0.0001404 ]
[0.00017664 0.00000062 0.00000214 0.00990055 0.00000157 0.9764072
0.00000094 0.00003025 0.00076562 0.01271451]
[0.0001976 0.11958738 0.05967849 0.49799427 0.00000878 0.00022552
0.00000506 0.00147579 0.320353 0.000474 ]
[0.00000666 0.00000002 0.00000091 0.00205978 0.00052456 0.00006016
0.00000007 0.00257819 0.00003914 0.9947305 ]
[0.00008026 0.00014217 0.00000212 0.00014914 0.00057236 0.02275991
0.00000071 0.00018375 0.9689307 0.00717883]
[0.9926137 0.00000022 0.0064486 0.00000226 0.00007781 0.00000266
0.00002063 0.00009304 0.00028803 0.0004531 ]
[0.00000459 0.9985 0.00025249 0.00005222 0.00005671 0.00000391
0.00000631 0.00094015 0.00015452 0.00002908]
[0.91281885 0.00000056 0.00216166 0.00009756 0.00132021 0.0002136
0.06025714 0.00000016 0.01627978 0.00685052]
[0.00001281 0.9991773 0.00011872 0.00014762 0.00002076 0.00000587
0.00002503 0.00028516 0.00017192 0.00003479]
[0.00032988 0.15245304 0.19930777 0.09470434 0.00036543 0.06287006
0.48210916 0.00004639 0.00779235 0.00002151]
[0.0005195 0.218501 0.00022941 0.00081896 0.7122533 0.00010951
0.00077851 0.00327465 0.00367129 0.05984387]
[0.00017201 0.00552427 0.00042782 0.00018167 0.00001642 0.00004083
0.00000794 0.00004608 0.99347866 0.00010436]
[0.00013177 0.00000003 0.00000018 0.00000442 0.00001539 0.00000248
0.00000001 0.99761164 0.00000007 0.00223403]
[0.0000018 0.00000018 0.00000045 0.0000072 0.0001452 0.0000165
0. 0.9389009 0.00000439 0.06092336]
[0.00000011 0.6364836 0.3591752 0.00049449 0.00000629 0.0000198
0.00346324 0.00000053 0.00035643 0.00000045]
[0.00824901 0.00050231 0.00187641 0.00050592 0.00061081 0.9755182
0.01210836 0.00029249 0.0001497 0.00018675]
[0.00172608 0.06643065 0.0023376 0.71670836 0.00078903 0.20590669
0.00328048 0.00010317 0.00037964 0.00233831]
[0.00000123 0.00000009 0.00003369 0.00047659 0.00001952 0.00000508
0.00000008 0.00000025 0.9991571 0.00030642]
[0.00013784 0.0000158 0.00004419 0.0000162 0.00000404 0.00045562
0.9979761 0. 0.00134874 0.00000141]
[0.0000003 0.00000001 0.00000123 0.00000307 0.00000005 0.00000003
0. 0.999908 0.00000003 0.00008717]
[0.00000205 0.00000022 0.99997926 0.00001272 0.00000062 0.00000007
0.00000457 0.00000002 0.00000044 0.00000003]
[0.00001608 0.00000087 0.00003536 0.00010281 0.00000395 0.000016
0. 0.9878291 0.00000007 0.01199591]
[0.00012509 0.00001078 0.00009395 0.00012845 0.00051081 0.00010073
0.00000005 0.9817062 0.00006088 0.0172631 ]
[0.00000909 0.0000201 0.00044984 0.00000101 0.00056437 0.00000609
0.9989268 0.00000038 0.00002199 0.0000003 ]
[0.00000009 0.00000007 0.00000048 0.00000023 0.9996197 0.00000008
0.00000227 0.00000483 0.00000627 0.00036583]
[0.00012676 0.00001876 0.00007024 0.00068287 0.00005605 0.00006106
0.00000003 0.9679878 0.00003399 0.03096248]
[0.9993549 0.00000074 0.00003765 0.00002213 0.00000079 0.00055099
0.0000094 0.00001461 0.00000603 0.00000275]
[0.00002917 0.00178283 0.00017922 0.00015926 0.00151695 0.0001843
0.99439776 0.00000086 0.00174589 0.00000394]
[0.00002808 0.00000463 0.9954602 0.0042828 0.0000202 0.00000494
0.00000016 0.00004279 0.00007798 0.00007808]
[0.00181483 0.00091174 0.22887605 0.05120551 0.20310538 0.00272528
0.04843728 0.00687791 0.45050454 0.00554154]
[0.00001027 0.00001011 0.0031928 0.00231415 0.00002076 0.00014704
0.00000187 0.00000758 0.9942179 0.00007751]
[0.00000256 0.9992223 0.00001775 0.00000925 0.00000847 0.00003013
0.00037166 0.00001386 0.00032201 0.00000193]
[0.0056904 0.000032 0.00061433 0.01692274 0.0001197 0.95999926
0.00006952 0.00159048 0.00249305 0.01246854]
[0.3027773 0.00003069 0.6492896 0.03497973 0.00000329 0.00641184
0.0000036 0.00113799 0.00454407 0.00082192]
[0.00001019 0.00001161 0.00003709 0.9994134 0.00000238 0.0002097
0.00000002 0.00000332 0.00007973 0.00023264]
[0.00000042 0.00000123 0.47450382 0.4914326 0.00000012 0.00003374
0.00000005 0.00000639 0.03402162 0.00000009]
[0.00001646 0.00000708 0.00794297 0.0002719 0.00000103 0.00021443
0.00003541 0.00000037 0.9914929 0.00001751]
[0.00000099 0.00000104 0.00001478 0.00026425 0.00024689 0.00005933
0.00000002 0.00773065 0.00005008 0.9916319 ]
[0.09405121 0.09254706 0.19568071 0.07942136 0.09395033 0.07075018
0.04593598 0.00462806 0.23319057 0.08984453]
[0.9999958 0. 0.00000403 0. 0. 0.00000013
0.00000004 0.00000001 0.00000003 0.00000002]
[0.00006213 0.00262884 0.00044107 0.00010354 0.00025959 0.00003446
0.00002813 0.00005732 0.99602365 0.00036121]
[0.00008898 0.000093 0.00060717 0.0006796 0.00005402 0.0002413
0.00091443 0.00000404 0.9966253 0.00069223]
[0.00063553 0.00001075 0.00359965 0.0014327 0.00001251 0.00001405
0.00000027 0.00125775 0.99206334 0.00097348]
[0.00001974 0.00173059 0.00089575 0.00093818 0.9341853 0.00004982
0.00017829 0.00013888 0.00043839 0.06142506]
[0.00000165 0.99886537 0.00014239 0.00003912 0.00000465 0.00000252
0.00001863 0.00003426 0.00088555 0.00000591]
[0.99814713 0.00000002 0.00172245 0.00000143 0.00001572 0.00000061
0.00004197 0.00000182 0.00002012 0.00004875]
[0.00000202 0.00000109 0.00000461 0.0000435 0.00740274 0.00001684
0.00000008 0.00637304 0.00003441 0.9861216 ]
[0.00000268 0.0000003 0.00006067 0.0004232 0.00001554 0.00007788
0.00000004 0.99245775 0.00001858 0.00694332]
[0.0000017 0.00000011 0.00000299 0.00000548 0.16725917 0.00009919
0.00001178 0.00001511 0.8325185 0.00008594]
[0.00000984 0.9700659 0.00087443 0.00195125 0.00005733 0.00001509
0.00000763 0.02317022 0.00086504 0.00298324]
[0.00007618 0.00001846 0.96646905 0.0306498 0.00099017 0.00006282
0.00021982 0.00005371 0.0013888 0.00007127]
[0.00000676 0.9981876 0.00003285 0.00022257 0.00001139 0.00000334
0.0000054 0.00095555 0.00047863 0.00009585]
[0.00000043 0.00064039 0.00003074 0.00414179 0.19729371 0.00236706
0.00000192 0.00275983 0.00400306 0.7887611 ]
[0.00014772 0.980079 0.00166267 0.00085004 0.00426644 0.00020811
0.0009556 0.00986576 0.00154716 0.00041758]
[0.00002886 0.00003736 0.00139656 0.00058228 0.00002099 0.00008117
0.00000027 0.9947202 0.00016391 0.00296841]
[0.00049845 0.0032278 0.95299697 0.01048457 0.007143 0.00316678
0.00457233 0.00098773 0.0144492 0.00247315]
[0.00000141 0.99871635 0.00002957 0.00017025 0.00004952 0.00000244
0.00000615 0.00015065 0.0006961 0.00017761]
[0.00000312 0.997297 0.00009875 0.00007831 0.00049703 0.00002494
0.00031899 0.00006776 0.00148947 0.00012467]
[0.00174914 0.00017846 0.01393098 0.96345997 0.0000006 0.01945871
0.00000837 0.00000631 0.00092383 0.00028364]
[0.00015642 0.9937558 0.00058608 0.00031955 0.00014954 0.00005626
0.00298496 0.00144757 0.00048343 0.00006035]
[0.00000504 0.00000102 0.00027206 0.00020278 0.00015762 0.00000692
0.00000078 0.9987846 0.00000232 0.00056676]
[0.00008971 0.00000002 0.00000207 0.00000101 0.00000019 0.00000922
0.00000034 0.00000003 0.9998965 0.00000083]
[0.00000004 0. 0.9999863 0.00000318 0.00000004 0.
0.00000001 0.00000005 0.00001037 0.00000001]
[0.99964476 0.00000055 0.00005032 0.00000129 0.00000017 0.00017659
0.00012291 0.00000033 0.00000053 0.0000026 ]
[0.00000022 0.00000005 0.00000246 0.00023245 0.00096876 0.00000079
0.00000002 0.00178105 0.00008815 0.996926 ]
[0.00001709 0.99782276 0.00029405 0.00006721 0.00027816 0.00000463
0.00007061 0.00042154 0.00099421 0.00002973]
[0.00339555 0.00000942 0.00020897 0.00210331 0.00000237 0.9927417
0.00001523 0.00018817 0.00105983 0.00027542]
[0.00000594 0.00023875 0.00528529 0.01841949 0.00895302 0.00003197
0.00000087 0.9497999 0.00011747 0.01714729]
[0.9999428 0. 0.00001308 0.00000029 0.00000073 0.00000021
0.00000003 0.00000855 0.00000049 0.00003381]
[0.00000025 0.00001713 0.02760528 0.97005975 0.00000037 0.00000075
0. 0.00119437 0.00111912 0.00000309]
[0.00000046 0.00000537 0.00001953 0.00012075 0.00127234 0.00001287
0.0000001 0.00579237 0.00004464 0.99273163]
[0.00102573 0.0042882 0.00200124 0.00334653 0.00034228 0.00207408
0.00011397 0.00123924 0.9808749 0.00469377]
[0.00547595 0.05328 0.03533481 0.00643996 0.0007083 0.01062255
0.00072624 0.00169955 0.8759648 0.00974783]
[0.0000519 0.00015023 0.0003908 0.00003585 0.0024822 0.00322469
0.9918715 0.00000768 0.00168992 0.00009532]
[0.00001466 0.00004805 0.00004427 0.00038582 0.04005297 0.00087443
0.00000483 0.14090365 0.02829087 0.78938043]
[0.00000003 0.00000004 0.00003817 0.00006114 0.00092822 0.00000595
0.00000008 0.00052061 0.00006017 0.99838555]
[0.00000271 0.00096103 0.56573725 0.42055273 0.0000004 0.00001489
0.00000021 0.00020772 0.01226761 0.00025546]
[0.00000023 0.00000005 0.00000264 0.00000033 0.00000004 0.00000629
0.00000001 0.00000024 0.9999901 0.00000012]] (7.311 sec)
INFO:tensorflow:Saving checkpoints for 20000 into /tmp/mnist_convnet_model/model.ckpt.
INFO:tensorflow:Loss for final step: 0.23901062.
INFO:tensorflow:Starting evaluation at 2018-04-04-09:54:09
2018-04-04 17:54:09.532357: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1120] Creating TensorFlow device (/device:GPU:0) -> (device: 0, name: GeForce GT 710, pci bus id: 0000:01:00.0, compute capability: 3.5)
INFO:tensorflow:Restoring parameters from /tmp/mnist_convnet_model/model.ckpt-20000
INFO:tensorflow:Finished evaluation at 2018-04-04-09:54:14
INFO:tensorflow:Saving dict for global step 20000: accuracy = 0.9697, global_step = 20000, loss = 0.09833111
{'global_step': 20000, 'loss': 0.09833111, 'accuracy': 0.9697}
Process finished with exit code 0
网友评论