美文网首页我爱编程
用TensorFlow搭建一个全连接神经网络

用TensorFlow搭建一个全连接神经网络

作者: techping | 来源:发表于2018-03-11 19:58 被阅读0次

    用TensorFlow搭建一个全连接神经网络


    说明

    • 本例子利用TensorFlow搭建一个全连接神经网络,实现对MNIST手写数字的识别。

    先上代码

    from tensorflow.examples.tutorials.mnist import input_data
    import tensorflow as tf
    
    # prepare data
    mnist = input_data.read_data_sets('MNIST_data', one_hot=True)
    
    xs = tf.placeholder(tf.float32, [None, 784])
    ys = tf.placeholder(tf.float32, [None, 10])
    
    # the model of the fully-connected network
    weights = tf.Variable(tf.random_normal([784, 10]))
    biases = tf.Variable(tf.zeros([1, 10]) + 0.1)
    outputs = tf.matmul(xs, weights) + biases
    predictions = tf.nn.softmax(outputs)
    cross_entropy = tf.reduce_mean(-tf.reduce_sum(ys * tf.log(predictions),
                                                  reduction_indices=[1]))
    train_step = tf.train.GradientDescentOptimizer(0.5).minimize(cross_entropy)
    
    # compute the accuracy
    correct_predictions = tf.equal(tf.argmax(predictions, 1), tf.argmax(ys, 1))
    accuracy = tf.reduce_mean(tf.cast(correct_predictions, tf.float32))
    
    with tf.Session() as sess:
        init = tf.global_variables_initializer()
        sess.run(init)
        for i in range(1000):
            batch_xs, batch_ys = mnist.train.next_batch(100)
            sess.run(train_step, feed_dict={
                xs: batch_xs,
                ys: batch_ys
            })
            if i % 50 == 0:
                print(sess.run(accuracy, feed_dict={
                    xs: mnist.test.images,
                    ys: mnist.test.labels
                }))
    

    代码解析

    1. 读取MNIST数据

    mnist = input_data.read_data_sets('MNIST_data', one_hot=True)
    

    2. 建立占位符

    xs = tf.placeholder(tf.float32, [None, 784])
    ys = tf.placeholder(tf.float32, [None, 10])
    
    • xs 代表图片像素数据, 每张图片(28×28)被展开成(1×784), 有多少图片还未定, 所以shape为None×784.
    • ys 代表图片标签数据, 0-9十个数字被表示成One-hot形式, 即只有对应bit为1, 其余为0.

    3. 建立模型

    weights = tf.Variable(tf.random_normal([784, 10]))
    biases = tf.Variable(tf.zeros([1, 10]) + 0.1)
    outputs = tf.matmul(xs, weights) + biases
    predictions = tf.nn.softmax(outputs)
    cross_entropy = tf.reduce_mean(-tf.reduce_sum(ys * tf.log(predictions),
                                                  reduction_indices=[1]))
    train_step = tf.train.GradientDescentOptimizer(0.5).minimize(cross_entropy)
    

    使用Softmax函数作为激活函数:

    $$ouput=Softmax(input \times weight + bias)$$

    4. 计算正确率

    correct_predictions = tf.equal(tf.argmax(predictions, 1), tf.argmax(ys, 1))
    accuracy = tf.reduce_mean(tf.cast(correct_predictions, tf.float32))
    

    5. 使用模型

    with tf.Session() as sess:
        init = tf.global_variables_initializer()
        sess.run(init)
        for i in range(1000):
            batch_xs, batch_ys = mnist.train.next_batch(100)
            sess.run(train_step, feed_dict={
                xs: batch_xs,
                ys: batch_ys
            })
            if i % 50 == 0:
                print(sess.run(accuracy, feed_dict={
                    xs: mnist.test.images,
                    ys: mnist.test.labels
                }))
    

    运行结果

    训练1000个循环, 准确率在87%左右.

    Extracting MNIST_data/train-images-idx3-ubyte.gz
    Extracting MNIST_data/train-labels-idx1-ubyte.gz
    Extracting MNIST_data/t10k-images-idx3-ubyte.gz
    Extracting MNIST_data/t10k-labels-idx1-ubyte.gz
    0.1041
    0.632
    0.7357
    0.7837
    0.7971
    0.8147
    0.8283
    0.8376
    0.8423
    0.8501
    0.8501
    0.8533
    0.8567
    0.8597
    0.8552
    0.8647
    0.8654
    0.8701
    0.8712
    0.8712
    

    参考


    相关文章

      网友评论

        本文标题:用TensorFlow搭建一个全连接神经网络

        本文链接:https://www.haomeiwen.com/subject/nnwwfftx.html