美文网首页TensorFlow
tensorflow非线性回归例子

tensorflow非线性回归例子

作者: 上行彩虹人 | 来源:发表于2018-09-16 16:11 被阅读25次

1、例子一

import tensorflow as tf
import numpy as np
import matplotlib.pyplot as plt

x_data = np.linspace(-0.5,0.5,200)[:,np.newaxis]
y_data = np.square(x_data)+ np.random.normal(0,0.02,x_data.shape)

x = tf.placeholder(tf.float32,[None,1])
y = tf.placeholder(tf.float32,[None,1])

#输入层
weight_l1 = tf.Variable(tf.random_normal([1,10]))
bias_l1 = tf.Variable(tf.zeros([1,10]))
y_l1 = tf.matmul(x,weight_l1) +bias_l1
l1 = tf.nn.tanh(y_l1)
#中间层
weight_l2 = tf.Variable(tf.random_normal([10,1]))
bias_l2 = tf.Variable(tf.zeros([1,1]))
y_l2 = tf.matmul(l1,weight_l2) + bias_l2
predect = tf.nn.tanh(y_l2)
# predect = tf.nn.relu(y_l2)

loss = tf.reduce_mean(tf.square(y-predect))
train_step = tf.train.GradientDescentOptimizer(0.1).minimize(loss)
init = tf.global_variables_initializer()


with tf.Session() as sess:
    sess.run(init)
    for _ in range(10000):
        sess.run(train_step,feed_dict={x:x_data,y:y_data})
        # print(sess.run(loss,feed_dict={x:x_data,y:y_data}))

    pre = sess.run(predect,feed_dict={x:x_data})
    plt.figure()
    plt.scatter(x_data,y_data)
    plt.plot(x_data,pre,'r-',lw=5)
    plt.show()

显示结果


image.png

2、例子2
当把x_data扩展到【-1,1】是效果并不理想
考虑真假一层神经网路,并且把靠近输入的地方的激活函数改为relu
同属增加x_data数据

x_data = np.linspace(-1,1,2000)[:,np.newaxis]

noise = np.random.normal(0,0.02,x_data.shape)
y_data = np.square(x_data) + noise

x = tf.placeholder(tf.float32,[None,1])
y = tf.placeholder(tf.float32,[None,1])

weight_1 = tf.Variable(tf.random_normal([1,10]))
bias_1 = tf.Variable(tf.zeros([1,10]))
layer_1 = tf.matmul(x,weight_1) + bias_1
layer_1_out = tf.nn.relu(layer_1)

weight_2 = tf.Variable(tf.random_normal([10,5]))
bias_2 = tf.Variable(tf.zeros([1,5]))
layer_2 = tf.matmul(layer_1_out,weight_2) + bias_2
layer_2_out = tf.nn.relu(layer_2)

weight_3 = tf.Variable(tf.random_normal([5,1]))
bias_3 = tf.Variable(tf.zeros([1,1]))
layer_3 = tf.matmul(layer_2_out,weight_3)+bias_3
prediction = tf.nn.tanh(layer_3)




loss = tf.reduce_mean(tf.square(prediction-y))
optimizer = tf.train.GradientDescentOptimizer(1)
train_step = optimizer.minimize(loss)

with tf.Session(config=config) as sess:
    sess.run(tf.global_variables_initializer())
    for step in range(20000):
        sess.run(train_step,feed_dict={x:x_data,y:y_data})
        if step%100==0:
            print(sess.run(loss,feed_dict={x:x_data,y:y_data}))

    prediction_value = sess.run(prediction,feed_dict={x:x_data})
    plt.figure()
    plt.scatter(x_data,y_data)
    plt.plot(x_data,prediction_value,color='red')
    plt.show()
image.png

相关文章

网友评论

    本文标题:tensorflow非线性回归例子

    本文链接:https://www.haomeiwen.com/subject/izhfnftx.html