网上有个教程不错,Tensorflow案例实战视频课程05 构造线性回归模型,跟着这个教程做了一遍,有了一些感觉。
import numpy as np
import tensorflow as tf
import matplotlib.pyplot as plt
#随机生成1000个点,围绕在y=0.1x+0.3的直线周围
num_points = 1000
vectors_set = []
for i in range(num_points):
x1 = np.random.normal(0.0,0.55)
y1 = x1 * 0.1 + 0.3 + np.random.normal(0.0,0.03)
vectors_set.append([x1,y1])
#生成一些样本
x_data = [v[0] for v in vectors_set]
y_data = [v[1] for v in vectors_set]
plt.scatter(x_data,y_data,c='r')
plt.show()
打印出的围绕直线 y=0.1x+0.3 附近的点云
#生成1维的矩阵,取值是[-1,1]之间的随机数
W = tf.Variable(tf.random_uniform([1],-1.0,1.0),name='W')
#生成1维的b矩阵,初始值是0
b = tf.Variable(tf.zeros([1]),name = 'b')
#经过计算得出预估值y
y = W * x_data +b
#以预估值y和实际值y_data之间的均方根误差作为损失
loss = tf.reduce_mean(tf.square(y-y_data),name = 'loss')
#采用梯度下降法来优化参数
optimizer = tf.train.GradientDescentOptimizer(0.5)
#训练过程就是最小化这个误差值
train = optimizer.minimize(loss,name = 'train')
sess = tf.Session()
init = tf.global_variables_initializer()
sess.run(init)
#初始化的w和b是多少
print("W=",sess.run(W),"b=",sess.run(b),"loss=",sess.run(loss))
#执行20次训练
for step in range(20):
sess.run(train)
#输出训练好的w和b
print("W=",sess.run(W),"b=",sess.run(b),"loss=",sess.run(loss))
学习结果如下:
W= [0.83205175] b= [0.] loss= 0.23633502
W= [0.6220981] b= [0.27340004] loss= 0.08258922
W= [0.4656933] b= [0.28100777] loss= 0.040913653
W= [0.3562842] b= [0.2866752] loss= 0.020513779
W= [0.27973717] b= [0.2906397] loss= 0.01052812
W= [0.22618186] b= [0.29341343] loss= 0.005640187
W= [0.18871245] b= [0.29535404] loss= 0.0032475654
W= [0.16249736] b= [0.29671177] loss= 0.0020763879
W= [0.14415625] b= [0.2976617] loss= 0.0015031014
W= [0.13132408] b= [0.29832628] loss= 0.0012224801
W= [0.12234619] b= [0.29879126] loss= 0.0010851173
W= [0.11606489] b= [0.29911658] loss= 0.0010178789
W= [0.11167025] b= [0.2993442] loss= 0.0009849658
W= [0.10859558] b= [0.29950345] loss= 0.00096885505
W= [0.10644443] b= [0.29961485] loss= 0.0009609689
W= [0.10493939] b= [0.2996928] loss= 0.00095710874
W= [0.10388641] b= [0.29974735] loss= 0.00095521915
W= [0.1031497] b= [0.2997855] loss= 0.0009542942
W= [0.10263427] b= [0.2998122] loss= 0.00095384143
W= [0.10227366] b= [0.29983085] loss= 0.00095361995
W= [0.10202136] b= [0.29984394] loss= 0.0009535112
可以看出从第一行的loss=0.23一直到loss无限趋近于0,而W无限趋近于0.1,b无限趋近于0.3的趋势。
plt.scatter(x_data,y_data,c='r')
plt.plot(x_data,sess.run(W)*x_data+sess.run(b))
plt.show()
打印出直线在点云中的位置:
绘制出直线 y=0.1x+0.3
网友评论