美文网首页
TensorFlow_线性回归实例

TensorFlow_线性回归实例

作者: leogoforit | 来源:发表于2020-03-20 14:45 被阅读0次

根据胡海曼的教程学习深度学习,第一个是线性回归。全套教程源码也在她的github地址上面

#--*coding--:utf-8
import tensorflow as tf
import numpy
import matplotlib.pyplot as plt
rng = numpy.random

# 参数设定
learning_rate = 0.01
training_epochs = 10000
display_step = 50

# 训练数据
train_X = numpy.asarray([3.3,4.4,5.5,6.71,6.93,4.168,9.779,6.182,7.59,2.167,
                         7.042,10.791,5.313,7.997,5.654,9.27,3.1])
train_Y = numpy.asarray([1.7,2.76,2.09,3.19,1.694,1.573,3.366,2.596,2.53,1.221,
                         2.827,3.465,1.65,2.904,2.42,2.94,1.3])
n_samples = train_X.shape[0]
print ("train_X:",train_X)
print ("train_Y:",train_Y)

# 设置placeholder
X = tf.placeholder("float")
Y = tf.placeholder("float")

# 设置模型的权重和偏置
W = tf.Variable(rng.randn(), name="weight")
b = tf.Variable(rng.randn(), name="bias")

# 设置线性回归的方程
pred = tf.add(tf.multiply(X, W), b)

#设置cost为均方差
cost = tf.reduce_sum(tf.pow(pred-Y, 2))/(2*n_samples)
# 梯度下降
# 注意,minimize() 可以自动修正w和b,因为默认设置Variables的trainable=True
optimizer = tf.train.GradientDescentOptimizer(learning_rate).minimize(cost)

# 初始化所有variables 
init = tf.global_variables_initializer()

# 开始训练
with tf.Session() as sess:
    sess.run(init)

    # 灌入所有训练数据
    for epoch in range(training_epochs):
        for (x, y) in zip(train_X, train_Y):
            sess.run(optimizer, feed_dict={X: x, Y: y})

        # 打印出每次迭代的log日志
        if (epoch+1) % display_step == 0:
            c = sess.run(cost, feed_dict={X: train_X, Y:train_Y})
            print ("Epoch:", '%04d' % (epoch+1), "cost=", "{:.9f}".format(c), \
                "W=", sess.run(W), "b=", sess.run(b))

    print ("Optimization Finished!")
    training_cost = sess.run(cost, feed_dict={X: train_X, Y: train_Y})
    print ("Training cost=", training_cost, "W=", sess.run(W), "b=", sess.run(b), '\n')

    # 作图
    plt.plot(train_X, train_Y, 'ro', label='Original data')
    plt.plot(train_X, sess.run(W) * train_X + sess.run(b), label='Fitted line')
    plt.legend()
    plt.show()
    
    # 测试样本
    test_X = numpy.asarray([6.83, 4.668, 8.9, 7.91, 5.7, 8.7, 3.1, 2.1])
    test_Y = numpy.asarray([1.84, 2.273, 3.2, 2.831, 2.92, 3.24, 1.35, 1.03])

    print("Testing... (Mean square loss Comparison)")
    testing_cost = sess.run(
        tf.reduce_sum(tf.pow(pred - Y, 2)) / (2 * test_X.shape[0]),
        feed_dict={X: test_X, Y: test_Y})  # same function as cost above
    print("Testing cost=", testing_cost)
    print("Absolute mean square loss difference:", abs(
        training_cost - testing_cost))

    plt.plot(test_X, test_Y, 'bo', label='Testing data')
    plt.plot(train_X, sess.run(W) * train_X + sess.run(b), label='Fitted line')
    plt.legend()
    plt.show()

显示结果为:

Epoch: 0050 cost= 0.204349652 W= 0.050527416 b= 2.2335343
Epoch: 0100 cost= 0.189667836 W= 0.06236654 b= 2.1483645
Epoch: 0150 cost= 0.176679239 W= 0.073501796 b= 2.0682585
Epoch: 0200 cost= 0.165188596 W= 0.08397481 b= 1.992916
Epoch: 0250 cost= 0.155022874 W= 0.09382538 b= 1.9220517
Epoch: 0300 cost= 0.146030232 W= 0.10308945 b= 1.8554069
Epoch: 0350 cost= 0.138074681 W= 0.11180251 b= 1.7927257
Epoch: 0400 cost= 0.131036356 W= 0.11999757 b= 1.733771
Epoch: 0450 cost= 0.124809787 W= 0.12770519 b= 1.6783234
Epoch: 0500 cost= 0.119300954 W= 0.13495448 b= 1.6261718
Epoch: 0550 cost= 0.114427045 W= 0.14177313 b= 1.5771189
Epoch: 0600 cost= 0.110115424 W= 0.14818579 b= 1.5309868
Epoch: 0650 cost= 0.106301151 W= 0.1542166 b= 1.4876018
Epoch: 0700 cost= 0.102926441 W= 0.15988891 b= 1.4467955
Epoch: 0750 cost= 0.099940613 W= 0.16522409 b= 1.4084144
Epoch: 0800 cost= 0.097298898 W= 0.17024207 b= 1.3723154
Epoch: 0850 cost= 0.094961740 W= 0.17496146 b= 1.3383646
Epoch: 0900 cost= 0.092893802 W= 0.17940038 b= 1.3064313
Epoch: 0950 cost= 0.091064177 W= 0.18357533 b= 1.2763975
Epoch: 1000 cost= 0.089445353 W= 0.18750195 b= 1.2481494
Epoch: 1050 cost= 0.088012971 W= 0.19119523 b= 1.2215796
Epoch: 1100 cost= 0.086745843 W= 0.19466822 b= 1.196596
Epoch: 1150 cost= 0.085624710 W= 0.19793454 b= 1.1730984
Epoch: 1200 cost= 0.084632702 W= 0.20100646 b= 1.150999
Epoch: 1250 cost= 0.083754852 W= 0.20389594 b= 1.1302123
Epoch: 1300 cost= 0.082978085 W= 0.20661353 b= 1.1106623
Epoch: 1350 cost= 0.082290746 W= 0.20916954 b= 1.092275
Epoch: 1400 cost= 0.081682473 W= 0.21157353 b= 1.0749803
Epoch: 1450 cost= 0.081144169 W= 0.21383473 b= 1.0587131
Epoch: 1500 cost= 0.080667801 W= 0.21596141 b= 1.0434139
Epoch: 1550 cost= 0.080246247 W= 0.21796162 b= 1.0290246
Epoch: 1600 cost= 0.079873174 W= 0.21984296 b= 1.0154907
Epoch: 1650 cost= 0.079542965 W= 0.22161247 b= 1.0027609
Epoch: 1700 cost= 0.079250753 W= 0.22327666 b= 0.9907885
Epoch: 1750 cost= 0.078992076 W= 0.22484198 b= 0.9795277
Epoch: 1800 cost= 0.078763157 W= 0.22631417 b= 0.9689365
Epoch: 1850 cost= 0.078560516 W= 0.22769892 b= 0.95897526
Epoch: 1900 cost= 0.078381225 W= 0.22900082 b= 0.9496096
Epoch: 1950 cost= 0.078222491 W= 0.2302254 b= 0.94080025
Epoch: 2000 cost= 0.078081980 W= 0.23137711 b= 0.9325147
Epoch: 2050 cost= 0.077957571 W= 0.23246041 b= 0.92472154
Epoch: 2100 cost= 0.077847429 W= 0.23347919 b= 0.91739213
Epoch: 2150 cost= 0.077749915 W= 0.23443748 b= 0.9104984
Epoch: 2200 cost= 0.077663586 W= 0.23533875 b= 0.904015
Epoch: 2250 cost= 0.077587135 W= 0.23618644 b= 0.8979164
Epoch: 2300 cost= 0.077519424 W= 0.2369838 b= 0.89218074
Epoch: 2350 cost= 0.077459462 W= 0.23773368 b= 0.8867858
Epoch: 2400 cost= 0.077406369 W= 0.23843904 b= 0.88171166
Epoch: 2450 cost= 0.077359341 W= 0.23910244 b= 0.8769388
Epoch: 2500 cost= 0.077317670 W= 0.23972642 b= 0.8724502
Epoch: 2550 cost= 0.077280767 W= 0.24031332 b= 0.86822826
Epoch: 2600 cost= 0.077248067 W= 0.24086526 b= 0.8642574
Epoch: 2650 cost= 0.077219114 W= 0.24138428 b= 0.8605238
Epoch: 2700 cost= 0.077193446 W= 0.2418726 b= 0.85701054
Epoch: 2750 cost= 0.077170685 W= 0.24233179 b= 0.85370743
Epoch: 2800 cost= 0.077150516 W= 0.24276373 b= 0.8505998
Epoch: 2850 cost= 0.077132657 W= 0.24316992 b= 0.84767807
Epoch: 2900 cost= 0.077116832 W= 0.24355206 b= 0.8449288
Epoch: 2950 cost= 0.077102758 W= 0.24391139 b= 0.8423438
Epoch: 3000 cost= 0.077090323 W= 0.2442494 b= 0.83991224
Epoch: 3050 cost= 0.077079259 W= 0.2445674 b= 0.83762455
Epoch: 3100 cost= 0.077069469 W= 0.24486633 b= 0.8354743
Epoch: 3150 cost= 0.077060774 W= 0.24514751 b= 0.8334513
Epoch: 3200 cost= 0.077053092 W= 0.24541213 b= 0.8315476
Epoch: 3250 cost= 0.077046230 W= 0.24566086 b= 0.8297582
Epoch: 3300 cost= 0.077040143 W= 0.24589477 b= 0.8280754
Epoch: 3350 cost= 0.077034757 W= 0.2461149 b= 0.8264919
Epoch: 3400 cost= 0.077029966 W= 0.24632198 b= 0.825002
Epoch: 3450 cost= 0.077025712 W= 0.24651662 b= 0.8236021
Epoch: 3500 cost= 0.077021934 W= 0.24669963 b= 0.82228553
Epoch: 3550 cost= 0.077018574 W= 0.24687171 b= 0.8210474
Epoch: 3600 cost= 0.077015594 W= 0.24703377 b= 0.8198813
Epoch: 3650 cost= 0.077012941 W= 0.24718618 b= 0.8187851
Epoch: 3700 cost= 0.077010572 W= 0.2473298 b= 0.8177522
Epoch: 3750 cost= 0.077008486 W= 0.24746428 b= 0.81678486
Epoch: 3800 cost= 0.077006593 W= 0.24759096 b= 0.8158736
Epoch: 3850 cost= 0.077004947 W= 0.2477101 b= 0.8150164
Epoch: 3900 cost= 0.077003457 W= 0.24782214 b= 0.8142104
Epoch: 3950 cost= 0.077002145 W= 0.24792781 b= 0.81345004
Epoch: 4000 cost= 0.077000983 W= 0.24802683 b= 0.812737
Epoch: 4050 cost= 0.076999925 W= 0.24812037 b= 0.81206447
Epoch: 4100 cost= 0.076998979 W= 0.24820836 b= 0.81143177
Epoch: 4150 cost= 0.076998152 W= 0.24829064 b= 0.81083935
Epoch: 4200 cost= 0.076997414 W= 0.24836808 b= 0.8102826
Epoch: 4250 cost= 0.076996744 W= 0.24844112 b= 0.8097573
Epoch: 4300 cost= 0.076996133 W= 0.24850988 b= 0.80926275
Epoch: 4350 cost= 0.076995626 W= 0.24857417 b= 0.8088004
Epoch: 4400 cost= 0.076995149 W= 0.2486349 b= 0.80836296
Epoch: 4450 cost= 0.076994725 W= 0.24869208 b= 0.8079517
Epoch: 4500 cost= 0.076994352 W= 0.24874586 b= 0.80756444
Epoch: 4550 cost= 0.076993994 W= 0.24879652 b= 0.8072006
Epoch: 4600 cost= 0.076993696 W= 0.24884404 b= 0.8068587
Epoch: 4650 cost= 0.076993413 W= 0.24888888 b= 0.80653566
Epoch: 4700 cost= 0.076993167 W= 0.24893084 b= 0.8062343
Epoch: 4750 cost= 0.076992959 W= 0.24897039 b= 0.80594975
Epoch: 4800 cost= 0.076992743 W= 0.24900782 b= 0.80568016
Epoch: 4850 cost= 0.076992556 W= 0.2490432 b= 0.80542624
Epoch: 4900 cost= 0.076992400 W= 0.24907647 b= 0.8051861
Epoch: 4950 cost= 0.076992244 W= 0.24910738 b= 0.8049647
Epoch: 5000 cost= 0.076992109 W= 0.24913646 b= 0.8047547
Epoch: 5050 cost= 0.076991998 W= 0.24916321 b= 0.804562
Epoch: 5100 cost= 0.076991893 W= 0.2491888 b= 0.8043784
Epoch: 5150 cost= 0.076991811 W= 0.24921277 b= 0.80420595
Epoch: 5200 cost= 0.076991715 W= 0.24923539 b= 0.80404323
Epoch: 5250 cost= 0.076991625 W= 0.24925673 b= 0.8038898
Epoch: 5300 cost= 0.076991551 W= 0.2492772 b= 0.8037424
Epoch: 5350 cost= 0.076991498 W= 0.24929598 b= 0.80360776
Epoch: 5400 cost= 0.076991439 W= 0.24931346 b= 0.80348206
Epoch: 5450 cost= 0.076991387 W= 0.24933012 b= 0.8033622
Epoch: 5500 cost= 0.076991320 W= 0.24934587 b= 0.80324894
Epoch: 5550 cost= 0.076991282 W= 0.24936043 b= 0.8031436
Epoch: 5600 cost= 0.076991245 W= 0.24937476 b= 0.8030408
Epoch: 5650 cost= 0.076991208 W= 0.24938856 b= 0.8029419
Epoch: 5700 cost= 0.076991186 W= 0.24940091 b= 0.8028525
Epoch: 5750 cost= 0.076991148 W= 0.24941225 b= 0.8027713
Epoch: 5800 cost= 0.076991126 W= 0.24942264 b= 0.80269575
Epoch: 5850 cost= 0.076991089 W= 0.2494327 b= 0.8026246
Epoch: 5900 cost= 0.076991051 W= 0.24944188 b= 0.80255824
Epoch: 5950 cost= 0.076991044 W= 0.24945104 b= 0.80249196
Epoch: 6000 cost= 0.076991029 W= 0.24945939 b= 0.80243206
Epoch: 6050 cost= 0.076991007 W= 0.24946737 b= 0.8023746
Epoch: 6100 cost= 0.076990984 W= 0.24947532 b= 0.8023175
Epoch: 6150 cost= 0.076990969 W= 0.2494826 b= 0.8022646
Epoch: 6200 cost= 0.076990969 W= 0.24948889 b= 0.8022191
Epoch: 6250 cost= 0.076990955 W= 0.24949503 b= 0.80217487
Epoch: 6300 cost= 0.076990932 W= 0.2495015 b= 0.8021293
Epoch: 6350 cost= 0.076990910 W= 0.2495075 b= 0.80208623
Epoch: 6400 cost= 0.076990910 W= 0.24951248 b= 0.8020494
Epoch: 6450 cost= 0.076990910 W= 0.2495172 b= 0.8020166
Epoch: 6500 cost= 0.076990888 W= 0.24952155 b= 0.8019845
Epoch: 6550 cost= 0.076990888 W= 0.24952576 b= 0.8019547
Epoch: 6600 cost= 0.076990873 W= 0.2495299 b= 0.80192506
Epoch: 6650 cost= 0.076990865 W= 0.24953318 b= 0.8019009
Epoch: 6700 cost= 0.076990865 W= 0.24953614 b= 0.8018798
Epoch: 6750 cost= 0.076990850 W= 0.24953903 b= 0.80185896
Epoch: 6800 cost= 0.076990850 W= 0.24954212 b= 0.8018369
Epoch: 6850 cost= 0.076990843 W= 0.24954554 b= 0.8018126
Epoch: 6900 cost= 0.076990850 W= 0.2495488 b= 0.801788
Epoch: 6950 cost= 0.076990835 W= 0.2495517 b= 0.8017682
Epoch: 7000 cost= 0.076990828 W= 0.24955374 b= 0.801753
Epoch: 7050 cost= 0.076990813 W= 0.2495559 b= 0.8017381
Epoch: 7100 cost= 0.076990806 W= 0.24955758 b= 0.8017255
Epoch: 7150 cost= 0.076990828 W= 0.24955891 b= 0.8017156
Epoch: 7200 cost= 0.076990828 W= 0.24956031 b= 0.8017067
Epoch: 7250 cost= 0.076990806 W= 0.24956149 b= 0.80169773
Epoch: 7300 cost= 0.076990820 W= 0.24956271 b= 0.8016888
Epoch: 7350 cost= 0.076990806 W= 0.2495643 b= 0.80167747
Epoch: 7400 cost= 0.076990806 W= 0.24956599 b= 0.80166554
Epoch: 7450 cost= 0.076990791 W= 0.24956761 b= 0.8016536
Epoch: 7500 cost= 0.076990813 W= 0.2495693 b= 0.8016417
Epoch: 7550 cost= 0.076990806 W= 0.24957074 b= 0.8016311
Epoch: 7600 cost= 0.076990813 W= 0.249572 b= 0.8016214
Epoch: 7650 cost= 0.076990791 W= 0.24957383 b= 0.80160946
Epoch: 7700 cost= 0.076990813 W= 0.24957524 b= 0.80159754
Epoch: 7750 cost= 0.076990791 W= 0.24957699 b= 0.8015856
Epoch: 7800 cost= 0.076990791 W= 0.2495787 b= 0.8015737
Epoch: 7850 cost= 0.076990791 W= 0.24958034 b= 0.8015618
Epoch: 7900 cost= 0.076990791 W= 0.249582 b= 0.8015498
Epoch: 7950 cost= 0.076990776 W= 0.24958374 b= 0.80153763
Epoch: 8000 cost= 0.076990768 W= 0.24958518 b= 0.80152714
Epoch: 8050 cost= 0.076990791 W= 0.24958599 b= 0.8015209
Epoch: 8100 cost= 0.076990783 W= 0.2495868 b= 0.8015161
Epoch: 8150 cost= 0.076990776 W= 0.24958722 b= 0.80151314
Epoch: 8200 cost= 0.076990761 W= 0.24958752 b= 0.80151016
Epoch: 8250 cost= 0.076990791 W= 0.24958785 b= 0.8015072
Epoch: 8300 cost= 0.076990783 W= 0.24958839 b= 0.8015042
Epoch: 8350 cost= 0.076990761 W= 0.24958912 b= 0.80149984
Epoch: 8400 cost= 0.076990768 W= 0.2495898 b= 0.8014939
Epoch: 8450 cost= 0.076990761 W= 0.24959056 b= 0.8014884
Epoch: 8500 cost= 0.076990761 W= 0.24959101 b= 0.8014854
Epoch: 8550 cost= 0.076990776 W= 0.2495914 b= 0.80148244
Epoch: 8600 cost= 0.076990768 W= 0.24959166 b= 0.80147946
Epoch: 8650 cost= 0.076990783 W= 0.24959211 b= 0.8014765
Epoch: 8700 cost= 0.076990761 W= 0.24959257 b= 0.8014735
Epoch: 8750 cost= 0.076990791 W= 0.24959296 b= 0.8014705
Epoch: 8800 cost= 0.076990776 W= 0.24959335 b= 0.80146754
Epoch: 8850 cost= 0.076990761 W= 0.2495938 b= 0.80146456
Epoch: 8900 cost= 0.076990776 W= 0.24959418 b= 0.8014616
Epoch: 8950 cost= 0.076990761 W= 0.2495947 b= 0.8014586
Epoch: 9000 cost= 0.076990776 W= 0.24959506 b= 0.8014556
Epoch: 9050 cost= 0.076990776 W= 0.24959542 b= 0.80145264
Epoch: 9100 cost= 0.076990768 W= 0.24959598 b= 0.80144966
Epoch: 9150 cost= 0.076990753 W= 0.24959634 b= 0.8014467
Epoch: 9200 cost= 0.076990768 W= 0.24959674 b= 0.8014437
Epoch: 9250 cost= 0.076990761 W= 0.24959725 b= 0.8014407
Epoch: 9300 cost= 0.076990746 W= 0.24959758 b= 0.80143774
Epoch: 9350 cost= 0.076990768 W= 0.24959788 b= 0.80143476
Epoch: 9400 cost= 0.076990768 W= 0.24959834 b= 0.8014316
Epoch: 9450 cost= 0.076990776 W= 0.24959908 b= 0.80142576
Epoch: 9500 cost= 0.076990768 W= 0.24959989 b= 0.8014198
Epoch: 9550 cost= 0.076990768 W= 0.24960074 b= 0.80141383
Epoch: 9600 cost= 0.076990746 W= 0.24960172 b= 0.8014079
Epoch: 9650 cost= 0.076990761 W= 0.24960251 b= 0.8014019
Epoch: 9700 cost= 0.076990761 W= 0.24960332 b= 0.80139595
Epoch: 9750 cost= 0.076990776 W= 0.24960385 b= 0.8013917
Epoch: 9800 cost= 0.076990776 W= 0.24960423 b= 0.80138874
Epoch: 9850 cost= 0.076990753 W= 0.24960473 b= 0.80138576
Epoch: 9900 cost= 0.076990746 W= 0.24960524 b= 0.8013828
Epoch: 9950 cost= 0.076990753 W= 0.24960566 b= 0.8013798
Epoch: 10000 cost= 0.076990746 W= 0.2496061 b= 0.8013768
Optimization Finished!
Training cost= 0.076990746 W= 0.2496061 b= 0.8013768 
训练数据结果 测试数据结果

相关文章

  • TensorFlow_线性回归实例

    根据胡海曼的教程学习深度学习,第一个是线性回归。全套教程源码也在她的github地址上面 显示结果为:

  • 线性回归实例

    线性回归实例 来一个线性回归问题实例,在开始之前我们先明确一些问题,也就是我们要训练模型几个步骤,在李宏毅教授的课...

  • 正则化线性回归的方差与偏差(吴恩达课程Octave代码用Pyth

    详细代码参考github 利用正则化线性回归模型来了解偏差和方差的特征 实例:首先根据数据建立线性回归模型,模型能...

  • 正则化线性回归的方差与偏差!

    利用正则化线性回归模型来了解偏差和方差的特征 实例: 首先根据数据建立线性回归模型,模型能够根据水库液位的变化来预...

  • 机器学习线性回归实例

    生成直线数据并加入噪音画图显示 定义模型的输入和输出 定义模型的反向传播 定义损失函数 训练模型

  • 20201017-Tensorflow-2

    Tensorflow 实例 全部基于Tensorflow=1.14.0一个简单的线性回归用于入门。 Placeho...

  • 机器学习实战——回归

    本章内容】 线性回归 局部加权线性回归 岭回归和逐步线性回归 例子 【线性回归】 wHat = (X.T*X).I...

  • 线性回归模型

    参考:1.使用Python进行线性回归2.python机器学习:多元线性回归3.线性回归概念 线性回归模型是线性模...

  • 通俗得说线性回归算法(二)线性回归实战

    前情提要:通俗得说线性回归算法(一)线性回归初步介绍 一.sklearn线性回归详解 1.1 线性回归参数 介绍完...

  • 衍化至繁:逻辑回归

    逻辑回归是对线性回归的改进,用于解决分类问题; 逻辑回归输出的是实例属于每个类别的似然概率,似然概率最大的类别就是...

网友评论

      本文标题:TensorFlow_线性回归实例

      本文链接:https://www.haomeiwen.com/subject/yasjyhtx.html