前度传播+损失函数+ 学习效率+反向传播+session
import tensorflow as tf
from numpy.random import RandomState
batch_size = 8
w1= tf.Variable(tf.random_normal([2, 3], stddev=1, seed=1))
w2= tf.Variable(tf.random_normal([3, 1], stddev=1, seed=1))
x = tf.placeholder(tf.float32, shape=(None, 2), name="x-input")
y_= tf.placeholder(tf.float32, shape=(None, 1), name='y-input')
a = tf.matmul(x, w1)
y = tf.matmul(a, w2)
y = tf.sigmoid(y)
cross_entropy = -tf.reduce_mean(y_ * tf.log(tf.clip_by_value(y, 1e-10, 1.0))
+ (1 - y_) * tf.log(tf.clip_by_value(1 - y, 1e-10, 1.0)))
train_step = tf.train.AdamOptimizer(0.001).minimize(cross_entropy)
rdm = RandomState(1)
X = rdm.rand(128,2)
Y = [[int(x1+x2 < 1)] for (x1, x2) in X]
with tf.Session() as sess:
init_op = tf.global_variables_initializer()
sess.run(init_op)
# 输出目前(未经训练)的参数取值。
print(sess.run(w1))
print(sess.run(w2))
print("\n")
# 训练模型。
STEPS = 12000
for i in range(STEPS):
start = (i*batch_size) % 128
end = (i*batch_size) % 128 + batch_size
sess.run([train_step, y, y_], feed_dict={x: X[start:end], y_: Y[start:end]})
if i % 1000 == 0:
total_cross_entropy = sess.run(cross_entropy, feed_dict={x: X, y_: Y})
print("After %d training step(s), cross entropy on all data is %g" % (i, total_cross_entropy))
# 输出训练后的参数取值。
print("\n")
print(sess.run(w1))
print(sess.run(w2))
12000次训练结果:交叉熵集越来越小,得出w1,w2的优化参数
[[-0.8113182 1.4845988 0.06532937]
[-2.4427042 0.0992484 0.5912243 ]]
[[-0.8113182 ]
[ 1.4845988 ]
[ 0.06532937]]
After 0 training step(s), cross entropy on all data is 1.89805
After 1000 training step(s), cross entropy on all data is 0.655075
After 2000 training step(s), cross entropy on all data is 0.626172
After 3000 training step(s), cross entropy on all data is 0.615096
After 4000 training step(s), cross entropy on all data is 0.610309
After 5000 training step(s), cross entropy on all data is 0.608679
After 6000 training step(s), cross entropy on all data is 0.608231
After 7000 training step(s), cross entropy on all data is 0.608114
After 8000 training step(s), cross entropy on all data is 0.608088
After 9000 training step(s), cross entropy on all data is 0.608081
After 10000 training step(s), cross entropy on all data is 0.608079
After 11000 training step(s), cross entropy on all data is 0.608079
[[ 0.08924854 0.51599807 1.7538922 ]
[-2.2377944 -0.20479864 1.0734867 ]]
[[-0.49589315]
[ 0.4026622 ]
[-1.0064225 ]]
网友评论