美文网首页TensorFlow操作
TensorFlow 梯度截断

TensorFlow 梯度截断

作者: 翻开日记 | 来源:发表于2018-07-28 16:05 被阅读0次
global_step = tf.Variable(0, trainable=False)
learning_rate = tf.train.exponential_decay(config.base_learning_rate, global_step,
                                           decay_steps=config.decay_steps,
                                           decay_rate=config.decay_rate,
                                           staircase=True)
optimizer = tf.train.AdamOptimizer(learning_rate) ##.minimize(loss_val, global_step)
gradients = optimizer.compute_gradients(loss_val)
capped_gradients = [(tf.clip_by_value(grad, -5., 5.), var) for grad, var in gradients if grad is not None]
train_op = optimizer.apply_gradients(capped_gradients, global_step)

相关文章

网友评论

    本文标题:TensorFlow 梯度截断

    本文链接:https://www.haomeiwen.com/subject/mcbomftx.html