美文网首页
How to apply gradient clipping i

How to apply gradient clipping i

作者: 月下霜溪 | 来源:发表于2018-12-21 10:17 被阅读4次

    Gradient clipping needs to happen after computing the gradients, but before applying them to update the model's parameters. In your example, both of those things are handled by the AdamOptimizer.minimize() method.

    In order to clip your gradients you'll need to explicitly compute, clip, and apply them as described in this section in TensorFlow's API documentation. Specifically you'll need to substitute the call to the minimize() method with something like the following:

        optimizer = tf.train.AdamOptimizer(learning_rate=learning_rate)
        gvs = optimizer.compute_gradients(cost)
        capped_gvs = [(tf.clip_by_value(grad, -1., 1.), var)
                      for grad, var in gvs]
        train_op = optimizer.apply_gradients(capped_gvs)
    

    转自:# How to apply gradient clipping in TensorFlow?

    相关文章

      网友评论

          本文标题:How to apply gradient clipping i

          本文链接:https://www.haomeiwen.com/subject/oytmkqtx.html