美文网首页
[NN] Regularization Summary

[NN] Regularization Summary

作者: JaiUnChat | 来源:发表于2017-08-31 20:35 被阅读14次

    Dropout:

    • Dropout is a regularization technique.
    • You only use dropout during training. Don't use dropout (randomly eliminate nodes) during test time.
    • Apply dropout both during forward and backward propagation.
    • 在训练的时候,记得除以keep_prob来保持输出相同的期望。During training time, divide each dropout layer by keep_prob to keep the same expected value for the activations. For example, if keep_prob is 0.5, then we will on average shut down half the nodes, so the output will be scaled by 0.5 since only the remaining half are contributing to the solution. Dividing by 0.5 is equivalent to multiplying by 2. Hence, the output now has the same expected value. You can check that this works even when keep_prob is other values than 0.5.

    What we want you to remember from this notebook:

    • Regularization will help you reduce overfitting.
    • Regularization will drive your weights to lower values.
    • L2 regularization and Dropout are two very effective regularization techniques.

    相关文章

      网友评论

          本文标题:[NN] Regularization Summary

          本文链接:https://www.haomeiwen.com/subject/bzscjxtx.html