美文网首页
【机器学习】-Week5.5 Gradient Checking

【机器学习】-Week5.5 Gradient Checking

作者: Kitty_风花 | 来源:发表于2019-12-29 09:57 被阅读0次

Gradient Checking

Gradient checking will assure that our backpropagation works as intended. We can approximate the derivative of our cost function with:

With multiple theta matrices, we can approximate the derivative with respect to Θj as follows:

A small value for ϵ (epsilon) such as ϵ=10^−4, guarantees that the math works out properly. If the value for ϵ is too small, we can end up with numerical problems.

Hence, we are only adding or subtracting epsilon to the Θj matrix. In octave we can do it as follows:

We previously saw how to calculate the deltaVector. So once we compute our gradApprox vector, we can check that gradApprox ≈ deltaVector.

Once you have verified once that your backpropagation algorithm is correct, you don't need to compute gradApprox again. The code to compute gradApprox can be very slow.

来源:coursera 斯坦福 吴恩达 机器学习

相关文章

网友评论

      本文标题:【机器学习】-Week5.5 Gradient Checking

      本文链接:https://www.haomeiwen.com/subject/ylyiyctx.html