总结:
Batch gradient descent:Use all examples in each iteration;
Stochastic gradient descent:Use 1 example in each iteration;
Mini-batch gradient descent:Use b examples in each iteration.
总结:
Batch gradient descent:Use all examples in each iteration;
Stochastic gradient descent:Use 1 example in each iteration;
Mini-batch gradient descent:Use b examples in each iteration.
本文标题:机器学习 梯度下降
本文链接:https://www.haomeiwen.com/subject/ksutmdtx.html
网友评论