美文网首页
【机器学习】-Week2 3. 梯度下降实践1- 特征缩放

【机器学习】-Week2 3. 梯度下降实践1- 特征缩放

作者: Kitty_风花 | 来源:发表于2019-11-30 10:49 被阅读0次

Gradient Descent in Practice I - Feature Scaling

We can speed up gradient descent by having each of our input values in roughly the same range. This is because θ will descend quickly on small ranges and slowly on large ranges, and so will oscillate inefficiently down to the optimum when the variables are very uneven.

The way to prevent this is to modify the ranges of our input variables so that they are all roughly the same. Ideally:

These aren't exact requirements; we are only trying to speed things up. The goal is to get all input variables into roughly one of these ranges, give or take a few.

Two techniques to help with this are feature scaling and mean normalization. Feature scaling involves dividing the input values by the range (i.e. the maximum value minus the minimum value) of the input variable, resulting in a new range of just 1. Mean normalization involves subtracting the average value for an input variable from the values for that input variable resulting in a new average value for the input variable of just zero. To implement both of these techniques, adjust your input values as shown in this formula:

Where μi is the average of all the values for feature (i) and s_i is the range of values (max - min), or s_i  is the standard deviation.

Note that dividing by the range, or dividing by the standard deviation, give different results. The quizzes in this course use range - the programming exercises use standard deviation.

For example, if x_i  represents housing prices with a range of 100 to 2000 and a mean value of 1000, then

来源:coursera 斯坦福 吴恩达 机器学习

相关文章

网友评论

      本文标题:【机器学习】-Week2 3. 梯度下降实践1- 特征缩放

      本文链接:https://www.haomeiwen.com/subject/jtrsjctx.html