美文网首页
线性回归 Ridge

线性回归 Ridge

作者: 麒麟楚庄王 | 来源:发表于2018-11-27 16:36 被阅读0次

https://www.analyticsvidhya.com/blog/2017/06/a-comprehensive-guide-for-linear-ridge-and-lasso-regression/

Everything should be made simple as possible, but not simpler – Albert Einstein

from sklearn.linear_model import Ridge

## training the model

ridgeReg = Ridge(alpha=0.05, normalize=True)

ridgeReg.fit(x_train,y_train)

pred = ridgeReg.predict(x_cv)

calculating mse

mse = np.mean((pred_cv - y_cv)**2)

mse 1348171.96 ## calculating score ridgeReg.score(x_cv,y_cv) 0.5691

普通最小二乘,没有阿尔法值 alpha为0.5 alpha值为5 阿尔法值为10

可以看到:

随着alpha值的增加,系数的数量也在减小,向零靠近。

计算R-2值,发现等alpha为0.05时,R-2值最大。应当选误差平方和最小的。

改变alpha值就是改变惩罚项,alpha值越大,惩罚项越大从而系数也越来越小。 

要点:

1. 岭回归It shrinks the parameters, therefore it is mostly used to prevent multicollinearity.

2. It reduces the model complexity by coefficient shrinkage.

3. It uses L2 regularization technique. (which I will discussed later in this article)

相关文章

网友评论

      本文标题:线性回归 Ridge

      本文链接:https://www.haomeiwen.com/subject/ffggqqtx.html