美文网首页
2019-01-31[Stay Sharp]Lasso Regr

2019-01-31[Stay Sharp]Lasso Regr

作者: 三千雨点 | 来源:发表于2019-01-31 22:42 被阅读2次

    Loss function with L2 Regularization:
    \min _ { w } \sum _ { i = 1 } ^ { m } \left( y _ { i } - \boldsymbol { w } ^ { \mathrm { T } } \boldsymbol { x } _ { i } \right) ^ { 2 } + \lambda \| \boldsymbol { w } \| _ { 2 } ^ { 2 }
    the model use L2 Regularization is called Ridge Regression

    Loss function with L1 Regularization:

    \min _ { \boldsymbol { w } } \sum _ { i = 1 } ^ { m } \left( y _ { i } - \boldsymbol { w } ^ { \mathrm { T } } \boldsymbol { x } _ { i } \right) ^ { 2 } + \lambda \| \boldsymbol { w } \| _ { 1 }
    the model use L2 Regularization is called Lasso Regression

    For Lasso Regression and Ridge Regression, if \lambda is zero, then the Loss function will get back to ordinary least square function, and if \lambda is very large, the model will under-fit.
    the main difference between Lasso Regression and Ridge Regression is that Lasso will remove some less important feature by shrinking its coefficient to zero. which is useful for dataset with a huge number of features.

    相关文章

      网友评论

          本文标题:2019-01-31[Stay Sharp]Lasso Regr

          本文链接:https://www.haomeiwen.com/subject/ittrsqtx.html