美文网首页
Linear Regression Models 机器学习线性回

Linear Regression Models 机器学习线性回

作者: DT数据说 | 来源:发表于2018-09-05 13:19 被阅读0次

    线性模型对于回归类的机器学习速度快,效率高。基本原理是求出线性方程组的系数矩阵w à

    和常数b.

    具体可以有这样几类:

    1.      Linear Regression(aka ordinary

    least squares) from sklearn.linear_model import LinearRegression

    2.Ridge Regression(L2

    regularization,限定系数接近0),alpha =1 by default

    3.      Lasso(L1 regularization,有些系数定为0,意味着有些特征被忽略)

    4.  ElasticNet(combination ofLasso, Ridge) from sklearn.linear_model import ElasticNet, Lasso,  BayesianRidge, LassoLarsIC

    5. Kernel Ridge

    6.      GradientBoostingRegressor, test cross validation score 最高

    from sklearn.ensemble import RandomForestRegressor,  GradientBoostingRegressor

    kfold = KFold(n_splits=5)

    GBoost =

    GradientBoostingRegressor(n_estimators=3000,learning_rate=0.1,max_depth=4,

    max_features='sqrt',

                                      min_samples_leaf=15, min_samples_split=10,

                                       loss='huber',random_state =5)

    7.      Light GBM import lightgbm as

    lgbmodel_lgb = lgb.LGBMRegressor(objective='regression',num_leaves=5,

                                 learning_rate=0.05, n_estimators=720,

                                max_bin = 55, bagging_fraction = 0.8,

                                  bagging_freq = 5,feature_fraction = 0.2319,

                                 feature_fraction_seed=9, bagging_seed=9,

                                  min_data_in_leaf=6, min_sum_hessian_in_leaf = 11)

    8.     Xgboost import xgboost as xgb

    model_xgb = xgb.XGBRegressor(colsample_bytree=0.4603, gamma=0.0468,

                                learning_rate=0.05, max_depth=3,

                                min_child_weight=1.7817, n_estimators=2200,

                                reg_alpha=0.4640, reg_lambda=0.8571,

                                subsample=0.5213, silent=1,

                                random_state =7, nthread = -1)

    X_train,X_test,y_train,y_test = train_test_split(X,y,random_state=0)

    model_xgb.fit(X_train,y_train)

    print("cross-validation train

    scores:\n{}".format(np.mean(cross_val_score(model_xgb,X_train,y_train,cv=kfold))))

    print("cross-validation test

    scores:\n{}".format(np.mean(cross_val_score(model_xgb,X_test,y_test,cv=kfold))))

    数据来源测试mglearn.plots.plot_linear_regression_wave()

    Test Scroe(np.mean(cross_val_score(…)))

    Linear Regression:0.659

    Ridge:0.667

    GridSearchCV(Ridge(alpha=0.1):0.77

    Lasso:0.74

    ElasticNet:0.77

    KenelRidge:0.77

    GBoost:0.882

    LGBMRegressor:0.69

    XGBOOST:0.708

    相关文章

      网友评论

          本文标题:Linear Regression Models 机器学习线性回

          本文链接:https://www.haomeiwen.com/subject/tolawftx.html