美文网首页
Kaggle|Courses|XGBoost[待补充]

Kaggle|Courses|XGBoost[待补充]

作者: 十二支箭 | 来源:发表于2020-04-26 21:13 被阅读0次

In this tutorial, you will learn how to build and optimize models with gradient boosting. This method dominates many Kaggle competitions and achieves state-of-the-art results on a variety of datasets.

Introduction

For much of this course, you have made predictions with the random forest method, which achieves better performance than a single decision tree simply by averaging the predictions of many decision trees.

We refer to the random forest method as an "ensemble method". By definition, ensemble methods combine the predictions of several models (e.g., several trees, in the case of random forests).

Next, we'll learn about another ensemble method called gradient boosting.

Gradient Boosting

Gradient boosting is a method that goes through cycles to iteratively add models into an ensemble.

It begins by initializing the ensemble with a single model, whose predictions can be pretty naive. (Even if its predictions are wildly inaccurate, subsequent additions to the ensemble will address those errors.)

Then, we start the cycle:

  • First, we use the current ensemble to generate predictions for each observation in the dataset. To make a prediction, we add the predictions from all models in the ensemble.
  • These predictions are used to calculate a loss function (like mean squared error, for instance).
  • Then, we use the loss function to fit a new model that will be added to the ensemble. Specifically, we determine model parameters so that adding this new model to the ensemble will reduce the loss. (Side note: The "gradient" in "gradient boosting" refers to the fact that we'll use gradient descent on the loss function to determine the parameters in this new model.)
  • Finally, we add the new model to ensemble, and ...
  • ... repeat!

Example

We begin by loading the training and validation data in X_train, X_valid, y_train, and y_valid.

Output

Code

In this example, you'll work with the XGBoost library. XGBoost stands for extreme gradient boosting, which is an implementation of gradient boosting with several additional features focused on performance and speed. (Scikit-learn has another version of gradient boosting, but XGBoost has some technical advantages.)

In the next code cell, we import the scikit-learn API for XGBoost (xgboost.XGBRegressor). This allows us to build and fit a model just as we would in scikit-learn. As you'll see in the output, the XGBRegressor class has many tunable parameters -- you'll learn about those soon!

<pre style="box-sizing: border-box; text-rendering: auto; -webkit-font-smoothing: antialiased; overflow: auto; font-family: &quot;Roboto Mono&quot;, Monaco, Consolas, monospace; font-size: 14px; display: block; padding: 0px; margin: 0px; line-height: 1.7; word-break: break-all; overflow-wrap: break-word; color: rgba(0, 0, 0, 0.7); background-color: rgb(247, 247, 247); border: none; border-radius: 2px; white-space: pre-wrap; font-style: normal; font-variant-ligatures: normal; font-variant-caps: normal; font-weight: 400; letter-spacing: normal; orphans: 2; text-align: start; text-indent: 0px; text-transform: none; widows: 2; word-spacing: 0px; -webkit-text-stroke-width: 0px; text-decoration-style: initial; text-decoration-color: initial;">from xgboost import XGBRegressor

my_model = XGBRegressor()
my_model.fit(X_train, y_train)</pre>

We also make predictions and evaluate the model.

相关文章

  • Kaggle|Courses|XGBoost[待补充]

    In this tutorial, you will learn how to build and optimiz...

  • 浅谈 GBDT

    在 Xgboost 那篇文章 (Kaggle 神器 xgboost) 中提到了 Gradient Boosted ...

  • Kaggle|Courses|Pipelines

    管道机制。管道捆绑了 预处理 和 建模 的步骤,可以使代码更简单和井井有条。虽然有一些数据科学家不使用管道,但是使...

  • Kaggle|Courses|MissingValues

    这篇文章将持续记录kaggle示例代码中对零基础编程小白非常有意义的代码片段,领会这种思路,为形成标准化代码流程打...

  • xgboost

    通俗理解kaggle比赛大杀器xgboost

  • Kaggle|Courses|Cross Validation[

    什么时候使用交叉验证? Cross-validation gives a more accurate measur...

  • Kaggle|Courses|Categorical Varia

    主要讲述了如何处理分类型变量,直接上代码。 定义打分函数,使用MAE来评价不同方法的分数。 Score from ...

  • Xgboost

    在最近的 Kaggle 竞赛中,利用 Xgboost 的队伍经常能问鼎冠军,那么问题来了,Xgboost 为什么这...

  • Kaggle 神器 xgboost

    在 Kaggle 的很多比赛中,我们可以看到很多 winner 喜欢用 xgboost,而且获得非常好的表现,今天...

  • XGBoost的GPU加速插件

    XGBoost是诸如Kaggle等数据科学竞赛选手的利器。在特征属于许多不同范畴时,XGBoost的表现通常优于神...

网友评论

      本文标题:Kaggle|Courses|XGBoost[待补充]

      本文链接:https://www.haomeiwen.com/subject/nvojwhtx.html