美文网首页
Kaggle|Courses|Cross Validation[

Kaggle|Courses|Cross Validation[

作者: 十二支箭 | 来源:发表于2020-04-25 10:37 被阅读0次

什么时候使用交叉验证?

Cross-validation gives a more accurate measure of model quality, which is especially important if you are making a lot of modeling decisions. However, it can take longer to run, because it estimates multiple models (one for each fold).
So, given these tradeoffs, when should you use each approach?

-For small datasets, where extra computational burden isn't a big deal, you should run cross-validation.
-For larger datasets, a single validation set is sufficient. Your code will run faster, and you may have enough data that there's little need to re-use some of it for holdout.

相关文章

网友评论

      本文标题:Kaggle|Courses|Cross Validation[

      本文链接:https://www.haomeiwen.com/subject/wkffwhtx.html