Extreme Gradient Boosting

Extreme Gradient Boosting, which is an efficient implementation of the gradient boosting framework from Chen & Guestrin (2016) . This package is its R interface. The package includes efficient linear model solver and tree learning algorithms. The package can automatically do parallel computation on a single machine which could be more than 10 times faster than existing gradient boosting packages. It supports various objective functions, including regression, classification and ranking. The package is made to be extensible, so that users are also allowed to define their own objectives easily.


News

Reference manual

It appears you don't have a PDF plugin for this browser. You can click here to download the reference manual.

install.packages("xgboost")

0.71.2 by Tong He, a month ago


https://github.com/dmlc/xgboost


Report a bug at https://github.com/dmlc/xgboost/issues


Browse source code at https://github.com/cran/xgboost


Authors: Tianqi Chen [aut], Tong He [aut, cre], Michael Benesty [aut], Vadim Khotilovich [aut], Yuan Tang [aut] (<https://orcid.org/0000-0001-5243-233X>), Hyunsu Cho [aut], Kailong Chen [aut], Rory Mitchell [aut], Ignacio Cano [aut], Tianyi Zhou [aut], Mu Li [aut], Junyuan Xie [aut], Min Lin [aut], Yifeng Geng [aut], Yutian Li [aut], XGBoost contributors [cph] (base XGBoost implementation)


Documentation:   PDF Manual  


Task views: Machine Learning & Statistical Learning, High-Performance and Parallel Computing with R, Model Deployment with R


Apache License (== 2.0) | file LICENSE license


Imports Matrix, methods, data.table, magrittr, stringi

Suggests knitr, rmarkdown, ggplot2, DiagrammeR, Ckmeans.1d.dp, vcd, testthat, lintr, igraph

System requirements: GNU make, C++11


Imported by MlBayesOpt, SELF, SSL, autoBagging, blkbox, dblr, healthcareai, inTrees, rminer.

Suggested by Boruta, CBDA, DALEX, FeatureHashing, GSIF, SuperLearner, breakDown, coefplot, lime, mlr, pdp, pmml, rBayesianOptimization, rattle, utiml, vimp, vip.


See at CRAN