xgboost (

0 users

Extreme Gradient Boosting.


Extreme Gradient Boosting, which is an efficient implementation of the gradient boosting framework from Chen & Guestrin (2016) . This package is its R interface. The package includes efficient linear model solver and tree learning algorithms. The package can automatically do parallel computation on a single machine which could be more than 10 times faster than existing gradient boosting packages. It supports various objective functions, including regression, classification and ranking. The package is made to be extensible, so that users are also allowed to define their own objectives easily.

Maintainer: Tong He
Author(s): Tianqi Chen [aut], Tong He [aut, cre], Michael Benesty [aut], Vadim Khotilovich [aut], Yuan Tang [aut] (<https://orcid.org/0000-0001-5243-233X>), Hyunsu Cho [aut], Kailong Chen [aut], Rory Mitchell [aut], Ignacio Cano [aut], Tianyi Zhou [aut], Mu Li [aut], Junyuan Xie [aut], Min Lin [aut], Yifeng Geng [aut], Yutian Li [aut], XGBoost contributors [cph] (base XGBoost implementation)

License: Apache License (== 2.0) | file LICENSE

Uses: data.table, magrittr, Matrix, stringi, ggplot2, igraph, vcd, testthat, Ckmeans.1d.dp, knitr, jsonlite, rmarkdown, lintr, DiagrammeR, float
Reverse suggests: bigsnpr, Boruta, breakDown, butcher, CBDA, ck37r, coefplot, DALEX, DALEXtra, fastshap, FeatureHashing, flashlight, forecastML, GSIF, iBreakDown, ingredients, lime, MachineShop, mlr, mlr3learners, modelplotr, modelStudio, nlpred, ParBayesianOptimization, parsnip, pdp, pmml, r2pmml, rattle, rBayesianOptimization, SSLR, SuperLearner, tidypredict, utiml, vimp, vip, xspliner

Released 15 days ago.

19 previous versions



  5.0/5 (1 vote)


  0.0/5 (0 votes)

Log in to vote.


No one has written a review of xgboost yet. Want to be the first? Write one now.

Related packages: h2o, tensorflow, caret, GAMBoost, rgenoud, varSelRF, arules, randomForestSRC, Rborist, rattle, ssgraph, BDgraph, keras, reticulate, tfestimators, BatchJobs, BatchExperiments, RSNNS, doRNG, rminer(20 best matches, based on common tags.)

Search for xgboost on google, google scholar, r-help, r-devel.

Visit xgboost on R Graphical Manual.