bst (0.3-17)

0 users

Gradient Boosting.

Functional gradient descent algorithm for a variety of convex and non-convex loss functions, for both classical and robust regression and classification problems. See Wang (2011) , Wang (2012) , Wang (2018) , Wang (2018) .

Maintainer: Zhu Wang
Author(s): Zhu Wang [aut, cre], Torsten Hothorn [ctb]

License: GPL (>= 2)

Uses: doParallel, foreach, gbm, rpart, R.rsp, gdata, pROC, knitr, hdi
Reverse suggests: caret, fscaret, mlr

Released 5 months ago.

11 previous versions



  (0 votes)


  (0 votes)

Log in to vote.


No one has written a review of bst yet. Want to be the first? Write one now.

Related packages: BayesTree, ElemStatLearn, GAMBoost, LogicReg, ROCR, RXshrink, arules, caret, e1071, earth, effects, elasticnet, gbm, glmpath, grplasso, ipred, kernlab, klaR, lars, lasso2(20 best matches, based on common tags.)

Search for bst on google, google scholar, r-help, r-devel.

Visit bst on R Graphical Manual.