An implementation of extensions to Freund and Schapire's AdaBoost algorithm and Friedman's gradient boosting machine. Includes regression methods for least squares, absolute loss, t-distribution loss, quantile regression, logistic, multinomial logistic, Poisson, Cox proportional hazards partial likelihood, AdaBoost exponential loss, Huberized hinge loss, and Learning to Rank measures (LambdaMart). Originally developed by Greg Ridgeway.
[email protected] is safe to use (health: 57/100)
Get this data programmatically — free, no authentication.
curl https://depscope.dev/api/check/conda/r-gbmFirst published · 2020-07-15 10:44:53.912000+00:00
Last updated · 2026-01-24 14:04:25.049000+00:00