in Uncategorized

Gradient Boosting Machine in III Acts: Trevor Hastie, Netflix & 0xdata

Gradient Boosting Machine in III Acts: Dr. Trevor Hastie, Netflix & 0xdata. Triple Header on Boosting & GBM:

Act I: Trevor Hastie, Of Stanford Mathematical Sciences, the mathematician behind Lasso & GBM speaks of the nuances of the Algorithm.

Act II: Cliff Click, CTO of 0xdata, the implementor of parallel and distributed GBM.

Act III: Antonio Molins, Data Scientist at Netflix, who uses GBM in his practice of data science for Marketing Algorithmic Models.

Boosting is a simple strategy that produces dramatic improvement in prediction performance. It works by sequentially applying a Classification Algorithm to reweighted versions of training data and taking the weighted majority vote of the sequence of classifiers produced.

“In the last 10 years my colleagues and I have been drawn into the machine learning domain, probably after the lure of neural networks. This has led us to offer a statistical perspective on novel and popular techniques arising outside of statistics, such as boosting and support-vector machines. This culminated in our 2001 book “Elements of Statistical Learning”, but the interest continues.”
-Trevor Hastie, http://www.stanford.edu/~hastie

GBM Implementation:

H2O https://github.com/0xdata/h2o/tree/master/src/main/java/hex/gbm
R: http://cran.r-project.org/web/packages/gbm/gbm.pdf<

References:

http://www.stanford.edu/~hastie/Papers/AdditiveLogisticRegression/alr.pdf