The Power of Combining Machine Learning Models – The Risks and Rewards of Random Forests, XGBoost and other Ensembles

Ensembling is one of the hottest techniques in today’s predictive analytics competitions. Every single recent winner of Kaggle.com and KDD competitions used an ensemble technique, including famous algorithms such as XGBoost and Random Forest.
Are these competition victories paving the way for widespread organizational implementation of these techniques? This session will provide a detailed overview of ensemble models, their origin, and show why they are so effective. We will explain the building blocks of virtually all ensembles techniques, to include bagging and boosting.

What You Will Learn:

  • What are ensemble models and what are their advantages?
  • Why are ensembles in the news?
  • The two most influential ensembling approaches: bagging and boosting
  • The core elements of ensembles and their application
  • The challenge of applying competition strategies to organizational problems.