ML Modeling Tips
Advanced
Light Gradient Boosting
The lesser-known Python library, Light Gradient Boosting (LightGBM), a powerful machine learning model developed by Microsoft. One of its key advantages is its exceptional speed, making it an efficient choice for training models. Additionally, LightGBM provides us with the flexibility to select boosting types and control the maximum number of leaves in each tree. These features, which we will explore during our class, enable us to fine-tune our models for various tasks and datasets, further expanding our machine learning toolkit.
Advanced
Best Ensemble Method Sklearn
In the video we discuss various machine learning models within the Scikit-Learn library, aiming to compare them in an ensemble method framework. Ensemble methods, such as bagging or boosting, are employed to combine the predictions of multiple models to enhance overall performance. By using this approach, one can determine which model exhibits superior predictive capabilities. The goal of this analysis is to provide valuable insights into which machine learning algorithm is best suited for a specific problem, a critical decision-making process in data science and machine learning. The selection of an appropriate model can significantly impact the success of a given project, making such comparative analyses invaluable to data scientists.
Advanced
XGBoost - Extreme Gradient Boosting
In this machine learning session, we embark on a journey to unravel the intricacies of XGBoost, short for Extreme Gradient Boosting, a renowned model hailing from a dedicated library designed for the sole purpose of enhancing predictive power. This formidable model offers an expansive array of hyperparameters, far surpassing those found in scikit-learn's Gradient Boosting. As we delve into the mathematical underpinnings that unite these models, our primary objective in this Python-based machine learning lesson is to compare and contrast XGBoost with GradientBoosting, ultimately determining the superior machine learning algo for your predictive endeavors.
Advanced
Extremely Randomized Trees
The ExtraTreesRegressor, or Extremely Randomized Trees, distinguishes itself by introducing an additional layer of randomness during the construction of decision trees in an ensemble. Unlike Random Forest, Extra Trees selects both splitting features and thresholds at each node entirely at random, without any optimization criteria. This high degree of randomization often results in a more diverse set of trees, which can lead to lower variance and faster training times compared to traditional Random Forests.
Advanced
Gradient Boosting with Sklearn in Python
In the free machine learning tips video we explore how to get the most out of the Gradient Boosting Ensemble method with Sklearn in Python. This is an ensemble method and uses gradient descent to optimize it's predictions. This ensemble method uses sequential learning which will take longer to traning than the Random Forest or Bagging but can often out preform these. It does this learning by using gradient descent out to get to the point of best predictions. This is esstentially Sklearn's version of XGBoost and typically works just as well.