Tag: XGBoost

AI Frontiers Series: Supply Chain

Recently, I’ve pondered how I can provide equal value to both technical and business-oriented professionals in my writings. Fortunately, my role as a data science consultant naturally offers a wealth of interesting topics. Beyond coding, we consistently review literature and articles detailing...

10 Confusing XGBoost Hyperparameters and How to Tune Them Like a Pro in 2023

Today, I am going to show you how to squeeze XGBoost so hard that both ‘o’s pop out. We will achieve this by fine-tuning its hyperparameters to such an extent that it will no longer be able to bst after giving us all the performance it can. This will not be a mere hyperparam...

XGBoost: The Definitive Guide (Part 1)

XGBoost (short for eXtreme Gradient Boosting) is an open-source library that provides an optimized and scalable implementation of gradient boosted decision trees. It incorporates various software and hardware optimization techniques that allow it to deal with huge amounts of data. Originally deve...

Time Series Forecasting with XGBoost and LightGBM: Predicting Energy Consumption with Lag Features

In a previous article, we’ve gone through the process of creating a model capable of predicting the demand of energy consumption for the city of London. Essentially, it was a time series forecasting problem where we had utilized the London Energy Dataset and the London Weat...

XGBoost Algorithm in Machine Learning

XGBoost is a versatile machine learning algorithm that finds applications in a wide range of domains. Some of its common uses and applications: Classification: XGBoost is often employed for classification tasks, such as spam detection, image recognition, fraud detection, and sentiment analysis...

Optimizing XGBoost: A Guide to Hyperparameter Tuning

Hyperparameter tuning is important because the performance of a machine learning model is heavily influenced by the choice of hyperparameters. Choosing the right set of hyperparameters can lead to better model performance, while choosing the wrong set can lead to poor performance. Additionally, when...

The main parameters in XGBoost and their effects on model performance

The learning rate controls the step size at which the optimizer makes updates to the weights. A smaller eta value results in slower but more accurate updates, while a larger eta value results in faster but less accurate updates. It is common to start with a relatively high value and then gradually d...