ALGORITMO GRADIENT BOOSTING: FUNDAMENTOS MATEMÁTICOS Y VENTAJAS DE CATBOOST EN MACHINE LEARNING
ALGORITMO GRADIENT BOOSTING: FUNDAMENTOS MATEMÁTICOS Y VENTAJAS DE CATBOOST EN MACHINE LEARNING
-
DOI: https://doi.org/10.22533/at.ed.316122508043
-
Palavras-chave: Gradient Boosting, CatBoost, Variables Categóricas
-
Keywords: Gradient Boosting, CatBoost, Categorical Variables
-
Abstract: This article provides an overview of the Gradient Boosting algorithm, with a particular focus on CatBoost, one of its most prominent variants. It explains the fundamental principle of Gradient Boosting, an iterative method that combines weak models (such as decision trees) to minimize a loss function through negative gradient descent. Additionally, the mathematical foundation behind this algorithm and its implementation in CatBoost are analyzed, highlighting its ability to efficiently handle categorical variables without requiring complex transformations. Finally, the key advantages of CatBoost are discussed, including its training speed, handling of large datasets, and ability to reduce overfitting, making it a powerful tool for diverse machine learning applications.
- Carlos Alberto Peña Miranda
- Jesus Adalberto Zelaya Contreras
- Elizabeth Cosi Cruz