Skip to content

Cross Validation with LGBM: Only 1 Use Case?

When employing cross-validation with the LightGBM (LGBM) regressor, it primarily serves one key function: determining the optimal number of boosting rounds, known as `best_num_round`. Users have noted that beyond this, the utility of cross-validation in LGBM seems limited. Despite training models with this method, no additional benefits have been observed. Interestingly, an alternative approach, early stopping, can achieve similar results without the need for cross-validation. This raises questions about the broader applicability of cross-validation in LGBM settings.

Source: stackoverflow.com

Related Videos