3. #Gradientboosting is VERY interpretable with the #SHAPley method. They are totally misleading by saying their Deep Neural Network is more interpretable and boosting is not interpretable. They are apparently ignorant of these important advanced in interpretability, even though it is more than 5 years old now.

4. Despite a lot of talk about class imbalance, the churn datasets are not very imbalanced - 10%-20% churn rates. Really imbalanced data is low single digit churn rates.

Master #GradientBoosting for #StockPrediction! Learn to build a powerful #MachineLearning model using #Python and $TSLA data. Enhance your trading strategy with data-driven insights. #FinTech #DataScience #TradingAlgorithms

https://teguhteja.id/gradient-boosting-stock-prediction-python-guide/

Gradient Boosting: Mastering Stock Price Prediction with Python

Gradient boosting stock prediction: Learn to implement a powerful machine learning model for financial analysis using Python and Tesla data

teguhteja.id
Predicting and improving complex beer flavor through machine learning - Nature Communications

Perception and appreciation of food flavour depends on many factors, posing a challenge for effective prediction. Here, the authors combine extensive chemical and sensory analyses of 250 commercial Belgian beers to train machine learning models that enable flavour and consumer appreciation prediction.

Nature

I ran a quick Gradient Boosted Trees vs Neural Nets check using scikit-learn's dev branch which makes it more convenient to work with tabular datasets with mixed numerical and categorical features data (e.g. the Adult Census dataset).

Let's start with the GBRT model. It's now possible to reproduce the SOTA number of this dataset in a few lines of code 2 s (CV included) on my laptop.

1/n

#sklearn #PyData #MachineLearning #TabularData #GradientBoosting #DeepLearning #Python

I'm excited to see #gradientboosting making some news! There is so much #aihype around #llms (and before that it was #deeplearning) but I think that for most #datascientists working in industry the development of #gradientboosting #machinelearning algorithms (like #xgboost and #catboost) are the real revolution and will have a much more long lived impact on our work.

https://www.nature.com/articles/s41598-022-20149-z

Gradient boosting decision tree becomes more reliable than logistic regression in predicting probability for diabetes with big data - Scientific Reports

We sought to verify the reliability of machine learning (ML) in developing diabetes prediction models by utilizing big data. To this end, we compared the reliability of gradient boosting decision tree (GBDT) and logistic regression (LR) models using data obtained from the Kokuho-database of the Osaka prefecture, Japan. To develop the models, we focused on 16 predictors from health checkup data from April 2013 to December 2014. A total of 277,651 eligible participants were studied. The prediction models were developed using a light gradient boosting machine (LightGBM), which is an effective GBDT implementation algorithm, and LR. Their reliabilities were measured based on expected calibration error (ECE), negative log-likelihood (Logloss), and reliability diagrams. Similarly, their classification accuracies were measured in the area under the curve (AUC). We further analyzed their reliabilities while changing the sample size for training. Among the 277,651 participants, 15,900 (7978 males and 7922 females) were newly diagnosed with diabetes within 3 years. LightGBM (LR) achieved an ECE of 0.0018 ± 0.00033 (0.0048 ± 0.00058), a Logloss of 0.167 ± 0.00062 (0.172 ± 0.00090), and an AUC of 0.844 ± 0.0025 (0.826 ± 0.0035). From sample size analysis, the reliability of LightGBM became higher than LR when the sample size increased more than $$10^4$$ . Thus, we confirmed that GBDT provides a more reliable model than that of LR in the development of diabetes prediction models using big data. ML could potentially produce a highly reliable diabetes prediction model, a helpful tool for improving lifestyle and preventing diabetes.

Nature
Using the #ensemble of #gradientboosting regression models for FedCSIS 2022 Challenge: “Diversified gradient boosting ensembles for prediction of the cost of forwarding contracts” by D. Ruta, M. Liu, L. Cen, QH Vu. @FedCSIS
2022, ACSIS Vol. 30 p. 431–436; http://tinyurl.com/4vszh6wt
Annals of Computer Science and Information Systems, Volume 30

#MachineLearning lesson of the day: Working with a #gradientboosting model, I got no traction cross-validating hyperparameters like tree depth and number; but different evaluation metrics (e.g. SMSE vs. MAE etc.) had a major impact. Have you tried this?

IMHO #DNN get all the press because they do sexy human jobs like seeing and processing language. But in the business world of tabular data, #gradientboosting is where the real revolution is happening! #xgboost #catboost #lightgbm #DataScience