Title: P3: AutoML ⤜(⚆i⚆)⤏ [2023-12-06 Wed]
- LIGHTAUTOML BO of linear models and GBM Vakhrushev et al. (2021)
- MLJAR Custom data science pipeline Plónska and Plónski (2021)
- NAIVEAUTOML Custom data science pipeline Mohr and Wever (2023)
- TPOT EO of SCIKIT-LEARN pipelines Olson and Moore (2016)
GPU based
- AUTO-KERAS (Jin et al.,2019)
- AUTOPYTORCH (Zimmer et al., 2021)
🎀\n#automl #ml #nn
Title: P2: AutoML ⤜(⚆i⚆)⤏ [2023-12-06 Wed]
- AUTOGLUON Stacked ensembles of preset pipelines Erickson et al. (2020)
- AUTO-SKLEARN BO of SCIKIT-LEARN pipelines Feurer et al. (2015a)
- AUTO-SKLEARN 2 BO of iterative algorithms Feurer et al. (2020)
- FLAML CFO of iterative algorithms Wang et al. (2021)
- GAMA EO of SCIKIT-LEARN pipelines Gijsbers and Vanschoren (2021)
- H2O AUTOML Iterative mix of RS and ensembling LeDell and Poirier (2020)\n#automl #ml #nn

Title: P1: AutoML ⤜(⚆i⚆)⤏ [2023-12-06 Wed]
- Meta-Learning - 1) collect meta-data: prior learning tasks and previously learned models 2) learn from
meta-data to extract and transfer knowledge that guides the search for optimal models for
new tasks
- meta-features - measurable properties of the task itself

optimization techniques:
- Bayesian optimization (BO)
- evolutionary optimization (EO)
- random search (RS)
- cost frugal optimization (CFO)

opensource frameworks\n#automl #ml #nn

Title: P0: AutoML ⤜(⚆i⚆)⤏ [2023-12-06 Wed]
Major papers:
- Automated Machine Learning - Methods, Systems, Challenges. Springer, 2019 https://www.automl.org/wp-content/uploads/2019/05/AutoML_Book.pdf
- sequential model-based optimization (Hutter et al., 2011 Snoek et al., 2012),
- hierarchical task planning (Erol et al., 1994)
- genetic programming (Koza, 1992)

tasks that AutoML solve:
- Neural Architecture Search (NAS)
- Hyperparameter Optimization\n#automl #ml #nn

Title: P3: AutoML ⤜(⚆i⚆)⤏ [2023-12-06 Wed]
- LIGHTAUTOML BO of linear models and GBM Vakhrushev et al. (2021)
- MLJAR Custom data science pipeline Plónska and Plónski (2021)
- NAIVEAUTOML Custom data science pipeline Mohr and Wever (2023)
- TPOT EO of SCIKIT-LEARN pipelines Olson and Moore (2016)
GPU based
- AUTO-KERAS (Jin et al.,2019)
- AUTOPYTORCH (Zimmer et al., 2021)
🎀\n#automl #ml #nn
Title: P2: AutoML ⤜(⚆i⚆)⤏ [2023-12-06 Wed]
- AUTOGLUON Stacked ensembles of preset pipelines Erickson et al. (2020)
- AUTO-SKLEARN BO of SCIKIT-LEARN pipelines Feurer et al. (2015a)
- AUTO-SKLEARN 2 BO of iterative algorithms Feurer et al. (2020)
- FLAML CFO of iterative algorithms Wang et al. (2021)
- GAMA EO of SCIKIT-LEARN pipelines Gijsbers and Vanschoren (2021)
- H2O AUTOML Iterative mix of RS and ensembling LeDell and Poirier (2020)\n#automl #ml #nn

Title: P1: AutoML ⤜(⚆i⚆)⤏ [2023-12-06 Wed]
- Meta-Learning - 1) collect meta-data: prior learning tasks and previously learned models 2) learn from
meta-data to extract and transfer knowledge that guides the search for optimal models for
new tasks
- meta-features - measurable properties of the task itself

optimization techniques:
- Bayesian optimization (BO)
- evolutionary optimization (EO)
- random search (RS)
- cost frugal optimization (CFO)

opensource frameworks\n#automl #ml #nn

Title: P0: AutoML ⤜(⚆i⚆)⤏ [2023-12-06 Wed]
Major papers:
- Automated Machine Learning - Methods, Systems, Challenges. Springer, 2019 https://www.automl.org/wp-content/uploads/2019/05/AutoML_Book.pdf
- sequential model-based optimization (Hutter et al., 2011 Snoek et al., 2012),
- hierarchical task planning (Erol et al., 1994)
- genetic programming (Koza, 1992)

tasks that AutoML solve:
- Neural Architecture Search (NAS)
- Hyperparameter Optimization\n#automl #ml #nn

2am? At my Age?? I should be in bed!

#NN

Title: P0: I have been reading 📚 about Ensemble Learing [2023-08-07 Mon]
in Machine Learing and Deep Learning:
- book: 2012 Ensemble Methods: Foundations and Algorithms - Zhi-Hua Zhou
- article: 2022 [2104.02395] Ensemble deep learning: A review http s:/ /arxiv. org/abs/2104.02395
It is possible to calc upper error boundary for ensemble final output,
for Adaboost and combination of binary classifiers the generalization error
reduces exponentially to the ensemble size.
❤️\n#ensemble #NN