I love [[AutoGluon]] for fast iteration. But let it run wild and it’ll stack an ensemble of 20 models and chew your entire node.
I now explicitly cap it to a few predictors, [[LightGBM]], [[CatBoost]], RF. That’s it. No deep nets, no kNN, no multi-hour ensembles.
It still beats hand-tuned baselines. And I get results in 15 minutes instead of two hours.
[[AutoGluon]] [[ML]] [[LightGBM]] [[CatBoost]] [[RandomForest]] [[Serendipity]]