Update the hyperparameter tuning methods #353
Replies: 3 comments 2 replies
-
Good idea, I think tuning beyond grid search is an important issue for practitioners! I think FLAML AutoML is also a promising option. Both would be viable options to integrate. |
Beta Was this translation helpful? Give feedback.
-
Hi all, I made an initial implementation on the following branch to try out Optuna: https://github.com/DoubleML/doubleml-for-py/tree/j-optuna
I know that the implementation isn't great at the moment, I see this first version more as a proof of concept (created with some help from AI). You can also find a quick simulation in the following notebook: optuna_tuning_comparison.ipynb At first glance, it definitely seems to be faster than sklearns tuning. Other advantages may arise in the case of more complex DGPs? ![]() ![]() |
Beta Was this translation helpful? Give feedback.
-
One specific issue with tuning is to allow for tuning on folds, i.e. fold specific nuisance parameters. I would suggest to remove the possiblity to tune on folds completely and set parameters just per learner (even optionally allow for updated/different learners (´g1´ vs ´g0´). |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Reworking the tuning procedures from doubleml to not only use basic grid search would be a good addition.
I think the current framework might as well be quite adaptable to be combined with Optuna.
A
tune_optuna
method could accept parameter dictionaries for the specific learners and set up an optuna study, see example.Any comments or suggestions on hyperparameter tuning?
Beta Was this translation helpful? Give feedback.
All reactions