Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Limitation in number of Hyperparameters with Multi-Fidelity #897

Closed
Mshz2 opened this issue Jan 5, 2023 · 3 comments
Closed

Limitation in number of Hyperparameters with Multi-Fidelity #897

Mshz2 opened this issue Jan 5, 2023 · 3 comments

Comments

@Mshz2
Copy link

Mshz2 commented Jan 5, 2023

question

Description

Hi, I would like to use Multi-Fidelity option of SMAC to find optimum arrangement defined by about 2752 Hyperparameters within ConfigurationSpace.

I was wondering if there is any performance limitation in number of hyperparameters for Optimization by using your implemetation?
these 2752 Hyperparameters can be all categorial or all integers.

Thanks for your support and best regards

@mlindauer
Copy link
Contributor

Hi,

In principle, you can run SMAC on a space with 2752 hyperparameters. You only need to use the RF as the model (which is the default for the multi-fidelity facade), not the GP.
I think the largest one we ran once was also around 2k.

However, please note that if the effective dimensionality is also in the same order (i.e., nearly all hyperparameters are important) and the hyperparameters interact with each other, I doubt that SMAC will deliver a satisfying result. Nevertheless, it is often the case that only a few hyperparameters matter, and SMAC is doing a reasonable job of focusing on these.

Best,
Marius

@Mshz2
Copy link
Author

Mshz2 commented Jan 8, 2023

Hi,

In principle, you can run SMAC on a space with 2752 hyperparameters. You only need to use the RF as the model (which is the default for the multi-fidelity facade), not the GP. I think the largest one we ran once was also around 2k.

However, please note that if the effective dimensionality is also in the same order (i.e., nearly all hyperparameters are important) and the hyperparameters interact with each other, I doubt that SMAC will deliver a satisfying result. Nevertheless, it is often the case that only a few hyperparameters matter, and SMAC is doing a reasonable job of focusing on these.

Best, Marius

Thank you so much and I really appreciate your reply. If it is the case, would you suggest a Random Search strategy instead of Surrogate Model-based Optimization or any Bayesian optimization methodology?

@mlindauer
Copy link
Contributor

No, random search might even perform poorer than Bayesian Optimization. This won't help.
You could give it a try with local Bayesian Optimization methods such as TuRBO, but they might not have any multi-fidelity included -- I haven't checked that.

@Mshz2 Mshz2 closed this as completed Jan 9, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants