Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No more quantized hyperparameters? #390

Open
bpkroth opened this issue Aug 5, 2024 · 5 comments
Open

No more quantized hyperparameters? #390

bpkroth opened this issue Aug 5, 2024 · 5 comments

Comments

@bpkroth
Copy link
Contributor

bpkroth commented Aug 5, 2024

#346 removed quantization

No more quantized hyperparameters

Can I ask what was the reasoning behind this and can it be restored? That was a very useful feature especially for systems tuning applications.

@eddiebergman
Copy link
Contributor

Hi @bpkroth,

The reason behind it was simply we didn't have anyone we know use these features, as well as the fact they had surprising and non-consistent behaviours. It complicated a lot of internal code and given the lack of man-power we can put towards maintaining things, this was a feature we removed. The plan was to eventually re-introduce them if we had requests for it, which it seems you do need them.

This should be a lot easier now since the notion of the distribution is seperated from the concern of transformation from the vectorized space, e.g. (0, 1), [0, ... N] from the actual value space, i.e. (-10, 243) and [cat, dog, ..., mouse].

Doing so would essentially involve having an integer distribution with a Transformer to take them from this integer vectorized space to the value space.

For non-uniform integer distributions, the following could be used while for uniform distributions, this distribution could be used.

When they come back, they would most likely come back as seperate hyperparameters. I believe trying to handle everything in one base-class leads to a lot of edge cases and messyness. For example, a QUniformFloat or a QUniformInt.

I can't give you a concrete timeline on this but I'm happy to review any PR's for it. If you only need a work-around and you are wrapping ConfigSpace, then applying a custom transformation to an existing integer hyperparameter is your best bet.

@bpkroth
Copy link
Contributor Author

bpkroth commented Aug 6, 2024

@motus when you get a moment can you please comment here with some of your observations about the Ordinal workaround issues?

@bpkroth
Copy link
Contributor Author

bpkroth commented Aug 6, 2024

@eddiebergman thanks for the pointers. We'll take a look at see if we can make something work.
For now, we're toying with a monkeypatch workaround in our wrappers, and though not our favorite, seems like it might be workable.
microsoft/MLOS#833

@BogueUser
Copy link

BogueUser commented Aug 7, 2024

The reason behind it was simply we didn't have anyone we know use these features, as well as the fact they had surprising and non-consistent behaviours. It complicated a lot of internal code and given the lack of man-power we can put towards maintaining things, this was a feature we removed. The plan was to eventually re-introduce them if we had requests for it, which it seems you do need them.

@eddiebergman Ray Tune uses the removed quantization option for the BOHB algorithm so it was actually in use somewhere. I can't seem to find many people talking about how it isn't working since the change so I guess it wasn't too heavily used.

@bpkroth
Copy link
Contributor Author

bpkroth commented Aug 8, 2024

The reason behind it was simply we didn't have anyone we know use these features, as well as the fact they had surprising and non-consistent behaviours. It complicated a lot of internal code and given the lack of man-power we can put towards maintaining things, this was a feature we removed. The plan was to eventually re-introduce them if we had requests for it, which it seems you do need them.

@eddiebergman Ray Tune uses the removed quantization option for the BOHB algorithm so it was actually in use somewhere. I can't seem to find many people talking about how it isn't working since the change so I guess it wasn't too heavily used.

I have an intern who was actually using that too.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants