Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement partial fixed sampler. #1117

Closed
keisuke-umezawa opened this issue Apr 14, 2020 · 9 comments
Closed

Implement partial fixed sampler. #1117

keisuke-umezawa opened this issue Apr 14, 2020 · 9 comments
Labels
contribution-welcome Issue that welcomes contribution. feature Change that does not break compatibility, but affects the public interfaces. needs-discussion Issue/PR which needs discussion. no-stale Exempt from stale bot

Comments

@keisuke-umezawa
Copy link
Member

Motivation

Sometimes, when we try an experiment again, we want to override some parameters by fixed values.

e.g.

study = create_study(sampler=PartialFixedSampler({'lambda_l1': 0.1}, base_sampler=...))

Description

I would like to propose partial fixed sampler to override some parameters by fixed values , because it is easy to run experiments with small code changes.

I want to discuss two points here:

  1. partial fixed sampler is needed or not.
  2. If it is needed, what is the api looks like?
@keisuke-umezawa keisuke-umezawa added feature Change that does not break compatibility, but affects the public interfaces. needs-discussion Issue/PR which needs discussion. labels Apr 14, 2020
@g-votte g-votte added the contribution-welcome Issue that welcomes contribution. label Apr 15, 2020
@nzw0301
Copy link
Collaborator

nzw0301 commented Apr 29, 2020

We might implement this feature as a new class of FixedTrial. This is because FixedTrial has a similar API. But I'm not sure.

Trial is not appropriate. Please never mind what I said before.

@nzw0301
Copy link
Collaborator

nzw0301 commented Apr 29, 2020

I think we can already perform this kind of setting by using Study.enqueue_trial.

For example,

import optuna
n_trials = 10

def objective(trial):
    x = trial.suggest_uniform('x', 0, 10)
    y = trial.suggest_uniform('y', 0, 10)
    return x ** 2 + y ** 2

study = optuna.create_study()
for _ in range(n_trials):
    study.enqueue_trial({'x': 5})

study.optimize(objective, n_trials=n_trials)

for t in study.trials:
    t.params['x'] == 5

@hvy hvy added the no-stale Exempt from stale bot label Jun 12, 2020
@toshihikoyanase
Copy link
Member

[FYI] Let me share a prototype of the partial fixed sampler:
https://colab.research.google.com/drive/1HagfZ1bAv6rT_TDuzgtInFq6g0BnzjO8?usp=sharing
It was created to answer the question in Gitter.

I think it follows @keisuke-umezawa's design, but this feature can be implemented using Study.enqueue_trial as @nzw0301 mentioned. Please try the notebook to compare usability.

@toshihikoyanase
Copy link
Member

toshihikoyanase commented Aug 4, 2020

This is still an open issue.

I think both approaches (i.e., Study.enqueue_trial and partial fixed sampler) have pros and cons. For example, if we employ the former approach, users need to enqueue parameters n_trials times, and it may not be intuitive to use with the timeout option of Study.optimize. On the other hand, PartialFixedSampler is a kind of a wrapper of the samplers, similar to the independent sampler in the relative samplers. I'm not fully sure users are familiar with such an idea.

So, it may be better to discuss the approaches based on the concrete implementation in the PR. Maybe you can utilize my prototypical implementation if you'd like to implement PartialFixedSampler.

class PartialFixedSampler:
    def __init__(self, fixed_params, base_sampler):
        self._fixed_params = fixed_params
        self._base_sampler = base_sampler

    def reseed_rng(self) -> None:

        self._base_sampler.reseed_rng()

    def infer_relative_search_space(self, study, trial):
        # type: (Study, FrozenTrial) -> Dict[str, BaseDistribution]

        search_space = self._base_sampler.infer_relative_search_space(study, trial)
        # Remove fixed params from relative search space to return fixed values.
        for name in self._fixed_params:
            if name in search_space:
                del search_space[name]
        return search_space

    def sample_independent(self, study, trial, param_name, param_distribution):
        # type: (Study, FrozenTrial, str, BaseDistribution) -> float

        # Fixed params will be sampled here.
        if param_name in self._fixed_params:
            return self._fixed_params[param_name]
    
        return self._base_sampler.sample_independent(study, trial, param_name, param_distribution)

    def sample_relative(self, study, trial, search_space):
        # type: (Study, FrozenTrial, Dict[str, BaseDistribution]) -> Dict[str, float]

        # Fixed params are never sampled here.
        return self._base_sampler.sample_relative(study, trial, search_space)

@norihitoishida
Copy link
Contributor

norihitoishida commented Sep 12, 2020

Related PR and issues:

@norihitoishida
Copy link
Contributor

norihitoishida commented Oct 1, 2020

[WIP] I'll add docstrings. Big thanks for @toshihikoyanase - san !
https://github.com/norihitoishida/optuna/blob/partial-fixed-sampler/optuna/samplers/_partialfixedsampler.py

@norihitoishida
Copy link
Contributor

Hi @keisuke-umezawa
If there seems to be no ploblem with #1892 , could you please close this issue?

@norihitoishida
Copy link
Contributor

@toshihikoyanase
Could you please close this issue?

@toshihikoyanase
Copy link
Member

@norihitoishida Thank you for letting me know. I'll close this issue.

@keisuke-umezawa Please feel free to re-open it if you have further comments.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
contribution-welcome Issue that welcomes contribution. feature Change that does not break compatibility, but affects the public interfaces. needs-discussion Issue/PR which needs discussion. no-stale Exempt from stale bot
Projects
None yet
Development

No branches or pull requests

7 participants