Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow more parameters for BO #200

Merged
merged 3 commits into from
May 6, 2019
Merged

Allow more parameters for BO #200

merged 3 commits into from
May 6, 2019

Conversation

jrapin
Copy link
Contributor

@jrapin jrapin commented May 6, 2019

Types of changes

  • Docs change / refactoring / dependency upgrade
  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)

Motivation and Context / Related issue

Closes #189

How Has This Been Tested (if it applies)

Checklist

  • The documentation is up-to-date with the changes I made.
  • I have read the CONTRIBUTING document and completed the CLA (see CONTRIBUTING).
  • All tests passed, and additional code has been covered with new tests.

@facebook-github-bot facebook-github-bot added the CLA Signed Do not delete this pull request or issue due to inactivity. label May 6, 2019
@jrapin jrapin merged commit a0d0cb3 into master May 6, 2019
bounds = {f'x{i}': (0., 1.) for i in range(self.dimension)}
seed = np.random.randint(2**32, dtype=np.uint32)
self._bo = BayesianOptimization(self._fake_function, bounds, random_state=np.random.RandomState(seed))
if self._parameters.gp_parameters is not None:
self._bo.set_gp_parameters(**self._parameters.gp_parameters)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this actually should be self._bo.set_gp_params(**self._parameters.gp_parameters)

Copy link
Contributor Author

@jrapin jrapin May 13, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, I was a bit too fast with it sorry. It should now be solved
Btw, don't hesitate to create a new issue (or PR) for this kind of comment, I nearly did not see it in my emails :s

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, I was a bit too fast with it sorry. It should now be solved
Btw, don't hesitate to create a new issue (or PR) for this kind of comment, I nearly did not see it in my emails :s

I really don't feel like making a new PR for just correcting a very small typo :D

@jrapin jrapin mentioned this pull request May 13, 2019
7 tasks
@jrapin jrapin deleted the boopt branch May 13, 2019 12:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed Do not delete this pull request or issue due to inactivity.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Ability to tune the underlying bayes_opt params
3 participants