Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[tune] remove legacy search algorithms #41414

Merged
merged 6 commits into from
Nov 28, 2023

Conversation

matthewdeng
Copy link
Contributor

Why are these changes needed?

This PR removes legacy search algorithms that are not seeing much usage. This list consists of:

  • Dragonfly
  • (FLAML) BlendSearch, CFO
  • Nevergrad
  • SigOpt

Related issue number

Follow-up to #41348.

Checks

  • I've signed off every commit(by using the -s flag, i.e., git commit -s) in this PR.
  • I've run scripts/format.sh to lint the changes in this PR.
  • I've included any doc changes needed for https://docs.ray.io/en/master/.
    • I've added any new APIs to the API Reference. For example, if I added a
      method in Tune, I've added it in doc/source/tune/api/ under the
      corresponding .rst file.
  • I've made sure the tests are passing. Note that there might be a few flaky tests, see the recent failures at https://flakey-tests.ray.io/
  • Testing Strategy
    • Unit tests
    • Release tests
    • This PR is not tested :(

Signed-off-by: Matthew Deng <matt@anyscale.com>
Signed-off-by: Matthew Deng <matt@anyscale.com>
Signed-off-by: Matthew Deng <matt@anyscale.com>
Signed-off-by: Matthew Deng <matt@anyscale.com>
Copy link
Contributor

@pcmoritz pcmoritz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very nice! Do we also need to regenerate requirements_compiled.txt?

@matthewdeng
Copy link
Contributor Author

@pcmoritz yep waiting for it to generate here and will update the PR with the update file!

Signed-off-by: Matthew Deng <matt@anyscale.com>
Signed-off-by: Matthew Deng <matt@anyscale.com>
@matthewdeng matthewdeng merged commit fa4963b into ray-project:master Nov 28, 2023
16 of 17 checks passed
@matthewdeng matthewdeng deleted the tune-cleanup branch November 28, 2023 00:53
ujjawal-khare pushed a commit to ujjawal-khare-27/ray that referenced this pull request Nov 29, 2023
Signed-off-by: Matthew Deng <matt@anyscale.com>
@anhnami
Copy link

anhnami commented Dec 23, 2023

Nevergrad's OnePlusOne and the recent LogNormalOnePlusOne have been my favorite. They yielded better performance than Optuna for me. Kind of sad because it got removed for no valid reason.

@anhnami
Copy link

anhnami commented Dec 23, 2023

Just want to add, this does not allow me to load my old tune results.

[327](https://vscode-remote+ssh-002dremote-002be13k-002eeu-002eorg-002e47.vscode-resource.vscode-cdn.net/home/anhnht/csaiomodel/~/miniforge3/envs/dev/lib/python3.11/site-packages/ray/tune/impl/tuner_internal.py:327) fs, fs_path = get_fs_and_path(path_or_uri, storage_filesystem) [328](https://vscode-remote+ssh-002dremote-002be13k-002eeu-002eorg-002e47.vscode-resource.vscode-cdn.net/home/anhnht/csaiomodel/~/miniforge3/envs/dev/lib/python3.11/site-packages/ray/tune/impl/tuner_internal.py:328) with fs.open_input_file(os.path.join(fs_path, _TUNER_PKL)) as f: --> [329](https://vscode-remote+ssh-002dremote-002be13k-002eeu-002eorg-002e47.vscode-resource.vscode-cdn.net/home/anhnht/csaiomodel/~/miniforge3/envs/dev/lib/python3.11/site-packages/ray/tune/impl/tuner_internal.py:329) tuner_state = pickle.loads(f.readall()) [331](https://vscode-remote+ssh-002dremote-002be13k-002eeu-002eorg-002e47.vscode-resource.vscode-cdn.net/home/anhnht/csaiomodel/~/miniforge3/envs/dev/lib/python3.11/site-packages/ray/tune/impl/tuner_internal.py:331) old_trainable_name, flattened_param_space_keys = self._load_tuner_state( [332](https://vscode-remote+ssh-002dremote-002be13k-002eeu-002eorg-002e47.vscode-resource.vscode-cdn.net/home/anhnht/csaiomodel/~/miniforge3/envs/dev/lib/python3.11/site-packages/ray/tune/impl/tuner_internal.py:332) tuner_state [333](https://vscode-remote+ssh-002dremote-002be13k-002eeu-002eorg-002e47.vscode-resource.vscode-cdn.net/home/anhnht/csaiomodel/~/miniforge3/envs/dev/lib/python3.11/site-packages/ray/tune/impl/tuner_internal.py:333) ) [335](https://vscode-remote+ssh-002dremote-002be13k-002eeu-002eorg-002e47.vscode-resource.vscode-cdn.net/home/anhnht/csaiomodel/~/miniforge3/envs/dev/lib/python3.11/site-packages/ray/tune/impl/tuner_internal.py:335) # Perform validation and set the re-specified trainableandparam_space`

ModuleNotFoundError: No module named 'ray.tune.search.nevergrad'`

@JustasB
Copy link

JustasB commented Dec 27, 2023

@matthewdeng @pcmoritz I second @anhnami 's comment. My team uses Nevergrad extensively, and removal of it from ray tune is a major loss for us.

@teytaud
Copy link

teytaud commented Jan 9, 2024

I am one of the developers of Nevergrad. In Nevergrad's group we see users mentioning that we are not available anymore in Ray; is there anything we should do in Nevergrad so that it is not a problem for people taking care of Ray ? Happy to do anything that can help.

@teytaud
Copy link

teytaud commented Jan 9, 2024

Nevergrad's OnePlusOne and the recent LogNormalOnePlusOne have been my favorite. They yielded better performance than Optuna for me. Kind of sad because it got removed for no valid reason.

I have added LogNormalOnePlusOne recently and I love it. As seemingly I am not the only one maybe I should add variants of it in Nevergrad as well.

@matthewdeng
Copy link
Contributor Author

Apologies all for the disruption, I'll get Nevergrad added back into Ray Tune.

@daveqs
Copy link

daveqs commented Feb 11, 2024

I presently use FLAML with Ray Tune to run automl jobs. With FLAML's blendsearch and cfo algos no longer available in Ray Tune, is their a recommended alternative for using Tune to perform automl?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

7 participants