Skip to content

Commit

Permalink
chore: release v0.5.0 (#620)
Browse files Browse the repository at this point in the history
Co-authored-by: Matthias Seeger <matthis@amazon.de>
  • Loading branch information
wesk and mseeger committed Apr 20, 2023
1 parent a84e585 commit 90cef33
Show file tree
Hide file tree
Showing 12 changed files with 53 additions and 10 deletions.
37 changes: 37 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,43 @@ All notable changes to this project will be documented in this file.

The format is based on [Keep a Changelog](http://keepachangelog.com/).

<a name="v0.5.0"></a>
## [v0.5.0] - 2023-04-20

### New Features

- Speculative early checkpoint removal for async multi-fidelity ([#628](https://github.com/awslabs/syne-tune/issues/628))
- Simple linear scalarization scheduler ([#619](https://github.com/awslabs/syne-tune/issues/619))
- Automatic termination criterion ([#605](https://github.com/awslabs/syne-tune/issues/605))
- All schedulers have a is_multiobjective_scheduler function ([#618](https://github.com/awslabs/syne-tune/issues/618))
- Allow for customized extra results to be written to results.csv.zip ([#612](https://github.com/awslabs/syne-tune/issues/612))
- Plotting functions to analyse multi-objective experiments. ([#611](https://github.com/awslabs/syne-tune/issues/611))
- Downsampling of observed data for single-fidelity Bayesian optimization ([#607](https://github.com/awslabs/syne-tune/issues/607))

### Bug Fixes
- CI failing with `ModuleNotFoundError: No module named 'examples'` error ([#626](https://github.com/awslabs/syne-tune/issues/626))
- Call init of class ([#584](https://github.com/awslabs/syne-tune/issues/584))
- Random seed initialisation limited to int32 ([#608](https://github.com/awslabs/syne-tune/issues/608))
- Make sure that checkpoints in PBT are removed once they are no longer needed ([#600](https://github.com/awslabs/syne-tune/pull/600))

### Code Refactoring
- Move early checkpoint removal into mixin ([#621](https://github.com/awslabs/syne-tune/issues/621))
- Keep rung levels sorted in HyperbandScheduler ([#604](https://github.com/awslabs/syne-tune/issues/604))
- Move utils from benchmarking to syne_tune ([#606](https://github.com/awslabs/syne-tune/pull/606))

### Documentation Updates
- Update instructions for how to install from source ([#629](https://github.com/awslabs/syne-tune/issues/629))
- Update README.md ([#615](https://github.com/awslabs/syne-tune/pull/615))

### Maintenance
- Bump codecov/codecov-action from 3.1.1 to 3.1.2 ([#623](https://github.com/awslabs/syne-tune/issues/623))
- Bump zgosalvez/github-actions-ensure-sha-pinned-actions from 2.1.0 to 2.1.2 ([#624](https://github.com/awslabs/syne-tune/issues/624))
- Add release drafter automation ([#568](https://github.com/awslabs/syne-tune/issues/568))
- Moved scheduler metadata generation ([#617](https://github.com/awslabs/syne-tune/issues/617))
- Bump tensorflow from 2.11.0 to 2.11.1 in /examples/training_scripts/rl_cartpole ([#609](https://github.com/awslabs/syne-tune/issues/609))

[v0.5.0]: https://github.com/awslabs/syne-tune/compare/v0.4.1...v0.5.0

## [0.4.1] - 2023-03-16

We release version 0.4.1 which you can install with `pip install syne-tune[extra]`.
Expand Down
1 change: 0 additions & 1 deletion benchmarking/nursery/benchmark_dehb/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,3 +1,2 @@
syne-tune[gpsearchers,kde,blackbox-repository,yahpo,aws]
tqdm
sortedcontainers # Remove in version > 0.4.1
1 change: 0 additions & 1 deletion benchmarking/nursery/benchmark_dyhpo/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,3 +1,2 @@
syne-tune[gpsearchers,blackbox-repository,aws]
tqdm
sortedcontainers # Remove in version > 0.4.1
1 change: 0 additions & 1 deletion benchmarking/nursery/benchmark_hypertune/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,3 +1,2 @@
syne-tune[gpsearchers,kde,blackbox-repository,aws]
tqdm
sortedcontainers # Remove in version > 0.4.1
1 change: 0 additions & 1 deletion benchmarking/nursery/benchmark_neuralband/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
syne-tune[gpsearchers,kde,blackbox-repository,aws]
tqdm
botorch
sortedcontainers # Remove in version > 0.4.1
1 change: 0 additions & 1 deletion benchmarking/nursery/benchmark_warping/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,3 +1,2 @@
syne-tune[gpsearchers,blackbox-repository,aws]
tqdm
sortedcontainers # Remove in version > 0.4.1
1 change: 0 additions & 1 deletion benchmarking/nursery/benchmark_yahpo/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,3 +1,2 @@
syne-tune[gpsearchers,blackbox-repository,yahpo,aws]
tqdm
sortedcontainers # Remove in version > 0.4.1
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
syne-tune[gpsearchers]
datasets==1.8.0
transformers
sortedcontainers # Remove in version > 0.4.1
Original file line number Diff line number Diff line change
@@ -1,3 +1,2 @@
syne-tune[gpsearchers,aws]
tqdm
sortedcontainers # Remove in version > 0.4.1
1 change: 0 additions & 1 deletion benchmarking/nursery/launch_sagemaker/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,3 +1,2 @@
syne-tune[gpsearchers,aws]
tqdm
sortedcontainers # Remove in version > 0.4.1
15 changes: 15 additions & 0 deletions docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,21 @@ This package provides state-of-the-art algorithms for hyperparameter optimizatio
What's New?
-----------

* Speculative early checkpoint removal for asynchronous multi-fidelity optimization.
Retaining all checkpoints often exhausts all available disk space when training
large models. With this feature, Syne Tune automatically removes checkpoints
that are unlikely to be needed.
`Details <faq.html#checkpoints-are-filling-up-my-disk-what-can-i-do>`__.
* New Multi-Objective Scheduler:
:class:`~syne_tune.optimizer.schedulers.multiobjective.LinearScalarizedScheduler`.
The method works by taking a multi-objective problem and turning it into a
single-objective task by optimizing for a linear combination of all objectives.
This wrapper works with all single-objective schedulers.
* Support for automatic termination criterion proposed by Makarova et al.
Instead of defining a fix number of iterations or wall-clock time limit, we
can set a threshold on how much worse we allow the final solution to be
compared to the global optima, such that we automatically stop the optimization
process once we find a solution that meets this criteria.
* You can now customize writing out results during an experiment, as shown in
`examples/launch_height_extra_results.py <examples.html#customize-results-written-during-an-experiment>`__.
* You can now warp inputs and apply a
Expand Down
2 changes: 1 addition & 1 deletion syne_tune/version
Original file line number Diff line number Diff line change
@@ -1 +1 @@
0.4.1
0.5.0

0 comments on commit 90cef33

Please sign in to comment.