Skip to content

Commit

Permalink
Merge cf1bd04 into 6ac68bf
Browse files Browse the repository at this point in the history
  • Loading branch information
Bronzila committed Mar 23, 2024
2 parents 6ac68bf + cf1bd04 commit 4ad43b1
Show file tree
Hide file tree
Showing 7 changed files with 798 additions and 186 deletions.
1 change: 1 addition & 0 deletions docs/examples/04_restarting_an_optimization_run.ipynb
223 changes: 223 additions & 0 deletions examples/04_restarting_an_optimization_run.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,223 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Logging and Restarting an Optimization Run"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This notebook describes how DEHB logs its state and results and how you can reload a checkpoint from the disk and restart the optimization run.\n",
"\n",
"DEHB supports logging in three different ways, which can be specified in the constructor of DEHB via the `save_freq` parameter:\n",
"1. `\"end\"`, saving the optimizer state only at the end of optimization (at the end of `run`). Note: This option is suboptimal for users using the ask & tell interface.\n",
"2. `\"incumbent\"`, saving the optimizer state after the incumbent changes.\n",
"3. `\"step\"`, saving the optimizer state after every step, i.e. after every call of `tell`.\n",
"\n",
"No matter what option is chosen, the state will always also be saved after the `run` function has finished (similar as in `\"end\"`)."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The directory, where the state and logs will be saved is specified via the `output_path` parameter. If no output path is specified, the current directory is used."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Setting up DEHB\n",
"Here we only use a toy setup for DEHB as in the `interfacing_DEHB` example. For a detailed description of the unique parts of DEHB, please refer to [this example](https://github.com/automl/DEHB/blob/master/examples/00_interfacing_DEHB.ipynb)."
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
"import time\n",
"import warnings\n",
"from typing import Dict, List, Optional, Union\n",
"\n",
"import ConfigSpace\n",
"import numpy as np\n",
"\n",
"warnings.filterwarnings(\"ignore\")\n",
"\n",
"def target_function(\n",
" x: Union[ConfigSpace.Configuration, List, np.array],\n",
" fidelity: Optional[Union[int, float]] = None,\n",
" **kwargs,\n",
") -> Dict:\n",
" start = time.time()\n",
" y = np.random.uniform() # placeholder response of evaluation\n",
" time.sleep(0.05) # simulates runtime\n",
" cost = time.time() - start\n",
"\n",
" # result dict passed to DE/DEHB as function evaluation output\n",
" result = {\n",
" \"fitness\": y, # must-have key that DE/DEHB minimizes\n",
" \"cost\": cost, # must-have key that associates cost/runtime \n",
" \"info\": dict() # optional key containing a dictionary of additional info\n",
" }\n",
" return result"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"import ConfigSpace\n",
"\n",
"\n",
"def create_search_space():\n",
" # Creating a one-dimensional search space of real numbers in [3, 10]\n",
" cs = ConfigSpace.ConfigurationSpace()\n",
" cs.add_hyperparameter(ConfigSpace.UniformFloatHyperparameter(\"x0\", lower=3, upper=10, log=False))\n",
" return cs\n",
"\n",
"cs = create_search_space()\n",
"dimensions = len(cs.get_hyperparameters())\n",
"min_fidelity, max_fidelity = (0.1, 3)"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [],
"source": [
"from dehb import DEHB\n",
"\n",
"dehb = DEHB(\n",
" f=target_function,\n",
" dimensions=dimensions,\n",
" cs=cs,\n",
" min_fidelity=min_fidelity,\n",
" max_fidelity=max_fidelity,\n",
" output_path=\"./temp_folder\",\n",
" save_freq=\"end\",\n",
" n_workers=1,\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Running DEHB\n",
"First, we want to run DEHB for 5 brackets, later we will use the created checkpoint to restart the optimization. Since we used the option `\"end\"`, the state will only be saved after 5 brackets."
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[32m2024-03-11 18:21:29.797\u001b[0m | \u001b[1mINFO \u001b[0m | \u001b[36mdehb.optimizers.dehb\u001b[0m:\u001b[36msave\u001b[0m:\u001b[36m915\u001b[0m - \u001b[1mSaving state to disk...\u001b[0m\n",
"Trajectory length: 105\n",
"Incumbent:\n",
"(Configuration(values={\n",
" 'x0': 5.691861434763917,\n",
"}), 0.007631296974771384)\n"
]
}
],
"source": [
"trajectory, runtime, history = dehb.run(brackets=5)\n",
"\n",
"print(f\"Trajectory length: {len(trajectory)}\")\n",
"print(\"Incumbent:\")\n",
"print(dehb.get_incumbents())"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Restarting DEHB\n",
"Now, we use the previously created checkpoint to restart the optimization run. For this, we specifiy the same `output_path` as above and additionally set the `resume` flag to `True`. After reloading the checkpoint, we run for another five brackets and report the results."
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[32m2024-03-11 18:21:29.885\u001b[0m | \u001b[1mINFO \u001b[0m | \u001b[36mdehb.optimizers.dehb\u001b[0m:\u001b[36m__init__\u001b[0m:\u001b[36m226\u001b[0m - \u001b[1mLoading checkpoint...\u001b[0m\n",
"\u001b[32m2024-03-11 18:21:33.967\u001b[0m | \u001b[1mINFO \u001b[0m | \u001b[36mdehb.optimizers.dehb\u001b[0m:\u001b[36msave\u001b[0m:\u001b[36m915\u001b[0m - \u001b[1mSaving state to disk...\u001b[0m\n",
"Trajectory length: 183\n",
"Incumbent:\n",
"(Configuration(values={\n",
" 'x0': 5.691861434763917,\n",
"}), 0.007631296974771384)\n"
]
}
],
"source": [
"dehb = DEHB(\n",
" f=target_function,\n",
" dimensions=dimensions,\n",
" cs=cs,\n",
" min_fidelity=min_fidelity,\n",
" max_fidelity=max_fidelity,\n",
" output_path=\"./temp_folder\",\n",
" save_freq=\"end\",\n",
" n_workers=1,\n",
" resume=True,\n",
")\n",
"\n",
"trajectory, runtime, history = dehb.run(brackets=5)\n",
"\n",
"print(f\"Trajectory length: {len(trajectory)}\")\n",
"print(\"Incumbent:\")\n",
"print(dehb.get_incumbents())"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.16"
}
},
"nbformat": 4,
"nbformat_minor": 4
}
1 change: 1 addition & 0 deletions mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@ nav:
- Optimizing RandomForest using DEHB: examples/01.1_Optimizing_RandomForest_using_DEHB.ipynb
- Using the Ask & Tell interface: examples/01.2_Optimizing_RandomForest_using_Ask_Tell.ipynb
- Using DEHB without ConfigSpace: examples/02_using_DEHB_without_ConfigSpace.ipynb
- Logging and Restarting: examples/04_restarting_an_optimization_run.ipynb
- Code Reference:
- DEHB: references/dehb.md
- DE: references/de.md
Expand Down

0 comments on commit 4ad43b1

Please sign in to comment.