Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
95 commits
Select commit Hold shift + click to select a range
dec8a1b
Merge pull request #1 from automl/master
lumib Apr 14, 2025
4ea7acb
Initial code for the new pipeline space
lumib Apr 18, 2025
5bbbce8
Add centering
lumib Apr 18, 2025
a6b10ff
Add bracket optimizers
lumib Apr 20, 2025
6703974
WIP
lumib Apr 25, 2025
9cbcc97
Allow having Operation args and kwargs themselves be Resolvable objects
lumib May 11, 2025
ed417fe
Add NOS like space
lumib May 23, 2025
0ad5ac1
Minor change in config pretty print
lumib May 23, 2025
6ce6507
Merge remote-tracking branch 'upstream/master'
Meganton Jul 1, 2025
e72a3d0
Merge branch 'master' into new-pipeline-space
Meganton Jul 1, 2025
8a68a2c
Add comprehensive tests for NEPS space functionality
Meganton Jul 1, 2025
ceccae3
Refactor NEPS space imports and update test cases to use neps_space m…
Meganton Jul 1, 2025
2f33c40
Refactored Optimizers
Meganton Jul 2, 2025
8c7e4ab
Refactor NEPS space tests to use updated parameters module
Meganton Jul 2, 2025
e9294f7
Refactor parameter imports and class definitions in NEPS space tests
Meganton Jul 2, 2025
3b58495
Refactor NEPS space integration tests to remove redundant evaluation …
Meganton Jul 3, 2025
9bc6ca8
Add Pytorch Neural Network example
Meganton Jul 4, 2025
eaceb8c
Change import order
Meganton Jul 4, 2025
1c854b2
Refactor optimizer imports in integration tests
Meganton Jul 5, 2025
917939b
Refactor type hints and improve docstring formatting in optimizer mod…
Meganton Jul 5, 2025
bc89f65
Refactor documentation and remove redundant arguments in optimizer mo…
Meganton Jul 6, 2025
5b68925
Enhance docstrings for Resolvable, Fidelity, Pipeline, Domain, and Op…
Meganton Jul 6, 2025
e3dd79e
Refactor SamplingResolutionContext and Domain classes for improved cl…
Meganton Jul 6, 2025
8c5e9f0
Add documentation for NePS Spaces framework and usage examples
Meganton Jul 6, 2025
a8f641a
Fix indentation in docstring for clarity in SamplingResolutionContext…
Meganton Jul 6, 2025
2243da7
Improve documentation clarity in NePS Spaces by refining section titl…
Meganton Jul 6, 2025
b6a7b30
Refine documentation in NePS Spaces by correcting terminology, enhanc…
Meganton Jul 7, 2025
0a5a7e1
Add neps_algorithms as direct sub-import of neps
Meganton Jul 7, 2025
5aca1b5
Refactor hyperparameter space definitions and update tests to use new…
Meganton Jul 9, 2025
6bfbd7c
Update documentation references and remove deprecated pipeline_space.…
Meganton Jul 9, 2025
846b45d
Refactor NePS examples and documentation to enhance clarity and usabi…
Meganton Jul 9, 2025
19dbe2c
Add optimizer compatibility check for Pipeline in run function
Meganton Jul 9, 2025
3532805
Add warmstarting functionality to the run method and create example s…
Meganton Jul 9, 2025
b127740
- Import warmstart_neps in the API module.
Meganton Jul 9, 2025
5a5e0c4
Refactor warmstart_neps function to improve parameter naming and logg…
Meganton Jul 10, 2025
79aab60
Add Gaussian priors to NePS Integer and Floats
Meganton Jul 10, 2025
37e3a89
Fix variable name in check_neps_space_compatibility function
Meganton Jul 10, 2025
b0b8e05
Enhance status function to support NePS-only optimizers; integrate Ne…
Meganton Jul 10, 2025
d467011
Enhance formatted summary method documentation to clarify usage of pi…
Meganton Jul 10, 2025
fbf89b3
Add warnings for warmstarting compatibility with NEPS optimizers in r…
Meganton Jul 10, 2025
63a6661
Add compatibility check for warmstarting in warmstart_neps function a…
Meganton Jul 10, 2025
f8b560a
Enhance NEPS functions to support internal runtime calls and improve …
Meganton Jul 11, 2025
79ae4d5
Add warnings for repeated warmstarting in warmstart_neps function and…
Meganton Jul 11, 2025
801687b
Add error handling and logging for pipeline space resolution in warms…
Meganton Jul 11, 2025
10d3104
Add timing parameters to warmstart_neps function for performance trac…
Meganton Jul 13, 2025
eaa66e3
Add location parameter to warmstart_neps function for configuration t…
Meganton Jul 13, 2025
9547d5a
Add warning log for missing priors when use_priors is set to True in …
Meganton Jul 13, 2025
88da9be
Refactor NEPS Pipeline classes to use PipelineSpace
Meganton Jul 28, 2025
0db699f
Add resolving for tuples and lists
lumib Jul 30, 2025
07b923c
Fix test
lumib Jul 30, 2025
4f8d778
Change neps_spaces tests to use a /tmp/ directory
lumib Jul 30, 2025
dbb5964
Add TODO note to fix new issue with eager tuple/list resolving
lumib Jul 30, 2025
5beebfa
Enhance operation conversion to handle tuples and lists in arguments …
Meganton Jul 30, 2025
40dadc4
Add `Lazy` component and use it to stop categoricals from eagerly res…
lumib Jul 30, 2025
0bd1449
Add dict resolving and simplify operation resolving
lumib Jul 30, 2025
0f015a3
Simplify resolving Operations and correct resolving for sequences and…
lumib Aug 1, 2025
d6d2858
Match above definition
lumib Aug 1, 2025
0764f40
Small changes
lumib Aug 1, 2025
69587c8
Fix imports
lumib Aug 2, 2025
6c1a230
Add repeated resolvable type
lumib Aug 2, 2025
ab656f2
Add TODOs
lumib Aug 2, 2025
54f019d
Add inc_ratio parameter to _neps_bracket_optimizer and NePSPriorBandS…
Meganton Aug 4, 2025
cddc747
Refactor condition checks in convert_configspace to ensure attributes…
Meganton Sep 7, 2025
755723e
Add convert_classic_to_neps_search_space function to convert classic …
Meganton Sep 7, 2025
0e5f234
Enhance compatibility check in run function to ensure pipeline_space …
Meganton Sep 7, 2025
cd8c389
Add log parameter to Integer and Float conversions in convert_classic…
Meganton Sep 7, 2025
d4437c0
Add string representation method to PipelineSpace class
Meganton Sep 7, 2025
a443002
Remove redundant check for 'auto' in compatibility verification
Meganton Sep 7, 2025
9f8613f
Add string representation methods for Fidelity, Categorical, Float, I…
Meganton Sep 7, 2025
85ffe02
Simplify string representation method in Fidelity class
Meganton Sep 7, 2025
d3e76dc
Simplify string representation method in Categorical class
Meganton Sep 7, 2025
10b5080
Add conversion logic for classic to NEPS search space in API
Meganton Sep 8, 2025
6fe13c9
Refactor run function to use 'space' variable for consistency in comp…
Meganton Sep 8, 2025
0d9fafc
Update NePS Spaces documentation to clarify usage of pipeline_space i…
Meganton Sep 21, 2025
464bae7
Merge branch 'master' into neps-spaces
Meganton Sep 21, 2025
adfefe1
feat: Enhance NePS Spaces with new parameter addition and removal met…
Meganton Sep 23, 2025
0ab0e09
feat: Add methods for adding and removing parameters in NePS Spaces
Meganton Sep 23, 2025
6e3faa9
Started HyperBand
Meganton Sep 25, 2025
cfb4b11
Introduce NePS-space compatible Grid Search and Hyperband variants
Meganton Oct 11, 2025
db37873
Introduce NePS-space compatible Grid Search and Hyperband variants - …
Meganton Oct 11, 2025
076b07d
Merge branch 'neps-spaces-updates' of https://github.com/automl/neps …
Meganton Oct 11, 2025
6f7011d
feat: Enhance DefaultWorker to return cumulative metrics and add add_…
Meganton Oct 11, 2025
97be100
feat: Refactor trace text generation into a reusable function for bet…
Meganton Oct 13, 2025
a2c82a7
feat: Refactor load_incumbent_trace to include cumulative metrics tra…
Meganton Oct 13, 2025
cb02744
feat: Enhance SamplingResolutionContext with path-aware resolution me…
Meganton Oct 13, 2025
eb985b0
feat: Update best_config.txt with final cumulative metrics upon stopp…
Meganton Oct 14, 2025
85916f4
feat: Update summary writing logic to ensure metrics are recorded onl…
Meganton Oct 15, 2025
86a3775
feat: Refactor trajectory_of_improvements to utilize _build_trace_tex…
Meganton Oct 21, 2025
69d6eb5
feat: Move _build_trace_texts function to status.py and update import…
Meganton Oct 21, 2025
34af3aa
feat: Remove trajectory_of_improvements function
Meganton Oct 22, 2025
740065a
feat: Fix resampling by seperating __eq and is_equivalent. Improved c…
Meganton Oct 23, 2025
a730816
Add comprehensive tests for NePS space conversion, compatibility, and…
Meganton Oct 23, 2025
4ed7bda
Remove old warmstarting functionality, unused example notebooks and s…
Meganton Oct 23, 2025
89f3346
feat: Refactor pipeline space definition into a class and add new int…
Meganton Oct 23, 2025
17cb82e
Merge branch 'master' into neps-spaces
Meganton Oct 23, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
23 changes: 11 additions & 12 deletions docs/getting_started.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,18 +13,17 @@ pip install neural-pipeline-search

## The 3 Main Components

1. **Establish a [`pipeline_space`](reference/pipeline_space.md)**:
1. **Establish a [`pipeline_space=`](reference/neps_spaces.md)**:

```python
pipeline_space={
"some_parameter": (0.0, 1.0), # float
"another_parameter": (0, 10), # integer
"optimizer": ["sgd", "adam"], # categorical
"epoch": neps.Integer(lower=1, upper=100, is_fidelity=True),
"learning_rate": neps.Float(lower=1e-5, upper=1, log=True),
"alpha": neps.Float(lower=0.1, upper=1.0, prior=0.99, prior_confidence="high")
}

class ExampleSpace(neps.PipelineSpace):
# Define the parameters of your search space
some_parameter = neps.Float(min_value=0.0, max_value=1.0) # float
another_parameter = neps.Integer(min_value=0, max_value=10) # integer
optimizer = neps.Categorical(choices=("sgd", "adam")) # categorical
epoch = neps.Fidelity(neps.Integer(min_value=1, max_value=100))
learning_rate = neps.Float(min_value=1e-5, max_value=1, log=True)
alpha = neps.Float(min_value=0.1, max_value=1.0, prior=0.99, prior_confidence="high")
```

2. **Define an `evaluate_pipeline()` function**:
Expand All @@ -42,7 +41,7 @@ def evaluate_pipeline(some_parameter: float,
3. **Execute with [`neps.run()`](reference/neps_run.md)**:

```python
neps.run(evaluate_pipeline, pipeline_space)
neps.run(evaluate_pipeline, ExampleSpace())
```

---
Expand All @@ -52,7 +51,7 @@ neps.run(evaluate_pipeline, pipeline_space)
The [reference](reference/neps_run.md) section provides detailed information on the individual components of NePS.

1. How to use the [**`neps.run()`** function](reference/neps_run.md) to start the optimization process.
2. The different [search space](reference/pipeline_space.md) options available.
2. The different [search space](reference/neps_spaces.md) options available.
3. How to choose and configure the [optimizer](reference/optimizers.md) used.
4. How to define the [`evaluate_pipeline()` function](reference/evaluate_pipeline.md).
5. How to [analyze](reference/analyse.md) the optimization runs.
Expand Down
21 changes: 7 additions & 14 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -59,33 +59,26 @@ import logging


# 1. Define a function that accepts hyperparameters and computes the validation error
def evaluate_pipeline(
hyperparameter_a: float, hyperparameter_b: int, architecture_parameter: str
) -> dict:
def evaluate_pipeline(hyperparameter_a: float, hyperparameter_b: int, architecture_parameter: str):
# Create your model
model = MyModel(architecture_parameter)

# Train and evaluate the model with your training pipeline
validation_error = train_and_eval(
model, hyperparameter_a, hyperparameter_b
)
validation_error = train_and_eval(model, hyperparameter_a, hyperparameter_b)
return validation_error


# 2. Define a search space of parameters; use the same parameter names as in evaluate_pipeline
pipeline_space = dict(
hyperparameter_a=neps.Float(
lower=0.001, upper=0.1, log=True # The search space is sampled in log space
),
hyperparameter_b=neps.Integer(lower=1, upper=42),
architecture_parameter=neps.Categorical(["option_a", "option_b"]),
)
class ExampleSpace(neps.PipelineSpace):
hyperparameter_a = neps.Float(min_value=0.001, max_value=0.1, log=True) # Log scale parameter
hyperparameter_b = neps.Integer(min_value=1, max_value=42)
architecture_parameter = neps.Categorical(choices=("option_a", "option_b"))

# 3. Run the NePS optimization
logging.basicConfig(level=logging.INFO)
neps.run(
evaluate_pipeline=evaluate_pipeline,
pipeline_space=pipeline_space,
pipeline_space=ExampleSpace(),
root_directory="path/to/save/results", # Replace with the actual path.
evaluations_to_spend=100,
)
Expand Down
34 changes: 17 additions & 17 deletions docs/reference/evaluate_pipeline.md
Original file line number Diff line number Diff line change
@@ -1,38 +1,38 @@
# The `evaluate_pipeline` function
# The `evaluate_pipeline` function

> **TL;DR**
> *Sync*: return a scalar or a dict ⟶ NePS records it automatically.
> *Async*: return `None`, launch a job, and call `neps.save_pipeline_results()` when the job finishes.

---

## 1  Return types
## 1 Return types

| Allowed return | When to use | Minimal example |
| -------------- | ------------------------------------------- | -------------------------------------------- |
| **Scalar** | simple objective, single fidelity | `return loss` |
| **Dict** | need cost/extra metrics | `{"objective_to_minimize": loss, "cost": 3}` |
| **`None`** | you launch the job elsewhere (SLURM, k8s …) | *see § 3 Async* |
| **`None`** | you launch the job elsewhere (SLURM, k8s …) | *see § 3 Async* |

All other values raise a `TypeError` inside NePS.

## 2  Result dictionary keys
## 2 Result dictionary keys

| key | purpose | required? |
| ----------------------- | ---------------------------------------------------------------------------- | ----------------------------- |
| `objective_to_minimize` | scalar NePS will minimise | **yes** |
| `cost` | wall‑clock, GPU‑hours, … — only if you passed `cost_to_spend` to `neps.run` | yes *iff* cost budget enabled |
| `cost` | wall‑clock, GPU‑hours, … — only if you passed `cost_to_spend` to `neps.run` | yes *iff* cost budget enabled |
| `learning_curve` | list/np.array of intermediate objectives | optional |
| `extra` | any JSON‑serialisable blob | optional |
| `exception` | any Exception illustrating the error in evaluation | optional |

> **Tip**  Return exactly what you need; extra keys are preserved in the trial’s `report.yaml`.
> **Tip** Return exactly what you need; extra keys are preserved in the trial’s `report.yaml`.

---

## 3  Asynchronous evaluation (advanced)
## 3 Asynchronous evaluation (advanced)

### 3.1 Design
### 3.1 Design

1. **The Python side** (your `evaluate_pipeline` function)

Expand All @@ -52,9 +52,9 @@ All other values raise a `TypeError` inside NePS.
when it finishes.
This writes `report.yaml` and marks the trial *SUCCESS* / *CRASHED*.

### 3.2 Code walk‑through
### 3.2 Code walk‑through

`submit.py` – called by NePS synchronously
`submit.py` – called by NePS synchronously

```python
from pathlib import Path
Expand All @@ -68,7 +68,7 @@ def evaluate_pipeline(
learning_rate: float,
optimizer: str,
):
# 1) write a Slurm script
# 1) write a Slurm script
script = f"""#!/bin/bash
#SBATCH --time=0-00:10
#SBATCH --job-name=trial_{pipeline_id}
Expand All @@ -83,15 +83,15 @@ python run_pipeline.py \
--root_dir {root_directory}
""")

# 2) submit and RETURN None (async)
# 2) submit and RETURN None (async)
script_path = pipeline_directory / "submit.sh"
script_path.write_text(script)
os.system(f"sbatch {script_path}")

return None # ⟵ signals async mode
return None # ⟵ signals async mode
```

`run_pipeline.py` – executed on the compute node
`run_pipeline.py` – executed on the compute node

```python
import argparse, json, time, neps
Expand Down Expand Up @@ -128,7 +128,7 @@ neps.save_pipeline_results(

the default value for `post_run_summary` is True, if you want to prevent any summary creation, you should specify in the arguments.

### 3.3 Why this matters
### 3.3 Why this matters

* No worker idles while your job is in the queue ➜ better throughput.
* Crashes inside the job still mark the trial *CRASHED* instead of hanging.
Expand All @@ -138,7 +138,7 @@ the default value for `post_run_summary` is True, if you want to prevent any sum

* When using async approach, one worker, may create as many trials as possible, of course that in `Slurm` or other workload managers it's impossible to overload the system because of limitations set for each user, but if you want to control resources used for optimization, it's crucial to set `max_evaluations_per_run` when calling `neps.run`.

## 4  Extra injected arguments
## 4 Extra injected arguments

| name | provided when | description |
| ----------------------------- | ----------------------- | ---------------------------------------------------------- |
Expand All @@ -150,7 +150,7 @@ Use them to handle warm‑starts, logging and result persistence.

---

## 5  Checklist
## 5 Checklist

* [x] Return scalar **or** dict **or** `None`.
* [x] Include `cost` when using cost budgets.
Expand Down
22 changes: 11 additions & 11 deletions docs/reference/neps_run.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,32 +17,32 @@ import neps

def evaluate_pipeline(learning_rate: float, epochs: int) -> float:
# Your code here

return loss

class ExamplePipeline(neps.PipelineSpace):
learning_rate = neps.Float(1e-3, 1e-1, log=True)
epochs = neps.Fidelity(neps.Integer(10, 100))

neps.run(
evaluate_pipeline=evaluate_pipeline, # (1)!
pipeline_space={, # (2)!
"learning_rate": neps.Float(1e-3, 1e-1, log=True),
"epochs": neps.Integer(10, 100)
},
pipeline_space=ExamplePipeline(), # (2)!
root_directory="path/to/result_dir" # (3)!
)
```

1. The objective function, targeted by NePS for minimization, by evaluation various configurations.
It requires these configurations as input and should return either a dictionary or a sole loss value as the output.
2. This defines the search space for the configurations from which the optimizer samples.
It accepts either a dictionary with the configuration names as keys, a path to a YAML configuration file, or a [`configSpace.ConfigurationSpace`](https://automl.github.io/ConfigSpace/) object.
For comprehensive information and examples, please refer to the detailed guide available [here](../reference/pipeline_space.md)
It accepts a class instance inheriting from `neps.PipelineSpace` or a [`configSpace.ConfigurationSpace`](https://automl.github.io/ConfigSpace/) object.
For comprehensive information and examples, please refer to the detailed guide available [here](../reference/neps_spaces.md)
3. The directory path where the information about the optimization and its progress gets stored.
This is also used to synchronize multiple calls to `neps.run()` for parallelization.


See the following for more:

* What kind of [pipeline space](../reference/pipeline_space.md) can you define?
* What goes in and what goes out of [`evaluate_pipeline()`](../reference/evaluate_pipeline.md)?
* What kind of [pipeline space](../reference/neps_spaces.md) can you define?
* What goes in and what goes out of [`evaluate_pipeline()`](../reference/neps_run.md)?

## Budget, how long to run?
To define a budget, provide `evaluations_to_spend=` to [`neps.run()`][neps.api.run],
Expand All @@ -69,7 +69,7 @@ neps.run(
2. Prevents the initiation of new evaluations once this cost threshold is surpassed.
This can be any kind of cost metric you like, such as time, energy, or monetary, as long as you can calculate it.
This requires adding a cost value to the output of the `evaluate_pipeline` function, for example, return `#!python {'objective_to_minimize': loss, 'cost': cost}`.
For more details, please refer [here](../reference/evaluate_pipeline.md)
For more details, please refer [here](../reference/neps_spaces.md)

## Getting some feedback, logging
NePS will not print anything to the console. To view the progress of workers,
Expand Down Expand Up @@ -140,7 +140,7 @@ provided to [`neps.run()`][neps.api.run].
│ └── config_2
│ ├── config.yaml
│ └── metadata.json
├── summary
├── summary
│ ├── full.csv
│ └── short.csv
│ ├── best_config_trajectory.txt
Expand Down
Loading
Loading