Overview
Search classes (Nautilus, Dynesty, Emcee, Zeus, BFGS, Drawer) load default parameters from YAML config files via a blind setattr() loop with no type safety. This makes the source code hard to follow, new samplers hard to add, and config typos hard to detect. Workspace scripts already pass all parameters explicitly, so the YAML defaults are unused in practice.
This refactor moves all search/run parameter defaults into explicit typed __init__ arguments, eliminates config_dict_search/config_dict_run entirely in favour of direct instance attributes, and removes the per-search YAML files.
Plan
- Move search and run parameter defaults into explicit
__init__ arguments on each search class with type annotations
- Eliminate
config_dict_search and config_dict_run — replace all dict access with direct instance attributes
- Remove config machinery from the base class (
setattr() loop, config_type, _class_config, _config())
- Delete per-search YAML files (
nest.yaml, mcmc.yaml, mle.yaml); keep general.yaml for cross-cutting settings
- Update tests to assert against instance attributes instead of config-loaded dict values
Detailed implementation plan
Affected Repositories
Work Classification
Library
Branch Survey
| Repository |
Current Branch |
Dirty? |
| ./PyAutoFit |
main |
clean |
Suggested branch: feature/search-config-cleanup
Worktree root: ~/Code/PyAutoLabs-wt/search-config-cleanup/
Implementation Steps
- Refactor
NonLinearSearch.__init__ — remove setattr() loop, config_dict properties, config_type, _class_config, _config(). Add silence as explicit param. Keep general.yaml reads.
- Refactor Nautilus — explicit typed params in
__init__, replace all config_dict_search["key"] with self.key, replace **config_dict_search with explicit kwargs to sampler.
- Refactor DynestyStatic/DynestyDynamic/AbstractDynesty — same pattern.
- Refactor Emcee/Zeus/AbstractMCMC — same pattern, keep AutoCorrelationsSettings class but with Python defaults.
- Refactor BFGS/LBFGS/Drawer/AbstractMLE — same pattern, options become explicit params.
- Remove AbstractNest.config_type and related overrides.
- Delete
autofit/config/non_linear/{nest,mcmc,mle}.yaml.
- Update tests to assert instance attributes directly.
- Update mock search to remove config_dict_search dependency.
Key Files
autofit/non_linear/search/abstract_search.py — base class config machinery removal
autofit/non_linear/search/nest/nautilus/search.py — Nautilus explicit params
autofit/non_linear/search/nest/dynesty/search/{static,dynamic,abstract}.py — Dynesty explicit params
autofit/non_linear/search/mcmc/{emcee,zeus}/search.py — MCMC explicit params
autofit/non_linear/search/mle/{bfgs,drawer}/search.py — MLE explicit params
autofit/config/non_linear/*.yaml — delete
Original Prompt
Click to expand starting prompt
All searches use config files to load their input parameters and interface with their respective source code library.
Part of the design was so that a project like PyAutoLens could create a search, have the parameters set in a way that was suitable for that model and hide from the user everything else:
search = af.Nautilus(
path_prefix=Path("imaging"),
name="start_here",
unique_tag=dataset_name,
n_live=100,
n_batch=50,
iterations_per_quick_update=1000,
)
Example of a modeling script:
@autolens_workspace/scripts/imaging/modeling.py
However, the downside of this approach is the autofit source code got very confusing with search config loads, the config files which control this for a project are bloated and require energy to set up and maintain and it inevitably leads to bug.
This issue shows part of the problem:
#1001
Is there a solution which makes the source code interface a lot cleaner, doesnt have these config files and makes it easier for new samplers to be set up? I wonder if having a Python class input to each search class is the solution, which in the past would be a pain to maintain for each specific sampler but now AI agents can do this quickly isnt so bad.
The biggest downside is a project like autolens_workspace may need to manually specify search inputs everywhere the search is set up, albeit maybe theres a no-required-config trick or inheritance trick we could use?
Think hard and come up with an assessment of how we can make the source code a lot cleaner without losing the desired API.
Overview
Search classes (Nautilus, Dynesty, Emcee, Zeus, BFGS, Drawer) load default parameters from YAML config files via a blind
setattr()loop with no type safety. This makes the source code hard to follow, new samplers hard to add, and config typos hard to detect. Workspace scripts already pass all parameters explicitly, so the YAML defaults are unused in practice.This refactor moves all search/run parameter defaults into explicit typed
__init__arguments, eliminatesconfig_dict_search/config_dict_runentirely in favour of direct instance attributes, and removes the per-search YAML files.Plan
__init__arguments on each search class with type annotationsconfig_dict_searchandconfig_dict_run— replace all dict access with direct instance attributessetattr()loop,config_type,_class_config,_config())nest.yaml,mcmc.yaml,mle.yaml); keepgeneral.yamlfor cross-cutting settingsDetailed implementation plan
Affected Repositories
Work Classification
Library
Branch Survey
Suggested branch:
feature/search-config-cleanupWorktree root:
~/Code/PyAutoLabs-wt/search-config-cleanup/Implementation Steps
NonLinearSearch.__init__— removesetattr()loop, config_dict properties,config_type,_class_config,_config(). Addsilenceas explicit param. Keepgeneral.yamlreads.__init__, replace allconfig_dict_search["key"]withself.key, replace**config_dict_searchwith explicit kwargs to sampler.autofit/config/non_linear/{nest,mcmc,mle}.yaml.Key Files
autofit/non_linear/search/abstract_search.py— base class config machinery removalautofit/non_linear/search/nest/nautilus/search.py— Nautilus explicit paramsautofit/non_linear/search/nest/dynesty/search/{static,dynamic,abstract}.py— Dynesty explicit paramsautofit/non_linear/search/mcmc/{emcee,zeus}/search.py— MCMC explicit paramsautofit/non_linear/search/mle/{bfgs,drawer}/search.py— MLE explicit paramsautofit/config/non_linear/*.yaml— deleteOriginal Prompt
Click to expand starting prompt
All searches use config files to load their input parameters and interface with their respective source code library.
Part of the design was so that a project like PyAutoLens could create a search, have the parameters set in a way that was suitable for that model and hide from the user everything else:
search = af.Nautilus(
path_prefix=Path("imaging"),
name="start_here",
unique_tag=dataset_name,
n_live=100,
n_batch=50,
iterations_per_quick_update=1000,
)
Example of a modeling script:
@autolens_workspace/scripts/imaging/modeling.py
However, the downside of this approach is the autofit source code got very confusing with search config loads, the config files which control this for a project are bloated and require energy to set up and maintain and it inevitably leads to bug.
This issue shows part of the problem:
#1001
Is there a solution which makes the source code interface a lot cleaner, doesnt have these config files and makes it easier for new samplers to be set up? I wonder if having a Python class input to each search class is the solution, which in the past would be a pain to maintain for each specific sampler but now AI agents can do this quickly isnt so bad.
The biggest downside is a project like autolens_workspace may need to manually specify search inputs everywhere the search is set up, albeit maybe theres a no-required-config trick or inheritance trick we could use?
Think hard and come up with an assessment of how we can make the source code a lot cleaner without losing the desired API.