interface (interface): Interface with queues and events to be passed to controller
Keyword Args:
- controller_type (Optional [str]): Defines the type of controller can be 'random', 'nelder'or 'gaussian_process'. Defaults to 'gaussian_process'.
+ controller_type (Optional [str]): Defines the type of controller can be 'random', 'nelder', 'gaussian_process' or 'neural_net'. Defaults to 'gaussian_process'.
**controller_config_dict : Options to be passed to controller.
Returns:
@@ -529,12 +529,12 @@ class MachineLearnerController(Controller):
Args:
interface (Interface): The interface to the experiment under optimization.
- **kwargs (Optional [dict]): Dictionary of options to be passed to Controller parent class, initial training learner and Gaussian Process learner.
-
+ **kwargs (Optional [dict]): Dictionary of options to be passed to Controller parent classand initial training learner.
+
Keyword Args:
-initial_training_source (Optional [string]): The type for the initial training source can be 'random' for the random learner or 'nelder_mead' for the Nelder-Mead learner. This leaner is also called if the Gaussian process learner is too slow and a new point is needed. Default 'random'.
- num_training_runs (Optional [int]): The number of training runs to before starting the learner. If None, will by ten or double the number of parameters, whatever is larger.
- no_delay (Optional [bool]): If True, there is never any delay between a returned cost and the next parameters to run for the experiment. In practice, this means if the gaussian process has not prepared the next parameters in time the learner defined by the initial training source is used instead. If false, the controller will wait for the gaussian process to predict the next parameters and there may be a delay between runs.
+training_type (Optional [string]): The type for the initial training source can be 'random' for the random learner, 'nelder_mead' for the Nelder-Mead learner or 'differential_evolution' for the Differential Evolution learner. This learner is also called if the machine learning learner is too slow and a new point is needed. Default 'differential_evolution'.
+ num_training_runs (Optional [int]): The number of training runs to before starting the learner. If None, will be ten or double the number of parameters, whatever is larger.
+ no_delay (Optional [bool]): If True, there is never any delay between a returned cost and the next parameters to run for the experiment. In practice, this means if the machine learning learner has not prepared the next parameters in time the learner defined by the initial training source is used instead. If false, the controller will wait for the machine learning learner to predict the next parameters and there may be a delay between runs.
'''
def__init__(self, interface,
@@ -599,8 +599,8 @@ def __init__(self, interface,
**self.remaining_kwargs)
else:
- self.log.error('Unknown training type provided to Gaussian process controller:'+repr(training_type))
-
+ self.log.error('Unknown training type provided to machine learning controller:'+repr(training_type))
- Call _get_cost_and_in_dict() of parent Controller class. But also sends cost to Gaussian process learner and saves the cost if the parameters came from a trainer.
-
+ Call _get_cost_and_in_dict() of parent Controller class. But also sends cost to machine learning learner and saves the cost if the parameters came from a trainer.
Runs pararent method and also starts training_learner.
'''
super(MachineLearnerController,self)._start_up()
- self.log.debug('GP learner started.')
+ self.log.debug('ML learner started.')
self.ml_learner.start()
def_optimization_routine(self):
'''
- Overrides _optimization_routine. Uses the parent routine for the training runs. Implements a customized _optimization_rountine when running the Gaussian Process learner.
+ Overrides _optimization_routine. Uses the parent routine for the training runs. Implements a customized _optimization_routine when running the machine learning learner.
'''
#Run the training runs using the standard optimization routine.
0 comments on commit
326f98b