Skip to content

Commit

Permalink
Merge pull request #58 from rodrigo-arenas/0.6.X
Browse files Browse the repository at this point in the history
Release notes
  • Loading branch information
rodrigo-arenas committed Jul 1, 2021
2 parents 8df98b3 + c99bc55 commit 5b2ab2f
Show file tree
Hide file tree
Showing 4 changed files with 12 additions and 9 deletions.
2 changes: 1 addition & 1 deletion dev-requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ twine==3.3.0
numpy>=1.13.3
seaborn>=0.11.1
deap>=1.3.1
mlflow==1.17.0
mlflow>=1.17.0
black==21.5b2
sphinx
sphinx_gallery
Expand Down
10 changes: 6 additions & 4 deletions docs/release_notes.rst
Original file line number Diff line number Diff line change
Expand Up @@ -22,12 +22,12 @@ Features:
* Added new parallel coordinates plot in :func:`~sklearn_genetic.plots.plot_parallel_coordinates`.
* Now if one or more callbacks decides to stop the algorithm, it will print
its class name to know which callbacks were responsible of the stopping.
* Added support for extra methods coming from scikit-learn's BaseSearchCV, it is
still partial support, missing properties like `cv_results_`, `best_index_` and `multimetric_`.
* Added support for extra methods coming from scikit-learn's BaseSearchCV, like `cv_results_`,
`best_index_` and `refit_time_` among others.
* Added methods `on_start` and `on_end` to :class:`~sklearn_genetic.callbacks.base.BaseCallback`.
Now the algorithms check for the callbacks like this:

- **on_start**: When the evolutionary algorithm is called from the GASearchCV.fit method
- **on_start**: When the evolutionary algorithm is called from the GASearchCV.fit method.

- **on_step:** When the evolutionary algorithm finish a generation (no change here).

Expand All @@ -48,7 +48,9 @@ API Changes:
now requires an explicit installation of seaborn and mlflow, now those
are optionally installed using ``pip install sklearn-genetic-opt[all].``
* The GASearchCV.logbook property now has extra information that comes from the
scikit-learn cross_validate function
scikit-learn cross_validate function.
* An optional extra parameter was added to GASearchCV, named `return_train_score`: bool, default= ``False``.
As in scikit-learn, it controls if the `cv_results_` should have the training scores.

^^^^^
Docs:
Expand Down
3 changes: 2 additions & 1 deletion sklearn_genetic/callbacks/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,7 @@ def on_end(self, logbook=None, estimator=None):
"""
pass # pragma: no cover

def __call__(self, record=None, logbook=None, estimator=None):
return self.on_step(record, logbook, estimator)

6 changes: 3 additions & 3 deletions sklearn_genetic/genetic_search.py
Original file line number Diff line number Diff line change
Expand Up @@ -441,6 +441,7 @@ def fit(self, X, y, callbacks=None):

# Make sure the callbacks are valid
self.callbacks = check_callback(callbacks)

self.scorer_ = check_scoring(self.estimator, scoring=self.scoring)

# Check cv and get the n_splits
Expand Down Expand Up @@ -481,16 +482,15 @@ def fit(self, X, y, callbacks=None):
self.estimator.set_params(**self.best_params_)

refit_start_time = time.time()

self.estimator.fit(self.X_, self.y_)
refit_end_time = time.time()
self.refit_time_ = refit_end_time - refit_start_time

self.best_estimator_ = self.estimator

# hof keeps the best params according to the fitness value
# To be consistent with self.best_estimator_, if more than 1 model gets same score
# It could lead to differences between hof and self.best_estimator_
# To be consistent with self.best_estimator_, if more than 1 model gets the
# same score, it could lead to differences between hof and self.best_estimator_
self._hof.remove(0)
self._hof.items.insert(0, list(self.best_params_.values()))
self._hof.keys.insert(0, self.best_score_)
Expand Down

0 comments on commit 5b2ab2f

Please sign in to comment.