Skip to content

Commit

Permalink
pep8 fix and bumping up the changelog note to the 0.17 release
Browse files Browse the repository at this point in the history
  • Loading branch information
rasbt committed May 17, 2019
1 parent 5433db0 commit 8108b0e
Show file tree
Hide file tree
Showing 2 changed files with 10 additions and 8 deletions.
3 changes: 1 addition & 2 deletions docs/sources/CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ The CHANGELOG for the current development version is available at

##### New Features

- -
- Add optional `groups` parameter to `SequentialFeatureSelector` and `ExhaustiveFeatureSelector` `fit()` methods for forwarding to sklearn CV ([#537](https://github.com/rasbt/mlxtend/pull/537) via [arc12](https://github.com/qiaguhttps://github.com/arc12))

##### Changes

Expand Down Expand Up @@ -44,7 +44,6 @@ The CHANGELOG for the current development version is available at
- Now, the `StackingCVRegressor` also enables grid search over the `regressors` and even a single base regressor. When there are level-mixed parameters, `GridSearchCV` will try to replace hyperparameters in a top-down order (see the [documentation](http://rasbt.github.io/mlxtend/user_guide/regressor/StackingCVRegressor/) for examples details). ([#515](https://github.com/rasbt/mlxtend/pull/512) via [Qiang Gu](https://github.com/qiaguhttps://github.com/qiagu))
- Adds a `verbose` parameter to `apriori` to show the current iteration number as well as the itemset size currently being sampled. ([#519](https://github.com/rasbt/mlxtend/pull/519)
- Adds an optional `class_name` parameter to the confusion matrix function to display class names on the axis as tick marks. ([#487](https://github.com/rasbt/mlxtend/pull/487) via [sandpiturtle](https://github.com/qiaguhttps://github.com/sandpiturtle))
- Add optional `groups` parameter to `SequentialFeatureSelector` and `ExhaustiveFeatureSelector` `fit()` methods for forwarding to sklearn CV

##### Changes

Expand Down
15 changes: 9 additions & 6 deletions mlxtend/feature_selection/sequential_feature_selector.py
Original file line number Diff line number Diff line change
Expand Up @@ -295,8 +295,8 @@ def fit(self, X, y, custom_feature_names=None, groups=None, **fit_params):
if not isinstance(self.k_features, int) and\
not isinstance(self.k_features, tuple)\
and not isinstance(self.k_features, str):
raise AttributeError('k_features must be a positive integer'
', tuple, or string')
raise AttributeError('k_features must be a positive integer'
', tuple, or string')

if (isinstance(self.k_features, int) and (
self.k_features < 1 or self.k_features > X_.shape[1])):
Expand Down Expand Up @@ -355,7 +355,8 @@ def fit(self, X, y, custom_feature_names=None, groups=None, **fit_params):
k_to_select = min_k
k_idx = tuple(range(X_.shape[1]))
k = len(k_idx)
k_idx, k_score = _calc_score(self, X_, y, k_idx, groups=groups, **fit_params)
k_idx, k_score = _calc_score(self, X_, y, k_idx,
groups=groups, **fit_params)
self.subsets_[k] = {
'feature_idx': k_idx,
'cv_scores': k_score,
Expand Down Expand Up @@ -480,7 +481,7 @@ def fit(self, X, y, custom_feature_names=None, groups=None, **fit_params):
X)
raise KeyboardInterrupt

except KeyboardInterrupt as e:
except KeyboardInterrupt:
self.interrupted_ = True
sys.stderr.write('\nSTOPPING EARLY DUE TO KEYBOARD INTERRUPT...')

Expand Down Expand Up @@ -549,7 +550,8 @@ def _inclusion(self, orig_set, subset, X, y, ignore_feature=None,
all_cv_scores[best])
return res

def _exclusion(self, feature_set, X, y, fixed_feature=None, groups=None, **fit_params):
def _exclusion(self, feature_set, X, y, fixed_feature=None,
groups=None, **fit_params):
n = len(feature_set)
res = (None, None, None)
if n > 1:
Expand All @@ -560,7 +562,8 @@ def _exclusion(self, feature_set, X, y, fixed_feature=None, groups=None, **fit_p
n_jobs = min(self.n_jobs, features)
parallel = Parallel(n_jobs=n_jobs, verbose=self.verbose,
pre_dispatch=self.pre_dispatch)
work = parallel(delayed(_calc_score)(self, X, y, p, groups=groups, **fit_params)
work = parallel(delayed(_calc_score)(self, X, y, p,
groups=groups, **fit_params)
for p in combinations(feature_set, r=n - 1)
if not fixed_feature or fixed_feature in set(p))

Expand Down

0 comments on commit 8108b0e

Please sign in to comment.