Skip to content
This repository has been archived by the owner on Dec 6, 2023. It is now read-only.

Commit

Permalink
Merge pull request #22 from thuijskens/update-readme
Browse files Browse the repository at this point in the history
Update readme with more examples
  • Loading branch information
thuijskens committed Sep 15, 2018
2 parents b06ac1f + 18f5ac5 commit 7916344
Show file tree
Hide file tree
Showing 6 changed files with 82 additions and 32 deletions.
76 changes: 62 additions & 14 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,26 +1,43 @@
# stability-selection - A scikit-learn compatible implementation of stability selection

[![Build Status](https://travis-ci.org/scikit-learn-contrib/stability-selection.svg?branch=master)](https://travis-ci.org/scikit-learn-contrib/stability-selection)
[![Coverage Status](https://coveralls.io/repos/github/thuijskens/stability-selection/badge.svg?branch=master)](https://coveralls.io/github/thuijskens/stability-selection?branch=master)
[![Coverage Status](https://coveralls.io/repos/github/scikit-learn-contrib/stability-selection/badge.svg?branch=master)](https://coveralls.io/github/scikit-learn-contrib/stability-selection?branch=master)
[![CircleCI](https://circleci.com/gh/scikit-learn-contrib/stability-selection.svg?style=svg)](https://circleci.com/gh/scikit-learn-contrib/stability-selection)

**stability-selection** is a Python implementation of the stability selection algorithm[^1], following
the scikit-learn `Estimator` API.
**stability-selection** is a Python implementation of the stability selection feature selection algorithm, first proposed by [Meinshausen and Buhlmann](https://stat.ethz.ch/~nicolai/stability.pdf).

## Installation and usage
The idea behind stability selection is to inject more noise into the original problem by generating bootstrap samples of the data, and to use a base feature selection algorithm (like the LASSO) to find out which features are important in every sampled version of the data. The results on each bootstrap sample are then aggregated to compute a *stability score* for each feature in the data. Features can then be selected by choosing an appropriate threshold for the stability scores.

## Installation

Before installing the module you will need `numpy`, `matplotlib`, and `sklearn`.
To install the module, clone the repository
```shell
git clone https://github.com/thuijskens/stability-selection.git
```
and execute the following in the project directory:
```shell
```bash
git clone https://github.com/scikit-learn-contrib/stability-selection.git
```
Before installing the module you will need `numpy`, `matplotlib`, and `sklearn`. Install these modules separately, or install using the `requirements.txt` file:
```bash
pip install -r requirements.txt
```
and execute the following in the project directory to install `stability-selection`:
```bash
python setup.py install
```

## Documentation and algorithmic details

See the [documentation](https://thuijskens.github.io/stability-selection/docs/index.html) for details on the module, and the accompanying [blog post](https://thuijskens.github.io/2018/07/25/stability-selection/) for details on the algorithmic details.

## Example usage

`stability-selection` implements a class `StabilitySelection`, that takes any scikit-learn compatible estimator that has either a ``feature_importances_`` or ``coef_`` attribute after fitting. Important other parameters are

- `lambda_name`: the name of the penalization parameter of the base estimator (for example, `C` in the case of `LogisticRegression`).
- `lambda_grid`: an array of values of the penalization parameter to iterate over.

After instantiation, the algorithm can be run with the familiar `fit` and `transform` calls.

### Basic example
See below for an example:
```python
import numpy as np

Expand All @@ -45,25 +62,56 @@ def _generate_dummy_classification_data(p=1000, n=1000, k=5, random_state=123321

return X, y, important_betas


## This is all preparation of the dummy data set
n, p, k = 500, 1000, 5

X, y, important_betas = _generate_dummy_classification_data(n=n, k=k)
base_estimator = Pipeline([
('scaler', StandardScaler()),
('model', LogisticRegression(penalty='l1'))
])

## Here stability selection is instantiated and run
selector = StabilitySelection(base_estimator=base_estimator, lambda_name='model__C',
lambda_grid=np.logspace(-5, -1, 50)).fit(X, y)

print(selector.get_support(indices=True))
```

## Algorithmic details
### Bootstrapping strategies

`stability-selection` uses bootstrapping without replacement by default (as proposed in the original paper), but does support different bootstrapping strategies. [Shah and Samworth] proposed *complentairy pairs* bootstrapping, where the data set is bootstrapped in pairs, such that the intersection is empty but the union equals the original data set. `StabilitySelection` supports this through the `bootstrap_func` parameter.

This parameter can be:
- A string, which must be one of
- 'subsample': For subsampling without replacement (default).
- 'complementary_pairs': For complementary pairs subsampling [2].
- 'stratified': For stratified bootstrapping in imbalanced
classification.
- A function that takes `y`, and a random state
as inputs and returns a list of sample indices in the range
`(0, len(y)-1)`.

For example, the `StabilitySelection` call in the above example can be replaced with
```python
selector = StabilitySelection(base_estimator=base_estimator,
lambda_name='model__C',
lambda_grid=np.logspace(-5, -1, 50),
bootstrap_func='complementary_pairs')
selector.fit(X, y)
```
to run stability selection with complementary pairs bootstrapping.

## Feedback and contributing

See the [documentation](https://thuijskens.github.io/stability-selection/docs/index.html)
Feedback and contributions are much appreciated. If you have any feedback, please post it on the [issue tracker](https://github.com/scikit-learn-contrib/stability-selection/issues).

## References

[^1]: Meinshausen, N. and Buhlmann, P., 2010. Stability selection. Journal of the Royal Statistical Society:
[1]: Meinshausen, N. and Buhlmann, P., 2010. Stability selection. Journal of the Royal Statistical Society:
Series B (Statistical Methodology), 72(4), pp.417-473.

[2] Shah, R.D. and Samworth, R.J., 2013. Variable selection with
error control: another look at stability selection. Journal
of the Royal Statistical Society: Series B (Statistical Methodology),
75(1), pp.55-80.
2 changes: 1 addition & 1 deletion ci_scripts/circleci/push_doc.sh
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ GENERATED_DOC_DIR=$(readlink -f $GENERATED_DOC_DIR)

if [ "$CIRCLE_BRANCH" = "master" ]
then
dir=dev
dir=docs # NOTE: I needed to change this from dev to docs for gh-pages to work
else
# Strip off .X
dir="${CIRCLE_BRANCH::-2}"
Expand Down
12 changes: 2 additions & 10 deletions doc/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -46,20 +46,12 @@
'numpydoc',
'sphinx.ext.ifconfig',
'sphinx.ext.viewcode',
'sphinx_gallery.gen_gallery'

'sphinx_gallery.gen_gallery',
'sphinx.ext.mathjax'
]

numpydoc_show_class_members = False

# pngmath / imgmath compatibility layer for different sphinx versions
import sphinx
from distutils.version import LooseVersion
if LooseVersion(sphinx.__version__) < LooseVersion('1.4'):
extensions.append('sphinx.ext.pngmath')
else:
extensions.append('sphinx.ext.imgmath')

sphinx_gallery_conf = {
# path to your examples scripts
'examples_dirs' : '../examples',
Expand Down
2 changes: 2 additions & 0 deletions doc/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,8 @@ This algorithm identifies a set of “stable” variables that are selected with
:maxdepth: 2

api
stability_selection
randomized_lasso
auto_examples/index
...

Expand Down
4 changes: 2 additions & 2 deletions stability_selection/randomized_lasso.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,11 +4,11 @@
===========================
This module contains implementations of randomized logistic regression
and randomized LASSO regression [1].
and randomized LASSO regression [1]_ .
References
----------
[1] Meinshausen, N. and Buhlmann, P., 2010. Stability selection.
.. [1] Meinshausen, N. and Buhlmann, P., 2010. Stability selection.
Journal of the Royal Statistical Society: Series B
(Statistical Methodology), 72(4), pp.417-473.
"""
Expand Down
18 changes: 13 additions & 5 deletions stability_selection/stability_selection.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,14 +4,18 @@
===============================
This module contains a scikit-learn compatible implementation of
stability selection[1].
stability selection [1]_ .
References
----------
[1] Meinshausen, N. and Buhlmann, P., 2010. Stability selection.
.. [1] Meinshausen, N. and Buhlmann, P., 2010. Stability selection.
Journal of the Royal Statistical Society: Series B
(Statistical Methodology), 72(4), pp.417-473.
.. [2] Shah, R.D. and Samworth, R.J., 2013. Variable selection with
error control: another look at stability selection. Journal
of the Royal Statistical Society: Series B (Statistical Methodology),
75(1), pp.55-80.
"""

from warnings import warn
Expand Down Expand Up @@ -152,7 +156,7 @@ def plot_stability_path(stability_selection, threshold_highlight=None,


class StabilitySelection(BaseEstimator, TransformerMixin):
"""Stability selection [1] fits the estimator `base_estimator` on
"""Stability selection [1]_ fits the estimator `base_estimator` on
bootstrap samples of the original data set, for different values of
the regularization parameter for `base_estimator`. Variables that
reliably get selected by the model in these bootstrap samples are
Expand All @@ -161,7 +165,9 @@ class StabilitySelection(BaseEstimator, TransformerMixin):
Parameters
----------
base_estimator : object.
The base estimator used for stability selection.
The base estimator used for stability selection. The estimator
must have either a ``feature_importances_`` or ``coef_``
attribute after fitting.
lambda_name : str.
The name of the penalization parameter for the estimator
Expand All @@ -184,7 +190,9 @@ class StabilitySelection(BaseEstimator, TransformerMixin):
The function used to subsample the data. This parameter can be:
- A string, which must be one of
- 'subsample': For subsampling without replacement.
- 'complementary_pairs': For complementary pairs subsampling [2].
- 'complementary_pairs': For complementary pairs subsampling [2]_ .
- 'stratified': For stratified bootstrapping in imbalanced
classification.
- A function that takes y, and a random state
as inputs and returns a list of sample indices in the range
(0, len(y)-1). By default, indices are uniformly subsampled.
Expand Down

0 comments on commit 7916344

Please sign in to comment.