Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add property test for optimizers complexity #55

Merged
merged 1 commit into from
Mar 8, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
9 changes: 0 additions & 9 deletions pysindy/optimizers/sr3.py
Original file line number Diff line number Diff line change
Expand Up @@ -64,15 +64,6 @@ class SR3(BaseOptimizer):
copy_X : boolean, optional (default True)
If True, X will be copied; else, it may be overwritten.

unbias : boolean, optional (default True)
Whether to perform an extra step of unregularized linear regression to unbias
the coefficients for the identified support.
For example, if `STLSQ(alpha=0.1)` is used then the learned coefficients will
be biased toward 0 due to the L2 regularization.
Setting `unbias=True` will trigger an additional step wherein the nonzero
coefficients learned by the `STLSQ` object will be updated using an
unregularized least-squares fit.

Attributes
----------
coef_ : array, shape (n_features,) or (n_targets, n_features)
Expand Down
9 changes: 0 additions & 9 deletions pysindy/optimizers/stlsq.py
Original file line number Diff line number Diff line change
Expand Up @@ -45,15 +45,6 @@ class STLSQ(BaseOptimizer):
copy_X : boolean, optional (default True)
If True, X will be copied; else, it may be overwritten.

unbias : boolean, optional (default True)
Whether to perform an extra step of unregularized linear regression to unbias
the coefficients for the identified support.
For example, if `STLSQ(alpha=0.1)` is used then the learned coefficients will
be biased toward 0 due to the L2 regularization.
Setting `unbias=True` will trigger an additional step wherein the nonzero
coefficients learned by the `STLSQ` object will be updated using an
unregularized least-squares fit.

Attributes
----------
coef_ : array, shape (n_features,) or (n_targets, n_features)
Expand Down
1 change: 1 addition & 0 deletions requirements-dev.txt
Original file line number Diff line number Diff line change
Expand Up @@ -14,3 +14,4 @@ sphinx >= 2
sphinxcontrib-apidoc
sphinx_rtd_theme
pre-commit
hypothesis
52 changes: 52 additions & 0 deletions test/property_tests/test_optimizers_complexity.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,52 @@
from hypothesis import assume
from hypothesis import given
from hypothesis import settings
from hypothesis.strategies import integers
from sklearn.datasets import make_regression
from sklearn.linear_model import ElasticNet
from sklearn.linear_model import Lasso
from sklearn.linear_model import LinearRegression
from sklearn.linear_model import Ridge

from pysindy.optimizers import SINDyOptimizer
from pysindy.optimizers import SR3
from pysindy.optimizers import STLSQ


@given(
n_samples=integers(min_value=100, max_value=10000),
n_features=integers(min_value=3, max_value=30),
n_informative=integers(min_value=1, max_value=10),
random_state=integers(min_value=0, max_value=2 ** 32 - 1),
)
@settings(max_examples=10)
def test_complexity(n_samples, n_features, n_informative, random_state):
"""Behaviour test for complexity.

We assume that more regularized optimizers are less complex on the same dataset.
"""
assume(n_informative <= n_features)

x, y = make_regression(
n_samples, n_features, n_informative, 1, 0, noise=0.1, random_state=random_state
)
y = y.reshape(-1, 1)

opt_kwargs = dict(fit_intercept=True, normalize=False)
optimizers = [
SR3(thresholder="l0", **opt_kwargs),
SR3(thresholder="l1", **opt_kwargs),
Lasso(**opt_kwargs),
STLSQ(**opt_kwargs),
ElasticNet(**opt_kwargs),
Ridge(**opt_kwargs),
LinearRegression(**opt_kwargs),
]

optimizers = [SINDyOptimizer(o, unbias=True) for o in optimizers]

for opt in optimizers:
opt.fit(x, y)

for less_complex, more_complex in zip(optimizers, optimizers[1:]):
assert less_complex.complexity <= more_complex.complexity