Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Clean up fixture names and usages in tests #895

Merged
merged 21 commits into from Jul 1, 2020
Merged

Clean up fixture names and usages in tests #895

merged 21 commits into from Jul 1, 2020

Conversation

angela97lin
Copy link
Contributor

Closes #891 by removing X_y fixture in conftest.py / test_auto_regression.py, renaming X_y to X_y_binary, renaming X_y_reg to X_y_regression, and cleaning up some instances where we were using X_y for regression problems.

I think it's a good idea to rename X_y to X_y_binary because there were a few instances where we were using X_y for regression problems, and hopefully having a more clear name will help avoid that in the future!

@angela97lin angela97lin marked this pull request as draft June 29, 2020 17:59
@codecov
Copy link

codecov bot commented Jun 29, 2020

Codecov Report

Merging #895 into master will increase coverage by 0.02%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #895      +/-   ##
==========================================
+ Coverage   99.79%   99.81%   +0.02%     
==========================================
  Files         197      197              
  Lines        9205     9168      -37     
==========================================
- Hits         9186     9151      -35     
+ Misses         19       17       -2     
Impacted Files Coverage Δ
evalml/tests/component_tests/test_utils.py 100.00% <ø> (+2.32%) ⬆️
evalml/tests/automl_tests/test_automl.py 100.00% <100.00%> (ø)
.../automl_tests/test_automl_search_classification.py 100.00% <100.00%> (ø)
...ests/automl_tests/test_automl_search_regression.py 100.00% <100.00%> (ø)
...l/tests/automl_tests/test_pipeline_search_plots.py 100.00% <100.00%> (ø)
.../tests/component_tests/test_baseline_classifier.py 100.00% <100.00%> (ø)
...l/tests/component_tests/test_baseline_regressor.py 100.00% <100.00%> (ø)
.../tests/component_tests/test_catboost_classifier.py 100.00% <100.00%> (ø)
...l/tests/component_tests/test_catboost_regressor.py 100.00% <100.00%> (ø)
evalml/tests/component_tests/test_components.py 100.00% <100.00%> (ø)
... and 32 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 99ee2e5...f8fd30f. Read the comment docs.

@@ -23,7 +22,6 @@ def make_mock_import_module(libs_to_exclude):
def _import_module(library):
if library in libs_to_exclude:
raise ImportError("Cannot import {}; excluded by mock muahahaha".format(library))
return import_module(library)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed because this was never called and was causing codecov to fail!

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah gotcha. Cool! As discussed, we have coverage for this via the core dependencies CI job, so its fine by me to remove this.

@angela97lin angela97lin marked this pull request as ready for review June 29, 2020 22:30
Copy link
Contributor

@eccabay eccabay left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm such a fan of this, it makes things so much cleaner!

evalml/tests/automl_tests/test_automl.py Outdated Show resolved Hide resolved
evalml/tests/automl_tests/test_pipeline_search_plots.py Outdated Show resolved Hide resolved
evalml/tests/component_tests/test_feature_selectors.py Outdated Show resolved Hide resolved
@dsherry
Copy link
Contributor

dsherry commented Jun 30, 2020

@angela97lin please ping me whenever you get the conflicts resolved and I'll review/approve!

@angela97lin angela97lin added this to the July 2020 milestone Jul 1, 2020
Copy link
Contributor

@dsherry dsherry left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good stuff! Left one note on the changelog, otherwise good to go

docs/source/changelog.rst Outdated Show resolved Hide resolved
X, y = datasets.make_classification(n_samples=100, n_features=20,
n_informative=2, n_redundant=2, random_state=0)

return X, y


@pytest.fixture
def X_y_reg():
def X_y_regression():
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice

@@ -23,7 +22,6 @@ def make_mock_import_module(libs_to_exclude):
def _import_module(library):
if library in libs_to_exclude:
raise ImportError("Cannot import {}; excluded by mock muahahaha".format(library))
return import_module(library)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah gotcha. Cool! As discussed, we have coverage for this via the core dependencies CI job, so its fine by me to remove this.

@angela97lin angela97lin merged commit 133c9bc into master Jul 1, 2020
@angela97lin angela97lin deleted the 891_rename branch July 1, 2020 21:38
@dsherry dsherry mentioned this pull request Jul 16, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Rename X_y fixture in conftest.py / test_auto_regression.py
3 participants