Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix pytest options to show skipped tests and how to unskip them rather than deselecting them #4966

Merged
merged 3 commits into from
Nov 2, 2021

Conversation

harupy
Copy link
Member

@harupy harupy commented Oct 30, 2021

Signed-off-by: harupy hkawamura0130@gmail.com

What changes are proposed in this pull request?

Fix pytest options to show skipped tests and how to unskip them for the first-time contributors or new team members who are not familiar with options for running tests:

image

How is this patch tested?

Existing tests

Release Notes

Is this a user-facing change?

  • No. You can skip the rest of this section.
  • Yes. Give a description of this change to be included in the release notes for MLflow users.

(Details in 1-2 sentences. You can just refer to another PR with a description if this PR is part of a larger change.)

What component(s), interfaces, languages, and integrations does this PR affect?

Components

  • area/artifacts: Artifact stores and artifact logging
  • area/build: Build and test infrastructure for MLflow
  • area/docs: MLflow documentation pages
  • area/examples: Example code
  • area/model-registry: Model Registry service, APIs, and the fluent client calls for Model Registry
  • area/models: MLmodel format, model serialization/deserialization, flavors
  • area/projects: MLproject format, project running backends
  • area/scoring: MLflow Model server, model deployment tools, Spark UDFs
  • area/server-infra: MLflow Tracking server backend
  • area/tracking: Tracking Service, tracking client APIs, autologging

Interface

  • area/uiux: Front-end, user experience, plotting, JavaScript, JavaScript dev server
  • area/docker: Docker use across MLflow's components, such as MLflow Projects and MLflow Models
  • area/sqlalchemy: Use of SQLAlchemy in the Tracking Service or Model Registry
  • area/windows: Windows support

Language

  • language/r: R APIs and clients
  • language/java: Java APIs and clients
  • language/new: Proposals for new client languages

Integrations

  • integrations/azure: Azure and Azure ML integrations
  • integrations/sagemaker: SageMaker integrations
  • integrations/databricks: Databricks integrations

How should the PR be classified in the release notes? Choose one:

  • rn/breaking-change - The PR will be mentioned in the "Breaking Changes" section
  • rn/none - No description will be included. The PR will be mentioned only by the PR number in the "Small Bugfixes and Documentation Updates" section
  • rn/feature - A new user-facing feature worth mentioning in the release notes
  • rn/bug-fix - A user-facing bug fix worth mentioning in the release notes
  • rn/documentation - A user-facing documentation change worth mentioning in the release notes

@github-actions github-actions bot added the rn/none List under Small Changes in Changelogs. label Oct 30, 2021
Comment on lines +50 to +55
# Register markers to suppress `PytestUnknownMarkWarning`
config.addinivalue_line("markers", "large")
config.addinivalue_line("markers", "requires_ssh")
config.addinivalue_line("markers", "lazy_import")
config.addinivalue_line("markers", "notrackingurimock")
config.addinivalue_line("markers", "allow_infer_pip_requirements_fallback")
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pytest raises the following warning for unregistered markers:

/home/runner/work/mlflow/mlflow/tests/pyfunc/test_model_export_with_class_and_artifacts.py:650: PytestUnknownMarkWarning:
Unknown pytest.mark.large - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html

Copy link
Member Author

@harupy harupy Nov 1, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This warning is actually not harmful but makes it difficult to find other warnings.

@harupy harupy changed the title Fix pytest options to clarify which tests were skipped and how to run them Fix pytest options to show skipped tests and how to run them Oct 30, 2021
@harupy harupy changed the title Fix pytest options to show skipped tests and how to run them Fix pytest options to show skipped tests and how to unskip them Oct 30, 2021
@harupy harupy changed the title Fix pytest options to show skipped tests and how to unskip them Fix pytest options to show skipped tests and how to unskip them rather than deselecting them Nov 1, 2021
Comment on lines +58 to +72
def pytest_runtest_setup(item):
markers = [mark.name for mark in item.iter_markers()]
marked_as_large = "large" in markers
large_option = item.config.getoption("--large")
large_only_option = item.config.getoption("--large-only")
if marked_as_large and not (large_option or large_only_option):
pytest.skip("use `--large` or `--large-only` to run this test")
if not marked_as_large and large_only_option:
pytest.skip("remove `--large-only` to run this test")

if "requires_ssh" in markers and not item.config.getoption("--requires-ssh"):
pytest.skip("use `--requires-ssh` to run this test")

if "lazy_import" in markers and not item.config.getoption("--lazy-import"):
pytest.skip("use `--lazy-import` to run this test")
Copy link
Member Author

@harupy harupy Nov 1, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With this fix, the pytest output looks like this:

$ pytest tests/xgboost
...
tests/xgboost/test_xgboost_autolog.py::test_xgb_autolog_log_models_configuration[False] SKIPPED (use `--large` or `--large-only` to run this test)       [ 55%]
tests/xgboost/test_xgboost_autolog.py::test_xgb_autolog_does_not_break_dmatrix_instantiation_with_data_none PASSED                                       [ 57%]
tests/xgboost/test_xgboost_model_export.py::test_model_save_load SKIPPED (use `--large` or `--large-only` to run this test)                              [ 59%]
tests/xgboost/test_xgboost_model_export.py::test_signature_and_examples_are_saved_correctly SKIPPED (use `--large` or `--large-only` to run this test)   [ 61%]
tests/xgboost/test_xgboost_model_export.py::test_model_load_from_remote_uri_succeeds SKIPPED (use `--large` or `--large-only` to run this test)          [ 63%]
tests/xgboost/test_xgboost_model_export.py::test_model_log SKIPPED (use `--large` or `--large-only` to run this test)                                    [ 65%]

This looks a bit too verbose, but this clarifies which tests were skipped.

Signed-off-by: harupy <hkawamura0130@gmail.com>
Signed-off-by: harupy <hkawamura0130@gmail.com>
Signed-off-by: harupy <hkawamura0130@gmail.com>
Copy link
Collaborator

@dbczumar dbczumar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@harupy harupy merged commit add4ece into mlflow:master Nov 2, 2021
@harupy harupy deleted the fix-pytest-options branch November 2, 2021 00:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
rn/none List under Small Changes in Changelogs.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants