Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for minor version incremental increase for pyspark #5366

Merged
merged 2 commits into from
Feb 11, 2022

Conversation

BenWilson2
Copy link
Member

@BenWilson2 BenWilson2 commented Feb 10, 2022

Signed-off-by: Ben Wilson benjamin.wilson@databricks.com

What changes are proposed in this pull request?

Modify the version validation logic for pyspark if running on Databricks to allow for minor (or micro) version updates to still be compatible with the version validation checks.

How is this patch tested?

unit tests

Does this PR change the documentation?

  • No. You can skip the rest of this section.
  • Yes. Make sure the changed pages / sections render correctly by following the steps below.
  1. Check the status of the ci/circleci: build_doc check. If it's successful, proceed to the
    next step, otherwise fix it.
  2. Click Details on the right to open the job page of CircleCI.
  3. Click the Artifacts tab.
  4. Click docs/build/html/index.html.
  5. Find the changed pages / sections and make sure they render correctly.

Release Notes

Is this a user-facing change?

  • No. You can skip the rest of this section.
  • Yes. Give a description of this change to be included in the release notes for MLflow users.

(Details in 1-2 sentences. You can just refer to another PR with a description if this PR is part of a larger change.)

What component(s), interfaces, languages, and integrations does this PR affect?

Components

  • area/artifacts: Artifact stores and artifact logging
  • area/build: Build and test infrastructure for MLflow
  • area/docs: MLflow documentation pages
  • area/examples: Example code
  • area/model-registry: Model Registry service, APIs, and the fluent client calls for Model Registry
  • area/models: MLmodel format, model serialization/deserialization, flavors
  • area/projects: MLproject format, project running backends
  • area/scoring: MLflow Model server, model deployment tools, Spark UDFs
  • area/server-infra: MLflow Tracking server backend
  • area/tracking: Tracking Service, tracking client APIs, autologging

Interface

  • area/uiux: Front-end, user experience, plotting, JavaScript, JavaScript dev server
  • area/docker: Docker use across MLflow's components, such as MLflow Projects and MLflow Models
  • area/sqlalchemy: Use of SQLAlchemy in the Tracking Service or Model Registry
  • area/windows: Windows support

Language

  • language/r: R APIs and clients
  • language/java: Java APIs and clients
  • language/new: Proposals for new client languages

Integrations

  • integrations/azure: Azure and Azure ML integrations
  • integrations/sagemaker: SageMaker integrations
  • integrations/databricks: Databricks integrations

How should the PR be classified in the release notes? Choose one:

  • rn/breaking-change - The PR will be mentioned in the "Breaking Changes" section
  • rn/none - No description will be included. The PR will be mentioned only by the PR number in the "Small Bugfixes and Documentation Updates" section
  • rn/feature - A new user-facing feature worth mentioning in the release notes
  • rn/bug-fix - A user-facing bug fix worth mentioning in the release notes
  • rn/documentation - A user-facing documentation change worth mentioning in the release notes

…abricks

Signed-off-by: Ben Wilson <benjamin.wilson@databricks.com>
@github-actions github-actions bot added integrations/databricks Databricks integrations rn/bug-fix Mention under Bug Fixes in Changelogs. labels Feb 10, 2022
Comment on lines 32 to 33
if Version(ver) > Version(min_ver):
ver = _reset_minor_version(ver)
Copy link
Collaborator

@dbczumar dbczumar Feb 10, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The following might be less brittle (let Version() do the work of identifying major and minor) and is contained within this single function, since the logic isn't needed elsewhere. For version 3, minor is interpreted as 0 (Version("3").minor == 0)

Suggested change
if Version(ver) > Version(min_ver):
ver = _reset_minor_version(ver)
parsed_ver = Version(ver)
if parsed_ver > Version(min_ver):
ver = f"{parsed_ver.major}.{parsed_ver.minor}"

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

excellent idea. Thanks!

@@ -28,6 +28,12 @@ def _check_version_in_range(ver, min_ver, max_ver):
return Version(min_ver) <= Version(ver) <= Version(max_ver)


def _check_spark_version_in_range(ver, min_ver, max_ver):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we add a short docstring to this function explaining what it does and why it's necessary?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

added along with an example to explain why we're doing this.

Copy link
Collaborator

@dbczumar dbczumar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM with a couple comments! Thanks Ben!

Signed-off-by: Ben Wilson <benjamin.wilson@databricks.com>
Copy link
Collaborator

@WeichenXu123 WeichenXu123 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In future we'd better improve this by in "ml-package-versions.yml" maximun version support format like "3.2.*"
CC @harupy

@WeichenXu123
Copy link
Collaborator

also upgrade pyspark maximum version to 3.2.1 in ml-package-versions.yml

@BenWilson2 BenWilson2 merged commit a9897ac into master Feb 11, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
integrations/databricks Databricks integrations rn/bug-fix Mention under Bug Fixes in Changelogs.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants