Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore(ingest/s3) Bump Deequ and Pyspark version #8638

Merged
merged 11 commits into from
Aug 29, 2023

Conversation

treff7es
Copy link
Contributor

@treff7es treff7es commented Aug 15, 2023

Closes #6852

Checklist

  • The PR conforms to DataHub's Contributing Guideline (particularly Commit Message Format)
  • Links to related issues (if applicable)
  • Tests for the changes have been added/updated (if applicable)
  • Docs related to the changes have been added/updated (if applicable). If a new feature has been added a Usage Guide has been added for the same.
  • For any breaking change/potential downtime/deprecation/big changes an entry has been made in Updating DataHub

@github-actions github-actions bot added the ingestion PR or Issue related to the ingestion of metadata label Aug 15, 2023
@@ -259,13 +259,14 @@ def init_spark(self):
import pydeequ

conf = SparkConf()

spark_version = os.getenv("SPARK_VERSION", "3.0")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

3.0? Don’t we want 3.3?

"pydeequ>=1.0.1, <1.1",
"pyspark==3.0.3",
"pydeequ>=1.0.1",
"pyspark==3.3.2",
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It might still be good to keep this strictly pinned instead of >=

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Spark 3.3.3 has already been released. Would it be better to fix the version to the 3.3x branch. i.e. compatible release
"pyspark~=3.3.0", so can upgrade Spark with any future micro releases in 3.3?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

make sense, thanks

@hsheth2
Copy link
Collaborator

hsheth2 commented Aug 15, 2023

@treff7es looks like CI is failing on this one

@treff7es treff7es merged commit d86b336 into datahub-project:master Aug 29, 2023
52 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ingestion PR or Issue related to the ingestion of metadata
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Bump pyspark dependency to >=3.1.3
4 participants