New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[MAINTENANCE] Update NotImported mechanism to use scoped compatibility modules #7635
Conversation
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
✅ Deploy Preview for niobium-lead-7998 canceled.
|
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…/DX-440/alexsherstinsky/link/update_not_imported_mechanism_to_use_scoped_compatibility_modules_instead-2023_04_14-2
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ink/update_not_imported_mechanism_to_use_scoped_compatibility_modules_instead-2023_04_14-2
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
…ob Storage, and Spark import situations.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for making all these changes! I have a few small requests, and a question but I'll approve and leave to your judgement. The question is - why are there type ignore statements in some compatibility/*
files but not others? E.g. they are in great_expectations/compatibility/google.py
but not sqlalchemy.py
? Can we remove more of them?
@@ -460,15 +456,19 @@ def prepare_dump(self, data, **kwargs): | |||
This method calls the schema's jsonValue() method, which translates the object into a json | |||
""" | |||
# check whether spark exists | |||
if StructType is None: | |||
if (not pyspark.types) or (pyspark.types.StructType is None): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
StructType won't be None so I think you can remove the right side of the or
@@ -933,15 +933,19 @@ def prepare_dump(self, data, **kwargs): | |||
This method calls the schema's jsonValue() method, which translates the object into a json | |||
""" | |||
# check whether spark exists | |||
if StructType is None: | |||
if (not pyspark.types) or (pyspark.types.StructType is None): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same comment, we have to be careful comparing to None since we are now using the NotImported type
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@anthonyburdi I see what you mean. I fixed to utilize the boolean of NotImported
and guard against AttributeError
-- thanks!
…ob Storage, and Spark import situations.
…ink/update_not_imported_mechanism_to_use_scoped_compatibility_modules_instead-2023_04_14-2
94694f3
to
5354641
Compare
@anthonyburdi It is not entirely clear why this inconsistency happens. (It was observed even before the large refactoring work, aimed at utilizing |
…ob Storage, and Spark import situations.
Scope
sqlalchemy
,pyspark
,azure
,google
, etc.).Next To Be Done
Update imports to use the new
NotImported
style for the packages AWS S3, Trino, AWS Athena, Google BigQuery, Snowflake, and AWS Redshift (and any others as needed).Please annotate your PR title to describe what the PR does, then give a brief bulleted description of your PR below. PR titles should begin with [BUGFIX], [FEATURE], [DOCS], [MAINTENANCE], or [CONTRIB]. If a new feature introduces breaking changes for the Great Expectations API or configuration files, please also add [BREAKING]. You can read about the tags in our contributor checklist.
Changes proposed in this pull request:
After submitting your PR, CI checks will run and @cla-bot will check for your CLA signature.
For a PR with nontrivial changes, we review with both design-centric and code-centric lenses.
In a design review, we aim to ensure that the PR is consistent with our relationship to the open source community, with our software architecture and abstractions, and with our users' needs and expectations. That review often starts well before a PR, for example in GitHub issues or Slack, so please link to relevant conversations in notes below to help reviewers understand and approve your PR more quickly (e.g.
closes #123
).Previous Design Review notes:
Definition of Done
Please delete options that are not relevant.
Thank you for submitting!