-
Notifications
You must be signed in to change notification settings - Fork 28k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-18482][SQL] make sure Spark can access the table metadata created by older version of spark #16003
Conversation
@@ -1370,47 +1370,4 @@ class MetastoreDataSourcesSuite extends QueryTest with SQLTestUtils with TestHiv | |||
sparkSession.sparkContext.conf.set(DEBUG_MODE, previousValue) | |||
} | |||
} | |||
|
|||
test("SPARK-17470: support old table that stores table location in storage properties") { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
} | ||
} | ||
|
||
test("SPARK-18464: support old table which doesn't store schema in table properties") { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it's covered by the new test suite
Test build #69134 has finished for PR 16003 at commit
|
import org.apache.spark.util.Utils | ||
|
||
|
||
class HiveExternalCatalogCompatibilitySuite extends QueryTest with TestHiveSingleton { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To make the name super long...
HiveExternalCatalogBackwardCompatibilitySuite
Test build #69145 has finished for PR 16003 at commit
|
The main thing I'd add is to add comment explaining what version of Spark would generate those table props. |
How about Spark 2.1 altering the table metadata created by Spark 2.0? |
Test build #69200 has started for PR 16003 at commit |
Test build #3440 has finished for PR 16003 at commit
|
|
||
|
||
// Raw table metadata that are dumped from tables created by Spark 2.0. Note that, all spark | ||
// versions prior to 2.1 would generate same raw table metadata for a specific table. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I briefly checked 1.6. Most of them are the same, but some changes are only available in 2.0. For example, locationUri = Some(defaultTablePath("tbl7") + "-__PLACEHOLDER__"),
was added in #13270
LGTM except a minor comment. We can address it in a separate PR for checking 1.6. |
Test build #69217 has finished for PR 16003 at commit
|
Merging in master/branch-2.1. |
…ted by older version of spark ## What changes were proposed in this pull request? In Spark 2.1, we did a lot of refactor for `HiveExternalCatalog` and related code path. These refactor may introduce external behavior changes and break backward compatibility. e.g. http://issues.apache.org/jira/browse/SPARK-18464 To avoid future compatibility problems of `HiveExternalCatalog`, this PR dumps some typical table metadata from tables created by 2.0, and test if they can recognized by current version of Spark. ## How was this patch tested? test only change Author: Wenchen Fan <wenchen@databricks.com> Closes #16003 from cloud-fan/test. (cherry picked from commit fc2c13b) Signed-off-by: Reynold Xin <rxin@databricks.com>
…ted by older version of spark ## What changes were proposed in this pull request? In Spark 2.1, we did a lot of refactor for `HiveExternalCatalog` and related code path. These refactor may introduce external behavior changes and break backward compatibility. e.g. http://issues.apache.org/jira/browse/SPARK-18464 To avoid future compatibility problems of `HiveExternalCatalog`, this PR dumps some typical table metadata from tables created by 2.0, and test if they can recognized by current version of Spark. ## How was this patch tested? test only change Author: Wenchen Fan <wenchen@databricks.com> Closes apache#16003 from cloud-fan/test.
…ted by older version of spark ## What changes were proposed in this pull request? In Spark 2.1, we did a lot of refactor for `HiveExternalCatalog` and related code path. These refactor may introduce external behavior changes and break backward compatibility. e.g. http://issues.apache.org/jira/browse/SPARK-18464 To avoid future compatibility problems of `HiveExternalCatalog`, this PR dumps some typical table metadata from tables created by 2.0, and test if they can recognized by current version of Spark. ## How was this patch tested? test only change Author: Wenchen Fan <wenchen@databricks.com> Closes apache#16003 from cloud-fan/test.
What changes were proposed in this pull request?
In Spark 2.1, we did a lot of refactor for
HiveExternalCatalog
and related code path. These refactor may introduce external behavior changes and break backward compatibility. e.g. http://issues.apache.org/jira/browse/SPARK-18464To avoid future compatibility problems of
HiveExternalCatalog
, this PR dumps some typical table metadata from tables created by 2.0, and test if they can recognized by current version of Spark.How was this patch tested?
test only change