-
Notifications
You must be signed in to change notification settings - Fork 704
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[CARBONDATA-2288] [Test] Exception is Masked Inside StandardPartitionTableQueryTestCase #2034
Conversation
Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4082/ |
Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/2837/ |
SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/3781/ |
retest sdv please |
SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/3843/ |
retest sdv please |
SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/3878/ |
LGTM |
…TableQueryTestCase This closes #2034
…TableQueryTestCase This closes apache#2034
…TableQueryTestCase This closes apache#2034
Problem: Inside This test case exception from both spark version is different and it is masked
test("Creation of partition table should fail if the colname in table schema and partition column is same even if both are case sensitive"){
intercept[Exception]
{ sql("CREATE TABLE uniqdata_char2(name char,id int) partitioned by (NAME char)stored by 'carbondata' ") }
}
for spark 2.1 exception message is
Operation not allowed: Partition columns should not be specified in the schema: name")
but for spark 2.2 it is
DataType char is not supported"
Reason: for different exception in different spark version is
spark 2.1.0 supports creating table even if char data type is given with out digits
scala> spark.sql("create table id(id char)");
18/02/23 14:12:23 WARN HiveMetaStore: Location: file:/home/anubhav/Documents/phatak/spark-2.1/bin/spark-warehouse/id specified for non-external table:id
res0: org.apache.spark.sql.DataFrame = []
but not in spark 2.2.1
scala> spark.sql("create table id(id char)");
18/02/23 14:22:05 WARN ObjectStore: Failed to get database global_temp, returning NoSuchObjectException
org.apache.spark.sql.catalyst.parser.ParseException:
DataType char is not supported.(line 1, pos 19)
Solution: assert the different exception for different spark version when char data type does not have digit
Be sure to do all of the following checklist to help us incorporate
your contribution quickly and easily:
Any interfaces changed?
Any backward compatibility impacted?
Document update required?
Testing done
Please provide details on
- Whether new unit test cases have been added or why no new tests are required?
- How it is tested? Please attach test report.
- Is it a performance related change? Please attach the performance test report.
- Any additional information to help reviewers in testing this change.
For large changes, please consider breaking it into sub-tasks under an umbrella JIRA.