New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[CARBONDATA-1596] Fixed IntermediateFileMerger for decimal types #1420
Conversation
Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/483/ |
Build Success with Spark 1.6, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/358/ |
Can you add a testcase to verify? |
SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/1111/ |
b0ef7ef
to
39e8e6a
Compare
@jackylk test case added..Please review |
sortParameters.setPrefetch(false); | ||
sortParameters.setSortFileCompressionEnabled(false); | ||
sortParameters.setFileWriteBufferSize(2); | ||
sortParameters.setMeasureDataType(new DataType[] { DataTypes.DECIMAL }); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I mean to add a testcase to reproduce the error.
Is it a error when compaction a table having decimal type?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes. While merging the sort temp files the following code was throwing the error.
(byte[]) NonDictionaryUtil.getMeasure(fieldIndex, row);
The fix is to use the DataTypeUtil.bigDecimalToByte to convert bigDecimal value to byte[].
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So can you add a testcase to do compaction on table having decimal type instead of using mock.
Build Failed with Spark 1.6, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/362/ |
Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/487/ |
SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/1115/ |
39e8e6a
to
baad743
Compare
@jackylk added integration test instead of unit test. |
Build Failed with Spark 1.6, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/367/ |
Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/492/ |
SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/1119/ |
retest this please |
Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/509/ |
SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/1137/ |
retest this please |
Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/523/ |
SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/1153/ |
It seems some SDV tests are failed |
retest this please |
SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/1157/ |
Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/527/ |
SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/1172/ |
Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/541/ |
retest this please |
Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/740/ |
SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/1377/ |
@kunal642 please check the test failure |
@ravipesala This test is passing on my local |
baad743
to
8c99189
Compare
Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/794/ |
SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/1427/ |
@ravipesala @jackylk Please review.. The build is randomly failing. |
LGTM |
Analysis: casting bigdecimal to byte[] was throwing ClassCastException in IntermediateFileMerger. Solution: Use DataType#bigDecimalToByte to convert bigdecimal to byte[]. This closes apache#1420
Analysis: casting bigdecimal to byte[] was throwing ClassCastException in IntermediateFileMerger. Solution: Use DataType#bigDecimalToByte to convert bigdecimal to byte[]. This closes apache#1420
Analysis: casting bigdecimal to byte[] was throwing ClassCastException in IntermediateFileMerger.
Solution: Use DataType#bigDecimalToByte to convert bigdecimal to byte[].