[CARBONDATA-3136] Fix JVM crash with preaggregate datamap when average of decimal column is taken with orderby. #2958
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
problem: JVM crash with preaggregate datamap when average of decimal column is taken with orderby.
cause: When preparing plan with preaggregate datamap, decimal is cast to double in average expression. This was leading to JVM crash in spark as we were filling with wrong precision (callstack mentioned in JIRA)
solution: division result of average, should be casted to decimal instead of double for decimal datatype.
Be sure to do all of the following checklist to help us incorporate
your contribution quickly and easily:
Any interfaces changed? NA
Any backward compatibility impacted? NA
Document update required? NA
Testing done.
yes, added UT
For large changes, please consider breaking it into sub-tasks under an umbrella JIRA. NA