-
Notifications
You must be signed in to change notification settings - Fork 703
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[CARBONDATA-1867] Add support for task/segment level pruning #1624
Conversation
Build Success with Spark 2.2.0, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/499/ |
Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/1762/ |
* @return | ||
* @throws IOException | ||
*/ | ||
public boolean isScanRequired(String segmentId, FilterResolverIntf filterExp) throws IOException { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It should not be isScanRequired
, it should be pruneSegments that should take segmentslist and filterexp, and returns the pruned list of segments
* @param taskMinMaxRow | ||
* @return | ||
*/ | ||
public void addTaskMinMaxRowToUnsafe(DataMapRow taskMinMaxRow, List<Integer> indexesToAccess) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why it is required a new method? why don't use addToUnsafe
} else { | ||
byte[][] existingMinMaxValues = getMinMaxValue(taskMinMaxRow, ordinal); | ||
// Compare and update min max values | ||
SerializableComparator comparator = |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think you can use UnsafeComparer.compare
} | ||
} | ||
|
||
private void addRowToUnsafeMemoryStore(DataMapRow row) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think better set some junk or default values exept for the minmax in case of last row. so that it will be easy to read and write
SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2145/ |
…ask level min/max which can be helpful for task/segment level pruning
5ada233
to
aad3759
Compare
Build Success with Spark 2.2.0, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/523/ |
aad3759
to
c0c32b9
Compare
@ravipesala ..handled review comments..kindly review and merge |
Build Success with Spark 2.2.0, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/524/ |
Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/1780/ |
SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2155/ |
SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2156/ |
c0c32b9
to
08d2dcd
Compare
Build Success with Spark 2.2.0, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/549/ |
Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/1807/ |
SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2179/ |
LGTM |
Added support for task/segment level pruning. Added code to compute task level min/max which can be helpful for task/segment level pruning This closes apache#1624
Added support for task/segment level pruning. Added code to compute task level min/max which can be helpful for task/segment level pruning This closes apache#1624
Added support for task/segment level pruning. Added code to compute task level min/max which can be helpful for task/segment level pruning
Any interfaces changed?
Any backward compatibility impacted?
Document update required?
Testing done
For large changes, please consider breaking it into sub-tasks under an umbrella JIRA.