New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[CARBONDATA-530]Modified optimizer to place decoder on top of limit in case of sort and limt. #429
Conversation
LGTM, But please add same code in |
Build Success with Spark 1.5.2, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/155/ |
1fc60fc
to
8a5271f
Compare
Build Success with Spark 1.5.2, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/165/ |
Seems CI is failing for -Pspark-2.0
|
8a5271f
to
5e2635f
Compare
Build Success with Spark 1.5.2, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/253/ |
@jackylk is there any way to get build url for spark-2.0 as we have for spark-1.5 |
@ashokblend, please check
|
5e2635f
to
f91c72d
Compare
Build Success with Spark 1.5.2, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/335/ |
Build Failed with Spark 1.6.2, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/1079/ |
please change the title as per the format: [CARBONDATA-issue number>] Title of the pull request (need to add a blank) |
retest this please |
Build Success with Spark 1.6.2, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/1173/ |
@ashokblend Spark2.1 testcases are failing, can you fix it. |
Refer to this link for build results (access rights to CI server needed): |
Refer to this link for build results (access rights to CI server needed): |
Can one of the admins verify this patch? |
2 similar comments
Can one of the admins verify this patch? |
Can one of the admins verify this patch? |
SDV Build Failed with Spark 2.1, Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/65/ |
Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/636/ |
SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/1267/ |
closing it. |
…l-dev [Rest Server] Add label for gpu in docker container
Problem
While optimizing plan of query which is having order by and limit, we put outer decoder below limit. Because of this spark is not able to optimize the plan.
Solution
We need to optimize the plan in such a way that outer decoder comes on top of limit in above case. as given below
|== Physical Plan == |
|CarbonDictionaryDecoder [CarbonDecoderRelation(Map(name#3 -> name#3),CarbonDatasourceRelation(
default
.dict
,None))], ExcludeProfile(ArrayBuffer(name#3)), CarbonAliasDecoderRelation() || TakeOrderedAndProject(limit=2, orderBy=[name#3 ASC], output=[name#3]) |
| ConvertToSafe |
| CarbonDictionaryDecoder [CarbonDecoderRelation(Map(name#3 -> name#3),CarbonDatasourceRelation(
default
.dict
,None))], IncludeProfile(ArrayBuffer(name#3)), CarbonAliasDecoderRelation() || CarbonScan [name#3], (CarbonRelation default, dict, CarbonMetaData(ArrayBuffer(name),ArrayBuffer(default_dummy_measure),org.apache.carbondata.core.carbon.metadata.schema.table.CarbonTable@62b3dcd6,DictionaryMap(Map(name -> true))), org.apache.carbondata.spark.merger.TableMeta@e67983a, None), [(name#3 = hello)], false|
| |
|Code Generation: true |
+----------------------