Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[CARBONDATA-530]Modified optimizer to place decoder on top of limit in case of sort and limt. #429

Closed
wants to merge 1 commit into from

Conversation

ashokblend
Copy link
Contributor

@ashokblend ashokblend commented Dec 13, 2016

Problem
While optimizing plan of query which is having order by and limit, we put outer decoder below limit. Because of this spark is not able to optimize the plan.

Solution
We need to optimize the plan in such a way that outer decoder comes on top of limit in above case. as given below

|== Physical Plan == |
|CarbonDictionaryDecoder [CarbonDecoderRelation(Map(name#3 -> name#3),CarbonDatasourceRelation(default.dict,None))], ExcludeProfile(ArrayBuffer(name#3)), CarbonAliasDecoderRelation() |
| TakeOrderedAndProject(limit=2, orderBy=[name#3 ASC], output=[name#3]) |
| ConvertToSafe |
| CarbonDictionaryDecoder [CarbonDecoderRelation(Map(name#3 -> name#3),CarbonDatasourceRelation(default.dict,None))], IncludeProfile(ArrayBuffer(name#3)), CarbonAliasDecoderRelation() |
| CarbonScan [name#3], (CarbonRelation default, dict, CarbonMetaData(ArrayBuffer(name),ArrayBuffer(default_dummy_measure),org.apache.carbondata.core.carbon.metadata.schema.table.CarbonTable@62b3dcd6,DictionaryMap(Map(name -> true))), org.apache.carbondata.spark.merger.TableMeta@e67983a, None), [(name#3 = hello)], false|
| |
|Code Generation: true |
+----------------------

@ravipesala
Copy link
Contributor

ravipesala commented Dec 13, 2016

LGTM, But please add same code in CarbonLateDecodeRule as well.

@CarbonDataQA
Copy link

Build Success with Spark 1.5.2, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/155/

@CarbonDataQA
Copy link

Build Success with Spark 1.5.2, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/165/

@jackylk
Copy link
Contributor

jackylk commented Dec 14, 2016

Seems CI is failing for -Pspark-2.0

[ERROR] /home/travis/build/apache/incubator-carbondata/integration/spark2/src/main/scala/org/apache/spark/sql/optimizer/CarbonLateDecodeRule.scala:142: error: not found: type Limit
[INFO]         case limit: Limit =>

@CarbonDataQA
Copy link

Build Success with Spark 1.5.2, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/253/

@ashokblend ashokblend changed the title [CARBONDATA-530]Modified optimizer to place decoder on top of limit. [CARBONDATA-530]Modified optimizer to place decoder on top of limit in case of sort and limt. Dec 21, 2016
@ashokblend
Copy link
Contributor Author

@jackylk is there any way to get build url for spark-2.0 as we have for spark-1.5

@jackylk
Copy link
Contributor

jackylk commented Dec 26, 2016

@ashokblend, please check

- select imei,series from Carbon_automation_test where series='7Series' order by imei limit 10 *** FAILED ***

@CarbonDataQA
Copy link

Build Success with Spark 1.5.2, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/335/

@CarbonDataQA
Copy link

Build Failed with Spark 1.6.2, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/1079/

@chenliang613
Copy link
Contributor

please change the title as per the format: [CARBONDATA-issue number>] Title of the pull request (need to add a blank)

@ravipesala
Copy link
Contributor

retest this please

@CarbonDataQA
Copy link

Build Success with Spark 1.6.2, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/1173/

@ravipesala
Copy link
Contributor

@ashokblend Spark2.1 testcases are failing, can you fix it.

@asfbot
Copy link

asfbot commented May 28, 2017

Refer to this link for build results (access rights to CI server needed):
https://builds.apache.org/job/carbondata-pr-spark-1.6/80/
Test FAILed.

@asfbot
Copy link

asfbot commented May 28, 2017

Refer to this link for build results (access rights to CI server needed):
https://builds.apache.org/job/carbondata-pr-spark-2.1/231/
Test PASSed.

@CarbonDataQA
Copy link

Can one of the admins verify this patch?

2 similar comments
@asfgit
Copy link

asfgit commented Aug 2, 2017

Can one of the admins verify this patch?

@asfgit
Copy link

asfgit commented Aug 2, 2017

Can one of the admins verify this patch?

@CarbonDataQA
Copy link

SDV Build Failed with Spark 2.1, Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/65/

@CarbonDataQA
Copy link

Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/636/

@ravipesala
Copy link
Contributor

SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/1267/

@ashokblend
Copy link
Contributor Author

closing it.

@ashokblend ashokblend closed this Dec 14, 2017
Beyyes pushed a commit to Beyyes/carbondata that referenced this pull request Jul 12, 2018
…l-dev

[Rest Server] Add label for gpu in docker container
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

7 participants