New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[CARBONDATA-2609] Change RPC implementation to Hadoop RPC framework #2372
Conversation
Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6331/ |
Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5169/ |
SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5283/ |
SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5293/ |
Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6343/ |
Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5181/ |
Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6347/ |
SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5297/ |
Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5185/ |
Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6359/ |
Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5197/ |
Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6360/ |
Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5198/ |
SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5308/ |
SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5309/ |
Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6362/ |
Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5200/ |
Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6363/ |
Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5201/ |
SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5311/ |
Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6364/ |
Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5202/ |
SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5312/ |
SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5313/ |
@@ -80,7 +80,7 @@ public void initialize(InputSplit inputSplit, TaskAttemptContext context) | |||
} | |||
// It should use the exists tableBlockInfos if tableBlockInfos of queryModel is not empty | |||
// otherwise the prune is no use before this method | |||
if (!queryModel.isFG()) { | |||
if (queryModel.getTableBlockInfos().isEmpty()) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If use (queryModel.getTableBlockInfos().isEmpty()), when the prune result is empty by FG in search mode, it will use the original TableBlockInfos and execute again, it mean the FG no use in this scenario. So it's better keep it and don't change it to queryModel.getTableBlockInfos().isEmpty()
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have added a new RecordReader called IndexedRecordReader
, I am using this one in search mode now. So this problem will not come. I will remove line 83.
Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5258/ |
Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6425/ |
Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6428/ |
SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5363/ |
SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5364/ |
Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6433/ |
Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5264/ |
fix test fix test wip wip wip wip wip wip fix test fix test fix test fix test fix test fix comment refactor fix test
Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6458/ |
Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5290/ |
SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5383/ |
@@ -137,7 +149,7 @@ class SearchModeTestCase extends QueryTest with BeforeAndAfterAll { | |||
sql("DROP DATAMAP if exists dm3 ON TABLE main") | |||
} | |||
|
|||
test("start search mode twice") { | |||
ignore("start search mode twice") { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why ignore it?
please close it, already merged into the branch carbonstore |
In Carbon Search Mode, change the RPC implementation to use Hadoop RPC framework, decoupling from Spark.
Any interfaces changed?
No
Any backward compatibility impacted?
No
Document update required?
No
Testing done
Please provide details on
- Whether new unit test cases have been added or why no new tests are required?
- How it is tested? Please attach test report.
- Is it a performance related change? Please attach the performance test report.
- Any additional information to help reviewers in testing this change.
For large changes, please consider breaking it into sub-tasks under an umbrella JIRA.
NA