New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[CARBONDATA-1954] [Pre-Aggregate] CarbonHiveMetastore updated while dropping the Pre-Aggregate table & code refactored #1743
Conversation
SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2633/ |
Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/2447/ |
Build Success with Spark 2.2.0, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/1227/ |
retest sdv please |
SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2645/ |
@@ -76,6 +78,24 @@ class TestDataMapCommand extends QueryTest with BeforeAndAfterAll { | |||
assert(dataMapSchemaList.get(2).getChildSchema.getTableName.equals("datamaptest_datamap3")) | |||
} | |||
|
|||
test("check hivemetastore after drop datamap") { | |||
CarbonProperties.getInstance().addProperty(CarbonCommonConstants.ENABLE_HIVE_SCHEMA_META_STORE, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
reset this property
var dataMapSchemaList = table.getTableInfo.getDataMapSchemaList | ||
assert(dataMapSchemaList.size() == 1) | ||
|
||
sql("drop datamap if exists datamap_hiveMetaStoreTable on table hiveMetaStoreTable") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- remove if exists
- drop main table also
- Use get datamap list to find if datamap table is dropped
@@ -194,6 +194,11 @@ class CarbonHiveMetaStore extends CarbonFileMetastore { | |||
newTablePath) | |||
val dbName = oldTableIdentifier.getDatabaseName | |||
val tableName = oldTableIdentifier.getTableName | |||
val schemaParts = CarbonUtil.convertToMultiGsonStrings(wrapperTableInfo, "=", "'", "") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not required this function use above function , updateHiveMetaStoreForAlter
change PR title and issue title to describe the problem |
Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/2513/ |
Build Success with Spark 2.2.0, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/1296/ |
SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2681/ |
@@ -65,7 +65,7 @@ object DropDataMapPostListener extends OperationEventListener { | |||
Some(dataMapSchema.get.getRelationIdentifier.getDatabaseName), | |||
dataMapSchema.get.getRelationIdentifier.getTableName, | |||
dropChildTable = true | |||
).run(sparkSession) | |||
).processMetadata(sparkSession) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it is better to move this invokation to CarbonDropDataMapCommand.processMetadata
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@jackylk code is moved to CarbonDropDataMapCommand
CarbonCommonConstants.ENABLE_HIVE_SCHEMA_META_STORE_DEFAULT) | ||
} | ||
} | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Beside CarbonDropDataMapCommand.scala, CarbonDropTableCommand.scala also has problem that processMetadata is incorrect if there are datamaps associated with the fact table. Please modify that in this PR also and add testcase
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
CarbonDropDataMapCommand also modified and test case added.
Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/2548/ |
Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/1323/ |
SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2718/ |
SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2743/ |
retest sdv please |
SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2745/ |
Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/2580/ |
Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/2588/ |
SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2751/ |
Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/1353/ |
SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2780/ |
Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/2618/ |
Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/1383/ |
SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2781/ |
Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/1384/ |
Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/2619/ |
SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2782/ |
Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/2620/ |
Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/1385/ |
…aggregate table & code refactored to process metadata only if processMetadata() is called for preaggregate table
Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/1404/ |
SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2798/ |
Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/2639/ |
SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2800/ |
LGTM |
Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/1410/ |
Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/2646/ |
CarbonProperties.getInstance() | ||
.addProperty(CarbonCommonConstants.ENABLE_HIVE_SCHEMA_META_STORE, | ||
"true") | ||
sql("drop datamap if exists datamap_hiveMetaStoreTable on table hiveMetaStoreTable") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
where is hiveMetaStoreTable?
…ropping the Pre-Aggregate table & code refactored 1. To update CarbonHiveMetastore similar function was already there . Removed duplicate function defination and updated the caller. 2. code refactored so that during droping a pre-aggregate table only metadata will be deleted if processMetadata() is called. This closes apache#1743
This PR contains :
1. To update CarbonHiveMetastore similar function was already there . Removed duplicate function defination and updated the caller.
2. code refactored so that during droping a pre-aggregate table only metadata will be deleted if processMetadata() is called.
Any interfaces changed? No
Any backward compatibility impacted? No
Document update required? No
Testing done - Yes, test case added
For large changes, please consider breaking it into sub-tasks under an umbrella JIRA. NA