New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[CARBONDATA-2359] Support applicable load options and table properties for Non-Transactional table #2190
Conversation
Retest this please |
Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/5214/ |
Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4033/ |
71d9e66
to
bf9ee9f
Compare
Retest this please |
Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4043/ |
243b276
to
ae07151
Compare
Retest this please |
Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/5232/ |
Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4055/ |
ae07151
to
227269a
Compare
Retest this please |
Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/5242/ |
Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4068/ |
227269a
to
bdd463e
Compare
Retest this Please |
Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/5255/ |
Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4074/ |
Retest this please |
Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4076/ |
Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/5257/ |
* @return measures present in the block | ||
*/ | ||
public static List<ProjectionMeasure> createMeasureInfoAndGetCurrentBlockQueryMeasures( | ||
BlockExecutionInfo blockExecutionInfo, List<ProjectionMeasure> queryMeasures, | ||
List<CarbonMeasure> currentBlockMeasures, boolean isUnManagedTable) { | ||
List<CarbonMeasure> currentBlockMeasures) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For non transactional table columnid check should not be persent during read. As files can be created some where else and can be copied.
tableSchemaBuilder = tableSchemaBuilder.isUnmanagedTable(isUnManagedTable); | ||
} | ||
|
||
List<String> sortColumnsList = new ArrayList<>(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sort options behaviour should be consistent as create table
If empty sort columns list, then it should be no sort.
If sort column list not passed, default behaviour should be same as create table.
Also mention this behaviour in SDK java doc
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok. done
@@ -119,7 +126,18 @@ public TableSchemaBuilder addColumn(StructField field, boolean isSortColumn) { | |||
} | |||
newColumn.setSchemaOrdinal(ordinal++); | |||
newColumn.setColumnar(true); | |||
newColumn.setColumnUniqueId(UUID.randomUUID().toString()); | |||
|
|||
// For unmanagedTable, multiple sdk writer output with same column name can be placed in |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Change un managed to transactional table in all PR specific code added.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok. done
* e. timestampformat -- same as JAVA SimpleDateFormat | ||
* @return updated CarbonWriterBuilder | ||
*/ | ||
public CarbonWriterBuilder withLoadOptions(Map<String, String> options) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
complex type level delimiters should be supported, quote char, escape character needs to be supported, as required for complex types parsing.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok. done
Assert.assertEquals("robot" + (i % 10), row[0]); | ||
Assert.assertEquals(i, row[1]); | ||
Object[] row = (Object[]) reader.readNextRow(); | ||
// TODO: Default sort column is applied for dimensions. So, need to validate accordingly |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TODO: to be corrected
* To support the load options for sdk writer | ||
* @param options key,value pair of load options. | ||
* supported keys values are | ||
* a. bad_records_logger_enable -- true, false |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please follow standard documentation format
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok
bdd463e
to
a35ba71
Compare
Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/5296/ |
Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4117/ |
a35ba71
to
f286a13
Compare
Retest this please |
Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4119/ |
Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/5299/ |
Retest this please |
Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4121/ |
Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/5301/ |
Retest this please |
Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4123/ |
Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/5303/ |
@@ -75,6 +78,11 @@ public TableSchemaBuilder tableName(String tableName) { | |||
return this; | |||
} | |||
|
|||
public TableSchemaBuilder resetTransactionalTable(boolean isTransactionalTable) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
change function name to setTransactionTable , there nothing resetting happening in this function
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok
List<String> sortColumnsList; | ||
if (sortColumns != null) { | ||
sortColumnsList = Arrays.asList(sortColumns); | ||
if (!isTransactionalTable) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
always directly set value to tableSchemaBuilder
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok
…s for Non-Transactional table Support read multiple sdk writer placed at same path
f286a13
to
d52b3f9
Compare
retest this please |
Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4128/ |
Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/5308/ |
LGTM |
…erties for Non-Transactional table Support read multiple sdk writer placed at same path This closes apache#2190
[CARBONDATA-2359] Support applicable load options and table properties for a Non-Transactional table
And blocked clean files for a Non-Transactional table.
your contribution quickly and easily:
Any interfaces changed? No. Added new interfaces. Didn't modified any.
Any backward compatibility impacted? No
Document update required? yes, will be handled in separate PR
Testing done.
please refer TestUnmanagedCarbonTable.scala
For large changes, please consider breaking it into sub-tasks under an umbrella JIRA. Done.