-
Notifications
You must be signed in to change notification settings - Fork 13.9k
[FLINK-13107][table-planner-blink] Copy TableApi IT and UT to Blink planner #9006
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Thanks a lot for your contribution to the Apache Flink project. I'm the @flinkbot. I help the community Automated ChecksLast check on commit 9078425 (Tue Aug 06 15:39:38 UTC 2019) Warnings:
Mention the bot in a comment to re-run the automated checks. Review Progress
Please see the Pull Request Review Guide for a full explanation of the review process. DetailsThe Bot is tracking the review progress through labels. Labels are applied according to the order of the review items. For consensus, approval by a Flink committer of PMC member is required Bot commandsThe @flinkbot bot supports the following commands:
|
KurtYoung
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the work!
...k-table-planner-blink/src/main/java/org/apache/flink/table/expressions/RexNodeConverter.java
Outdated
Show resolved
Hide resolved
| // we have determined the row type before, just convert it to RelDataType | ||
| typeFactory.asInstanceOf[FlinkTypeFactory].createFieldTypeFromLogicalType( | ||
| fromDataTypeToLogicalType(externalResultType)) | ||
| val fieldTypes = TableEnvironment.getFieldTypes(externalResultType) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would suggest to not use TableEnvironment.getFieldTypes, it will be deleted soon
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK, I would use FieldInfoUtils in api-module instead.
|
We need copy all |
|
|
||
| // NOTE: this list explicitly excludes data types that need further parameters | ||
| // exclusions: DECIMAL, INTERVAL YEAR TO MONTH, MAP, MULTISET, ROW, NULL, ANY | ||
| addDefaultDataType(boolean.class, DataTypes.BOOLEAN()); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
-1 about this change, why we need convert a int to DataType(Integer)?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The code is similar with ClassDataTypeConverter in table-common module. When UDF has a eval method explicitly return int type, we need convert a int to DataType.
I add code here to fix the Inconsistence between ClassDataTypeConverter with LegacyTypeInfoDataTypeConverter.
ClassDataTypeConverter map Class to DataTypes, However, for primitive class, it explicit do the following logical to binds convention class:
private static void addDefaultDataType(Class<?> clazz, DataType rootType) { final DataType dataType; if (clazz.isPrimitive()) { dataType = rootType.notNull(); } else { dataType = rootType.nullable(); } defaultDataTypes.put(clazz.getName(), dataType.bridgedTo(clazz)); }
So for the primitive class, for example, int.class, it's DataType binding to int.class, instead of Integer.class.
The logical is not consistent with LegacyTypeInfoDataTypeConverter.
the mapping in LegacyTypeInfoDataTypeConverter only contains DataTypes.INT().bridgedTo(Integer.class), not including DataTypes.INT().bridgedTo(int.class).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So why not change TypeInfoDataTypeConverter? This is the conversion util in blink.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good suggestion.
| override def toString = s"sum($child)" | ||
|
|
||
| override private[flink] def resultType = child.resultType | ||
| override private[flink] def resultType = { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[FLINK-13107][table-planner-blink] Fix bug when infer resultType of some PlannerExpression.
[FLINK-13107][table-planner-blink] Derive sum, avg, div return type in planner expressions using behavior of blink ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Run a sql query using blink, the behavior already using behavior of blink. It would be strange if the behavior is different when run a tableApi query.
So I think we should use behavior of blink.
Do you mean change the commit message?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, The change is very good, I think the commit message should be more clear.
...k-table-planner-blink/src/main/scala/org/apache/flink/table/plan/util/SetOpRewriteUtil.scala
Show resolved
Hide resolved
4cb0ae7 to
ce897c5
Compare
twalthr
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What are the implications on the test times when we copy so many tests?
…TableFunction to using fieldNames which is specified.
…h has primitiveClass as convensionClass to TypeInformation.
…n planner expressions using behavior of blink
…is RowMode or RangeMode.
According to this: https://travis-ci.org/beyond1920/flink/builds/557233740, the test time increases less than 2 mins. |
|
travis passed here: https://travis-ci.org/beyond1920/flink/builds/557233740 |
What is the purpose of the change
The issue aims to copy the testcases in the following packages from flink-planner and original blink to Blink-planner:
Brief change log
Verifying this change
UT, ITCase
Does this pull request potentially affect one of the following parts:
@Public(Evolving): (no)Documentation