Skip to content

Conversation

@beyond1920
Copy link
Contributor

What is the purpose of the change

The issue aims to copy the testcases in the following packages from flink-planner and original blink to Blink-planner:

  1. org.apache.flink.table.api.batch.table
  2. org.apache.flink.table.api.stream.table
  3. org.apache.flink.table.runtime.batch.table
  4. org.apache.flink.table.runtime.stream.table

Brief change log

  • Copy UT and ITCase of TableApi
  • Fix some Bugs, commit message contains detail information about those Bugs

Verifying this change

UT, ITCase

Does this pull request potentially affect one of the following parts:

  • Dependencies (does it add or upgrade a dependency): (no)
  • The public API, i.e., is any changed class annotated with @Public(Evolving): (no)
  • The serializers: (no)
  • The runtime per-record code paths (performance sensitive): (no)
  • Anything that affects deployment or recovery: JobManager (and its components), Checkpointing, Yarn/Mesos, ZooKeeper: (no)
  • The S3 file system connector: (no)

Documentation

  • Does this pull request introduce a new feature? (no)
  • If yes, how is the feature documented? (not applicable)

@flinkbot
Copy link
Collaborator

flinkbot commented Jul 6, 2019

Thanks a lot for your contribution to the Apache Flink project. I'm the @flinkbot. I help the community
to review your pull request. We will use this comment to track the progress of the review.

Automated Checks

Last check on commit 9078425 (Tue Aug 06 15:39:38 UTC 2019)

Warnings:

  • No documentation files were touched! Remember to keep the Flink docs up to date!

Mention the bot in a comment to re-run the automated checks.

Review Progress

  • ❓ 1. The [description] looks good.
  • ❓ 2. There is [consensus] that the contribution should go into to Flink.
  • ❓ 3. Needs [attention] from.
  • ❓ 4. The change fits into the overall [architecture].
  • ❓ 5. Overall code [quality] is good.

Please see the Pull Request Review Guide for a full explanation of the review process.

Details
The Bot is tracking the review progress through labels. Labels are applied according to the order of the review items. For consensus, approval by a Flink committer of PMC member is required Bot commands
The @flinkbot bot supports the following commands:

  • @flinkbot approve description to approve one or more aspects (aspects: description, consensus, architecture and quality)
  • @flinkbot approve all to approve all aspects
  • @flinkbot approve-until architecture to approve everything until architecture
  • @flinkbot attention @username1 [@username2 ..] to require somebody's attention
  • @flinkbot disapprove architecture to remove an approval you gave earlier

Copy link
Contributor

@KurtYoung KurtYoung left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the work!

// we have determined the row type before, just convert it to RelDataType
typeFactory.asInstanceOf[FlinkTypeFactory].createFieldTypeFromLogicalType(
fromDataTypeToLogicalType(externalResultType))
val fieldTypes = TableEnvironment.getFieldTypes(externalResultType)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would suggest to not use TableEnvironment.getFieldTypes, it will be deleted soon

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, I would use FieldInfoUtils in api-module instead.

@flinkbot
Copy link
Collaborator

flinkbot commented Jul 7, 2019

CI report for commit 7da887d: SUCCESS Build

@flinkbot
Copy link
Collaborator

flinkbot commented Jul 7, 2019

CI report for commit b756b1f: SUCCESS Build

@JingsongLi
Copy link
Contributor

We need copy all org.apache.flink.table.api.*? I think we have these duplicate plan cases in org.apache.flink.table.table.*? Why not put cases to org.apache.flink.table.table.* to be consistent with previous SQL plan tests?


// NOTE: this list explicitly excludes data types that need further parameters
// exclusions: DECIMAL, INTERVAL YEAR TO MONTH, MAP, MULTISET, ROW, NULL, ANY
addDefaultDataType(boolean.class, DataTypes.BOOLEAN());
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

-1 about this change, why we need convert a int to DataType(Integer)?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The code is similar with ClassDataTypeConverter in table-common module. When UDF has a eval method explicitly return int type, we need convert a int to DataType.
I add code here to fix the Inconsistence between ClassDataTypeConverter with LegacyTypeInfoDataTypeConverter.
ClassDataTypeConverter map Class to DataTypes, However, for primitive class, it explicit do the following logical to binds convention class:
private static void addDefaultDataType(Class<?> clazz, DataType rootType) { final DataType dataType; if (clazz.isPrimitive()) { dataType = rootType.notNull(); } else { dataType = rootType.nullable(); } defaultDataTypes.put(clazz.getName(), dataType.bridgedTo(clazz)); }
So for the primitive class, for example, int.class, it's DataType binding to int.class, instead of Integer.class.
The logical is not consistent with LegacyTypeInfoDataTypeConverter.
the mapping in LegacyTypeInfoDataTypeConverter only contains DataTypes.INT().bridgedTo(Integer.class), not including DataTypes.INT().bridgedTo(int.class).

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So why not change TypeInfoDataTypeConverter? This is the conversion util in blink.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good suggestion.

override def toString = s"sum($child)"

override private[flink] def resultType = child.resultType
override private[flink] def resultType = {
Copy link
Contributor

@JingsongLi JingsongLi Jul 8, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[FLINK-13107][table-planner-blink] Fix bug when infer resultType of some PlannerExpression.

[FLINK-13107][table-planner-blink] Derive sum, avg, div return type in planner expressions using behavior of blink ?

Copy link
Contributor Author

@beyond1920 beyond1920 Jul 8, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Run a sql query using blink, the behavior already using behavior of blink. It would be strange if the behavior is different when run a tableApi query.
So I think we should use behavior of blink.
Do you mean change the commit message?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, The change is very good, I think the commit message should be more clear.

@beyond1920 beyond1920 force-pushed the TableApiTest branch 4 times, most recently from 4cb0ae7 to ce897c5 Compare July 8, 2019 07:42
@flinkbot
Copy link
Collaborator

flinkbot commented Jul 8, 2019

CI report for commit ce897c5: FAILURE Build

Copy link
Contributor

@twalthr twalthr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What are the implications on the test times when we copy so many tests?

@flinkbot
Copy link
Collaborator

flinkbot commented Jul 11, 2019

CI report:

@KurtYoung
Copy link
Contributor

What are the implications on the test times when we copy so many tests?

According to this: https://travis-ci.org/beyond1920/flink/builds/557233740, the test time increases less than 2 mins.

@KurtYoung
Copy link
Contributor

travis passed here: https://travis-ci.org/beyond1920/flink/builds/557233740
I'm merging this

@KurtYoung KurtYoung closed this in 116c10b Jul 11, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants