Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FLINK-13841][hive] Extend Hive version support to all 1.2 and 2.3 ve… #9524

Closed
wants to merge 3 commits into from

Conversation

xuefuz
Copy link
Contributor

@xuefuz xuefuz commented Aug 23, 2019

…rsions

What is the purpose of the change

Support all 1.2 and 2.3 minor Hive versions instead of currently 1.2.1 and 2.3.4 only.

Brief change log

  • Added new shims for each version
  • Made inheritance relationships between the shims
  • Shimmed methods to deal with API difference

Verifying this change

This change is already covered by existing tests, and manually ran the tests for all versions.

Does this pull request potentially affect one of the following parts:

  • Dependencies (does it add or upgrade a dependency): (no)
  • The public API, i.e., is any changed class annotated with @Public(Evolving): (no)
  • The serializers: (no)
  • The runtime per-record code paths (performance sensitive): (no)
  • Anything that affects deployment or recovery: JobManager (and its components), Checkpointing, Yarn/Mesos, ZooKeeper: (no)
  • The S3 file system connector: (no)

Documentation

  • Does this pull request introduce a new feature? (no)
  • If yes, how is the feature documented? (not applicable / docs / JavaDocs / not documented)

@xuefuz
Copy link
Contributor Author

xuefuz commented Aug 23, 2019

@flinkbot
Copy link
Collaborator

flinkbot commented Aug 23, 2019

Thanks a lot for your contribution to the Apache Flink project. I'm the @flinkbot. I help the community
to review your pull request. We will use this comment to track the progress of the review.

Automated Checks

Last check on commit a077b6c (Tue Aug 27 18:37:48 UTC 2019)

Warnings:

  • No documentation files were touched! Remember to keep the Flink docs up to date!

Mention the bot in a comment to re-run the automated checks.

Review Progress

  • ❓ 1. The [description] looks good.
  • ❓ 2. There is [consensus] that the contribution should go into to Flink.
  • ❓ 3. Needs [attention] from.
  • ❓ 4. The change fits into the overall [architecture].
  • ❓ 5. Overall code [quality] is good.

Please see the Pull Request Review Guide for a full explanation of the review process.


The Bot is tracking the review progress through labels. Labels are applied according to the order of the review items. For consensus, approval by a Flink committer of PMC member is required Bot commands
The @flinkbot bot supports the following commands:

  • @flinkbot approve description to approve one or more aspects (aspects: description, consensus, architecture and quality)
  • @flinkbot approve all to approve all aspects
  • @flinkbot approve-until architecture to approve everything until architecture
  • @flinkbot attention @username1 [@username2 ..] to require somebody's attention
  • @flinkbot disapprove architecture to remove an approval you gave earlier

@flinkbot
Copy link
Collaborator

flinkbot commented Aug 23, 2019

CI report:

Copy link
Member

@bowenli86 bowenli86 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@xuefuz thanks for the PR!

Except a few minor formatting issues, what are the testing plan for all these Hive versions? We probably need to run nightly build for all versions but not necessary for every PR build.

@Override
public SimpleGenericUDAFParameterInfo createUDAFParameterInfo(ObjectInspector[] params, boolean isWindowing, boolean distinct, boolean allColumns) {
try {
Constructor constructor = SimpleGenericUDAFParameterInfo.class.getConstructor(ObjectInspector[].class,
boolean.class, boolean.class, boolean.class);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

revert?

return (SimpleGenericUDAFParameterInfo) constructor.newInstance(params, isWindowing, distinct, allColumns);
} catch (NoSuchMethodException | IllegalAccessException | InstantiationException | InvocationTargetException e) {
throw new CatalogException("Failed to create SimpleGenericUDAFParameterInfo", e);
}
}

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

revert?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is actually desirable as there was missing a new line at the end of file.

@@ -86,7 +88,7 @@ public Function getFunction(IMetaStoreClient client, String dbName, String funct
public boolean moveToTrash(FileSystem fs, Path path, Configuration conf, boolean purge) throws IOException {
try {
Method method = FileUtils.class.getDeclaredMethod("moveToTrash", FileSystem.class, Path.class,
Configuration.class, boolean.class);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

revert?

@@ -99,14 +101,35 @@ public void alterTable(IMetaStoreClient client, String databaseName, String tabl
client.alter_table(databaseName, tableName, table);
}

@Override
public void alterPartition(IMetaStoreClient client, String databaseName, String tableName, Partition partition)
throws InvalidOperationException, MetaException, TException {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: needs an extra tab

Copy link
Contributor Author

@xuefuz xuefuz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Re: testing plan

The general idea is to run tests against those versions at the end of a release cycle, including unit tests and end to end tests. Daily test against one version of each major version can continue, but we cannot do so for each version we support.

We can discuss more about testing should be there any disagreement.

@xuefuz
Copy link
Contributor Author

xuefuz commented Aug 26, 2019

PR updated based on review feedback. @bowenli86 could you take another look? Thanks.

@xuefuz xuefuz closed this Aug 26, 2019
@xuefuz xuefuz reopened this Aug 26, 2019
@bowenli86
Copy link
Member

bowenli86 commented Aug 26, 2019

LGTM. W.r.t testing, shall we at least add build profiles for these newly added Hive versions in flink-connector-hive's pom so they can currently be run manually?

E.g.

		<profile>
			<id>hive-1.2.0</id>
			<properties>
				<hive.version>1.2.0</hive.version>
				<hivemetastore.hadoop.version>2.6.5</hivemetastore.hadoop.version>
				<hiverunner.version>3.2.1</hiverunner.version>
			</properties>
		</profile>
               ...

I created FLINK-13866 to track the testing plan.

@xuefuz
Copy link
Contributor Author

xuefuz commented Aug 26, 2019

LGTM. W.r.t testing, shall we at least add build profiles for these newly added Hive versions in flink-connector-hive's pom so they can currently be run manually?

E.g.

		<profile>
			<id>hive-1.2.0</id>
			<properties>
				<hive.version>1.2.0</hive.version>
				<hivemetastore.hadoop.version>2.6.5</hivemetastore.hadoop.version>
				<hiverunner.version>3.2.1</hiverunner.version>
			</properties>
		</profile>
               ...

I created FLINK-13866 to track the testing plan.

Let's discuss about this in the JIRA you created. For short comment, I don't think we need to provide a profile for each version we support. There are just too many of them. However, this and how to test in general can be discussed.

@bowenli86
Copy link
Member

bowenli86 commented Aug 27, 2019

Let's discuss about this in the JIRA you created. For short comment, I don't think we need to provide a profile for each version we support. There are just too many of them. However, this and how to test in general can be discussed.

It is alright, as there's no difference between shims in the same major versions (1.2 and 2.3) and the current testing setup still covers the change. Support of other major versions may need the corresponding testing plan to come along.

@asfgit asfgit closed this in 0437ad2 Aug 27, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
4 participants