Skip to content

[FLINK-39288][docs] Update DataStream API package guidance for Flink 1.20.x and 2.2.x#4339

Merged
lvyanquan merged 9 commits intoapache:masterfrom
lvyanquan:FLINK-39288
Mar 26, 2026
Merged

[FLINK-39288][docs] Update DataStream API package guidance for Flink 1.20.x and 2.2.x#4339
lvyanquan merged 9 commits intoapache:masterfrom
lvyanquan:FLINK-39288

Conversation

@lvyanquan
Copy link
Contributor

@lvyanquan lvyanquan commented Mar 25, 2026

What is the purpose of the change

This PR updates the DataStream API package guidance documentation to support Flink 1.20.x and Flink 2.2.x with the latest CDC connector versions.

Brief change log

  • Update pom.xml example for Flink 1.20.x: Use CDC version 3.6.0-1.20 and Flink 1.20.3
  • Add new pom.xml example for Flink 2.2.x: Use CDC version 3.6.0-2.2 and Flink 2.2.0
  • Upgrade Java version: From 1.8 to 11 for both examples
  • Update CDC connector: Use flink-sql-connector-mysql-cdc instead of deprecated flink-connector-mysql-cdc
  • Add missing dependencies: flink-connector-base and log4j-slf4j-impl
  • Remove redundant dependency: Explicit flink-shaded-guava is no longer needed (included in CDC connector)
  • Sync Chinese documentation: Ensure docs/content.zh/ matches English version

Notes

Flink 1.20.x

Flink 1.20 does not depend on the compatibility layer code, so there is no need to introduce additional flink-compat dependencies.

Flink 2.2.x

For Flink 2.2.x, it is recommended to use flink-sql-connector-mysql-cdc instead of the deprecated flink-connector-mysql-cdc. The flink-sql-connector-mysql-cdc is a shaded jar that includes all necessary dependencies and provides better compatibility.

Verifying this change

This change is a documentation update and does not affect any code logic.

Does this pull request potentially affect one of the following parts

  • Dependencies (does it add or upgrade a dependency): no
  • The public API, i.e., is any changes to the public classes or interfaces: no
  • The serializers: no
  • The runtime per-codece: no
  • The compatibility between versions: no
  • The S3 filesystem connector: no

Documentation

  • Does this pull request introduce a new feature? no
  • If yes, how is the feature documented? not applicable

🤖 Generated with Claude Code

…1.20.x and 2.2.x

- Update pom.xml example for Flink 1.20.x with CDC version 3.6.0-1.20
- Add new pom.xml example for Flink 2.2.x with CDC version 3.6.0-2.2
- Upgrade Java version from 1.8 to 11 for both examples
- Use flink-sql-connector-mysql-cdc instead of flink-connector-mysql-cdc
- Add flink-connector-base and log4j-slf4j-impl dependencies
- Remove redundant flink-shaded-guava explicit dependency
- Sync Chinese documentation with English version

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@github-actions github-actions bot added the docs Improvements or additions to documentation label Mar 25, 2026
lvyanquan and others added 3 commits March 25, 2026 19:46
- Add missing <scope>provided</scope> for flink-streaming-java, flink-clients, flink-core
- Use flink-connector-mysql-cdc to match English version
- Translate title to Chinese

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
<include>io.debezium:debezium-connector-mysql</include>
<include>org.apache.flink:flink-connector-debezium</include>
<include>org.apache.flink:flink-connector-mysql-cdc</include>
<include>org.apache.flink:flink-sql-connector-mysql-cdc</include>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why we change here? I think keeping dependencies as less as possible is important for datastream users

Copy link
Contributor Author

@lvyanquan lvyanquan Mar 25, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is because when running jobs on Flink 2.2, we depend on two flink-shaded-guava versions: 31.1-jre-17.0(as we directly import org.apache.flink.shaded.guava31 in the code) and 33.4.0-jre-20.0. When running in an IDE (not submitting a JAR to the cluster), since they share the same groupId and artifactId, only the 33.4.0-jre-20.0 dependency can be provided. Therefore, an uber JAR is needed to provide the other flink-shaded-guava dependency.

This dependency is only required for Flink 2.2. For Flink 1.20 jobs, the dependency configuration remains the same as before.

Copy link
Contributor

@leonardBang leonardBang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @lvyanquan for the contribution, I left some comments

lvyanquan and others added 2 commits March 25, 2026 21:00
- Add flink-shaded-guava version comments for Flink 1.13-1.18 and 2.2
- Add flink-connector-mysql-cdc and flink-cdc-flink2-compat dependencies for Flink 2.2.x
- Add guava31 to guava33 relocation for Flink 2.2.x
- Translate title to Chinese

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
lvyanquan and others added 3 commits March 25, 2026 21:31
This reverts commit 3bf2060.
…mments

- Add flink-shaded-guava 31.1-jre-17.0 for Flink 1.20.x
- Add flink-shaded-guava 33.4.0-jre-20.0 for Flink 2.2.x
- Add version reference comments for Flink 1.13-1.18 and 2.2
- Sync Chinese documentation with English version

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Copy link
Contributor

@leonardBang leonardBang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @lvyanquan for the update, LGTM

@lvyanquan lvyanquan merged commit 619ab90 into apache:master Mar 26, 2026
8 checks passed
lvyanquan added a commit that referenced this pull request Mar 26, 2026
…1.20.x and 2.2.x (#4339)

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
(cherry picked from commit 619ab90)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

approved docs Improvements or additions to documentation reviewed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants