Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FLINK-29961][doc] Make referencing custom image clearer for Docker #20935

Open
wants to merge 7 commits into
base: master
Choose a base branch
from
Expand Up @@ -405,16 +405,24 @@ services:
```
You can then start creating tables and queries those.

* Note, that all required dependencies (e.g. for connectors) need to be available in the cluster as well as the client.
For example, if you would like to use the Kafka Connector create a custom image with the following Dockerfile
* Note that all required dependencies (e.g. for connectors) need to be available in the cluster as well as the client.
For example, if you would like to add and use the SQL Kafka Connector, you need to build a custom image.
1. Create a Dockerfile named `Kafka.Dockerfile` as follows:

```Dockerfile
FROM flink:{{< stable >}}{{< version >}}-scala{{< scala_version >}}{{< /stable >}}{{< unstable >}}latest{{< /unstable >}}
RUN wget -P /opt/flink/lib https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-kafka_2.12/{{< version >}}/flink-sql-connector-kafka_scala{{< scala_version >}}-{{< version >}}.jar
RUN wget -P /opt/flink/lib https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-kafka/{{< version >}}/flink-sql-connector-kafka_scala-{{< version >}}.jar
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I found the Kafka connector is not including Scala version in the name. So I updated this as well.

```

2. Replace the `image` config with the `build` command that references the Dockerfile for jobmanager, taskmanager and sql-client services.
For example, the jobmanager service will start with the following setting:
```yaml
jobmanager:
build:
dockerfile: ./Kafka.Dockerfile
...
```

and reference it (e.g via the `build`) command in the Dockerfile.
and reference it (e.g via the `build`) command in the Dockerfile.
SQL Commands like `ADD JAR` will not work for JARs located on the host machine as they only work with the local filesystem, which in this case is Docker's overlay filesystem.

## Using Flink Python on Docker
Expand Down
Expand Up @@ -405,16 +405,24 @@ services:
```
You can then start creating tables and queries those.

* Note, that all required dependencies (e.g. for connectors) need to be available in the cluster as well as the client.
For example, if you would like to use the Kafka Connector create a custom image with the following Dockerfile

* Note that all required dependencies (e.g. for connectors) need to be available in the cluster as well as the client.
For example, if you would like to add and use the SQL Kafka Connector, you need to build a custom image.
1. Create a Dockerfile named `Kafka.Dockerfile` as follows:

```Dockerfile
FROM flink:{{< stable >}}{{< version >}}-scala{{< scala_version >}}{{< /stable >}}{{< unstable >}}latest{{< /unstable >}}
RUN wget -P /opt/flink/lib https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-kafka_2.12/{{< version >}}/flink-sql-connector-kafka_scala{{< scala_version >}}-{{< version >}}.jar
RUN wget -P /opt/flink/lib https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-kafka/{{< version >}}/flink-sql-connector-kafka_scala-{{< version >}}.jar
```

and reference it (e.g via the `build`) command in the Dockerfile.
and reference it (e.g via the `build`) command in the Dockerfile.

2. Replace the `image` config with the `build` command that references the Dockerfile for jobmanager, taskmanager and sql-client services.
For example, the jobmanager service will start with the following setting:
```yaml
jobmanager:
build:
dockerfile: ./Kafka.Dockerfile
...
```

SQL Commands like `ADD JAR` will not work for JARs located on the host machine as they only work with the local filesystem, which in this case is Docker's overlay filesystem.

## Using Flink Python on Docker
Expand Down