Skip to content

Commit

Permalink
[FLINK-34123][docs][type] Add doc for built-in serialization support …
Browse files Browse the repository at this point in the history
…for map and lists
  • Loading branch information
X-czh committed May 23, 2024
1 parent bd3fecc commit 63bf63f
Showing 1 changed file with 12 additions and 5 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -37,15 +37,16 @@ Flink places some restrictions on the type of elements that can be in a DataStre
The reason for this is that the system analyzes the types to determine
efficient execution strategies.

There are seven different categories of data types:
There are eight different categories of data types:

1. **Java Tuples** and **Scala Case Classes**
2. **Java POJOs**
3. **Primitive Types**
4. **Regular Classes**
5. **Values**
6. **Hadoop Writables**
7. **Special Types**
4. **Common Collection Types**
5. **Regular Classes**
6. **Values**
7. **Hadoop Writables**
8. **Special Types**

#### Tuples and Case Classes

Expand Down Expand Up @@ -167,6 +168,12 @@ input.keyBy(_.word)

Flink supports all Java and Scala primitive types such as `Integer`, `String`, and `Double`.

#### Common Collection Types

Flink supports common Java collection types (currently only `Map` and `List`). Note that Flink will not
preserve the underlying implementation types. For example, Flink will deserialize any map data into a
`HashMap` and any list data into an `ArrayList`. In order to avoid this, a custom serializer is required.

#### General Class Types

Flink supports most Java and Scala classes (API and custom).
Expand Down

0 comments on commit 63bf63f

Please sign in to comment.