Skip to content

Error with Data size larger than 1M, will not write to zk. Data (first 1k) #7704

@dongxiaoman

Description

@dongxiaoman

This is not critical but shows some problems worth investigating.

Our QA cluster is having trouble with some accumulated data. It has 10k+ real time segments in place, data are not too much.

In logs we see something like below:

2021/11/04 20:31:34.166 ERROR [ZkClient] [HelixTaskExecutor-message_handle_thread] Data size larger than 1M, will not write to zk. Data (first 1k): {
  "id" : "point_entry_REALTIME",
  "simpleFields" : {
    "BATCH_MESSAGE_MODE" : "false",
    "BUCKET_SIZE" : "0",
    "SESSION_ID" : "30069443a0581e1",
    "STATE_MODEL_DEF" : "SegmentOnlineOfflineStateModel",
    "STATE_MODEL_FACTORY_NAME" : "DEFAULT"
  },
  "mapFields" : {
    "point_entry__0__0__20211030T0056Z" : {
      "CURRENT_STATE" : "OFFLINE"
    },
    "point_entry__0__100__20211102T0746Z" : {
      "CURRENT_STATE" : "OFFLINE"
    },
    "point_entry__0__101__20211102T0817Z" : {
      "CURRENT_STATE" : "OFFLINE"
    },
    "point_entry__0__102__20211102T0909Z" : {
      "CURRENT_STATE" : "OFFLINE"
    },
    "point_entry__0__103__20211102T0946Z" : {
      "CURRENT_STATE" : "ONLINE",
      "END_TIME" : "1636056441791",
      "INFO

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions