Skip to content

[SUPPORT] spark task execute too long and can not finish when ObjectSizeCalculator.getObjectSize #11879

@KnightChess

Description

@KnightChess

Tips before filing an issue

Describe the problem you faced

like #10504 , in different func

java.lang.Object.wait(Native Method)
java.lang.Object.wait(Object.java:502)
java.lang.UNIXProcess.waitFor(UNIXProcess.java:396)
org.apache.hudi.org.openjdk.jol.vm.sa.ServiceabilityAgentSupport.callAgent(ServiceabilityAgentSupport.java:190)
org.apache.hudi.org.openjdk.jol.vm.sa.ServiceabilityAgentSupport.callAgent(ServiceabilityAgentSupport.java:163)
org.apache.hudi.org.openjdk.jol.vm.sa.ServiceabilityAgentSupport.getUniverseData(ServiceabilityAgentSupport.java:301)
org.apache.hudi.org.openjdk.jol.vm.VM.current(VM.java:77)
org.apache.hudi.org.openjdk.jol.info.GraphWalker.walk(GraphWalker.java:97)
org.apache.hudi.org.openjdk.jol.info.GraphLayout.parseInstance(GraphLayout.java:54)
org.apache.hudi.common.util.ObjectSizeCalculator.getObjectSize(ObjectSizeCalculator.java:57)
org.apache.hudi.common.util.HoodieRecordSizeEstimator.<init>(HoodieRecordSizeEstimator.java:40)
org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.<init>(HoodieMergedLogRecordScanner.java:107)
org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.<init>(HoodieMergedLogRecordScanner.java:74)
org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner$Builder.build(HoodieMergedLogRecordScanner.java:465)
org.apache.hudi.LogFileIterator$.$anonfun$scanLog$1(Iterators.scala:329)
org.apache.hudi.LogFileIterator$$$Lambda$1054/69444513.apply(Unknown Source)
org.apache.spark.sql.hive.HadoopUgiUtils$$anon$1.run(HadoopUgiUtils.scala:54)
java.security.AccessController.doPrivileged(Native Method)
javax.security.auth.Subject.doAs(Subject.java:422)
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1989)
org.apache.spark.sql.hive.HadoopUgiUtils$.doAsWithHiveSuperUser(HadoopUgiUtils.scala:53)
org.apache.hudi.LogFileIterator$.scanLog(Iterators.scala:261)
org.apache.hudi.LogFileIterator.<init>(Iterators.scala:93)
org.apache.hudi.RecordMergingFileIterator.<init>(Iterators.scala:173)
org.apache.hudi.HoodieMergeOnReadRDD.compute(HoodieMergeOnReadRDD.scala:100)
org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:369)
org.apache.spark.rdd.RDD.iterator(RDD.scala:333)
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:369)
org.apache.spark.rdd.RDD.iterator(RDD.scala:333)
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:369)
org.apache.spark.rdd.RDD.iterator(RDD.scala:333)
org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59)
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:52)
org.apache.spark.scheduler.Task.run(Task.scala:131)
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:506)
org.apache.spark.executor.Executor$TaskRunner$$Lambda$476/220247377.apply(Unknown Source)
org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1463)
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:509)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
java.lang.Thread.run(Thread.java:745)

To Reproduce

can not reporduce

Expected behavior

Environment Description

  • Hudi version : 0.13.1

  • Spark version : 3.2

  • Hive version :

  • Hadoop version :

  • Storage (HDFS/S3/GCS..) :

  • Running on Docker? (yes/no) :

Metadata

Metadata

Assignees

No one assigned

    Labels

    area:writerWrite client and core write operationspriority:criticalProduction degraded; pipelines stalled

    Type

    No type

    Projects

    Status

    ⏳ Awaiting Triage

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions