[SPARK-27852][Spark Core] updateBytesWritten() operaton is missed#24720
[SPARK-27852][Spark Core] updateBytesWritten() operaton is missed#24720whatlulumomo wants to merge 1 commit intoapache:masterfrom
Conversation
…ed in DiskBlockObjectWriter.scala In DiskBlockObjectWriter.scala, there are 2 overload write functions, the first of which executes updateBytesWritten function while the other doesn't. I think writeMetrics should record all the information about writing operations, some data of which will be displayed in the Spark jobs UI such as the data size of shuffle read and shuffle write.
|
Can one of the admins verify this patch? |
|
This would make it update metrics on every write. It appears this is purposely done only every 16,000 records for this reason. |
In this write function, there isn't any work to update metrics or records. bs.write(kvBytes, offs, len) doesn't do the work, either. It seems strange. |
|
Look at |
Here is the function body:
|
|
So, is the matrix not getting updated? what's the issue this PR fixes? |
|
@BestOreo the caller calls |
What changes were proposed in this pull request?
one line code maybe missed in core/src/main/scala/org/apache/spark/storage/DiskBlockObjectWriter.scala
Possible Patch Link