Skip to content

Conversation

@zifeif2
Copy link

@zifeif2 zifeif2 commented Oct 21, 2025

What changes were proposed in this pull request?

We found a performance regression in recentProgress in pyspark in assigned cluster after version 4.0. The direct cause of the regression in this commit, it changes how we load recentProgress fromJson to fromJObject, and this commit is only included in 4.0. However, when constructing a dict from JObject, py4j needs to make multiple RPCs to JVM which result in a long time

Proposed Fix:

Use fromJson to load StraemingQueryProgress instead. This will be aligned with the recentProgress in pyConnect as well

We are also fixing lastProgress to load StreamingQueryProgress from json.

Why are the changes needed?

Otherwise there is a performance regression in recentProgress. Here is the log of the time it takes to load 70 recent progress:

[2025-10-22 06:28:37]Total Time: 0.12269 seconds for getting 1 progress
[2025-10-22 06:28:38]Total Time: 0.199387 seconds for getting 2 progress
[2025-10-22 06:28:39]Total Time: 0.384784 seconds for getting 4 progress
...
[2025-10-22 06:29:27]Total Time: 4.089001 seconds for getting 48 progress
[2025-10-22 06:29:32]Total Time: 4.571433 seconds for getting 53 progress
[2025-10-22 06:29:38]Total Time: 5.024825 seconds for getting 58 progress
[2025-10-22 06:29:45]Total Time: 5.520222 seconds for getting 64 progress
[2025-10-22 06:29:52]Total Time: 6.071674 seconds for getting 71 progress

This is generated by adding the following test to test_streaming.py locally

def test_recent_progress_regression(self):
        df = self.spark.readStream.format("rate").option("rowsPerSecond", 10).load()
        q = df.writeStream.format("noop").start()
        print("begin waiting for progress")
        numProgress = len(q.recentProgress)
        while numProgress < 70 and q.exception() is None:
            time.sleep(1)
            beforeTime = datetime.now()
            rep = q.recentProgress
            numProgress = len(rep)
            afterTime = datetime.now()
            print("[" + afterTime.strftime("%Y-%m-%d %H:%M:%S") + "]" + "Total Time: " + str((afterTime - beforeTime).total_seconds()) + " seconds for getting " + str(numProgress) + " progress")
        q.stop()
        q.awaitTermination()
        assert(q.exception() is None)
        assert(numProgress == 300) # wrong statement so that we can see the logs

Does this PR introduce any user-facing change?

No

How was this patch tested?

Run local script with the changes and here is the output log.

2025-10-22 06:23:35]Total Time: 0.001411 seconds for getting 1 progress
[2025-10-22 06:23:36]Total Time: 0.002293 seconds for getting 2 progress
[2025-10-22 06:23:37]Total Time: 0.003163 seconds for getting 3 progress
[2025-10-22 06:23:38]Total Time: 0.003583 seconds for getting 4 progress
[2025-10-22 06:23:39]Total Time: 0.004025 seconds for getting 5 progress
[2025-10-22 06:23:40]Total Time: 0.004954 seconds for getting 6 progress
...
[2025-10-22 06:24:43]Total Time: 0.012729 seconds for getting 69 progress
[2025-10-22 06:24:44]Total Time: 0.01289 seconds for getting 70 progress

Was this patch authored or co-authored using generative AI tooling?

No

@zifeif2 zifeif2 marked this pull request as ready for review October 22, 2025 15:18
Copy link
Contributor

@anishshri-db anishshri-db left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nice finding. Thanks !

Copy link
Contributor

@micheal-o micheal-o left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for RCA'ing and fixing this. Nice work!

anishshri-db pushed a commit that referenced this pull request Oct 23, 2025
…assic pyspark

### What changes were proposed in this pull request?
See #52688 for details. This PR is to backport the changes to Spark 4.0

### Why are the changes needed?

See #52688

### Does this PR introduce _any_ user-facing change?

No

### How was this patch tested?

See #52688

### Was this patch authored or co-authored using generative AI tooling?

No

Closes #52698 from zifeif2/recent-progress-bug-4.0.

Authored-by: Ubuntu <zifei.feng@your.hostname.com>
Signed-off-by: Anish Shrigondekar <anish.shrigondekar@databricks.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants