[SPARK-46825][DOCS] Build Spark only once when building docs #44865
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
What changes were proposed in this pull request?
As suggested here, this change improves the documentation build so that it builds Spark at most one time, regardless of what API docs are requested in the build.
Why are the changes needed?
There is no need to build Spark multiple times when generating docs. In particular, building Scala and Python docs, or Scala and SQL docs, causes Spark to be built twice.
Fixing this problem saves us a couple of minutes.
Does this PR introduce any user-facing change?
No.
How was this patch tested?
I built the docs as follows on
master
as well as on this branch:The time results before and after this change are as follows:
That's a savings of about 2.5 minutes.
Additionally, I diffed the generated
_site/
dir acrossmaster
and this branch and confirmed they are essentially identical except for some general SQL examples files.Was this patch authored or co-authored using generative AI tooling?
No.